Please use this identifier to cite or link to this item:
|Title:||A Global Multi-year (2000-2007) Validated Burnt Area Product (L3JRC) Derived from Daily Spot Vegetation Data|
|Authors:||TANSEY Kevin; GREGOIRE JEAN-MARIE; PEREIRA José M.c.; DEFOURNY Pierre; LEIGH Roland; PEKEL Jean-François; SILVA Joao M.; VAN BOGAERT Eric; BARTHOLOME' ETIENNE; BONTEMPS Sophie|
|Citation:||Proceedings of the 6th Int Workshop on Advances in RS and GIS Applications in Forest Fire Management p. 154-158|
|Publisher:||Office for Official Publications of the European Communities|
|Type:||Articles in periodicals and books|
|Abstract:||A new, global, multi-annual (2000-2007) burned area product at 1 km resolution and daily intervals Kevin Tansey, Jean-Marie Grégoire, Jose M.C. Pereira, Pierre Defourny, Roland Leigh, Jean-François Pekel, Joao M. das Neves Silva, Eric van Bogaert, Etienne Bartholomé, and Sophie Bontemps Keywords: Globe Burned Area Fire monitoring SPOT VEGETATION L3JRC ABSTRACT: Global burned area products are in high demand from research groups and communities interested in modelling the carbon cycle, understanding the relationships between fire regime and climate, atmos-pheric emissions and pollution resulting from fires and the impact of vegetation burning on land cover change. Currently, global burned area products at medium resolution, such as GBA2000 and GlobScar, are limited to the year 2000. Whilst these global products provided the user community with strong evi-dence of the scale of global vegetation burning, multi-annual products are needed to strengthen the ar-guments of relationships between vegetation, climate and fire. Recent initiatives by ESA to implement a number of regional algorithms from the GBA2000 prod-uct combined with GlobScar results over the period 1998-2007 has resulted in some problems caused by scaling up of the algorithms. At this time, we still await the MODIS global burned area product. It is in this context that the L3JRC (pronounced L-three-J-R-C) product has been developed. Its name refers to the consortium of academies involved in the development: the University of Leicester (UK), the Catholic University of Louvain-la-Neuve (BE), the Tropical Research Institute, Lisbon (PT) and the Joint Research Centre of the European Commission (EU). A single algorithm was used to classify the SPOT-VEGETATION data to burned areas. Originally developed by D. Ershov and colleagues, the algorithm has subsequently been modified by the L3JRC consortium. It makes use of a temporal index in the near infrared channel. Global, daily, atmospheric corrected SPOT VGT S1 data were used as input. In the pre-processing module, cloud, snow and fire smoke masks are generated. A viewing zenith mask is applied that restricts observations to angles less than 50.5 degrees. A cloud shadow mask is then derived using solar and view azimuth and zenith an-gles and assuming a constant cloud height of 10km. Post-processing of the data serves to utilise the latest land cover information to remove some over detections believed mainly to be due to some shadowing not excluded with the relief/sun shadow mask, the multi-annual detection of leaf off conditions in temperate regions and lake melt conditions at high northern latitudes. The GLC2000 land cover product was used to provide updated information on water bodies, snow and ice, bare surfaces and urban areas. The L3JRC product has been evaluated against a large number of Landsat TM and ETM+ image pairs and a number of regional products derived from in situ or remote means. We evaluate the product in its ability to correctly quantify the amount of burnt area by computing comparative values over a global hexagonal grid with a cell spacing of 60 km. This is done over a number of different vegetation and biome types. The L3JRC product is available at full resolution in binary and ASCII format in geographic coordi-nates (lat-lon).|
|JRC Institute:||Institute for Environment and Sustainability|
Files in This Item:
There are no files associated with this item.
Items in repository are protected by copyright, with all rights reserved, unless otherwise indicated.