Please use this identifier to cite or link to this item:
|Title:||Landsat 7 Enhanced Thematic Mapper JRC-FAPAR Algorithm Theoretical Basis Document|
|Authors:||GOBRON Nadine; TABERNER Malcolm|
|Other Identifiers:||EUR 23554 EN|
|Type:||EUR - Scientific and Technical Research Reports|
|Abstract:||This Algorithm Theoretical Basis document (ATBd) describes the Joint Research Center (JRC)- procedure used to retrieve information of absorbed photosynthetical radiation by the vegetated terrestrial surfaces from an analysis of the Top Of Atmosphere (TOA) data acquired by the Landsat 7 Enhanced Thematic Mapper (ETM+) instrument. The corresponding data consist of eight spectral bands, with a spatial resolution of 30 meters for bands 1 to 5 and band 7 whereas the resolution for band 6 (thermal infrared) is 60 meters and resolution for band 8 (panchromatic) is 15 meters. Approximate scene size is 170 km north-south by 183 km east-west. The code of the proposed algorithm takes the form of a set of several formulae which transform calibrated spectral directional reflectances into a single numerical value. These formulae are designed to extract the Fraction of Absorbed Photosynthetically Active Radiation (FAPAR) in the plant canopy from the measurements. The methodology described in this document has been optimized to assess the presence on the ground of healthy live green vegetation. The optimization procedure has been constrained to provide an estimate of FAPAR in the plant canopy, although the outputs are expected to be used in a wide range of applications. This algorithm delivers, in addition to the FAPAR product, the so-called rectied reflectance values in the red and near-infrared spectral bands (Landsat 7 ETM+ Band 3 and Band 4). These are virtual reflectances largely decontaminated from atmospheric and angular effects. It also provides a categorization of pixel types thanks to a pre-processing identication based on multi-spectral properties.|
|JRC Institute:||Sustainable Resources|
Items in repository are protected by copyright, with all rights reserved, unless otherwise indicated.