Full metadata record
DC FieldValueLanguage
dc.contributor.authorKISSIYAR Ounsen_GB
dc.contributor.authorBARTALEV SVYATOSLAVen_GB
dc.contributor.authorACHARD Fredericen_GB
dc.identifier.citationEARSeL ePROCEEDINGS vol. 13 no. S1 p. 82-88en_GB
dc.description.abstractThe purpose of this study is to develop a monitoring tool for boreal forest cover change on continental level at high resolution. The system is based on Landsat satellite imagery and has been implemented for the period 1990-2000-2010. For the identification and classification of the forest cover within a large amount of satellite imagery, a robust methodological approach combining multi-date image segmentation and cluster based supervised automated classification was chosen. Thus, an object based, automatic classification method with a regional expert validation are combined to produce regional scale land cover statistics over Russia and Mongolia. High resolution satellite imagery is used to accurately estimate land cover and land cover change for the epochs 1990-2000-2010. The overall method consists of four distinct steps: (i) automatic image preprocessing and pre-interpretation, (ii) validation by regional expert, (iii) statistic computation and (iv) accuracy assessment. The automated procedures have as main objective to unequivocally identify the objects so as to maximally reduce the post-classification interventions of manual procedures and of visual interpretation. A total of 14 different land cover classes are defined in the legend. Given the focus on forests, special attention was devoted to the differentiation of 8 different forest cover types, going up to species level.en_GB
dc.description.sponsorshipJRC.H.3-Forest Resources and Climateen_GB
dc.publisherBIS Verlagen_GB
dc.titleMonitoring forest cover change in boreal forests: a methodological approachen_GB
dc.typeArticles in periodicals and booksen_GB
JRC Directorate:Sustainable Resources

Files in This Item:
There are no files associated with this item.

Items in repository are protected by copyright, with all rights reserved, unless otherwise indicated.