Deep Learning Based Burn Area Mapping Using Sentinel 1 for the Santa Cruz Mountains Lightning Complex (CZU) and Creek fires 2020
The study presented here builds on previous synthetic aperture radar burn area estimation models and presents the first U-Net (a convolutional network architecture for fast and precise segmentation of images) combined with ResNet50 (Residual Networks used as a backbone for many computer vision tasks) encoder architecture used with SAR, Digital Elevation Model, and land cover data for burn area mapping in near-real time. Forest fires in the west coast of the United States have caused devastating environmental, social and economic losses – and they are increasing in frequency. Methods to effectively monitor burn area are critical to informing resource managers during these events and ensuring recovery after. Recent launches of the Sentinel 1 and 2 satellite constellations have opened productive avenues for research in burn area monitoring. However, studies are primarily limited to multispectral imagery with frameworks unable to be implemented in real-time or near real-time as a result. This research area is developing very fast with the arrival of Sentinel constellation and the increase in the space-market investments and companies that delivers added value form the raw imagery. Here we present burnt area estimation framework utilizing synthetic aperture radar (SAR) data captured from Sentinel 1, automatic data label generation in Google Earth Engine, and a semi-supervised deep learning framework utilizing a U-Net architecture with a ResNet50 encoder.
LUFT Harrison;
SCHILLACI Calogero;
CECCHERINI Guido;
SIMOES VIEIRA Diana;
LIPANI Aldo;
2022-10-26
MDPI
JRC130082
2571-6255 (online),
https://www.mdpi.com/2571-6255/5/5/163,
https://publications.jrc.ec.europa.eu/repository/handle/JRC130082,
10.3390/fire5050163 (online),
Additional supporting files
File name | Description | File type | |