Advances in Automatic Interpolation for Real-time Mapping
A number of critical environmental variables like atmospheric pollutants, radiation levels, seismic activity or rainfall fields, to name but a few, are continuously monitored by a variety of environmental sensors. The broadcasting in real-time of data from these sensors implies that the data are, to a large extent, processed automatically. In contrast to data obtained by remote sensing techniques, environmental variables that are monitored by sampling networks need to have the measurements interpolated on a defined grid before a map can be produced. This interpolation step is probably the biggest obstacle to the automatic production of information which will be further used for modelling and/or decision making, largely because of the variety of methods and parameters within those methods that need to be defined. The subject of spatial interpolation is not new and many have already explored and discussed the relative advantages and drawbacks of various mapping algorithms applied to different problems, without obviously arriving at a decisive answer regarding a “best” method. The developments in the field of spatial statistics and the parallel growth of the practitioner’s awareness of these in recent years have encouraged the creation of increasingly complex Geographic Information Systems (GIS). As a result, maps of higher accuracy and with lower uncertainties are more likely to be produced today than a decade ago. The main cost of these improvements is that GIS become more difficult to operate, a serious drawback in situations of emergencies when maps need to be generated as quickly as possible. This complexity is also an issue when maps need to be processed in batch without any human intervention, such as might occur when services are chained in a service oriented architecture. These obstacles to real-time mapping are, however, only secondary to the many statistical challenges an automatic mapping algorithm has to deal with. While mapping algorithms tailored for the routine monitoring of a given variable can be designed accurately using historical records and prior-knowledge of the monitored process, things become much more complicated in emergencies when extreme values are likely to appear. One may thus legitimately ask if building generic real-time mapping algorithms is possible or even advisable?
DUBOIS Gregoire;
2009-01-30
SPRINGER
JRC37231
1436-3240,
https://publications.jrc.ec.europa.eu/repository/handle/JRC37231,
10.1007/s00477-007-0159-5,
Additional supporting files
File name | Description | File type | |