Please use this identifier to cite or link to this item:
|Title:||An automated method for detection of selected non-conformities within LPIS database based on image object extraction approach|
|Abstract:||The Land Parcel Identification System (LPIS) is a spatial database of reference parcels managed by Member States in the context of the Common Agricultural Policy (CAP). The maximum eligible area is a crucial attribute of reference parcel for the area-based subsidies. An ineligible elements included in the reference parcel creates the non-conformities or anomalies. This database should be up to date with respect to all the elements which are ineligible for subsidies. In order to achieve this objective, Member States renew their entire orthocoverage every 3 to 5 years. In the consequence, the ¿millions¿ of reference parcels boundaries in the LPIS should then be verified by operators through Computer Aided Photo Interpretation. However, fully manual revision (parcel by parcel) is time consuming and expensive, thus an automatic detection of parcels that potentially need to be updated could significantly improve the time efficiency and reduce the cost of the update process. The aim of this thesis was to develop an automated method for detection of nonconformities in the LPIS in order to support a systematic update process. The proposed method uses the data available in the LPIS and considers the actual update needs. In order to understand the essential update needs and to address the challenge correctly, extensive analyses of anomalies in the systems were undertaken. In the survey more than 21 000 parcels in 12 zones located in 11 Member States were examined. In total close to 10 000 anomalies were found. It was stated that, more than 50% of all the identified anomalies were caused by patches of trees and buildings located in the reference parcels. Therefore the further research work was focused on outline extraction of selected (building and patches of tree) objects. The developed object extraction method in order to detect selected objects takes advantage of fusion of the radiometric image characteristic (NDVI) with the height data (normalised Digital Surface Model). Two approaches were combined: pixel-based - to localise the potential target objects (called object primitives) and object-based - to delineate 5 outlines of buildings and patches of trees. This combination reduced the processing time since the object-based method was applied only on the subsets of the image where the objects are located, instead of applying heavy computation analysis to the entire image scene. On the other hand, the potential candidate localisation gave better class separation due to reduction of class variability between the full image space and small image subsets containing the primitives. In the object-based approach the Mean Shift segmentation algorithm, developed in the field of Computer Vision, was successfully applied to the aerial ortho-imagery in order to partition the image into adjacent regions. To assure the correctness of the segmentation, considered as a key element of object-based analyses, the segmentation parameters were optimized to the characteristic of the target objects. It was demonstrated that using a standard image LPIS dataset as input to the proposed object extraction method, it is possible to automatically detect 96% of potential nonconformities caused by buildings and 100% caused by patches of trees. The UltraCamD image data were found suitable for the automated image analyses and capable of serving as a source for the Digital Surface Model generation using the commercial off-the-shelf software in the agricultural environment. In line with the objectives of the thesis, to reach the automated solution for detection of object causing the LPIS non-conformities, the proposed method of object extraction was implemented in the IDL programming environment.|
|JRC Institute:||Institute for Environment and Sustainability|
Files in This Item:
There are no files associated with this item.
Items in repository are protected by copyright, with all rights reserved, unless otherwise indicated.