Title: A Multi-Scale Expectation Maximization Semisupervised Classifier Suitable for Badly-Posed Image Classification
Authors: BARALDI ANDREABRUZZONE LorenzoBLONDA Palma
Citation: IEEE TRANSACTIONS ON IMAGE PROCESSING vol. 15 no. 8 p. 2208-2225
Publication Year: 2006
JRC Publication N°: JRC32615
URI: http://publications.jrc.ec.europa.eu/repository/handle/JRC32615
Type: Articles in Journals
Abstract: This paper deals with the problem of badly-posed image classification. Although underestimated in practice, bad-posedness is likely to affect many real-world image classification tasks, where reference samples are difficult to collect (e.g. in Remote Sensing (RS) image mapping) and/or spatial autocorrelation is relevant. In an image classification context affected by a lack of reference samples, an original inductive learning multi-scale image classifier, termed Multi-scale Semisupervised Expectation Maximization (MSEM), is proposed. The rationale behind MSEM is to combine useful complementary properties of two alternative data mapping procedures recently published outside of image processing literature, namely, the multi-scale Modified Adaptive Clustering (MPAC) algorithm and the sample-based Semisupervised Expectation Maximization (SEM) classifier. To demonstrate its potential utility, MSEM is compared against non-standard classifiers, such as MPAC, SEM and the single-scale Contextual SEM (CSEM) classifier, besides against well-known standard classifiers in two RS image classification problems featuring few reference samples and modestly useful texture information. These experiments yield weak (subjective) but numerous quantitative map quality indexes that are consistent with both theoretical considerations and qualitative evaluations by expert photointerpreters. According to these quantitative results, MSEM is competitive in terms of overall image mapping performance at the cost of a computational overhead three to six times superior to that of its most interesting rival, SEM. More in general, our experiments confirm that, even if they rely on heavy class-conditional normal distribution assumptions that may not be true in many real-world problems (e.g. in highly textured images), isupervised classifiers based on the iterative Expectation Maximization (EM) Gaussian mixture model solution can be very powerful in practice when: a) there is a lack of reference samples with respect to the problem/model complexity and ii) texture information is considered negligible (i.e. a piecewise constant image model holds).
JRC Institute:Institute for the Protection and Security of the Citizen

Files in This Item:
There are no files associated with this item.


Items in repository are protected by copyright, with all rights reserved, unless otherwise indicated.