Improving ferns ensembles by sparsifying and quantising posterior probabilities
Ferns ensembles offer an accurate and efficient multiclass non-linear classification, commonly at the expense of consuming a large amount of memory. We introduce a two-fold contribution that produces large reductions in their memory consumption. First, an efficient L0 regularised cost optimisation finds a sparse representation of the posterior probabilities in the ensemble by discarding elements with zero contribution to valid responses in the training samples. As a by-product this can produce a prediction accuracy gain that, if required, can be traded for further reductions in memory size and prediction time. Secondly, posterior probabilities are quantised and stored in a memory friendly sparse data structure. We reported a minimum of 75% memory reduction for different types of classification problems using generative and discriminative ferns ensembles, without increasing prediction time or classification error. For image patch recognition our proposal produced a 90% memory reduction, and improved in several percentage points the prediction accuracy.
RODRIGUEZ LOPEZ Antonio;
SEQUEIRA Vitor;
2016-04-21
INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS
JRC97320
1550-5499,
http://www.cv-foundation.org/openaccess/content_iccv_2015/papers/Rodriguez_Improving_Ferns_Ensembles_ICCV_2015_paper.pdf,
https://publications.jrc.ec.europa.eu/repository/handle/JRC97320,
10.1109/ICCV.2015.467,
Additional supporting files
File name | Description | File type | |