PeRL STUDIES AUTONOMOUS NAVIGATION & MAPPING FOR MOBILE ROBOTS IN A PRIORI UNKNOWN ENVIRONMENTS.

A visual vocabulary for flower classification

Summary


Maria-Elena Nilsback and Andrew Zisserman, A visual vocabulary for flower classification. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 1447-1454, Los Alamitos, CA, USA, 2006.

Abstract

We investigate to what extent "bag of visual words" models can be used to distinguish categories which have significant visual similarity. To this end we develop and optimize a nearest neighbour classifier architecture, which is evaluated on a very challenging database of flower images. The flower categories are chosen to be indistinguishable on colour alone (for example), and have considerable variation in shape, scale, and viewpoint. We demonstrate that by developing a visual vocabulary that explicitly represents the various aspects (colour, shape, and texture) that distinguish one flower from another, we can overcome the ambiguities that exist between flower categories. The novelty lies in the vocabulary used for each aspect, and how these vocabularies are combined into a final classifier. The various stages of the classifier (vocabulary selection and combination) are each optimized on a validation set. Results are presented on a dataset of 1360 images consisting of 17 flower species. It is shown that excellent performance can be achieved, far surpassing standard baseline algorithms using (for example) colour cues alone.

Bibtex entry

@INPROCEEDINGS { mnilsback-2006a,
    AUTHOR = { Maria-Elena Nilsback and Andrew Zisserman },
    TITLE = { A visual vocabulary for flower classification },
    BOOKTITLE = { Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition },
    PUBLISHER = { IEEE Computer Society },
    YEAR = { 2006 },
    ADDRESS = { Los Alamitos, CA, USA },
    VOLUME = { 2 },
    PAGES = { 1447--1454 },
    DOI = { 10.1109/CVPR.2006.42 },
}