Fast, Energy-Efficient Content-Based Image Recognition (CBIR)

The size of large media or consumer-generated image databases is growing so fast that their utility will depend on automated search, not on manual labeling. Current image classifiers are performance limited, both because low error rates depend on processing very high dimensional feature spaces, and to limit user response time. We are using Intel’s PIRO code-base, and will apply it to consumer and industrial image databases.

Useful search of large media databases is currently only possible by using explicit tags from the user. Content-based image retrieval (CBIR) systems [Smeulders, Worring et al. 2000] search large image databases based on implicit criteria generated from a set of positive and negative examples. Such functionality is quickly becoming a necessity, since the value of an image database depends on the ability to find relevant images. The system core is a set of feature extractors and classifiers. The feature extraction routines analyze the images to create various signatures from the image, while the classifiers learn boundaries separating useful and irrelevant images, and then classify the remaining images in the database. The classifiers are based on support-vector machines and k-nearest neighbor classifiers (both use matrix dwarfs). Current image classification systems are performance-limited. Manycore will enable more intelligent feature extraction and more precise queries, thereby increasing relevance of search results. We will leverage Intel’s PIRO (Personal Image Retrieval Organizer) code-base consisting of 35K lines of code for the base content-based image retrieval system and ~165K lines of libraries for feature extraction and other utilities. We have already performed a conceptual re-architecting of PIRO using a hierarchical pipe-and-filter stream pattern at the top level. The feature extractor, classifier trainer, and classifier elements of PIRO are then filters based on variants of the MapReduce [Dean and Ghemawat 2004] pattern.