Image Recognition in Context: Application to Microscopic Urinalysis

Part of Advances in Neural Information Processing Systems 12 (NIPS 1999)

Bibtex Metadata Paper

Authors

Xubo Song, Joseph Sill, Yaser Abu-Mostafa, Harvey Kasdan

Abstract

We propose a new and efficient technique for incorporating contextual information into object classification. Most of the current techniques face the problem of exponential computation cost. In this paper, we propose a new general framework that incorporates partial context at a linear cost. This technique is applied to microscopic urinalysis image recognition, resulting in a significant improvement of recognition rate over the context free approach. This gain would have been impossible using conventional context incorporation techniques.

1 BACKGROUND: RECOGNITION IN CONTEXT

There are a number of pattern recognition problem domains where the classification of an object should be based on more than simply the appearance of the object itself. In remote sensing image classification, where each pixel is part of ground cover, a pixel is more like(cid:173) ly to be a glacier if it is in a mountainous area, than if surrounded by pixels of residential areas. In text analysis, one can expect to find certain letters occurring regularly in particu(cid:173) lar arrangement with other letters(qu, ee,est, tion, etc.). The information conveyed by the accompanying entities is referred to as contextual information. Human experts apply con(cid:173) textual information in their decision making [2][ 6]. It makes sense to design techniques and algorithms to make computers aggregate and utilize a more complete set of information in their decision making the way human experts do. In pattern recognition systems, however,

*Author for correspondence

964

X B. Song, J Sill, Y. Abu-Mostafa and H. Kasdan

the primary (and often only) source of information used to identify an object is the set of measurements, or features, associated with the object itself. Augmenting this information by incorporating context into the classification process can yield significant benefits.

i = 1, ... N. With each object we associate a Consider a set of N objects Ti , class label Ci that is a member of a label set n = {1 , ... , D} . Each object Ti is characterized by a set of measurements Xi E R P, which we call a feature vec(cid:173) tor. Many techniques [1][2][4J[6} incorporate context by conditioning the posterior probability of objects' identities on the joint features of all accompanying objects. i.e .• P(Cl, C2,··· , cNlxl , . . . , XN). and then maximizing it with respectto Cl, C2, . .. , CN . It can be shown thatp(cl,c2, . . . ,cNlxl, . . . ,xN) ex p(cllxl) ... p(CNlxN) (~ci""'(N\ given certain reasonable assumptions.