Faculty Candidate: Kristen Grauman/Computer Science and Artificial Intelligence Laboratory MIT Efficient Matching for Recognition and Retrieval in ACES 2.302

Contact Name: 
Jenna Whitney
Date: 
Apr 4, 2006 11:00am - 12:00pm


There is a signup schedule for this
event.

Speaker Name/Affiliation: Kristen Grauman/Computer Science

and Artificial Intelligence Laboratory MIT

Talk Title: Efficient M

atching for Recognition and Retrieval

Date/Time: April 4 2006 at 1

1:00 a.m.

Coffee: 10:45 a.m.

Location: ACES 2.302

Hos

t: Ben Kuipers

Talk Abstract:
Local image features have emerged

as a powerful way to describe images of
objects and scenes. Their stabi

lity under variable image conditions is
critical for success in a wide r

ange of recognition and retrieval
applications. However comparing imag

es represented by their collections of
local features is challenging si

nce each set may vary in cardinality and
its elements lack a meaningful

ordering. Existing methods compare feature
sets by searching for explic

it correspondences between their elements which
is too computationally

expensive in many realistic settings.

I will present the pyr

amid match which efficiently forms an implicit
partial matching between
two sets of feature vectors. The matching has
linear time complexity

naturally forms a Mercer kernel and is robust to
clutter or outlier fea

tures a critical advantage for handling images with
variable background

s occlusions and viewpoint changes. I will show how
this dramatic inc

rease in performance enables accurate and flexible image
comparisons to

be made on large-scale data sets and removes the need to
artificially l

imit the size of images'' local descriptions. As a result we
can now a

ccess a new class of applications that relies on the analysis of
rich vi

sual data such as place or object recognition and meta-data
labeling.

I will provide results on several important vision tasks
including our

algorithm''s state-of-the-art recognition performance on a
challenging d

ata set of object categories.