by suren manvelyan, //hays/compvision2017/lectures/19.pdf · sampling strategies 61 k. grauman, b....

60
By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Upload: others

Post on 21-Jul-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Page 2: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Page 3: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Page 4: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Page 5: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Page 6: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Page 7: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Page 8: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Page 9: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

By Suren Manvelyan, http://www.surenmanvelyan.com/gallery/7116

Page 10: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Indy

Page 11: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Indy

Page 12: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly
Page 13: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Image Categorization

Training Labels

Training

Images

Classifier Training

Training

Image Features

Image Features

Testing

Test Image

Trained Classifier

Trained Classifier Outdoor

Prediction

Page 14: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

History of ideas in recognition

• 1960s – early 1990s: the geometric era

• 1990s: appearance-based models

• Mid-1990s: sliding window approaches

• Late 1990s: local features

• Early 2000s: parts-and-shape models

• Mid-2000s: bags of features

Svetlana Lazebnik

Page 15: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Bag-of-features models

Svetlana Lazebnik

Page 16: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

ObjectBag of

‘words’

Bag-of-features models

Svetlana Lazebnik

Page 17: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Objects as texture

• All of these are treated as being the same

• No distinction between foreground and background: scene recognition?

Svetlana Lazebnik

Page 18: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Origin 1: Texture recognition

• Texture is characterized by the repetition of basic elements

or textons

• For stochastic textures, it is the identity of the textons, not

their spatial arrangement, that matters

Julesz, 1981; Cula & Dana, 2001; Leung & Malik 2001; Mori, Belongie & Malik, 2001;

Schmid 2001; Varma & Zisserman, 2002, 2003; Lazebnik, Schmid & Ponce, 2003

Page 19: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Origin 1: Texture recognition

Universal texton dictionary

histogram

Julesz, 1981; Cula & Dana, 2001; Leung & Malik 2001; Mori, Belongie & Malik, 2001;

Schmid 2001; Varma & Zisserman, 2002, 2003; Lazebnik, Schmid & Ponce, 2003

Page 20: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Origin 2: Bag-of-words models

• Orderless document representation: frequencies of words

from a dictionary Salton & McGill (1983)

Page 21: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Origin 2: Bag-of-words models

US Presidential Speeches Tag Cloudhttp://chir.ag/phernalia/preztags/

• Orderless document representation: frequencies of words

from a dictionary Salton & McGill (1983)

Page 22: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Origin 2: Bag-of-words models

US Presidential Speeches Tag Cloudhttp://chir.ag/phernalia/preztags/

• Orderless document representation: frequencies of words

from a dictionary Salton & McGill (1983)

Page 23: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Origin 2: Bag-of-words models

US Presidential Speeches Tag Cloudhttp://chir.ag/phernalia/preztags/

• Orderless document representation: frequencies of words

from a dictionary Salton & McGill (1983)

Page 24: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

1. Extract features

2. Learn “visual vocabulary”

3. Quantize features using visual vocabulary

4. Represent images by frequencies of “visual words”

Bag-of-features steps

Page 25: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

1. Feature extraction

• Regular grid or interest regions

Page 26: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Normalize

patch

Detect patches

Compute

descriptor

Slide credit: Josef Sivic

1. Feature extraction

Page 27: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

1. Feature extraction

Slide credit: Josef Sivic

Page 28: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

2. Learning the visual vocabulary

Slide credit: Josef Sivic

Page 29: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

2. Learning the visual vocabulary

Clustering

Slide credit: Josef Sivic

Page 30: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

2. Learning the visual vocabulary

Clustering

Slide credit: Josef Sivic

Visual vocabulary

Page 31: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

K-means clustering

• Want to minimize sum of squared Euclidean

distances between points xi and their

nearest cluster centers mk

Algorithm:

• Randomly initialize K cluster centers

• Iterate until convergence:• Assign each data point to the nearest center

• Recompute each cluster center as the mean of all points

assigned to it

k

ki

ki mxMXDcluster

clusterinpoint

2)(),(

Page 32: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Clustering and vector quantization

• Clustering is a common method for learning a

visual vocabulary or codebook• Unsupervised learning process

• Each cluster center produced by k-means becomes a

codevector

• Codebook can be learned on separate training set

• Provided the training set is sufficiently representative, the

codebook will be “universal”

• The codebook is used for quantizing features• A vector quantizer takes a feature vector and maps it to the

index of the nearest codevector in a codebook

• Codebook = visual vocabulary

• Codevector = visual word

Page 33: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Example codebook

Source: B. Leibe

Appearance codebook

Page 34: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Visual vocabularies: Issues

• How to choose vocabulary size?• Too small: visual words not representative of all patches

• Too large: quantization artifacts, overfitting

• Computational efficiency• Vocabulary trees

(Nister & Stewenius, 2006)

Page 35: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

But what about layout?

All of these images have the same color histogram

Page 36: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Spatial pyramid

Compute histogram in each spatial bin

Page 37: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Spatial pyramid representation

• Extension of a bag of features

• Locally orderless representation at several levels of resolution

level 0

Lazebnik, Schmid & Ponce (CVPR 2006)

Page 38: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Spatial pyramid representation

• Extension of a bag of features

• Locally orderless representation at several levels of resolution

level 0 level 1

Lazebnik, Schmid & Ponce (CVPR 2006)

Page 39: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Spatial pyramid representation

level 0 level 1 level 2

• Extension of a bag of features

• Locally orderless representation at several levels of resolution

Lazebnik, Schmid & Ponce (CVPR 2006)

Page 40: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Scene category dataset

Multi-class classification results

(100 training images per class)

Page 41: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Caltech101 datasethttp://www.vision.caltech.edu/Image_Datasets/Caltech101/Caltech101.html

Multi-class classification results (30 training images per class)

Page 42: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Bags of features for action recognition

Juan Carlos Niebles, Hongcheng Wang and Li Fei-Fei, Unsupervised Learning of Human

Action Categories Using Spatial-Temporal Words, IJCV 2008.

Space-time interest points

Page 43: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

History of ideas in recognition

• 1960s – early 1990s: the geometric era

• 1990s: appearance-based models

• Mid-1990s: sliding window approaches

• Late 1990s: local features

• Early 2000s: parts-and-shape models

• Mid-2000s: bags of features

• Present trends: combination of local and global

methods, context, deep learning

Svetlana Lazebnik

Page 44: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Large-scale Instance Retrieval

Computer Vision

James Hays

Many slides from Derek Hoiem and Kristen Grauman

Page 45: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Multi-view matching

vs

?

Matching two given

views for depth

Search for a matching

view for recognition

Kristen Grauman

Page 46: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Perc

eptu

al

and S

enso

ry A

ugm

ente

d C

om

puti

ng

Vis

ual

Ob

ject

Reco

gn

itio

n T

uto

rial

Video Google System

1. Collect all words within

query region

2. Inverted file index to find

relevant frames

3. Compare word counts

4. Spatial verification

Sivic & Zisserman, ICCV 2003

• Demo online at : http://www.robots.ox.ac.uk/~vgg/r

esearch/vgoogle/index.html

Query

region

Retrie

ved fra

mes

Kristen Grauman

Page 47: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Perc

eptu

al

and S

enso

ry A

ugm

ente

d C

om

puti

ng

Vis

ual

Ob

ject

Reco

gn

itio

n T

uto

rial

Vis

ual

Ob

ject

Reco

gn

itio

n T

uto

rial

Application: Large-Scale Retrieval

[Philbin CVPR’07]

Query Results from 5k Flickr images (demo available for 100k set)

Page 48: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly
Page 49: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Simple idea

See how many keypoints are close to keypoints in each other image

Lots of

Matches

Few or No

Matches

But this will be really, really slow!

Page 50: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Indexing local features

• Each patch / region has a descriptor, which is a

point in some high-dimensional feature space

(e.g., SIFT)

Descriptor’s

feature space

Kristen Grauman

Page 51: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Indexing local features

• When we see close points in feature space, we

have similar descriptors, which indicates similar

local content.

Descriptor’s

feature space

Database

images

Query

image

Easily can have millions of

features to search! Kristen Grauman

Page 52: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Indexing local features:

inverted file index• For text

documents, an

efficient way to find

all pages on which

a word occurs is to

use an index…

• We want to find all

images in which a

feature occurs.

• To use this idea,

we’ll need to map

our features to

“visual words”.Kristen Grauman

Page 53: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Visual words

• Map high-dimensional descriptors to tokens/words

by quantizing the feature space

Descriptor’s

feature space

• Quantize via

clustering, let

cluster centers be

the prototype

“words”

• Determine which

word to assign to

each new image

region by finding

the closest cluster

center.

Word #2

Kristen Grauman

Page 54: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Visual words

• Example: each

group of patches

belongs to the

same visual word

Figure from Sivic & Zisserman, ICCV 2003 Kristen Grauman

Page 55: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Visual vocabulary formation

Issues:

• Vocabulary size, number of words

• Sampling strategy: where to extract features?

• Clustering / quantization algorithm

• Unsupervised vs. supervised

• What corpus provides features (universal vocabulary?)

Kristen Grauman

Page 56: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Sampling strategies

61 K. Grauman, B. LeibeImage credits: F-F. Li, E. Nowak, J. Sivic

Dense, uniformly Sparse, at interest

pointsRandomly

Multiple interest

operators

• To find specific, textured objects, sparse

sampling from interest points often more reliable.

• Multiple complementary interest operators offer

more image coverage.

• For object categorization, dense sampling offers

better coverage.

[See Nowak, Jurie & Triggs, ECCV 2006]

Page 57: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Inverted file index

• Database images are loaded into the index mapping

words to image numbersKristen Grauman

Page 58: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

• New query image is mapped to indices of database

images that share a word.

Inverted file index

Kristen Grauman

Page 59: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Inverted file index

• Key requirement for inverted file index to

be efficient: sparsity

• If most pages/images contain most words

then you’re no better off than exhaustive

search.

– Exhaustive search would mean comparing the

word distribution of a query versus every

page.

Page 60: By Suren Manvelyan, //hays/compvision2017/lectures/19.pdf · Sampling strategies 61 K. Grauman, B. Leibe Image credits: F-F. Li, E. Nowak, J. Sivic Sparse, at interest Dense, uniformly

Instance recognition:

remaining issues

• How to summarize the content of an entire

image? And gauge overall similarity?

• How large should the vocabulary be? How to

perform quantization efficiently?

• Is having the same set of visual words enough to

identify the object/scene? How to verify spatial

agreement?

• How to score the retrieval results?

Kristen Grauman