ensemble tracking shai avidan ieee transactions on pattern analysis and machine intelligence...

42
Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AN D MACHINE INTELLIGENCE February 2007

Post on 20-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Ensemble Tracking

Shai AvidanIEEE TRANSACTIONS ON PATTERN ANALYSIS AND

MACHINE INTELLIGENCE

February 2007

Page 2: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

outline

Prior knowledge : AdaboostIntroductionEnsemble trackingImplementation issuesExperiments

Page 3: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Adaboost

Resampling for Classifier Design Bagging

Use multiple versions of a training set• Each created by drawing n’<n samples from D with repla

cement (i.e. if a sample is drawn, it is not removed from D but is reconsidered in the next sampling)

• Each data set is used to train a different component classifier

• The final classification decision is based on the vote of the component classifiers

Page 4: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Adaboost

Boosting To generate complementary classifiers by training th

e next component classifier on the mistakes of the previous ones

• Using a subset of the training data that is most informative given the current set of component classifiers

Adaboost trains a weak classifier on increasingly more difficult examples and combines the result to produce a strong classifier that is better than any of the weak classifiers.

}1,1{)( xhkWeak classifier :

max

1

)()(K

kkk xhxg }1,1{)]([ xgsignyStrong classifier :

Page 5: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Adaboost

AdaBoost(adaptive boosting) Use the same training set over and over

Each training pattern receives a weight Wk(i) The probability that the i-th pattern is drawn to take the kth

component classifier. Uniform initialization W1(i)=1/n

If a training pattern is accurately classified hk(xi)=yi, its chance of used again is reduced

Otherwise, hk(xi)yi

keiWiW kk

)()(1 )1

ln(2

1

t

tt E

E

},..,1),,{( niyx ii }1,1{ iy

keiWiW kk )()(1

)( using on measurederror training iWDE kk

Page 6: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Adaboost

Final decision

k

kk xhxg )()(

Page 7: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Adaboost

Kmax component classifiers

}1,1{)( xhk

max

1

)()(K

kkk xhxg

}1,1{)]([ xgsigny

},..,1),,{( niyx ii }1,1{ iy

n

iii xgyJ

1

))(exp(

1

11 )()(

t

kkkt xhxg

11 ~ t()~() 11 thh

Page 8: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Adaboost

At the t step

)()()()( 11

xhxgxhxg ttt

t

kkkt

n

iitit xgyJ

1

))(exp(

n

iitititi xhyxgy

11 ))()(exp(

n

iititt xhyiw

1

))(exp()(

Page 9: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Adaboost

tt eEeE tt )1(

0)1( tt eEeEJ

ttt

t

tt EeE t 12

)1

ln(2

1

t

tt E

E

Page 10: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Adaboost

))()(()(1

1)( ittitiiti xhxgyxgyt eeiw

)()()( )(1 ittiittiiti xhyt

xhyxgy eiwee

Page 11: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Introduction

Considering tracking as a binary classification problem.

Ensemble tracking as a method for training classifiers on time-varying distributions.

Ensemble of weak classifiers is trained online to distinguish between the object and the background.

Page 12: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Introduction

Page 13: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007
Page 14: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Introduction

Ensemble tracking maintains an implicit representation of the foreground and the background instead of describing foreground object explicitly alone.

Ensemble is not template-based methods. Those maintains the spatial integrity of the objects and are especially suited for handling rigid objects.

Page 15: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Introduction

Ensemble tracking extends traditional mean-shift tracking in a number of important directions: Mean-shift tracking usually works with

histogram of RGB colors. This is because gray-scale images do not provide enough information for tracking and high-dimensional feature spaces cannot be modeled with histograms due to exponential memory requirements.

Page 16: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Introduction

This is in contrast to existing methods that either represent the foreground object using the most recent histogram or some ad hoc combination of the histograms of the first and last frames.

Page 17: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Introduction

Other advantages: It breaks the time consuming training phase into

a sequence of simple and easy to compute learning tasks that can be performed online.

It can also integrate offline and online learning seamlessly.

Integrating classifier over time improves the stability of the tracker in cases of partial occlusions or illumination changes.

Page 18: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

In each frame, we keep the K “best” weak classifiers, discard the remaining T-K new weak classifiers, train T-K new weak classifiers on the newly available data, and reconstruct the strong weak classifier.

The margin of the weak classifier h(x) is mapped to a confidence measure c(x) by clipping negative margins to zero and rescaling the positive margins to the range [0,1].

Page 19: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Ensemble update

Page 20: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Ensemble tracking

Page 21: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007
Page 22: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

During Step 7 of choosing the K best weak classifier, weak classifiers do not perform much better than chance.

We allow up to existing weak classifiers to be removed this way because a large number might be a sign of occlusion and keep the ensemble unchanged for this frame.

Page 23: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Implementations issues

Outlier Rejection

Page 24: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Implementations issues

Page 25: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Implementations issues

Multiresolution Tracking

Page 26: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

Implementations issues

Page 27: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experiments

The first version uses five weak classifiers, each working on an 11D feature vector per pixel that consists of an 8-bin local histogram of oriented gradients calculated on a 5x5 window as well as the pixel R, G, and B valuse.

To improve robustness, we only count edges that are above some predefined threshold, which war set to 10 intensity values.

Page 28: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experiments

We found that the original feature space was not stable enough and used a nonlinear version of that feature space instead.

We use only three, instead of five weak classifiers. Three levels of the pyramid In each frame, we drop one weak classifier and add a

newly trained weak classifier.

],,[ 32iii xxx

Page 29: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experiments

We allow the tracker to drop up to two weak classifiers per frame because dropping more than that might be could be a sign of occlusion and we therefore do not update the ensemble in such a case.

Page 30: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experiments

Results on Color Sequences: a pedestrian crossing the streat

Page 31: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experimentsResults on Color Sequences:

tracking a couple walking with a hand-held camera.

Page 32: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experimentsResults on Color Sequences:

tracking a face exhibiting out-of-plane rotations

Page 33: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experimentsResults on Color Sequences: tracking a red car that is undergoing out-of-plane rotations and partial occlusions.

11D feature vector, single scale, an ensemble of three classifier was enough to obtain robust and stable tracking

Page 34: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experimentsAnalyze the importance of the update scheme for tracking:

Page 35: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experimentsAnalyze how often are the weak classifiers updated?

Page 36: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experimentsAnalyze how does their weight change over time.

Page 37: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experimentsAnalyze how does this method compare with a standard AdaBoost classifier that trains all its weak classifiers on a given frame?

Page 38: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experimentsResults on gray-scale sequence :

Page 39: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experiments

Results on IR sequence:

Page 40: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experiments

Handling long-period occlusion Classification rate is the fraction of the number pixels

that were correctly classified As long as the classification rate is high, the tracking

goes unchanged. When the classification level drops(<0.5), switch to

prediction mode. Once occlusion is detected we start sampling, according

to the particle filter, possible location where the object might appear.

In each such location, compute the classification score. If it is above a threshold (0.7), then tracking resumes.

Page 41: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experimentsHandling occlusions:

Page 42: Ensemble Tracking Shai Avidan IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE February 2007

experiments