a simple classifier ridge regression a variation on standard linear regression adds a “ridge”...

6
A simple classifier Ridge regression A variation on standard linear regression Adds a “ridge” term that has the effect of “smoothing” the weights Equivalent to training a linear network with weight decay.

Post on 20-Dec-2015

235 views

Category:

Documents


2 download

TRANSCRIPT

A simple classifier

Ridge regression A variation on standard linear regression

Adds a “ridge” term that has the effect of

“smoothing” the weights

Equivalent to training a linear network with weight decay.

A “Strong” Classifier:SNoW– Sparse Network of Winnows

• Roth et al. 2000 – Currently best reported face detector• 1. Turn each pixel into a sparse, binary vector

• 2. Activation = sign( )• 3. Train with the Winnow update rule

wixi∑

AdaBoost for Feature Selection

Viola and Jones (2001) used AdaBoost as a feature selection method

For each round of AdaBoost:

For each patch, train a classifier using only that one patch.

Select the best one as the classifier for this round

reweight distribution based on that classifier.

Results

.00%

.20%

.40%

.60%

.80%

Single SNoWSNoW + BaggingRidge + AdaBoostSNoW + AdaBoost

SNoW + Bagging + patchesSNoW + AdaBoost + patchesRidge+AdaBoost+ patches

East

AdaBoost consistently improves performance

0 %

5 %

10 %

15 %

20 %

25 %

Global +Ridge

Global +SNoW

Patches +Ridge

Patches +SNoW

Single SystemBaggingAdaBoost

AdaBoost consistently improves performance

0 %

1 %

2 %

3 %

4 %

5 %

6 %

7 %

Global + SNoW Patches + Ridge Patches + SNoW

Single SystemBaggingAdaBoost