csse463: image recognition day 31 today: bayesian classifiers today: bayesian classifiers tomorrow:...
TRANSCRIPT
CSSE463: Image Recognition CSSE463: Image Recognition Day 31Day 31
Today: Bayesian classifiers Today: Bayesian classifiers Tomorrow: project day. Tomorrow: project day.
Questions?Questions?
Exam 3 ThursdayExam 3 Thursday Closed book, notes, computerClosed book, notes, computer
BUT you may bring notes (index card or 1-side of paper)BUT you may bring notes (index card or 1-side of paper) You may also want a calculator.You may also want a calculator.
Pdf of review questions ?Pdf of review questions ? Not cumulative: focus is k-means and later.Not cumulative: focus is k-means and later. More hints tomorrow?More hints tomorrow?
Bayesian classifiersBayesian classifiers Use training dataUse training data
Calculate the probabilities of Calculate the probabilities of each feature. each feature.
If 2 classes:If 2 classes: Classes Classes and and
Say, circles vs. non-circlesSay, circles vs. non-circles A single feature, xA single feature, x Both classes equally likelyBoth classes equally likely Both types of errors equally Both types of errors equally
badbad
Where should we set the Where should we set the threshold between classes? threshold between classes? Here?Here?
Where in graph are 2 types of Where in graph are 2 types of errors? errors?
x
p(x) P(x|1)
Non-circles
P(x|2)
Circles
Detected as circles
Q1-4
What if we have prior information?What if we have prior information?
Bayesian probabilities say that if we only Bayesian probabilities say that if we only expect 10% of the objects to be circles, expect 10% of the objects to be circles, that should affect our classificationthat should affect our classification
Q5-8
Bayesian classifier in generalBayesian classifier in general Bayes rule:Bayes rule:
Verify with exampleVerify with example For classifiers:For classifiers:
x = feature(s)x = feature(s) ii = class = class P(P(|x) = posterior probability|x) = posterior probability P(P() = prior) = prior P(x) = unconditional probability P(x) = unconditional probability Find best class by Find best class by maximum a maximum a
posteriori (MAP) posteriori (MAP) priniciple.priniciple. Find Find class iclass i that maximizes P(that maximizes P(ii|x).|x).
Denominator doesn’t affect Denominator doesn’t affect calculationscalculations
Example: Example: indoor/outdoor classificationindoor/outdoor classification
)(
)()|()|(
bp
apabpbap
)(
)()|()|(
xp
pxpxp ii
i
Learned from examples (histogram)
Learned from training set (or leave out if unknown)
Fixed
Indoor vs. outdoor classificationIndoor vs. outdoor classification
I can use low-level image info (color, I can use low-level image info (color, texture, etc)texture, etc)
But there’s another source of really helpful But there’s another source of really helpful info! info!
Camera Metadata DistributionsCamera Metadata Distributions
p(FF|I)
p(FF|O)
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
OnOff
p(FF|I)
p(FF|O)
0
1
2
3
4
5
7
9
17
p(S
D|I)
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
p(SD|I)
p(SD|O)
0
0.01
0.01
7
0.02
2
0.03
0.05
0.07
0.1
0.12
p(ET|I)
p(ET|O)
0
0.2
0.4
0.6
p(ET|I)
p(ET|O)
Exposure Time
FlashSubject Distance
-6
-0.51
2.54
5.5
7
8.5
10
11.5
p(B
V|I)
0
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
p(BV|I)
p(BV|O)
Subject Distance
Scene Brightness
Why we need Bayes RuleWhy we need Bayes RuleProblem:Problem:
We know conditional probabilities like P(We know conditional probabilities like P(flash was on flash was on | indoor)| indoor)
We want to find conditional probabilities like We want to find conditional probabilities like
P(indoor | flash was on, exp time = 0.017, sd=8 ft, SVM output)P(indoor | flash was on, exp time = 0.017, sd=8 ft, SVM output)
Let Let = class of image, and x = all the evidence. = class of image, and x = all the evidence.
More generally, we know P( x | More generally, we know P( x | ) from the training set (why?) ) from the training set (why?)
But we want P(But we want P( | x) | x)
)(
)()|()|(
xp
pxpxp ii
i
Using Bayes RuleUsing Bayes RuleP(P(|x) = P(x||x) = P(x|)P()P()/P(x))/P(x)The denominator is constant for an image, soThe denominator is constant for an image, so
P(P(|x) = |x) = P(x|P(x|)P()P())
Q9
Using Bayes RuleUsing Bayes RuleP(P(|x) = P(x||x) = P(x|)P()P()/P(x))/P(x)The denominator is constant for an image, soThe denominator is constant for an image, so
P(P(|x) = |x) = P(x|P(x|)P()P())
We have two types of features, from image We have two types of features, from image metadata (M) and from low-level features, like metadata (M) and from low-level features, like color (L)color (L)
Conditional independence means P(x|Conditional independence means P(x|) = P(M|) = P(M|)P(L|)P(L|))
P(P(|X) = |X) = P(M|P(M|) ) P(L|P(L|) ) P(P())
From histograms From SVM Priors (initial bias)
Bayesian networkBayesian network
Efficient way to encode conditional Efficient way to encode conditional probability distributions and calculate probability distributions and calculate marginalsmarginals
Use for classification by having the Use for classification by having the classification node at the rootclassification node at the rootExamplesExamples
Indoor-outdoor classificationIndoor-outdoor classificationAutomatic image orientation detectionAutomatic image orientation detection
Indoor vs. outdoor classificationIndoor vs. outdoor classification
SVM
KL Divergence
Color Features
SVM
Texture Features
EXIF header
Each edge in the graph hasan associated matrix of conditional probabilities
Effects of Image Capture ContextEffects of Image Capture Context
Recall for a class C is fraction of C classified correctly
Orientation detectionOrientation detection
See IEEE TPAMI paperSee IEEE TPAMI paper Hardcopy or postedHardcopy or posted
Also uses single-feature Bayesian classifier Also uses single-feature Bayesian classifier (answer to #1-4)(answer to #1-4)
Keys: Keys: 4-class problem (North, South, East, West)4-class problem (North, South, East, West) Priors Priors really really helped here!helped here!
You should be able to understand the two You should be able to understand the two papers (both posted)papers (both posted)