![Page 1: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/1.jpg)
Generative learning methods for bags of features
• Model the probability of a bag of features given a class
Many slides adapted from Fei-Fei Li, Rob Fergus, and Antonio Torralba
![Page 2: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/2.jpg)
Generative methods
• We will cover two models, both inspired by text document analysis:• Naïve Bayes• Probabilistic Latent Semantic Analysis
![Page 3: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/3.jpg)
The Naïve Bayes modelThe Naïve Bayes model
Csurka et al. 2004
• Assume that each feature is conditionally independent given the class
fi: ith feature in the imageN: number of features in the image
N
iiN cfpcffp
11 )|()|,,(
![Page 4: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/4.jpg)
The Naïve Bayes modelThe Naïve Bayes model
Csurka et al. 2004
M
j
wnj
N
iiN
jcwpcfpcffp1
)(
11 )|()|()|,,(
• Assume that each feature is conditionally independent given the class
wj: jth visual word in the vocabularyM: size of visual vocabularyn(wj): number of features of type wj in the image
fi: ith feature in the imageN: number of features in the image
![Page 5: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/5.jpg)
The Naïve Bayes modelThe Naïve Bayes model
Csurka et al. 2004
• Assume that each feature is conditionally independent given the class
No. of features of type wj in training images of class c
Total no. of features in training images of class c
p(wj | c) =
M
j
wnj
N
iiN
jcwpcfpcffp1
)(
11 )|()|()|,,(
![Page 6: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/6.jpg)
The Naïve Bayes modelThe Naïve Bayes model
Csurka et al. 2004
• Assume that each feature is conditionally independent given the class
No. of features of type wj in training images of class c + 1
Total no. of features in training images of class c + M
p(wj | c) =
(Laplace smoothing to avoid zero counts)
M
j
wnj
N
iiN
jcwpcfpcffp1
)(
11 )|()|()|,,(
![Page 7: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/7.jpg)
M
jjjc
M
j
wnjc
cwpwncp
cwpcpc j
1
1
)(
)|(log)()(logmaxarg
)|()(maxarg*
The Naïve Bayes modelThe Naïve Bayes model
Csurka et al. 2004
• Maximum A Posteriori decision:
(you should compute the log of the likelihood instead of the likelihood itself in order to avoid underflow)
![Page 8: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/8.jpg)
The Naïve Bayes modelThe Naïve Bayes model
Csurka et al. 2004
wN
c
• “Graphical model”:
![Page 9: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/9.jpg)
Probabilistic Latent Semantic AnalysisProbabilistic Latent Semantic Analysis
T. Hofmann, Probabilistic Latent Semantic Analysis, UAI 1999
zebra grass treeImage
= p1 + p2 + p3
“visual topics”
![Page 10: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/10.jpg)
Probabilistic Latent Semantic AnalysisProbabilistic Latent Semantic Analysis
• Unsupervised technique• Two-level generative model: a document is a
mixture of topics, and each topic has its own characteristic word distribution
wd z
T. Hofmann, Probabilistic Latent Semantic Analysis, UAI 1999
document topic wordP(z|d) P(w|z)
![Page 11: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/11.jpg)
Probabilistic Latent Semantic AnalysisProbabilistic Latent Semantic Analysis
• Unsupervised technique• Two-level generative model: a document is a
mixture of topics, and each topic has its own characteristic word distribution
wd z
T. Hofmann, Probabilistic Latent Semantic Analysis, UAI 1999
K
kjkkiji dzpzwpdwp
1
)|()|()|(
![Page 12: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/12.jpg)
The pLSA modelThe pLSA model
K
kjkkiji dzpzwpdwp
1
)|()|()|(
Probability of word i in document j
(known)
Probability of word i given
topic k (unknown)
Probability oftopic k givendocument j(unknown)
![Page 13: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/13.jpg)
The pLSA modelThe pLSA model
Observed codeword distributions
(M×N)
Codeword distributionsper topic (class)
(M×K)
Class distributionsper image
(K×N)
K
kjkkiji dzpzwpdwp
1
)|()|()|(
p(wi|dj) p(wi|zk)
p(zk|dj)
documents
wor
ds
wor
ds
topics
topi
cs
documents
=
![Page 14: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/14.jpg)
Maximize likelihood of data:
Observed counts of word i in document j
M … number of codewords
N … number of images
Slide credit: Josef Sivic
Learning pLSA parametersLearning pLSA parameters
![Page 15: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/15.jpg)
InferenceInference
)|(maxarg dzpzz
• Finding the most likely topic (class) for an image:
![Page 16: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/16.jpg)
InferenceInference
)|(maxarg dzpzz
• Finding the most likely topic (class) for an image:
zzz dzpzwp
dzpzwpdwzpz
)|()|(
)|()|(maxarg),|(maxarg
• Finding the most likely topic (class) for a visual word in a given image:
![Page 17: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/17.jpg)
Topic discovery in images
J. Sivic, B. Russell, A. Efros, A. Zisserman, B. Freeman, Discovering Objects and their Location in Images, ICCV 2005
![Page 18: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/18.jpg)
Application of pLSA: Action recognition
Juan Carlos Niebles, Hongcheng Wang and Li Fei-Fei, Unsupervised Learning of Human Action Categories Using Spatial-Temporal Words, IJCV 2008.
Space-time interest points
![Page 19: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/19.jpg)
Application of pLSA: Action recognition
Juan Carlos Niebles, Hongcheng Wang and Li Fei-Fei, Unsupervised Learning of Human Action Categories Using Spatial-Temporal Words, IJCV 2008.
![Page 20: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/20.jpg)
pLSA model
– wi = spatial-temporal word
– dj = video
– n(wi, dj) = co-occurrence table (# of occurrences of word wi in video dj)
– z = topic, corresponding to an action
K
kjkkiji dzpzwpdwp
1
)|()|()|(
Probability of word i in video j(known)
Probability of word i given
topic k (unknown)
Probability oftopic k given
video j(unknown)
![Page 21: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/21.jpg)
Action recognition example
![Page 22: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/22.jpg)
Multiple Actions
![Page 23: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/23.jpg)
Multiple Actions
![Page 24: Generative learning methods for bags of features Model the probability of a bag of features given a class Many slides adapted from Fei-Fei Li, Rob Fergus,](https://reader030.vdocument.in/reader030/viewer/2022033023/56649d575503460f94a35a76/html5/thumbnails/24.jpg)
Summary: Generative models
• Naïve Bayes• Unigram models in document analysis• Assumes conditional independence of words given class• Parameter estimation: frequency counting
• Probabilistic Latent Semantic Analysis• Unsupervised technique• Each document is a mixture of topics (image is a mixture of
classes)• Can be thought of as matrix decomposition• Parameter estimation: Expectation-Maximization