hidden variables, the em algorithm, and mixtures of gaussiansjbhuang/teaching/ece...•demo is...

58
Hidden Variables, the EM Algorithm, and Mixtures of Gaussians Computer Vision Jia-Bin Huang, Virginia Tech Many slides from D. Hoiem

Upload: others

Post on 08-Oct-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Hidden Variables, the EM Algorithm, and Mixtures of Gaussians

Computer Vision

Jia-Bin Huang, Virginia Tech

Many slides from D. Hoiem

Page 2: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Administrative stuffs

•Final project • proposal due soon - extended to Oct 29 Monday

•Tips for final project• Set up several milestones• Think about how you are going to evaluate• Demo is highly encouraged

•HW 4 out tomorrow

Page 3: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Sample final projects• State quarter classification

• Stereo Vision - correspondence matching

• Collaborative monocular SLAM for Multiple Robots in an unstructured environment

• Fight Detection using Convolutional Neural Networks

• Actor Rating using Facial Emotion Recognition

• Fiducial Markers on Bat Tracking Based on Non-rigid Registration

• Im2Latex: Converting Handwritten Mathematical Expressions to Latex

• Pedestrian Detection and Tracking

• Inference with Deep Neural Networks

• Rubik's Cube

• Plant Leaf Disease Detection and Classification

• MBZIRC Challenge-2017

• Multi-modal Learning Scheme for Athlete Recognition System in Long Video

• Computer Vision In Quantitative Phase Imaging

• Aircraft pose estimation for level flight

• Automatic segmentation of brain tumor from MRI images

• Visual Dialog

• PixelDream

Page 4: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Superpixel algorithms

•Goal: divide the image into a large number of regions, such that each regions lie within object boundaries

•Examples• Watershed• Felzenszwalb and Huttenlocher graph-based

• Turbopixels• SLIC

Page 5: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Watershed algorithm

Page 6: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Watershed segmentation

Image Gradient Watershed boundaries

Page 7: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Meyer’s watershed segmentation

1. Choose local minima as region seeds

2. Add neighbors to priority queue, sorted by value

3. Take top priority pixel from queue1. If all labeled neighbors have same label, assign that

label to pixel2. Add all non-marked neighbors to queue

4. Repeat step 3 until finished (all remaining pixels in queue are on the boundary)

Meyer 1991

Matlab: seg = watershed(bnd_im)

Page 8: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Simple trick•Use Gaussian or median filter to reduce number of

regions

Page 9: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Watershed usage

•Use as a starting point for hierarchical segmentation–Ultrametric contour map (Arbelaez 2006)

•Works with any soft boundaries–Pb (w/o non-max suppression)–Canny (w/o non-max suppression)–Etc.

Page 10: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Watershed pros and cons

• Pros–Fast (< 1 sec for 512x512 image)–Preserves boundaries

• Cons–Only as good as the soft boundaries (which may be slow to

compute)–Not easy to get variety of regions for multiple segmentations

•Usage–Good algorithm for superpixels, hierarchical segmentation

Page 11: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Felzenszwalb and Huttenlocher: Graph-Based Segmentation

+ Good for thin regions+ Fast+ Easy to control coarseness of segmentations+ Can include both large and small regions- Often creates regions with strange shapes- Sometimes makes very large errors

http://www.cs.brown.edu/~pff/segment/

Page 12: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Turbo Pixels: Levinstein et al. 2009http://www.cs.toronto.edu/~kyros/pubs/09.pami.turbopixels.pdf

Tries to preserve boundaries like watershed but to produce more regular regions

Page 13: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

SLIC (Achanta et al. PAMI 2012)

1. Initialize cluster centers on pixel grid in steps S

- Features: Lab color, x-y position

2. Move centers to position in 3x3 window with smallest gradient

3. Compare each pixel to cluster center within 2S pixel distance and assign to nearest

4. Recompute cluster centers as mean color/position of pixels belonging to each cluster

5. Stop when residual error is small

http://infoscience.epfl.ch/record/177415/files/Superpixel_PAMI2011-2.pdf

+ Fast 0.36s for 320x240+ Regular superpixels+ Superpixels fit boundaries- May miss thin objects- Large number of superpixels

Page 14: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Choices in segmentation algorithms

•Oversegmentation• Watershed + Structure random forest• Felzenszwalb and Huttenlocher 2004

http://www.cs.brown.edu/~pff/segment/

• SLIC• Turbopixels• Mean-shift

• Larger regions (object-level)• Hierarchical segmentation (e.g., from Pb)• Normalized cuts• Mean-shift• Seed + graph cuts (discussed later)

Page 15: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Multiple segmentations

• Don’t commit to one partitioning

• Hierarchical segmentation• Occlusion boundaries hierarchy: Hoiem et al.

IJCV 2011 (uses trained classifier to merge)• Pb+watershed hierarchy: Arbeleaz et al. CVPR

2009• Selective search: FH + agglomerative clustering • Superpixel hierarchy

• Vary segmentation parameters• E.g., multiple graph-based segmentations or

mean-shift segmentations

• Region proposals• Propose seed superpixel, try to segment out

object that contains it (Endres Hoiem ECCV 2010, Carreira SminchisescuCVPR 2010)

Page 16: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Review: Image Segmentation•Gestalt cues and principles of

organization

•Uses of segmentation• Efficiency• Provide feature supports• Propose object regions• Want the segmented object

•Segmentation and grouping• Gestalt cues• By clustering (k-means, mean-shift)• By boundaries (watershed)• By graph (merging , graph cuts)• By labeling (MRF) <- Next lecture

Page 17: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

HW 4: SLIC (Achanta et al. PAMI 2012)

1. Initialize cluster centers on pixel grid in steps S

- Features: Lab color, x-y position

2. Move centers to position in 3x3 window with smallest gradient

3. Compare each pixel to cluster center within 2S pixel distance and assign to nearest

4. Recompute cluster centers as mean color/position of pixels belonging to each cluster

5. Stop when residual error is small

http://infoscience.epfl.ch/record/177415/files/Superpixel_PAMI2011-2.pdf

+ Fast 0.36s for 320x240+ Regular superpixels+ Superpixels fit boundaries- May miss thin objects- Large number of superpixels

Page 18: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Today’s Class

• Examples of Missing Data Problems• Detecting outliers • Latent topic models • Segmentation (HW 4, problem 2)

• Background• Maximum Likelihood Estimation• Probabilistic Inference

•Dealing with “Hidden” Variables• EM algorithm, Mixture of Gaussians• Hard EM

Page 19: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Missing Data Problems: OutliersYou want to train an algorithm to predict whether a photograph is attractive. You collect annotations from Mechanical Turk. Some annotators try to give accurate ratings, but others answer randomly.

Challenge: Determine which people to trust and the average rating by accurate annotators.

Photo: Jam343 (Flickr)

Annotator Ratings

108928

Page 20: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Missing Data Problems: Object Discovery

You have a collection of images and have extracted regions from them. Each is represented by a histogram of “visual words”.

Challenge: Discover frequently occurring object categories, without pre-trained appearance models.

http://www.robots.ox.ac.uk/~vgg/publications/papers/russell06.pdf

Page 21: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Missing Data Problems: Segmentation

You are given an image and want to assign foreground/background pixels.

Challenge: Segment the image into figure and ground without knowing what the foreground looks like in advance.

Foreground

Background

Page 22: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Missing Data Problems: Segmentation

Challenge: Segment the image into figure and ground without knowing what the foreground looks like in advance.

Three steps:

1. If we had labels, how could we model the appearance of foreground and background? • Maximum Likelihood Estimation

2. Once we have modeled the fg/bg appearance, how do we compute the likelihood that a pixel is foreground?• Probabilistic Inference

3. How can we get both labels and appearance models at once?• Expectation-Maximization (EM) Algorithm

Page 23: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Maximum Likelihood Estimation

1. If we had labels, how could we model the appearance of foreground and background?

Foreground

Background

Page 24: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Maximum Likelihood Estimation

n

n

N

xp

p

xx

)|(argmaxˆ

)|(argmaxˆ

..1

x

xdata

parameters

Page 25: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Maximum Likelihood Estimation

n

n

N

xp

p

xx

)|(argmaxˆ

)|(argmaxˆ

..1

x

x

Gaussian Distribution

2

2

2

2

2exp

2

1),|(

n

n

xxp

Page 26: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Maximum Likelihood Estimation

መ𝜃 = argmax𝜃 𝑝 𝐱 𝜃) = argmax𝜃 log 𝑝 𝐱 𝜃)

መ𝜃 = argmax𝜃

𝑛

log (𝑝 𝑥𝑛 𝜃 ) = argmax𝜃 𝐿(𝜃)

𝐿 𝜃 =−𝑁

2log 2𝜋 −

−𝑁

2log 𝜎2 −

1

2𝜎2

𝑛

𝑥𝑛 − 𝜇 2

𝜕𝐿(𝜃)

𝜕𝜇=

1

𝜎2

𝑛

𝑥𝑛 − 𝑢 = 0 → ො𝜇 =1

𝑁

𝑛

𝑥𝑛

𝜕𝐿(𝜃)

𝜕𝜎=𝑁

𝜎−

1

𝜎3

𝑛

𝑥𝑛 − 𝜇 2 = 0 → 𝜎2 =1

𝑁

𝑛

𝑥𝑛 − ො𝜇 2

Log-Likelihood

2

2

2

2

2exp

2

1),|(

n

n

xxpGaussian Distribution

Page 27: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Maximum Likelihood Estimation

n

n

N

xp

p

xx

)|(argmaxˆ

)|(argmaxˆ

..1

x

x

2

2

2

2

2exp

2

1),|(

n

n

xxp

Gaussian Distribution

n

nxN

n

nxN

22 ˆ1

ˆ

Page 28: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Example: MLE

>> mu_fg = mean(im(labels))

mu_fg = 0.6012

>> sigma_fg = sqrt(mean((im(labels)-mu_fg).^2))

sigma_fg = 0.1007

>> mu_bg = mean(im(~labels))

mu_bg = 0.4007

>> sigma_bg = sqrt(mean((im(~labels)-mu_bg).^2))

sigma_bg = 0.1007

>> pfg = mean(labels(:));

labelsim

fg: mu=0.6, sigma=0.1

bg: mu=0.4, sigma=0.1

Parameters used to Generate

Page 29: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Probabilistic Inference

2. Once we have modeled the fg/bg appearance, how do we compute the likelihood that a pixel is foreground?

Foreground

Background

Page 30: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Probabilistic Inference

Compute the likelihood that a particular model generated a sample

component or label

),|( nn xmzp

Page 31: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Probabilistic Inference

component or label

|

|,),|(

n

mnnnn

xp

xmzpxmzp

Compute the likelihood that a particular model generated a sample

Conditional probability

𝑃 𝐴 𝐵 =𝑃(𝐴, 𝐵)

𝑃(𝐵)

Page 32: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Probabilistic Inference

component or label

|

|,),|(

n

mnnnn

xp

xmzpxmzp

k

knn

mnn

xkzp

xmzp

|,

|,

Compute the likelihood that a particular model generated a sample

Marginalization

𝑃 𝐴 =

𝑘

𝑃(𝐴, 𝐵 = 𝑘)

Page 33: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Probabilistic Inference

component or label

|

|,),|(

n

mnnnn

xp

xmzpxmzp

k

knknn

mnmnn

kzpkzxp

mzpmzxp

|,|

|,|

k

knn

mnn

xkzp

xmzp

|,

|,

Compute the likelihood that a particular model generated a sample

Joint distribution𝑃 𝐴, 𝐵 = P B P(A|B)

Page 34: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Example: Inference

>> pfg = 0.5;

>> px_fg = normpdf(im, mu_fg, sigma_fg);

>> px_bg = normpdf(im, mu_bg, sigma_bg);

>> pfg_x = px_fg*pfg ./ (px_fg*pfg + px_bg*(1-pfg));

imfg: mu=0.6, sigma=0.1

bg: mu=0.4, sigma=0.1

Learned Parameters

p(fg | im)

Page 35: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Dealing with Hidden Variables

3. How can we get both labels and appearance parameters at once?

Foreground

Background

Page 36: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Mixture of Gaussians

m

m

mn

m

x

2

2

2 2exp

2

1

mmmnnnn mzxpmzxp ,,|,,,|,22 πσμ

mnmmn mzpxp |,|2

mixture component

m

mmmnnn mzxpxp ,,|,,,|22 πσμ

component priorcomponent model parameters

Page 37: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Mixture of Gaussians

With enough components, can represent any probability density function• Widely used as general purpose pdf estimator

Page 38: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Segmentation with Mixture of Gaussians

Pixels come from one of several Gaussian

components• We don’t know which pixels come from which

components• We don’t know the parameters for the components

Problem:

- Estimate the parameters of the Gaussian Mixture Model.

What would you do?

Page 39: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Simple solution

1. Initialize parameters

2. Compute the probability of each hidden variable given the current parameters

3. Compute new parameters for each model, weighted by likelihood of hidden variables

4. Repeat 2-3 until convergence

Page 40: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Mixture of Gaussians: Simple Solution

1. Initialize parameters

2. Compute likelihood of hidden variables for current parameters

3. Estimate new parameters for each model, weighted by likelihood

),,,|( )()(2)( ttt

nnnm xmzp πσμ

n

nnm

n

nm

t

m x

1

ˆ)1(

n

mnnm

n

nm

t

m x2)1(2

ˆ1

ˆ

N

n

nmt

m

)1(

ˆ

Page 41: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Expectation Maximization (EM) Algorithm

z

zx

|,logargmaxˆ pGoal:

XfXf EE Jensen’s Inequality

Log of sums is intractable

See here for proof: www.stanford.edu/class/cs229/notes/cs229-notes8.ps

for concave functions f(x)

(so we maximize the lower bound!)

Page 42: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Expectation Maximization (EM) Algorithm

1. E-step: compute

2. M-step: solve

)(

,|,||,log|,logE )(

t

xzpppt

xzzxzx

z

)()1( ,||,logargmax tt pp

xzzxz

z

zx

|,logargmaxˆ pGoal:

Page 43: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Expectation Maximization (EM) Algorithm

1. E-step: compute

2. M-step: solve

)(

,|,||,log|,logE )(

t

xzpppt

xzzxzx

z

)()1( ,||,logargmax tt pp

xzzxz

z

zx

|,logargmaxˆ pGoal: XfXf EE

log of expectation of P(x|z)

expectation of log of P(x|z)

Page 44: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM for Mixture of Gaussians - derivation

m

m

m

mn

m

x

2

2

2exp

2

1

m

mmmnnn mzxpxp ,,|,,,|22 πσμ

1. E-step:

2. M-step:

)(

,|,||,log|,logE )(

t

xzpppt

xzzxzx

z

)()1( ,||,logargmax tt pp

xzzxz

Page 45: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM for Mixture of Gaussians

m

m

m

mn

m

x

2

2

2exp

2

1

m

mmmnnn mzxpxp ,,|,,,|22 πσμ

1. E-step:

2. M-step:

)(

,|,||,log|,logE )(

t

xzpppt

xzzxzx

z

)()1( ,||,logargmax tt pp

xzzxz

),,,|( )()(2)( ttt

nnnm xmzp πσμ

n

nnm

n

nm

t

m x

1

ˆ)1(

n

mnnm

n

nm

t

m x2)1(2

ˆ1

ˆ

N

n

nmt

m

)1(

ˆ

Page 46: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM algorithm - derivation

http://lasa.epfl.ch/teaching/lectures/ML_Phd/Notes/GP-GMM.pdf

Page 47: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM algorithm – E-Step

Page 48: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM algorithm – E-Step

Page 49: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM algorithm – M-Step

Page 50: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM algorithm – M-Step

Take derivative with respect to 𝜇𝑙

Page 51: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM algorithm – M-Step

Take derivative with respect to σ𝑙−1

Page 52: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM Algorithm for GMM

Page 53: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM Algorithm

•Maximizes a lower bound on the data likelihood at each iteration

•Each step increases the data likelihood• Converges to local maximum

•Common tricks to derivation• Find terms that sum or integrate to 1• Lagrange multiplier to deal with constraints

Page 54: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Convergence of EM Algorithm

Page 55: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

EM Demos

•Mixture of Gaussian demo

•Simple segmentation demo

Page 56: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

“Hard EM”

• Same as EM except compute z* as most likely values for hidden variables

• K-means is an example

•Advantages• Simpler: can be applied when cannot derive EM• Sometimes works better if you want to make hard predictions

at the end

• But• Generally, pdf parameters are not as accurate as EM

Page 57: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Missing Data Problems: OutliersYou want to train an algorithm to predict whether a photograph is attractive. You collect annotations from Mechanical Turk. Some annotators try to give accurate ratings, but others answer randomly.

Challenge: Determine which people to trust and the average rating by accurate annotators.

Photo: Jam343 (Flickr)

Annotator Ratings

108928

Page 58: Hidden Variables, the EM Algorithm, and Mixtures of Gaussiansjbhuang/teaching/ece...•Demo is highly encouraged •HW 4 out tomorrow. Sample final projects •State quarter classification

Next class

•MRFs and Graph-cut Segmentation

•Think about your final projects (if not done already)