image as markov random field and applications
TRANSCRIPT
![Page 1: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/1.jpg)
Talk Outline
� intro, applications
� MRF, labeling . . .
� how it can be computed at all?
� Applications in segmentation: GraphCut, GrabCut, demos
Image as Markov Random Field andApplications1
Tomáš Svoboda, [email protected] Technical University in Prague, Center for Machine Perception
http://cmp.felk.cvut.czLast update: July 14, 2010
1Please note that the lecture will be accompanied be several sketches and derivations on the blackboardand few live-interactive demos in Matlab
![Page 2: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/2.jpg)
2/50About this very lecture
few notes before we start
� MRF is a complicated topic
� this lecture is introductory
� some simplifications in order not to lose the whole picture
� most important references provided
� many accessible explanations on the web (wikipedia . . . )
![Page 3: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/3.jpg)
3/50Markov Random Field – MRF
From wikipedia2:
A Markov random field, Markov network or undirected graphical model is agraphical model in which a set of random variables have a Markov propertydescribed by an undirected graph.
More formal definition, which we follow, can be found in the first chapter3of the book [4].
2http://en.wikipedia.org/wiki/Markov_random_field3freely available at: http://www.nlpr.ia.ac.cn/users/szli/MRF_Book/MRF_Book.html
![Page 4: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/4.jpg)
3/50Markov Random Field – MRF
From wikipedia2:
A Markov random field, Markov network or undirected graphical model is agraphical model in which a set of random variables have a Markov propertydescribed by an undirected graph.
More formal definition, which we follow, can be found in the first chapter3of the book [4].
Let think about images:
2http://en.wikipedia.org/wiki/Markov_random_field3freely available at: http://www.nlpr.ia.ac.cn/users/szli/MRF_Book/MRF_Book.html
![Page 5: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/5.jpg)
3/50Markov Random Field – MRF
From wikipedia2:
A Markov random field, Markov network or undirected graphical model is agraphical model in which a set of random variables have a Markov propertydescribed by an undirected graph.
More formal definition, which we follow, can be found in the first chapter3of the book [4].
Let think about images:� image intensities are the random variables� the values depend only on their immediate spatial neighborhood whichis the Markov property
� images are organized in a regular grid which can be seen as anundirected graph
2http://en.wikipedia.org/wiki/Markov_random_field3freely available at: http://www.nlpr.ia.ac.cn/users/szli/MRF_Book/MRF_Book.html
![Page 6: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/6.jpg)
4/50Labeling for image analysis
Many image analysis and interpretation problems can be posed as labelingproblems.
� Assign a label to image pixel (or to features in general).� Image intensity can be considered as a label (think about palleteimages).
Sites
S index a discrete set of m sites.
S = {1, . . . ,m}
Site could be:� individual pixel� image region� corner point, line segment, surface patch . . .
![Page 7: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/7.jpg)
5/50Labels and sites
A label is an event that may happen to a site.
Set of labels L.
We will discuss disrete labels:
L = {l1, . . . , lM}
shortlyL = {1, . . . ,M}
![Page 8: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/8.jpg)
5/50Labels and sites
A label is an event that may happen to a site.
Set of labels L.
We will discuss disrete labels:
L = {l1, . . . , lM}
shortlyL = {1, . . . ,M}
What can be a label?
� intensity value
� object label
� in edge detection binary flag L = {edge,nonedge}
� . . .
![Page 9: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/9.jpg)
6/50Ordering of labels
Some labels can be ordered some not.
Ordered labels can be used to measure distance between labels.
![Page 10: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/10.jpg)
7/50The Labeling Problem
Assigning a label from the label set L to each of the sites S.
Example: edge detection in an image
Assign a label fi from the set L = {edge,nonedge} to site i ∈ S where theelements in S index the image pixels. The set
f = {f1, . . . , fm}
is called a labeling.
Unique labels
When each site is assigned a unique label, labeling can be seen as mappingfrom S to L.
f : S −→ LA labeling is also called a coloring in mathematical programming.
![Page 11: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/11.jpg)
8/50How many possible labelings?
Assuming all m sites have the same label set L
F = L × L× . . .× L︸ ︷︷ ︸m times
= Lm
Imagine the image restoration problem [3].
The the m is the number of pixels in the image and L equals to number ofintensity levels.
Many, many possible labelings. Usually only few are good.
![Page 12: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/12.jpg)
9/50Labeling problems in image analysis
� image restoration
� region segmentation
� edge detection
� object detection and recognition
� stereo
� . . .
![Page 13: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/13.jpg)
10/50Labeling with contextual analysis
In images, site neighborhood matters.
A probability P (fi) does not depend only on the site but also on the labelingaround. Mathematically speaking we must consider conditional probability
P (fi|{fi′})
where {fi′} denotes the set of other labels.
no context:P (f) =
∏i∈S
P (fi)
Markov Random Field - MRF
P (fi|fS−{i}) = P (fi|fNi)
where fNi stands for the labels at the sites neighboring i.
![Page 14: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/14.jpg)
11/50MRF a Gibbs Random Fields
How to specify an MRF: in terms of conditional probabilities P (fi|fNi) orjoint probability P (f)?
Hammersley–Clifford theorem about equivalence between MRF and Gibbsdistribution.
A set of random variables F is said to be a Gibbs Random Fields on S withrespect to N iff its configurations obey a Gibbs distribution
P (f) = Z−1 × e− 1TU(f)
where
Z =∑f∈F
e−1TU(f)
is a normalizing constant. 0 2 4 6 8 100
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
energy term U
P = e−U
![Page 15: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/15.jpg)
12/50Gibbs distribution
P (f) = Z−1 × e− 1TU(f)
� T is tempertature, T = 1 unless stated otherwise� U(f) is the energy function
0 2 4 6 8 100
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
energy term U
P = e−1T U
T=0.5T=1.0T=2.0T=5.0
![Page 16: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/16.jpg)
13/50Energy function
U(f) =∑c∈C
Vc(f)
is a sum of clique potentials Vc(f) over all possible cliques C.
4
4Illustration from the book [4]
![Page 17: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/17.jpg)
14/50Simple cliques, Auto–models
Contextual constraints on two labels
U(f) =∑i∈S
V1(fi) +∑i∈S
∑i′∈Ni
V2(fi, fi′)
This can be interpreted as
U(f) = Udata(f) + Usmooth(f)
![Page 18: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/18.jpg)
15/50Neighborhood and cliques on a regular lattices
5
5Illustration from the book [4]
![Page 19: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/19.jpg)
16/50Energy minimization
What is to be computed? Gibbs distribution:
P (f) = Z−1 × e− 1TU(f)
Remind that f is the desired labeling
f : S −→ L
In MAP formulation we seek the most probable labeling P (f).
The best labeling minimizes energy U(f)
It is a combinatorial problem. The next explanation follows mainly [2].
![Page 20: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/20.jpg)
17/50Energy – data term and smoothness term
U(f) = Udata(f) + Usmooth(f)
The data term measures how well a label fp fits the particular pixel (site) p.Globally,
Udata(f) =∑p∈P
Dp(fp)
Example: in Image restoration Dp(fp) = (fp − Ip)2, where Ip is theobserved intensity and fp is the label (assigned intensity).
Smoothness term
Expresses the context. Setting a proper smoothness term is much moretricky.
� smooth but not everywhere (think about object boundary)
� discontinuity preserving
![Page 21: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/21.jpg)
18/50Interacting pixels
We consider the energy
U(f) =∑p∈P
Dp(fp) +∑
{p,q}∈N
Vp,q(fp, fq)
where N is the set of interacting pixels, typically adjacent pixels. Dp isassumed to be nonnegative.
Interaction of adjacent pixels may have long range impact!
Vp,q is called interaction penalty.
![Page 22: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/22.jpg)
19/50(reasonable) interaction penalties
labels α, β, γ ∈ L
V (α, β) = 0 ⇔ α = β , (1)V (α, β) = V (β, α) ≥ 0 , (2)V (α, β) ≤ V (α, γ) + V (γ, β) (3)
Penalty is metric if all hold and semimetric if only (2,3) is satisfied.
Examples of discontinutity preserving penalties
� truncated quadratic V (α, β) = min(K, (α− β)2)
� truncated absolute distance V (α, β) = min(K, |α− β|)
� Potts model V (α, β) = KT (α 6= β), where T () = 1 if argument istrue, otherwise 0.
![Page 23: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/23.jpg)
20/50towards the best labeling (minimum energy)
Catch: finding global minimum is NP complete even for the simple Pottsmodel.
→ local minimum is sought.
Problem
If the solution is poor
� poor choice of energy funtion
� local minimum is far from the global one
![Page 24: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/24.jpg)
21/50Local minimum
a labeling f is a local minimum of the energy U if
U(f) ≤ U(f ′)
for any f ′ near to f . In case of discrete labeling near means withing singlemove of f .
Many local minimization method use standard moves, where only one pixel(site) may change label at a time.
Example: greedy optimization
for each pixel, the label which gives the largest descrease of the energy ischosen.
![Page 25: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/25.jpg)
22/50
Fast approximate energy minimization via graphcuts
We only sketch the main ideas from the seminal work [2]6
Allow more than just one label change at a time
7
α-β swap and α-expansion.
6Freely available implementation. Many difficult problems in computer vision were solved by using thismethod and implementation.
7image from [2]
![Page 29: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/29.jpg)
26/50α-β swap
� How many iterations in each cycle?
� A cycle is successful if a strictly better labeling is found in any iteration.
� Cycling stops after first unsuccessful cycle.
![Page 30: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/30.jpg)
27/50α-β swap
The best α-β swap is found by composing a graph and finding min-cut.
� min-cut is equivalent to max-flow (Ford-Fulkerson theorem)
� Max-flow between terminals is a standard problem in CombinatorialOptimization.
� Algorithms with low-order polynomial complexities exist.
![Page 37: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/37.jpg)
34/50Segmentation with seeds – Interactive GraphCuts
Idea: denote few pixels that trully belongs to object or background and thanrefine (grow) by using soft constraints.
![Page 38: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/38.jpg)
34/50Segmentation with seeds – Interactive GraphCuts
Idea: denote few pixels that trully belongs to object or background and thanrefine (grow) by using soft constraints.
![Page 39: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/39.jpg)
34/50Segmentation with seeds – Interactive GraphCuts
Idea: denote few pixels that trully belongs to object or background and thanrefine (grow) by using soft constraints.
![Page 40: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/40.jpg)
34/50Segmentation with seeds – Interactive GraphCuts
Idea: denote few pixels that trully belongs to object or background and thanrefine (grow) by using soft constraints.
![Page 41: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/41.jpg)
34/50Segmentation with seeds – Interactive GraphCuts
Idea: denote few pixels that trully belongs to object or background and thanrefine (grow) by using soft constraints.
� data term
� boundary penalties and/or pixel interactions
� how to find the optimal boundary between object and background
![Page 43: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/43.jpg)
36/50Edges, cuts, segmentation
Explained on the blackboard.
See [1] for details.
![Page 44: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/44.jpg)
37/50What data term?
Udata(obj,bck) =∑p∈obj
D(obj) +∑p∈bck
D(bck)
The better match between pixel and “obj” or “bck” model the lower energy(penalty).
![Page 45: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/45.jpg)
37/50What data term?
Udata(obj,bck) =∑p∈obj
D(obj) +∑p∈bck
D(bck)
The better match between pixel and “obj” or “bck” model the lower energy(penalty).
How to model?
for simplicity, you may think about the opposite case, the better match thehigher value
� background, foreground (object) pixels
� intensity or color distributions
� histograms
� parametric models: GMM – Gaussian Mixture Model
![Page 46: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/46.jpg)
38/50Image intensities - 1D GMM
−50 0 50 100 150 200 250 3000
0.005
0.01
0.015
intensity
rela
tive
freq
uenc
y
relative frequency of intensities
10
10Demo codes from [6]
![Page 47: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/47.jpg)
39/50Image intensities - 1D GMM
−50 0 50 100 150 200 250 3000
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
intensity
rela
tive
freq
uenc
y
individual Gaussians
![Page 48: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/48.jpg)
40/50Image intensities - 1D GMM
−50 0 50 100 150 200 250 3000
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
intensity
rela
tive
freq
uenc
y
Gaussian mixture
![Page 49: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/49.jpg)
41/502D GMM
−40 −30 −20 −10 0 10 20 30 40 50−50
−40
−30
−20
−10
0
10
20
30
40
![Page 50: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/50.jpg)
42/502D GMM
−40 −30 −20 −10 0 10 20 30 40 50−50
−40
−30
−20
−10
0
10
20
30
40
![Page 51: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/51.jpg)
43/502D GMM
−40 −30 −20 −10 0 10 20 30 40 50−50
−40
−30
−20
−10
0
10
20
30
40
![Page 52: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/52.jpg)
44/50Data term by GMM – math summary
p(x|fi) =K∑k=1
wfik1
(2π)d2
∣∣∣Σfik ∣∣∣12 exp(−1
2(x− µfik )T Σfik
−1(x− µfik )
),
where d is the dimension.� K number of Gaussians. User defined.� for each label L = {obj,bck} different wk, µk,Σk estimated from thedata (seeds)
� x pixel value, can be intensity, color vector, . . .
Data term
D(obj) = − ln p(x|obj)
D(bck) = − ln p(x|bck)
0 0.2 0.4 0.6 0.8 10
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
probability
data
term
![Page 53: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/53.jpg)
45/50Results data term only
See the live demo11
11Demo codes courtesy of V. Franc and A. Shekhovtsov
![Page 54: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/54.jpg)
46/50What boundary (interaction) term?
The Potts model: V (p, q) = λT (p 6= q), where T () = 1 if argument is true,otherwise 0.
Effect of λ, see the live demo.
Results for data and interaction terms
![Page 55: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/55.jpg)
47/50GrabCut – going beyond the seeds
The main idea: Iterate the graphcut and refine the data terms in eachiteration. Stop if the energy (penalty) does not decrease. [5]
The practical motivation
![Page 56: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/56.jpg)
47/50GrabCut – going beyond the seeds
The main idea: Iterate the graphcut and refine the data terms in eachiteration. Stop if the energy (penalty) does not decrease. [5]
The practical motivation
Further reduce the user interaction.
12
12images from [5]
![Page 57: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/57.jpg)
48/50GrabCut – algorithm
Init: Specify background pixels TB. TF = 0; TU = T̄B
Iterative minimization:
1. assign GMM components to pixels in TU
2. learn GMM parameters from data
3. segment by using min-cut (graphcut algorithm)
4. repeat from step 1 until convergence
Optional: edit some pixels and repeat the min-cut.
![Page 58: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/58.jpg)
49/50References
[1] Yuri Boykov and Marie-Pierre Jolly. Interactive graph cuts for optimal boundary & region segmentation ofobjects in N-D images. In Proceedings of International Conference on Computer Vision (ICCV), 2001.
[2] Yuri Boykov, Olga Veksler, and Ramin Zabih. Fast approximate energy minimization via graph cuts. IEEETransactions on Pattern Analysis and Machine Intelligence, 23(11):1222–1239, November 2001.
[3] S. Geman and D. Geman. Stochastic relaxation, gibbs distributions, and the bayesian restoration ofimages. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6(6):721–741, 1984.
[4] Stan Z. Li. Markov Random Field Modeling in Image Analysis. Springer, 2009.[5] Carsten Rother, Vladimir Kolomogorov, and Andrew Blake. GrabCut: Interactive foreground extraction
isong iterated graph cuts. ACM Transactions on Graphics (SIGGRAPH’04), 2004.[6] Tomáš Svoboda, Jan Kybic, and Václav Hlaváč. Image Processing, Analysis and Machine Vision. A
MATLAB Companion. Thomson, 2007. Accompanying www site http://visionbook.felk.cvut.cz.
![Page 60: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/60.jpg)
0 2 4 6 8 100
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
energy term U
P = e−U
![Page 61: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/61.jpg)
0 2 4 6 8 100
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
energy term U
P = e−1T U
T=0.5T=1.0T=2.0T=5.0
![Page 62: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/62.jpg)
![Page 63: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/63.jpg)
![Page 64: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/64.jpg)
![Page 65: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/65.jpg)
![Page 66: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/66.jpg)
![Page 67: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/67.jpg)
![Page 68: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/68.jpg)
![Page 69: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/69.jpg)
![Page 70: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/70.jpg)
![Page 71: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/71.jpg)
![Page 72: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/72.jpg)
![Page 73: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/73.jpg)
![Page 74: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/74.jpg)
![Page 75: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/75.jpg)
![Page 76: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/76.jpg)
![Page 77: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/77.jpg)
![Page 78: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/78.jpg)
![Page 79: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/79.jpg)
![Page 80: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/80.jpg)
![Page 81: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/81.jpg)
![Page 82: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/82.jpg)
![Page 83: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/83.jpg)
![Page 84: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/84.jpg)
![Page 85: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/85.jpg)
![Page 86: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/86.jpg)
![Page 87: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/87.jpg)
![Page 88: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/88.jpg)
![Page 89: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/89.jpg)
![Page 90: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/90.jpg)
−50 0 50 100 150 200 250 3000
0.005
0.01
0.015
intensity
rela
tive
freq
uenc
yrelative frequency of intensities
![Page 91: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/91.jpg)
−50 0 50 100 150 200 250 3000
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
intensity
rela
tive
freq
uenc
yindividual Gaussians
![Page 92: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/92.jpg)
−50 0 50 100 150 200 250 3000
0.002
0.004
0.006
0.008
0.01
0.012
0.014
0.016
0.018
intensity
rela
tive
freq
uenc
yGaussian mixture
![Page 93: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/93.jpg)
−40 −30 −20 −10 0 10 20 30 40 50−50
−40
−30
−20
−10
0
10
20
30
40
![Page 94: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/94.jpg)
−40 −30 −20 −10 0 10 20 30 40 50−50
−40
−30
−20
−10
0
10
20
30
40
![Page 95: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/95.jpg)
−40 −30 −20 −10 0 10 20 30 40 50−50
−40
−30
−20
−10
0
10
20
30
40
![Page 96: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/96.jpg)
0 0.2 0.4 0.6 0.8 10
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
probability
data
term
![Page 97: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/97.jpg)
![Page 98: Image as Markov Random Field and Applications](https://reader033.vdocument.in/reader033/viewer/2022052503/543c2e26afaf9fe8338b45e8/html5/thumbnails/98.jpg)