hidden markov models tunghai university fall 2005
TRANSCRIPT
![Page 1: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/1.jpg)
Hidden Markov Models
Tunghai University
Fall 2005
![Page 2: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/2.jpg)
Simple Model - Markov Chains
• Markov Property: The state of the system at time t+1 only depends on the state of the system at time t
X1X2 X3 X4 X5
] x X | x P[X
] x X , x X , . . . , x X , x X | x P[X
tt11t
00111-t1-ttt11t
t
t
![Page 3: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/3.jpg)
Markov Chains
Stationarity Assumption
• Probabilities are independent of t when the process is
“stationary”
So,
This means that if system is in state i, the probability that
the system will transition to state j is pij no matter what
the value of t is
pij ] x X| x P[X itj1t
![Page 4: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/4.jpg)
Weather:
– raining today rain tomorrow prr = 0.4
– raining today no rain tomorrow prn = 0.6
– no raining today rain tomorrow pnr = 0.2
– no raining today no rain tomorrow prr = 0.8
Simple Example
![Page 5: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/5.jpg)
Simple Example
Transition Matrix for Example
• Note that rows sum to 1
• Such a matrix is called a Stochastic Matrix
• If the rows of a matrix and the columns of a matrix all sum to 1, we have a Doubly Stochastic Matrix
8.02.0
6.04.0P
![Page 6: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/6.jpg)
Gambler’s Example
– At each play we have the following:
• Gambler wins $1 with probability p
• Gambler loses $1 with probability 1-p
– Game ends when gambler goes broke, or gains a fortune of $100
– Both $0 and $100 are absorbing states
0 1 2 N-1 N
p p p p
1-p 1-p 1-p 1-pStart (10$)
or
![Page 7: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/7.jpg)
Coke vs. Pepsi
Given that a person’s last cola purchase was Coke, there is a 90% chance that her next cola purchase will also be Coke.
If a person’s last cola purchase was Pepsi, there is an 80% chance that her next cola purchase will also be Pepsi.
coke pepsi
0.10.9 0.8
0.2
![Page 8: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/8.jpg)
Coke vs. Pepsi
Given that a person is currently a Pepsi purchaser, what is the probability that she will purchase Coke two purchases from now?
66.034.0
17.083.0
8.02.0
1.09.0
8.02.0
1.09.02P
8.02.0
1.09.0P
The transition matrix is:
(Corresponding to one purchase ahead)
![Page 9: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/9.jpg)
Coke vs. Pepsi
Given that a person is currently a Coke drinker, what is the probability that she will purchase Pepsi three purchases from now?
562.0438.0
219.0781.0
66.034.0
17.083.0
8.02.0
1.09.03P
![Page 10: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/10.jpg)
Coke vs. Pepsi
Assume each person makes one cola purchase per week. Suppose 60% of all people now drink Coke, and 40% drink Pepsi.
What fraction of people will be drinking Coke three weeks from now?
6438.0438.04.0781.06.0)0( )3(101
)3(000
1
0
)3(03
pQpQpQXPi
ii
Let (Q0,Q1)=(0.6,0.4) be the initial probabilities.
We will regard Coke as 0 and Pepsi as 1
We want to find P(X3=0)
8.02.0
1.09.0P
P00
![Page 11: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/11.jpg)
Hidden Markov Models - HMM
H1 H2 HL-1 HL
X1 X2 XL-1 XL
Hi
Xi
Hidden variables
Observed data
![Page 12: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/12.jpg)
Coin-Tossing Example
0.9
Fair loaded
head head
tailtail
0.9
0.1
0.1
1/2 1/4
3/41/2
H1 H2 HL-1 HL
X1 X2 XL-1 XL
Hi
Xi
L tosses Fair/Loaded
Head/Tail
Start
1/2 1/2
![Page 13: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/13.jpg)
H1 H2 HL-1 HL
X1 X2 XL-1 XL
Hi
Xi
L tosses
Fair/Loaded
Head/Tail
0.9
Fair loaded
head head
tailtail
0.9
0.1
0.1
1/2 1/4
3/41/2
Start1/2 1/2
Coin-Tossing Example
Query: what are the most likely values in the H-nodes to generate the given data?
![Page 14: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/14.jpg)
1. Compute the posteriori belief in Hi (specific i) given the evidence {x1,…,xL} for each of Hi’s values hi, namely, compute p(hi | x1,…,xL).
2. Do the same computation for every Hi but without repeating the first task L times.
Coin-Tossing Example
Seeing the set of outcomes {x1,…,xL}, compute p(loaded | x1,…,xL) for each coin toss
Query: what are the probabilities for fair/loaded coins given the set of outcomes {x1,…,xL}?
![Page 15: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/15.jpg)
C-G Islands Example
Regular
DNA
C-G island
C-G islands: DNA parts which are very rich in C and G
A
C
G
T
change
A
C
G
T
(1-P)/4
P/6
q/4
q/4
q/4
q/4 P
P
q
q
qqP
P
(1-q)/6
(1-q)/3
p/3
p/3
p/6
![Page 16: Hidden Markov Models Tunghai University Fall 2005](https://reader036.vdocument.in/reader036/viewer/2022082709/56649d0f5503460f949e51bd/html5/thumbnails/16.jpg)
C-G Islands Example
A
C
G
T
change
A
C
G
T
H1 H2 HL-1 HL
X1 X2 XL-1 XL
Hi
Xi
C-G island?
A/C/G/T