Download - Hidden Markov Model
![Page 1: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/1.jpg)
Hidden Markov ModelNghia BuiNov 2016
Andrei Markov (1856-1922)
![Page 2: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/2.jpg)
2
The weather problem
• I talked to Jane for days through telephone. Everyday she told me what she does, either “walk” or “shop” or “clean”, only one!
• I know, on a day, the weather in her city can be either “sunny” or “rainy”, only one!
• But she didn’t tell me exactly the weather on the days, and how it affected her actions.
• Then I have to figure out by myself! HMM
![Page 3: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/3.jpg)
3
HMM is just a set of 3 rules
• If today weather is then tmrw it will be with probability
• When weather is Jane will do action with probability
• In the 1st day, the weather is with probability https://en.wikipedia.org/wiki/Hidden_Markov_model
![Page 4: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/4.jpg)
4
What are hidden?
• The states of weather {“sunny”, “rainy”} are not observable they are hidden
• The actions {“walk”, “shop”, “clean”} are observed in an index sequence where
![Page 5: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/5.jpg)
5
Two common tasks
1. Given a model and a sequence of action indexes please calculate the probability the model generates the sequence. The forward algorithm
2. Given a sequence , build a model so that is maximum. The Baum-Welch algorithm
![Page 6: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/6.jpg)
6
The forward algorithm
• Let be the probability of generating the sequence and ending up at state
• Using dynamic programming we have:
• And result:
![Page 7: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/7.jpg)
7
The Baum-Welch algorithm
• Given a model , we use it to generate many sequences, but consider only the ones that emit :
Main idea: init with a random model and make it better incrementally
• Nothing is hidden in these sequences! Now we simply base on them to estimate
![Page 8: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/8.jpg)
8
Estimate
• To estimate , count the transitions from to and to other states
• To estimate , count the appearances of that have action index , also count all the appearances of
• To estimate , count the appearances of at the first element of all sequences, and count the number of all sequences too
• But, to count all of things above, we need …
![Page 9: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/9.jpg)
9
Forward and backward variables
• Using the forward algorithm we have • Using the backward algorithm we have the
probability of generating the sequence starting from tmrw, given the state of today. Dynamic programming is used again:
![Page 10: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/10.jpg)
10
Estimate
• Count transitions from to :
• Thus:
![Page 11: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/11.jpg)
11
Estimate
• Count the appearances of state :
• Thus:
![Page 12: Hidden Markov Model](https://reader035.vdocument.in/reader035/viewer/2022062316/587d833f1a28abcd648b46a9/html5/thumbnails/12.jpg)
12
Estimate
• Count the appearances of at the first element:
• Count the number of all sequences:
• Thus: