the viterbi algorithm application of dynamic programming-the principle of optimality -search of...
TRANSCRIPT
The Viterbi Algorithm
• Application of Dynamic Programming-the Principle of Optimality
• -Search of Citation Index -213 references since 1998• Applications
– Telecommunications• Convolutional codes-Trellis codes• Inter-symbol interference in Digital Transmission • Continuous phase transmission• Magnetic Recording-Partial Response Signaling- • Divers others
– Image restoration– Rainfall prediction– Gene sequencing– Character recognition
Milestones
• Viterbi (1967) decoding convolutional codes
• Omura (1968) VA optimal• Kobayashi (1971) Magnetic recording• Forney (1973) Classic survey recognizing
the generality of the VA• Rabiner (1989) Influential survey paper
of hidden Markov chains
Example-Principle of Optimality
EE Bld
FacultyClub
Publish Perish
N
N
S
S
.5
.8
.7
.5
1.2
.8
.2
.3
.5
.8
1.2
1.0
1.2
Professor X chooses an optimum path on his trip to lunch
Optimal: 6 addsBrute force:8 adds
N bridgesOptimal: 4(N+1) adds
Brute force: (N-1)2N adds
Find optimal path to each bridge
Digital Transmission with Convolutional Codes
Information
SourceConvolutional
Encoder
1 2, ,...,
N
N
a a a
A 1 2, ,..., Nc c c
BSC
pp
Information
SinkViterbi
Algorithm
1 2, ,..., Na a a 1 2, ,...,
N
N
b b b
B
Maximum a Posteriori (MAP) Estimate
( , ) Hamming distance betwee
Define
n sequencesN ND B A
1 2 1 2
( , ) ( , )1 2 1 2
, ,..., , ,...,max ( , ,..., / , ,..., ) max (1 )
bit error probability
aximum posteriori robabM ility
A PN N N N
N N
D A B N D A BN N
a a a a a aP b b b a a a p p
p
1 2, ,...,
Equivalent
min ( , ) log( /(1 )
ly
N
N N
a a aD A B p p
Brute force = Exponential Growth with N
TInput
110100
Example(3,1) code
T
Output
111 100 010 110 011 001 0000
0
0
0
1
2 1
3 1 2
1 2
i
i s
i s s
s sInitial state -
Initial state - s s1 2 0
(output,input)efficiency=input/output
Convolutional codes-Encoding a sequence
1 2
State
S S
Fig.2.1400
10
01
111 -100
1 -1110 -011
0-000
1 -101
0 -010
0 -001
1 -110
input -output state
Markov chain for Convolutional code
00
01
10
11
00
01
10
11
000111
001110011100
010101
State output Next state
0 input
1 inputs1s2
Trellis Representation
Iteration for Optimization
1 2 1 2Shift register conte
, ,..., , ,..nt
.,s
min ( , ) min ( , )
N N
N N N N
a a a s s sD A B D A B
1 2 1 2, ,..., , ,... memor, ylessness1
min ( , ) min ( , BSC)-N N
NN N
i is s s s s s
i
D A B d a b
1 2 1 2 1
1 1
, ,..., , ,..., ,min ( ( , ) min ( ( , ) ( , ))
N N N
N N N NN N
s s s s s s sD A B D A B d a b
1 2 1 2 1/
1 1
, ,..., , ,...,min ( ( , ) min min ( ( , ) ( , ))
N N N SN
N N N NN N
s s s s s s sD A B D A B d a b
1 2 1 2 1/
1 1
, ,..., , ,...,min ( ( , ) min( ( , ) min ( , ))
N N N SN
N N N NN N
s s s s s s sD A B d a b D A B
1 2 1/
1 2 2 / 1,
1 1, ,...,
2 2
, ,...,
min ( ( , ) min( ( , ) min ( ( , )
min ( , ))
N N N SN
N S SN N
N NN N N N
s s s s s
N N
s s s
D A B d a b d a b
D A B
Key step!
1 2 2 / , 1 2 2 /1 1
2 2 2 2
, ,..., , ,..., min ( , )) min ( , ))
N S S N SN N N
N N N N
s s s s s sD A B D A B
Redundant
1 2 1
1 1 2 2 1 Accumulated distanceIncremental
1 1
, ,..., /
2 21 1
/ , , distance ..., /
min ( , )
min ( ( , ) min ( , ))
N N
N N N N
N N
s s s s
N NN N
s s s s s s
D A B
d a b D A B
Linear growth in N
Deciding Previous State
1 2 1
1 1 2 1
, ,..., /
1 1
/ , ,..., /
min ( , )
min( ( , ) min ( , ))
i i
i i i i
i i
s s s s
i ii i
s s s s s s
D A B
d a b D A B
0000
10
State iState i-11 1( , )i iD A B
4
2
( , )i id a b
1
2
4
010ib
000ia
001ia
Search previous states
Trellis codes-Euclidean distance
shortest path-Hamming distance to s0
Viterbi Algorithm-shortest pathto detect sequenceFirst step
Optimum Sequence Detection
Trace though successive states
s0
s1
s2
s3
Inter-symbol Interference
ChannelTransmitter Equalizer
VADecisions
1
0
( ) ( ) ( )-Received si
Finite memory c
gnal
( ) ( )
0; hannel
N
ii
i j
i j
z t a h t iT n t
r h t iT h t jT dt
r i j m
1
( )N
ii
a p t iT
( )z t
AWGN Channel-MAP Estimate
1 2
2
, ,...,10
min ( ) ( ) -
Euclidean distance between received and possible signals
N
N
ia a a
i
z t a h t iT dt
1 2i
, ,...,1 1 1
0
Simplification
min 2 a
where
( ) ( ) -Output of Matched Filter
N
N N N
i i j i ja a a
i i j
i
Z a a r
Z y t h t iT dt
k 1
1 1 i1 1 1
12
k 1 0
Memory
-state Define:s { ,..., }
( ,..., , ,..., ) 2 a
(
s
Accumulated distanc
Z ; ,
e
In) 2 2 cr
k m k
k k k
k k m k i i j i ji i j
k
k k k k k i k i ki k m
m
a a
D Z Z s s Z a a r
d s s a Z a a r a r
emental distance
1 2 1
1 1 2 2 1
1 1 1 1, ,..., /
k 1 1 2 1 2/ , ,..., /
min ( ,..., , ,..., )
min ( (Z ; , ) min ( ,..., , ,..., ))k k
k k k k
k k m ks s s s
k k k k m ks s s s s s
D Z Z s s
d s s D Z Z s s
Viterbi Algorithm for ISI
State = number of symbols in memory
Magnetic Recording
0
( ) ( 1) ( ) 1( )kk
m t a u t kT t
Nyquist pu1
0 lse
( )( ) * ( )
2 ( ) where k k k kk
d m te t h t
dt
x h t kT x a a
Magnetic flux passes over headsDifferentiation of pulses
Sample
Magnetization pattern
OutputControlled ISI
Same model applies to Partial Response signaling
Continuous Phase FSK
cos( (
Tranm
)
itted Si
); ( 1)
gnal
k k ky a t x kT t k T
1 2
Digital Input Sequ
, ,. ,
e
.
enc
. Na a a
1 1
Constraint-Continuous Pha
( ) ( ) 2
s
;mod
e
k k k ka t x a t x
0;even no. ones
1;odd no. oneskx
Example-Binary signaling
-1.2
-0.7
-0.2
0.3
0.8
0 0.2 0.4 0.6 0.8 1
Signaling interval
Whole cyclesodd number ½ cycles
Merges and State Reduction
All paths merge
Computations order of (No states)2
Carry only high probability states
Force merges to reduce complexity
Optimal paths through trellis
Input Pixel Effect of Blurring
Input pixel Optical channel AWGN
Optica
( , ) ( , ) ( , ) ( , )
where optical blur width
l output signalL L
l L m L
s i j a i l j m h l m n i j
L
Blurring Analogous to ISI
Row Scan
Known state transitionsAnd Decision Feed back
Utilized for state reduction
VA for optimal row sequence
Hidden Markov Chain
• Data suggests Markovian structure
• Estimate initial state probabilities
• Estimate transition probabilities
• VA used for estimation of Probabilities
• Iteration
Rainfall Prediction
No rain
Showerydry
Showerywet
Rainydry
Rainywet
Rainfall observations
DNA Sequencing• DNA-double helix
– Sequences of four nucleotides, A,T,C and G– Pairing between strands– Bonding and A T C G
Nucleotide sequence CGGATTC
Gene 1
Gene 2
Gene 3
Cordon A in three genes
•Genes–Made up of Cordons, i.e. triplets of adjacent nucleotides
–Overlapping of genes
Hidden Markov ChainTracking genes
.H
M1
M2
M3
M4
P1
P2
P3
P4
E
SS-start first cordon of geneP1-4- +1,…,+4 from startGeneE-stopH-gapM1-4 -1,…,-4 from start
Initial andTransition
Probabilities known
Recognizing Handwritten Chinese Characters
Text-line images
Estimate stroke width
Set up m X n grid
Estimate initial and transition probabilities
Detect possible segmentation paths by VA
Results
Next Slide
Example Segmenting Handwritten Characters
All possible segmentation
paths
Removal of Overlapping
paths
Eliminating Redundant
Paths
Discardingnear paths