Page 1
Widrow-Hoff Learning
Page 2
Outline
• 1 Introduction• 2 ADALINE Network• 3 Mean Square Error• 4 LMS Algorithm• 5 Analysis of Converge• 6 Adaptive Filtering
Page 3
Introduction
• In 1960, Bernard Widrow and his doctoral student Marcian Hoff introduced the ADALINE (ADAptive LInear NEuron)network and LMS(Least Mean Square) algorithm.
Page 4
Perceptron Network
• Figure: a=hardlim(Wp+b)
Page 5
ADALINE Network
• Figure: a=purelin(Wp+b)=Wp+b
Page 9
Mean Square Error(conti.)
Page 10
Mean Square Error(conti.)
Page 11
Error analysis
𝐹 (𝐱 )=𝑐+𝐝𝑇 𝐱+𝟏𝟐𝐱𝑻 𝐀𝐱
Page 12
Error analysis(conti.)
d = -2h and A = 2R
= 0
definite
Page 14
Example 1(conti.)
Page 15
Example 1(conti.)
Page 16
Approximate Steepest Descent
Page 17
Approximate Gradient
Page 18
Approximate Gradient(conti.)
Page 19
Approximate Gradient(conti.)
Page 21
LMS Algorithm (conti.)
Page 23
Example 2(conti.)
, W(0)=
Page 24
Example 2(conti.)
Page 25
Example 2(conti.)
Page 26
Example 2(conti.)
Page 27
Analysis of Convergence
Page 28
Analysis of Convergence(conti.)
Page 29
Analysis of Convergence(conti.)
Page 31
Perceptron rule V.S. LMS algorithm
Page 32
Perceptron rule V.S. LMS algorithm(conti.)
Page 33
Perceptron rule V.S. LMS algorithm(conti.)
Page 34
Perceptron rule V.S. LMS algorithm(conti.)
Page 35
Adaptive Filtering
Page 36
Tapped Delay Line
Page 38
Adaptive Noise Cancellation