elg5377 adaptive signal processing

8
ELG5377 Adaptive Signal Processing Lecture 6: LMS Algorithm Continued

Upload: austin-rosales

Post on 30-Dec-2015

27 views

Category:

Documents


2 download

DESCRIPTION

ELG5377 Adaptive Signal Processing. Lecture 6: LMS Algorithm Continued. Coefficient Error Vector Covariance Matrix. c ( k ) = w ( n )- w o . cov[ c ( k )] = E[ c ( k ) c H ( k )] = K ( k ). Recall that c ( k +1) = [ I - m x ( k ) x H ( k )] c ( k ) + m x ( k ) e o * ( k ). - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: ELG5377 Adaptive Signal Processing

ELG5377 Adaptive Signal Processing

Lecture 6: LMS Algorithm Continued

Page 2: ELG5377 Adaptive Signal Processing

Coefficient Error Vector Covariance Matrix

• c(k) = w(n)-wo.

• cov[c(k)] = E[c(k)cH(k)] = K(k).• Recall that

• c(k+1) = [I - x(k)xH(k)]c(k) +x(k)eo*(k).

• K(k+1) = E{[I - x(k)xH(k)]c(k)cH(k) [I - x(k)xH(k)]H} + E{[I - x(k)xH(k)]xH(k)eo(k)} + E{x(k)eo*(k) {[I - x(k)xH(k)]} + 2E[|eo(k)|2x(k)xH(k)].

• K(k+1)= [I - R]K(k)[I - R]H + 2JminR.

• K(k+1)= [I - R]K(k)[I - R] + 2JminR.

Page 3: ELG5377 Adaptive Signal Processing

Coefficient Error Vector Covariance Matrix 2

• At steady state (or for large k), K(k+1)≈K(k).• Therefore

– K(k)= [I - R]K(k)[I - R] + 2JminR.

– 0 = -K(k)R-RK(k)+2RK(k)R+2JminR.

– K(k)R+RK(k) = JminR.

Page 4: ELG5377 Adaptive Signal Processing

Mean Square Error

• e(k) = d(k)-y(k) = d(k)-wH(k)x(k).

• e(k) = d(k)-y(k) = d(k)-(w(k)-wo)Hx(k)-woHx(k).

• e(k) = eo(k)-cH(k)x(k).

• E[|e(k)|2]=E[|eo(k)|2] + E[cH(k)x(k)xH(k)c(k)].

• E[|eo(k)|2]= Jmin.

• E[cH(k)x(k)xH(k)c(k)] = E[tr{cH(k)x(k)xH(k)c(k)}] = E[tr{c(k)cH(k)x(k)xH(k)}] = tr{E[c(k)cH(k)x(k)xH(k)]} ≈ tr{K(k)R].

• tr{K(k)R} = tr{RK(k)}.

• K(k)R+RK(k) = JminR.

• tr{K(k)R+RK(k)}=Jmintr{R}.

– Therefore tr{K(k)R} = Jmintr{R}/2

Page 5: ELG5377 Adaptive Signal Processing

Mean Square Error 2

• Therefore the MSE at the output of the LMS filter is

– J = Jmin + Jmintr{R}/2.

– J = Jmin[1+(/2)i]

• Suppose R has a dominant eigenvalue (max >> i)

• J ≈ Jmin(1+ (max/2)).

Page 6: ELG5377 Adaptive Signal Processing

Excess Mean Square Error

• Jex = J – Jmin.

• Jex = Jmintr{R}/2 = Jmin(/2)i. • If R has a dominant eigenvalue, then

– Jex ≈Jmin(max/2).

Page 7: ELG5377 Adaptive Signal Processing

Misadjustment

• M = Jex/Jmin.

• For LMS Filters,

– M = (/2)tr{R} = (/2)Mr(0) = (/2)i. – M ≈ (max/2)

• In our example in the previous lecture, Jmin = 0.0985.

• For the LMS filter with = 0.1, the misadjustment should be– 0.05* 3.57 = 0.1785

• Simulated misadjustment = (0.1255-0.0985)/0.0985 = 0.274.

• For LMS filter with = 0.3,– Theoretical = 0.536– Simulated = 2.57

Page 8: ELG5377 Adaptive Signal Processing

Conclusion

• Performance of LMS algorithm as a function of .• Increasing m improves convergence time at a cost of increasing

the misadjustment.• Misadjustment and convergence time are inversely proportional.