website chapter 3

Upload: vidya-muthukrishnan

Post on 07-Jul-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/18/2019 Website Chapter 3

    1/17

    CHAPTER 3CHAPTER 3

     RECURSIVE ESTIMATION FORLINEAR MODELS

    •Organization of chapter in ISSO –Linear models

    •Relationship between least-squares and mean-square

     –LMS and RLS estimation

    • Applications in adaptie control –LMS! RLS! and "alman filter for time-ar#ing solution

     –$ase stud#% Oboe reed data

    Slides for Introduction to Stochastic Search

    and Optimization ( ISSO)  by J. C. Spall

  • 8/18/2019 Website Chapter 3

    2/17

    3-2

    Basic Linear ModelBasic Linear Model

    •$onsider estimation of ector   in model that is linear in 

    •Model has classical linear form

    where z k  is k th measurement! hk  is corresponding &design ector!'

    and v k  is un(nown noise alue

    •Model used e)tensiel# in control! statistics! signal processing! etc*

    •Man# estimation+optimization criteria based on &squared-error'-

    t#pe loss functions

     – Leads to criteria that are quadratic in  – ,nique global. estimate

    = + !T k k k z v h

  • 8/18/2019 Website Chapter 3

    3/17

    3-3

    LeastS!"ares Esti#ationLeastS!"ares Esti#ation

    •Most common method for estimating in linear model is b# method

    of least squares 

    •$riterion loss function. has form

    where Z n / 0z 1! z 2 !3! z n4T  and H n is n × p concatenated matri) of hk T  row ectors

    •$lassical batchbatch least-squares estimate is

    •5opular recursiverecursive estimates LMS! RLS! "alman filter. ma# be

    deried from batch estimate

     

    =

    − = − −∑2

    1

    1 1

    2 2 . . .

    nT T 

    k k n n n n

    z n n

    h Z H Z H  

     

    −≡

    . 16 .n T T n n n nH H H Z  

  • 8/18/2019 Website Chapter 3

    4/17

    3-4

    $eo#etric Inter%retation o& LeastS!"ares$eo#etric Inter%retation o& LeastS!"aresEsti#ate '(enEsti#ate '(en p p ) * and) * and nn ) 3) 3

  • 8/18/2019 Website Chapter 3

    5/17

    3-5

    Rec"rsi+e Esti#ationRec"rsi+e Esti#ation

    •7atch form not conenient in man# applications

     – 8*g*! data arrie oer time and want &eas#' wa# to updateestimate at time k  to estimate at time k 91

    •Least-mean-squares LMS. method is er# popular recursie

    method

     – Stochastic analogue of steepest descent algorithm  

    •LMS rec"rsion,

    •$onergence theor# based on stochastic appro)imation e*g*!

    L:ung! et al*! 1;;2< =erencs>r! 1;;?.

     – Less rigorous theor# based on connections to steepest descent

    ignores noise. @idrow and Stearns! 1;?< Ba#(in! 1;;C.

      + + + += − − >1 1 1 16 6 6 ! D .T k k k k k k  a z ah h

  • 8/18/2019 Website Chapter 3

    6/17

    3-6

    LMS in ClosedLoo% ControlLMS in ClosedLoo% Control•Suppose process is modeled according to autoregressie AR.form%

    where x k  represents state! γ  and βi  are un(nown parameters! uk  iscontrol! and w k  is noise

    •Let target &desired'. alue for x k  be d k  

    •Optimal control law (nown minimizes mean-square trac(ing error.%

    •Certainty equivalence principle :ustifies substitution of parameter

    estimatesestimates for un(nown true parameters – LMS used to estimate γ  and βi  in closed-loop mode

    + − −= β + β + + β + γ +K 1 D 1 1   !k k k m k m k k   x x x x u w 

    + − −−β −β − −β=γ 

    K 1 D 1 1k k k m k m

    d x x x  u

  • 8/18/2019 Website Chapter 3

    7/173-7

    LMS in ClosedLoo% Control &orLMS in ClosedLoo% Control &or

    FirstOrder AR ModelFirstOrder AR Model

  • 8/18/2019 Website Chapter 3

    8/173-8

    Rec"rsi+e Least S!"ares -RLS.Rec"rsi+e Least S!"ares -RLS.

    • Alternatie to LMS is RLS

     – Recall LMS is stochastic analogue of steepest descent &firstorder' method.

     – RLS is stochastic analogue of Eewton-Raphson &second order'

    method. ⇒ faster conergence than LMS in practice

    •RLS al/orit(# -* rec"rsions.,

    •Eeed P D and  to initialize RLS recursions

    1 11

    1 1

    1 1 1 1 1

    1

    .

    T k k k k  

    k k  T k k k 

    T k k k k k k k  ˆ ˆ ˆ    z 

    + ++

    + +

    + + + + +

    = −+

    = − − 

    P h h P  P P 

    h P h

    P h h

    Dˆ  

  • 8/18/2019 Website Chapter 3

    9/173-9

    Rec"rsi+e Met(ods &or Esti#ation o& Ti#eRec"rsi+e Met(ods &or Esti#ation o& Ti#eVar0in/ Para#etersVar0in/ Para#eters

    •Ft is common to hae the underl#ing true eole in timee*g*! target trac(ing! adaptie control! sequential e)perimental

    design! etc*.

     – Gime-ar#ing parameters implies  replaced with k 

    •$onsider modified linear model

    •5rotot#pe recursie form for estimating k  is

    where choice of Ak  and k  depends on specific algorithm

    1 1 1 1 .!T 

    k k k k k k k k  ˆ ˆ ˆ    z + + + += − − A h A 

    k k k k  z v = +h 

  • 8/18/2019 Website Chapter 3

    10/173-10

    T(ree I#%ortant Al/orit(#s &or Esti#ationT(ree I#%ortant Al/orit(#s &or Esti#ationo& Ti#eVar0in/ Para#eterso& Ti#eVar0in/ Para#eters

    • LMS LMS  – =oal is to minimize instantaneous squared-error criteria across

    iterations

     – =eneral form for eolution of true parameters k 

    • RLS RLS 

     – =oal is to minimize weighted sum of squared errors – Sum criterion creates &inertia' not present in LMS

     – =eneral form for eolution of k 

    • Kalman filter Kalman filter  – Minimizes instantaneous squared-error criteria

     – Requires precise statistical description of eolution of k  iastate-space model

    • Hetails for aboe algorithms in terms of protot#pe algorithmpreious slide. are in Section I*I of ISSO

  • 8/18/2019 Website Chapter 3

    11/173-11

    Case St"d0, LMS and RLS 'it( O1oe Reed DataCase St"d0, LMS and RLS 'it( O1oe Reed Data

    ……an ill win that n!b!y bl!ws "!!#an ill win that n!b!y bl!ws "!!#

    J$omedian Hann# "a#e in spea(ing of the oboe in the &Ghe SecretLife of @alter Mitt#' 1;K.

    •Section I*K of ISSO reports on linear and curilinear models for

    predicting qualit# of oboe reeds

     –Linear model has parameters< curilinear has K parameters

    •Ghis stud# compares LMS and RLS with batch least-squares

    estimates

     – 1CD data points for fitting models reeddata-fitreeddata-fit .< D

    independent. data points for testing models reeddata-testreeddata-test.

    – reeddata-fitreeddata-fit and reeddata-testreeddata-test data sets aailable from

    ISSO @eb site

  • 8/18/2019 Website Chapter 3

    12/17

    Oboe withOboe with

    Attached ReedAttached Reed

  • 8/18/2019 Website Chapter 3

    13/173-13

    Co#%arison o& Fittin/ Res"lts &orCo#%arison o& Fittin/ Res"lts &orreeddata-fitreeddata-fit andand reeddata-testreeddata-test

    • Go test similarit# of  

    fit 

    and 

    test 

    data sets! performedmodel fittin"  using test data set

    • Ghis comparison is for chec(ing consistenc# of the two

    data sets< n!t for chec(ing accurac# of LMS or RLSestimates

    • $ompared model fits for parameters in

     – 7asic linear model eqn* I*2?. in ISSO.  p / .

     – $urilinear model eqn* I*2C. in ISSO.  p / K.

    • Results on ne)t slide for basic linear model

  • 8/18/2019 Website Chapter 3

    14/173-14

    Co#%arison o& Batc( Para#eter Esti#ates &orCo#%arison o& Batc( Para#eter Esti#ates &orBasic Linear Model2 A%%roi#ate 456Basic Linear Model2 A%%roi#ate 456

    Con&idence Inter+als S(o'n in 789 8:Con&idence Inter+als S(o'n in 789 8:

  • 8/18/2019 Website Chapter 3

    15/173-15

    Co#%arison o& Batc( and RLS 'it(Co#%arison o& Batc( and RLS 'it(

    O1oe Reed DataO1oe Reed Data

    • $ompared batch and RLS using 1CD data points inreeddata-fit and D data points for testing models

    in reeddata-test

    •Gwo slides to follow present results – irst slide compares parameter estimates in pure linear

    model

     – Second slide compares prediction errors for linear and

    curilinear models

  • 8/18/2019 Website Chapter 3

    16/17

    3-16

    Batc( and RLS Para#eter Esti#ates &or BasicBatc( and RLS Para#eter Esti#ates &or Basic

    Linear Model -Data &ro#Linear Model -Data &ro# reeddata-fitreeddata-fit ..

      7atch8stimates

    RLS8stimates 

    $onstant!

    θconst −D*1?C  −D*D; 

    Gop close! θT   D*1D2 D*1D1

     Appearance!θ A 

    D*D?? D*DKC

    8ase of

    =ouge! θE  D*1? D*11

    Nascular! θV   D*DKK D*DKIShininess!

    θS  D*D?C D*D?C

    irst blow! θF   D*?; D*?KD

  • 8/18/2019 Website Chapter 3

    17/17

    3-17

    Mean and Median A1sol"te PredictionMean and Median A1sol"te PredictionErrors &or t(e Linear and C"r+ilinear ModelsErrors &or t(e Linear and C"r+ilinear Models-Model &its &ro#-Model &its &ro# reeddata-fit;reeddata-fit; PredictionPrediction

    Errors &ro#Errors &ro# reeddata-testreeddata-test..7atch linear

    modelRLSlinearmodel

    7atchcurilinear

    model

    RLScurilinear

    model

    Mean D*2K2 D*2K2 D*2I? D*2I?Median D*2KI D*2?D D*22 D*22K

    • Ran matched-pairs t -test on linear ersus curilinearmodels* ,sed one-sided test*

    • P -alue for 7atch+linear ersus 7atch+curilinear isD*D

    • P -alue for RLS+linear s* RLS+curilinear is D*1D

    • Modest eidence for superiorit# of curilinear model