mit control 6_241js11_lec02

Upload: jadtahhan

Post on 02-Jun-2018

215 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/10/2019 MIT control 6_241JS11_lec02

    1/10

    6.241

    Dynamic

    Systems

    and

    Control

    Lecture2: LeastSquareEstimation

    Readings: DDV,Chapter2

    EmilioFrazzoli

    AeronauticsandAstronauticsMassachusetts InstituteofTechnology

    February

    7,

    2011

    E.Frazzoli (MIT) Lecture2: LeastSquaresEstimation Feb7,2011 1/9

    http://find/
  • 8/10/2019 MIT control 6_241JS11_lec02

    2/10

    Outline

    1 LeastSquaresEstimation

    E.Frazzoli (MIT) Lecture2: LeastSquaresEstimation Feb7,2011 2/9

    http://goforward/http://find/http://goback/
  • 8/10/2019 MIT control 6_241JS11_lec02

    3/10

    Least

    Squares

    Estimation

    Consideransystemofmequations innunknown,withm>n,oftheform

    y =Ax.

    Assumethatthesystem is inconsistent: therearemoreequationsthan

    unknowns,

    and

    these

    equations

    are

    non

    linear

    combinations

    of

    one

    another.

    Intheseconditions,there isnox suchthatyAx =0. However,onecanwritee=yAx,andfindx thatminimizese.

    Inparticular,theproblem

    mine2 =minyAx2x x

    isa leastsquaresproblem. Theoptimalx isthe leastsquaresestimate.

    E.Frazzoli (MIT) Lecture2: LeastSquaresEstimation Feb7,2011 3/9

    http://find/
  • 8/10/2019 MIT control 6_241JS11_lec02

    4/10

    Computing

    the

    Least-Square

    Estimate

    ThesetM :={zRm :z =Ax, xRn} isasubspaceofRm,calledtherangeofA,R(A), i.e.,thesetofallvectorsthatcanbeobtainedby linear

    combinations

    of

    the

    columns

    of

    A.

    Recalltheprojectiontheorem. Nowweare lookingfortheelementofM thatisclosesttoy, intermsof2-norm. Weknowthesolution issuchthat

    e= (yAx) R(A).

    In

    particular,

    if

    ai

    is

    the

    i-th

    column

    of

    A,

    it

    is

    also

    the

    case

    that

    (yAx) R(A) ai(yAx)=0, i = 1, . . . ,n

    A(yAx)=0

    AAx =Ay

    AA isannmatrix; is it invertible? It ifwere,thenatthispoint it iseasyto

    recover

    the

    least-square

    solution

    as

    x = (AA)1Ay.

    E.Frazzoli (MIT) Lecture2: LeastSquaresEstimation Feb7,2011 4/9

    http://find/
  • 8/10/2019 MIT control 6_241JS11_lec02

    5/10

  • 8/10/2019 MIT control 6_241JS11_lec02

    6/10

    The

    Least

    Squares

    Estimation

    Problem

    Consideragaintheproblemofcomputing

    min = min y.xRn yAx yR(A)

    y

    e

    y canbean infinite-dimensionalvectoras longasn isfinite.

    We

    assume

    that

    the

    columns

    of

    A

    = [a1,

    a2, . . . ,

    an]

    are

    independent.

    Lemma(Grammatrix)

    ThecolumnsofamatrixAare independentA,A is invertible.

    Proof

    If

    the

    columns

    are

    dependent,

    then

    there

    is

    =

    0

    such

    thatA= j ajj =0. Butthen jai,ajj =0bythe linearityof innerproduct.

    That

    is,A,A=0,andhenceA,A isnot invertible.

    Conversely, ifA,A isnot invertible,thenA,A=0forsome=0. Inotherwords A,A=0,andhenceA=0.

    E.Frazzoli (MIT) Lecture2: LeastSquaresEstimation Feb7,2011 6/9

    http://find/
  • 8/10/2019 MIT control 6_241JS11_lec02

    7/10

    The

    Projection

    theorem

    and

    least

    squares

    estimation

    1

    y hasauniquedecompositiony =y1+y2,wherey1 R(A),and

    y2 R

    (A).

    Tofindthisdecomposition, lety1 =A,forsomeRn. Then,ensurethat

    y2 =yy1 R(A). Forthistobetrue,

    ai

    ,yA= 0, i = 1, . . . ,n,

    i.e.,A,yA= 0.

    Rearranging,wegetA,A=A,y

    ifthecolumnsofAare independent,

    =A,A1A,y

    E.Frazzoli (MIT) Lecture2: LeastSquaresEstimation Feb7,2011 7/9

    http://find/
  • 8/10/2019 MIT control 6_241JS11_lec02

    8/10

    The

    Projection

    theorem

    and

    least

    squares

    estimation

    2

    Decomposee=e1+e2 similarly(e1 R(A),ande2 R(A)).

    Note

    e2 =

    e12 +

    e22.

    Rewritee=yAx as

    e1+e2 =y1+y2 Ax,

    i.e.,

    e2y2 =y1e1 Ax.

    Eachsidemustbe0,sincetheyareonorthogonalsubspaces!

    e2

    =y2

    cantdoanythingabout it.

    e1 =y1Ax =A(x)minimizebychoosingx =. Inotherwords

    x =A,A1A,y .

    E.Frazzoli (MIT) Lecture2: LeastSquaresEstimation Feb7,2011 8/9

    http://find/
  • 8/10/2019 MIT control 6_241JS11_lec02

    9/10

    Examples

    Ify,eRm,and it isdesiredtominimizee2 =ee=m

    i=1|ei|2,then

    x = (AA)1Ay

    (IfthecolumnsofAaremutuallyorthogonal,AA isdiagonal,and inversioniseasy)

    if

    y,eRm,and it isdesiredtominimizeeSe,whereS isaHermitian,

    positive-definite

    matrix,

    then

    x = (ASA)1ASy.

    Notethat ifS isdiagonal,theneSe=m

    i=1sii|ei|2, i.e.,weareminimizinga

    weighted

    least

    square

    criterion.

    A

    large

    sii

    penalizes

    the

    i-th

    component

    of

    theerrormorerelativetotheothers.

    Inageneralstochasticsetting,theweightmatrixS shouldberelatedtothenoisecovariance, i.e.,

    S = (E[ee])1.

    E.Frazzoli (MIT) Lecture2: LeastSquaresEstimation Feb7,2011 9/9

    http://find/
  • 8/10/2019 MIT control 6_241JS11_lec02

    10/10