systems identification

22
SYSTEMS SYSTEMS Identification Identification Ali Karimpour Assistant Professor Ferdowsi University of Mashhad Reference: “System Identification Theory For The User” Lennart Ljung(1999)

Upload: cate

Post on 08-Jan-2016

63 views

Category:

Documents


1 download

DESCRIPTION

SYSTEMS Identification. Ali Karimpour Assistant Professor Ferdowsi University of Mashhad. Reference: “System Identification Theory For The User” Lennart Ljung(1999). Lecture 3. Simulation and prediction. Topics to be covered include : Simulation. Prediction. Observers. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: SYSTEMS Identification

SYSTEMSSYSTEMSIdentificationIdentification

Ali Karimpour

Assistant Professor

Ferdowsi University of Mashhad

Reference: “System Identification Theory For The User” Lennart Ljung(1999)

Page 2: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

2

Lecture 3

Simulation and predictionSimulation and prediction

Topics to be covered include:

Simulation.

Prediction.

Observers.

Page 3: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

3

Simulation

Suppose the system description is given by

)()()()()( teqHtuqGty

Let

.....,3,2,1,)( ttu

Undisturbed output is

)()()( tuqGty

Let (by a computer)

.....,3,2,1,)( tteDisturbance is

)()()( teqHtv

Output is

)()()()()( teqHtuqGty

Page 4: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

4

Prediction (Invertibility of noise model)

Disturbance is

0

)()()()()(k

ktekhteqHtv

H must be stable so:

0

)(k

kh

Invertibility of noise model

?)(known is )( if)()()( tetvteqHtv

)()(~

)( tvqHte

With

0

)(~

k

kh

)(~

)(?

qHqH

Page 5: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

5

Lemma 1 Consider v(t) defined by

0

)()()()()(k

ktekhteqHtv

Assume that filter H is stable and let:

0

)()(k

kzkhzH

Assume that 1/H(z) is stable and:

0

)(~

)(

1

k

kzkhzH

Define H-1(q) by

0

1 )(~

)(k

kqkhqH

Then H-1(q) is inverse of H(q) and

)()(~

)()()( 1 tvqHtvqHte

Prediction (Invertibility of noise model)

Exercise1: Proof Lemma1

Page 6: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

6

Prediction (Invertibility of noise model)

Example 1 A moving average process

)1()()( tcetetvLet

That is11)( cqqH

This is a moving average of order 1, MA(1). Then

z

czczzH

11)(

If |c| < 1 then the inverse filter is determined as

0

1 )()(k

kk zccz

zzH

So e(t) is:

0

)()()(k

k ktvcte

Exercise3: Exercise 3T.1

Exercise2: Show the validity of example 1 by an example for c=0.2 and c=0.9 c=1.2.

Page 7: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

7

Prediction (One-step-ahead prediction of v)

Now we want to predict v(t) based on the pervious observation

10

)()()()()()(kk

ktekhtektekhtv

Now the knowledge of v(s), s ≤t-1 implies the knowledge of e(s), s ≤t-1 according to inevitability. Also we have

)1()()()()()(1

tmtektekhtetvk

Suppose that the PDF of e(t) be denoted by fe(x) so:

xxfxxtexP e )())((

Now we want to know fv(x) so:

))(()( 1 t

v vxxtvxPxxf ))1()(( 1 tvxxtmtexP

))1()()1(( 1 tvxtmxtetmxP xtmxfe ))1((

Page 8: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

8

Prediction (One-step-ahead prediction of v)

xtmxfxxf ev ))1(()(So the (posterior) probability density function of v(t), given observation up to time t-1, is ))1(()( tmxfxf ev

1- Maximum a posteriori prediction (MAP): Use the value for which PDF has its maximum.

)(max)(ˆLet xftv v

2- Conditional exceptation: Use the mean value of the distribution in question.

)1|(ˆ)(ˆLet ttvtv

We use mostly the blocked one.

?)(ˆ tv

Exercise4: Exercise 3D.3

Exercise5: Exercise 3E.4

Page 9: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

9

Prediction (One-step-ahead prediction of v)

)1()()( tmtetv?)1|(ˆ ttv

)1()1|(ˆ tmttv )()(1

teqkhk

k

)(1)( teqH

)()(

1)(tv

qH

qH )()(1 1 tvqH

1

)()(~

k

ktvkh

Conditional exceptation

1

)()(~

)1|(ˆk

ktvkhttv

Alternative formula

1

)()()1|(ˆ)(k

ktvkhttvqH

1

)()(k

ktekh

Exercise6: Exercise 3T.1

Suppose that H(q) is inversely stable and monic, what about H-1(q) ?

?

Page 10: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

10

Prediction (One-step-ahead prediction of v)

Example 3.2 A moving average process

)1()()( tcetetvLet

That is11)( cqqH

Conditional exceptation

1

)()(~

)1|(ˆk

ktvkhttv

Alternative formula

1

)()()1|(ˆ)(k

ktvkhttvqH

1

)()(~

)1|(ˆk

ktvkhttv

1

)()()1|(ˆ)(k

ktvkhttvqH )1()2|1(ˆ)1|(ˆ tcvttvcttv

1

)()()1|(ˆk

k ktvcttv

Page 11: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

11

Prediction (One-step-ahead prediction of v)

Example 3.3

1)()(0

akteatvk

kLet

That is

011

1)(

k

kk

azzazH

Conditional exceptation

1

)()(~

)1|(ˆk

ktvkhttv

Alternative formula

1

)()()1|(ˆ)(k

ktvkhttvqH

1

)()(~

)1|(ˆk

ktvkhttv )1()1|(ˆ tavttv

11 1)( azzH

Page 12: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

12

Prediction (One-step-ahead prediction of y)

)()()()()()()()( teqHtuqGtvtuqGty Let

Suppose v(s) is known for s ≤ t-1 and u(s) are known for s ≤ t . Since

)()()()( tvtuqGty )1|(ˆ)()()1|(ˆ ttvtuqGtty

)1|(ˆ)()()1|(ˆ ttvtuqGtty )()(1)()( 1 tvqHtuqG

)()()()(1)()( 1 tuqGtyqHtuqG

)()(1)()()()1|(ˆ 11 tyqHtuqGqHtty

)(1)()()()1|(ˆ)( tyqHtuqGttyqH

)()(~

)()()1|(ˆ11

ktykhktuklttykk

Page 13: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

13

Prediction (One-step-ahead prediction of y)

Unknown initial condition

)()(~

)()()1|(ˆ11

ktykhktuklttykk

Since only data over the interval [0 , t-1] exist so

)()(~

)()()1|(ˆ11

ktykhktuklttyt

k

t

k

The exact prediction involves time-varying filter coefficients and can be computed using the Kalman filter.

The prediction error

)()()()()()1|(ˆ)( 1 tetytuqGqHttyty

So the variable e(t) is the part of y(t) that can not be predicted from past data. It is also called innovation at time t.

Page 14: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

14

Prediction (k-step-ahead prediction of v)

k-step-ahead predictor of v

kl

lktelhtktv )()(0)|(ˆ

First of all we need k-step-ahead prediction of v

0

)()()(l

ltelhtv

0

)()()(l

lktelhktv

kl

k

l

lktelhlktelhktv )()()()()(1

0

)()(~

)()()( teqHkteqHktv kk

Known at t

Unknown at t

kl

klk

k

l

lk qlhqHqlhqH )()(

~)()( where

1

0

)()(~

teqH k )()()(~ 1 tvqHqH k

Page 15: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

15

Prediction (k-step-ahead prediction of y)

)()()(~

)()(~

)()()|(ˆ 1 tvqHqHteqHlktvlhtktv kkl

k

k-step-ahead prediction of v is:

Suppose we have measured y(s) for s≤ t and u(s) is known for s≤ t+k-1. So let

)()()()( ktvktuqGkty

)|(ˆ)()()|(ˆ),|(ˆ 1 tktvktuqGtktyuykty ktt

)()()(~

)()( 1 tvqHqHktuqG k

)()()()()(~

)()( 1 tuqGtyqHqHktuqG k

)()()(~

)()()()(~

1)|(ˆ 11 tyqHqHktuqGqHqHqtkty kkk

Page 16: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

16

Prediction (k-step-ahead prediction of y)

)()()(~

)()()()(~

1)|(ˆ 11 tyqHqHktuqGqHqHqtkty kkk

)()(1)()()()|(ˆ tyqWtuqGqWktty kk k-step-ahead prediction of y is:

)(qWk )(qWqq kkk

Define prediction error of k-step-ahead prediction as:

)|(ˆ)()( tktyktyktek

Exercise8: Show that prediction error of k-step-ahead prediction is a moving average of e(t+k) , … ,e(t+1)

Exercise9: Exercise 3E.2

Exercise7: Show that the k-step-ahead prediction of can)()()()( tvtuqGty

also viewed as a one-step-ahead predictor associated with the model:

)()()()()( 1 tvqWtuqGty k

Page 17: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

17

Observer

)()()( tuqGty In many cases we ignore noises, so deterministic model is used

This description used for “computing,” “guessing,” or “predicting”. So we need the concept of observer.

As an example let:1

1

1

1

1)()(

az

bzzabzG

k

kk

)(1

)(1

1

tuaq

bqty

This means that

1

1 )()()(k

k ktuabty

)()(1 11 tubqtyaq )1()1()( tbutayty

)1()1()1|(ˆ tbutaytty

1

1 )()()1|(ˆk

k ktuabtty

So we have

Page 18: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

18

Observer

)()1()1()1|(ˆ IItbutaytty )()()()1|(ˆ1

1 Iktuabttyk

k

So we have

If input output data are lacking prior to time t = 0 , first one suffers from an error, but second one still is correct for t > 0.In the other hand first one is un affected by measurement errors in the output, but second one affected.

So the choice of predictor could be seen as a trade-off between sensitivity with respect to output measurement errors and rapidly

decaying effects of erroneous initial conditions.

Exercise (3E.3): Show that if )()()(

1)(

1

1

teqHtuaq

bqty

Then for the noise model H(q)=1, (I) is the natural predictor, whereas the noise model

0

)()(k

kk qaqH

Leads to the predictor (II)

Page 19: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

19

Observer

kl

lqwqW 11)(

)()()()()( tuqGqWtyqW

)()()()()(1)( tuqGqWtyqWty

)()()( tuqGty A family of predictor for

So the choice of predictor could be seen as a trade-off between sensitivity with respect to output measurement errors and rapidly decaying effects of erroneous initial conditions.

To introduce design variables for this trade-off, choose a filter W(q) such that

Applying it to both sides we have

Which means that

The right hand side of this expression depends only on y(s), s≤t-k, and u(s). s ≤t-1. So

)()()()()(1)1|(ˆ tuqGqWtyqWtty

Page 20: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

20

Observer

)()()()()(1)( tuqGqWtyqWty

)()()( tuqGty A family of predictor for

)()()()()(1)1|(ˆ tuqGqWtyqWtty

)()()1|(ˆ)()( tvqWttytyte

The trade-off considerations for the choice of W could then be

1. Select W(q) so that both W and WG have rapidly decaying filter coefficients in order to minimize the influence of erroneous initial conditions.

2. Select W(q) so that measurement imperfections in y(t) are maximally attenuated.

The later issue can be shown in frequency domain. Suppose that

)()()( tvtyty M The prediction error is:

Page 21: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

21

Observer

)()()()()(1)( tuqGqWtyqWty

)()()( tuqGty A family of predictor for

)()()()()(1)1|(ˆ tuqGqWtyqWtty

)()()1|(ˆ)()( tvqWttytyte

The prediction error is:

)()()(2

v

ieW

The spectrum of this error is, according to Theorem 2.2:

The problem is thus to select W, such that the error spectrum has an acceptable size and suitable shape.

Page 22: SYSTEMS Identification

lecture 3

Ali Karimpour Oct 2010

22

Observer

Fundamental role of the predictor filterLet

)()()()()( teqHtuqGty )()()( tuqGty or

Then y predicted as:

)()(1)()()()1|(ˆ 11 tyqHtuqGqHtty

)()()()()(1)1|(ˆ tuqGqWtyqWtty or

They are linear filters since:

Linear Filter

)(ty

)(tu)1|( tty