review and summary box-jenkins models stationary time series ar(p), ma(q), arma(p,q)

Post on 18-Jan-2018

227 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

The Moving Average Time series of order q, MA(q) where is a white noise time series with variance  2. is a Moving Average time series of order q. MA(q) if it satisfies the equation:

TRANSCRIPT

Review and Summary

Box-Jenkins models Stationary Time series

AR(p), MA(q), ARMA(p,q)

Models

for Stationary Time Series.

The Moving Average Time series of order q, MA(q)

1 1 2 2 t t t t q t qx u u u u

where is a white noise time series with variance 2.

21 2and q

qB I B B B

is a Moving Average time series of order q. MA(q) if it satisfies the equation:

t tx B u

tx t T

tu t T

qi

qih

hq

ihii

0

if0

2

The autocorrelation function for an MA(q) time series

The autocovariance function for an MA(q) time series

qi

qihh

q

ii

hq

ihii

0

if0 0

2

0

The mean

tE x

The Autoregressive Time series of order p, AR(p)

where {ut|t T} is a white noise time series with variance 2.

2211 tptpttt uxxxx

{xt|t T} is called a Autoregressive time series of order p. AR(p) if it satisfies the equation:

or t tB x u

21 2and p

pB I B B B

tx t T

tu t T

The mean value of a stationary AR(p) series

p

txE

211

The Autocovariance function (h) of a stationary AR(p) series

Satisfies the equations:

21 10 pp

101 1 pp

212 1 pp

and 011 ppp

phhh p 11 for h > p

Yule Walker Equations

2

1

01 1 p p

with phhh p 11

for h > p

111 1 pp

212 1 pp

111 ppp

The Autocorrelation function (h) of a stationary AR(p) series

Satisfies the equations:

and

or:

h

pp

hh

rc

rc

rch

111

22

11

and c1, c2, … , cp are determined by using the starting values of the sequence (h).

pp xxx 11

prx

rx

rx 111

21

where r1, r2, … , rp are the roots of the polynomial

Stationarity AR(p) time series: consider the polynomial

pp xxx 11

prx

rx

rx 111

21

with roots r1, r2 , … , rp

1. then {xt|t T} is stationary if |ri| > 1 for all i.

2. If |ri| < 1 for at least one i then {xt|t T} exhibits deterministic behaviour.

3. If |ri| ≥ 1 and |ri| = 1 for at least one i then {xt|t T} exhibits non-stationary random behaviour.

tx t T

tx t T

tx t T

The Mixed Autoregressive Moving Average Time Series of order p, ARMA(p,q)

where {ut|t T} is a white noise time series with variance 2,

A Mixed Autoregressive- Moving Average time series - ARMA(p,q) series {xt: t T} satisfies the equation:

2211 ptpttt xxxx

2211 qtqttt uuuu

or +t tB x B u

21 2 p

pB I B B B

21 2and q

qB I B B B

tu t T

The mean value of a stationary ARMA(p,q) series

p

txE

211

Stationary of an ARMA(p,q) series

Consider the polynomial p

p xxx 11

prx

rx

rx 111

21

with roots r1, r2 , … , rp

1. then {xt|t T} is stationary if |ri| > 1 for all i.

2. If |ri| < 1 for at least one i then {xt|t T} exhibits deterministic behaviour.

3. If |ri| ≥ 1 and |ri| = 1 for at least one i then {xt|t T} exhibits non-stationary random behaviour.

tx t T

The autocovariance function (h) satisfies:

phhh p 11

qhhh uxquxux 11

For h = 0, 1. … , q:

pp 10 1 quxquxux 10 1

101 1 pp 101 quxqux

pqqq p 11 0uxq

for h > q: phhh p 11

h ux(h)

0

-1

-2

-3

2

2

22 2

2 21 2 2 2 3 3

hux note phh uxpux 11

qhhh uuquuuu 11

0 for 0.ux h h

121

211111

21

211111

)(

kk

kk

kkk

xx

xx

xx

xxx

xx

xx

kkkk

The partial auto correlation function at lag k is defined to be:

Using Cramer’s Rule

A recursive formula for kk:

Starting with 11 = 1

k

jj

kj

k

jjk

kjk

kkkk

1

11

1,111

1

and

kjkjkkk

kj

kj ,,2 ,1 11,1

1

Spectral density function f()

Let {xt: t T} denote a time series with auto covariance function (h) and let f() satisfy:

0

cos( ) ( )h h f d

then f() is called the spectral density function of the time series {xt: t T}

1

1 1also 0 cos( )2 h

f h h

tx t T

tx t T

Linear Filters

sstst xay

Let {xt : t T} be any time series and suppose that the time series {yt : t T} is constructed as follows: :

The time series {yt : t T} is said to be constructed from {xt : t T} by means of a Linear Filter.

input xtoutput yt

Linear Filter

as

tx t T

tx t T

ty t T

ty t T

Spectral theory for Linear Filters

x

s

sisxy feafAf

22

sstst xay

if {yt : t T} is obtained from {xt : t T} by the linear filter:

tx t T ty t T

since 2

2

uf

u

q

s

sisux fefAf

2

0

2

Applications:

2

0

2

2

q

s

sise

2 2

2ie

The Moving Average Time series of order q, MA(q)

1 1 2 2 t t t t q t qx u u u u

since 2

2

uf

2

2

1

1p

i su x s x

s

f A f e f

22

1

hence 12

pi s

s xs

e f

The Autoregressive Time series of order p, AR(p)

1 1 2 2 t t t p t p tx x x x u

1 1 2 2or t t t t p t pu x x x x

2 2

2 2

1

and 22 1

x p ii s

ss

fee

since 2 2

2

i

u

ef

2

2

1

1p

i sz x s x

s

f A f e f

2 2 2

1

hence 12

i pi s

s xs

ee f

where {zt |t T} is a MA(q) time series.

1 1 2 2 t t t p t p tx x x x z

1 1 2 2or t t t t p t pz x x x x

2 22 2

2 2

1

thus 22 1

i i

x p ii s

ss

e ef

ee

The ARMA(p,q) Time series of order p,q

tz t T

xt But B xt ut B xt * But

MA(q) AR(p) ARMA(p,q)

Equation

Stationarity

Invertibility

Auto-correlation function

Partial Autocorrelation function

Spectral density function f( )

2

2 ei 2

ei 2

2

2

ei 22

2 ei 2

Process

Summary of Properties of MA, AR, ARMA Processes

always stationary

always invertible

stationary if r >1 for all iwhere r are the roots ofi

i

(x) = 0.

stationary if r >1 for all iwhere r are the roots ofi

i

(x) = 0.invertible if r > 1 for all iiwhere r are the roots ofi

(x) = 0.

invertible if r > 1 for all iiwhere r are the roots ofi

(x) = 0.

Cuts off

Cuts off

Infinite. Tails off.Damped Exponentials and/or Cosine waves

Infinite. Tails off.

Infinite. Tails off.Infinite. Tails off.Dominated by damped Exponentials & Cosine waves.

Dominated by damped Exponentials & Cosine waves

Damped Exponentials and/or Cosine wavesafter q-p.

after p-q.

Three Important Forms of a Non-Stationary Time Series

The Difference equation Form: (B)xt = +(B)ut

or

xt = 1xt-1 + 2xt-2 +... +pxt-p

+ + ut +1ut-1 + 2ut-2 +...+qut-q

The Random Shock Form:xt =+(B)ut

orxt = + ut+1ut-1 + 2ut-2 +3ut-3 +...

where (B) =(B)-1(B) =

= I+ 1B + 2B2 +3B3 + ...

=(B)-1=/(1- 1 - 2 - ... - p) and

The Inverted Form:(B)xt = + ut or

xt = 1xt-1 + 2xt-2 +3x3+ ... + + ut

where(B) = [(B)]-1[(B)] = I - 1B - 2B2 - 3B3 - ...

Models for Non-Stationary Time Series

The ARIMA(p,d,q) time series

An important fact: Most Non-stationary time series have changes that are stationaryRecall the time series {xt : t T} defined by the following equation:

xt = 1xt-1 + ut

Then1) if |1| < 1 then the time series {xt : t T} is

a stationary time series.2) if |1| = 1 then the time series {xt : t T} is

a non stationary time series.3) if |1| > 1 then the time series {xt : t T} is

a deterministic time series in nature.

In fact if 1 = 1 then this equation becomes:

xt = xt-1 + ut

This is the equation of a well known non stationary time series (called a Random Walk.) Note: xt - xt-1 = (I -B)xt= xt = ut where = I - B

Thus by the simple transformation of computing first differences we can can convert the time series {xt : t T} into a stationary time series.

Now consider the time series, {xt : t T}, defined by the equation:

(B)xt = +(B)ut

where(B) = I - 1B - 2B2 -... - p+d Bp+d.

Let r1, r2, ... ,rp+d are the roots of the polynomial (x) where:

(x) = 1 - 1x - 2x2 -... - p+dxp+d.

Then

1) if |ri| > 1 for i = 1,2,...,p+d the time series {xt : t T} is a stationary time series.

2) if |ri| = 1 for at least one i (i = 1,2,...,p) and |ri| > 1 for the remaining values of i then the time series {xt : t T} is a non stationary time series.

3) if |ri| < 1 for at least one i (i = 1,2,...,p) then the time series {xt : t T} is a deterministic time series in nature.

Suppose that d roots of the polynomial (x) are equal to unity then (x) can be written:(B) = (1 - 1x - 2x2 -... - pxp)(1-x)d.

and (B) could be written:(B) = (I - 1B - 2B2 -... - pBp)(I-B)d= (B)d.

In this case the equation for the time series becomes:(B)xt = + (B)ut

or (B)d xt = + (B)ut..

Thus if we let wt = dxt then the equation for {wt : t T} becomes:

(B)wt = +(B)ut

Since the roots of (B) are all greater than 1 in absolute value then the time series {wt: t T} is a stationary ARMA(p,q) time series. The original time series , {xt : t T}, is called an ARIMA(p,d,q) time series or Integrated Moving Average Autoregressive time series.

The reason for this terminology is that in the case that d = 1 then {xt: t T} can be expressed in terms of {wt: t T} as follows:

xt = -1wt = (I - B)-1wt

= (I + B + B2 + B3 + B4+ ...)wt

= wt + wt-1 + wt-2 + wt-3 + wt-4+ ...

Comments: 1. The operator (B) =(B)d is called the generalized

autoregressive operator.2. The operator (B) is called the autoregressive

operator.3. The operator (B) is called moving average operator.4. If d = 0 then t the process is stationary and the level of

the process is constant.5. If d = 1 then the level of the process is randomly

changing.6. If d = 2 thern the slope of the process is randomly

changing.

-15

-10

-5

0

5

10

15

0 50 100 150 200

(B)xt = + (B)ut

-150

-100

-50

0

50

100

150

0 50 100 150 200

(B)xt = + (B)ut

-5000

-4000

-3000

-2000

-1000

0

1000

2000

3000

4000

0 50 100 150 200

(B)2xt = + (B)ut

Forecasting

for ARIMA(p,d,q) Time Series

Consider the m+n random variables x1, x2, ... , xm, y1, y2, ... , yn

with joint density function f(x1, x2,... , xm, y1, y2, ... , yn) = f(x,y)

where x = (x1, x2, ... , xm) and y = (y1, y2, ... , yn).

Then the conditional density of x = (x1, x2,... , xm) given y = (y1, y2, ... , yn) is defined to be:

),...,,(),...,,,,...,,(

21

2121

n

nn

yyyfyyyxxxf

y

)(),(

yyx

yff

),...,,,...,,()( 2121 nn yyyxxxff yxyx yx

In addition the conditional expectation of g(x) = g(x1, x2,... , xm) given y = (y1, y2, ... , yn)

is defined to be:

nnnn dxdxyyxxfxxg ...),...,,...,(,..., 1111 yx

xyxx yx dfg )(

nn yyyxxxgEgE ,...,,,...,, 2121yx

Prediction

Again consider the m+n random variables x1,... , xm, y1,... , yn. Suppose we are interested in predicting g(x1,... , xm) = g(x) given y = (y1, y2, ... , yn). Let t(y1, y2, ... , yn) = t(y) denote any predictor of g(x1, x2,... , xm) = g(x) given the information in the observations y = (y1, y2, ... , yn). Then the Mean square error of t(y) in predicting g(x) using t(y) is defined to be

MSE[t(y)] = E[{t(y)-g(x)}2 |y]

It can be shown that the choice of t(y) that minimizes MSE[t(y)] is t(y) = E[g(x) |y].

Proof: Let v(t) = E{[t-g(x)]2 |y }

= E[t2-2tg(x)+g2(x) |y]= t2-2tE[g(x)|y]+E[g2(x) |y].

Then v'(t) = 2t -2 E[g(x)|y] = 0 when t = E[g(x)|y].

Three Important Forms of a Non-Stationary Time Series

The Difference equation Form: (B)dxt = +(B)ut

or(B)xt = +(B)ut

or

xt = 1xt-1 + 2xt-2 +... +p+dxt-p-d + + ut +a1ut-1 + a2ut-2 +...+aqut-q

The Random Shock Form:xt =(t) +(B)ut

orxt =(t) + ut+1ut-1 + 2ut-2 +3ut-3 +...

where (B) =(B)-1(B) =(B)d-1(B)

= I+ 1B + 2B2 +3B3 + ...

=(B)-1=/(1- 1 - 2 - ... - p) and

(t)=-d

Note: d(t)=i.e. the dth order differences are constant.

This implies that (t) is a polynomial of degree d.

Consider The Difference equation Form:

(B)dxt = +(B)ut

or(B)xt = + (B)ut

Multiply both sides by (B)-1

To get(B)-1(B)xt

= (B)-1 +(B)-1(B)ut

orxt = (B)-1 +(B)-1(B)ut

The Inverted Form:(B)xt = + ut or

xt = 1xt-1 + 2xt-2 +3x3+ ... + + ut

where(B) = [(B)]-1(B) = [(B)]-1[(B)d] = I - 1B - 2B2 - 3B3 - ...

Again Consider The Difference equation Form:

(B)xt = +(B)ut

Multiply both sides by (B)-1

To get(B)-1(B)xt

= (B)-1+(B)-1(B)ut

or(B)xt = +ut

Forecasting an ARIMA(p,d,q) Time Series

• Let PT denote {…, xT-2, xT-1, xT} = the “past” til time T.

• Then the optimal forecast of xT+l given PT is denoted by:

• This forecast minimizes the mean square error

TlTT PxElx ˆ

Three different forms of the forecast

1. Random Shock Form2. Inverted Form3. Difference Equation FormNote:

0 if ˆ lxPxElx lTTlTT

0 if0

0 ifˆ

llu

PuElu lTTlTT

Random Shock Form of the forecast

Recall

0 if0

0 ifˆ

llu

PuElu lTTlTT

xt =(t) + ut+1ut-1 + 2ut-2 +3ut-3 +...

xT+l =(T + l) + uT+l +1uT+l-1 + 2uT+l-2 +3uT+l-3 +...

or

Taking expectations of both sides and using

2ˆ1ˆˆˆ 21 lulululTlx TTTT

2211 TlTlTl uuulT

To compute this forecast we need to compute{…, uT-2, uT-1, uT} from {…, xT-2, xT-1, xT}.

3322111 1ˆ tttt uuutx

1ˆ 1 ttt xxu

Note:xt =(t) + ut +1ut-1 + 2ut-2 +3ut-3 +...

Thus

Which can be calculated recursively

and

The Error in the forecast:

112211 TllTlTlT uuuu lxxle TlTT ˆ

TTT PleElMSE 22

21111 TllTlT uuuE

221

211 l

21

211 lT l

The Mean Sqare Error in the Forecast

Hence

Prediction Limits for forecasts

(1 – )100% confidence limits for xT+l

lzlx TT 2/ˆ

The Inverted Form:(B)xt = + ut or

xt = 1xt-1 + 2xt-2 +3x3+ ... + + ut

where(B) = [(B)]-1(B) = [(B)]-1[(B)d] = I - 1B - 2B2 - 3B3 - ...

The Inverted form of the forecast

xt = 1xt-1 + 2xt-2 +... + + ut

2ˆ1ˆˆ 21 lxlxlx TTT

and for t = T+l

xT+l = 1xT+l-1 + 2xT+l-2 + ... + + uT+l

Taking conditional Expectations

Note:

The Difference equation form of the forecast

xT+l = 1xT+l-1 + 2xT+l-2 + ... + p+dxT+l-p-d

+ + uT+l +1uT+l-1 + 2uT+l-2 +... + quT+l-q

dplxlxlxlx TdpTTT ˆ2ˆ1ˆˆ 21

qlululu TqTT ˆ1ˆˆ 1

Taking conditional Expectations

Example: The Model:

xt - xt-1 = 1(xt-1 - xt-2) + ut + 1ut + 2ut

orxt = (1 + 1)xt-1 - 1 xt-2 + ut + 1ut + 2ut

or(B)xt = (B)(I-B)xt = (B)ut

where(x) = 1 - (1 + 1)x + 1x2 = (1 - 1x)(1-x) and

(x) = 1 + 1x + 2x2 .

The Random Shock form of the model:xt =(B)ut

where(B) = [(B)(I-B)]-1(B) = [(B)]-1(B)

i.e.(B) [(B)] = (B).

Thus(I + 1B + 2B2 + 3B3 + 4B4 + ... )(I - (1 + 1)B + 1B2)

= I + 1B + 2B2

Hence1 = 1 - (1 + 1) or 1 = 1 + 1 + 1.

2 = 2 - 1(1 + 1) + 1 or 2 =1(1 + 1) - 1 + 2.

0 = h - h-1(1 + 1) + h-21

or h = h-1(1 + 1) - h-21 for h ≥ 3.

The Inverted form of the model: (B) xt = ut

where(B) = [(B)]-1(B)(I-B) = B)]-1(B)

i.e.(B) [(B)] = (B).

Thus(I - 1B - 2B2 - 3B3 - 4B4 - ... )(I + 1B + 2B2)

= I - (1 + 1)B + 1B2

Hence-(1 + 1) = 1 - 1 or 1 = 1 + 1 + 1.

1 = -2 - 11 + 2 or 2 = -11 - 1 + 2.

0 = h - h-11 - h-22 or h = -(h-11 + h-22) for h ≥ 3.

Now suppose that 1 = 0.80, 1 = 0.60 and 2 = 0.40 then the Random Shock Form coefficients and the Inverted Form coefficients can easily be computed and are tabled below:

h 1 2 3 4 5 6 7 8 9 10

2.40 2.32 2.26 2.20 2.16 2.13 2.10 2.08 2.07 2.05 2.40 -1.84 0.14 0.65 -0.45 0.01 0.17 -0.11 0.00 0.05h 11 12 13 14 15 16 17 18 19 20 2.04 2.03 2.03 2.02 2.02 2.01 2.01 2.01 2.01 2.01 -0.03 0.00 0.01 -0.01 0.00 0.00 0.00 0.00 0.00 0.00h 21 22 23 24 25 26 27 28 29 30 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 2.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00

hh

hh

hh

The Forecast Equations

The Difference Form of the Forecast Equation

lulxlxlx tTTT ˆ2ˆ1ˆ1ˆ 11

0 if00 if

ˆ andllu

lu lTT

2ˆ1ˆ 21 lulu tt

0 if ˆ where lxlx lTT

Computation of the Random Shock Series, One-step Forecasts

12111111ˆ ttttt uuxxx

1ˆ - 1 ttt xxu

One-step Forecasts

Random Shock Computations

221121111 ttttt uuxxx

Computation of the Mean Square Error of the Forecasts and Prediction Limits

2ˆˆ lxxMSElxMSEl TlTTT

lzlx TT2

2/ˆ

Mean Square Error of the Forecasts

Prediction Limits

21

23

22

21

2 1 l

Table: MSE of Forecasts to lead time l = 12 (2 = 2.56)

l 1 2 3 4 5 6 7 8 9 10 11 121.60 4.16 5.58 6.64 7.52 8.28 8.95 9.57 10.13 10.66 11.15 11.62 (l)T

t x x (1) u u t x x (1) u u 1 97.68 0.00 97.68 -1.11 41 149.57 149.98 -0.40 -0.402 102.99 234.42 -131.43 0.72 42 149.47 147.36 2.11 2.113 105.69 67.45 38.24 -1.54 43 150.10 150.49 -0.39 -0.394 105.00 78.22 26.78 -2.21 44 148.95 151.22 -2.27 -2.275 103.30 135.81 -32.51 0.80 45 145.57 146.51 -0.94 -0.946 100.64 93.15 7.49 -0.90 46 138.43 141.39 -2.96 -2.967 101.09 90.00 11.08 2.79 47 131.52 130.56 0.96 0.968 105.48 111.09 -5.61 2.72 48 124.64 125.39 -0.75 -0.759 108.78 110.07 -1.29 -2.97 49 120.15 119.06 1.09 1.0910 110.14 108.40 1.74 -0.58 50 117.34 116.92 0.42 0.4211 107.24 111.75 -4.51 -2.45 51 118.28 115.77 2.51 2.5112 103.71 102.91 0.80 0.49 52 120.31 120.71 -0.40 -0.4013 99.76 99.56 0.20 -0.44 53 122.63 122.69 -0.06 -0.0614 98.53 97.04 1.48 1.99 54 127.22 124.29 2.93 2.9315 97.75 98.51 -0.76 -0.81 55 129.60 132.62 -3.02 -3.0216 101.01 97.27 3.74 3.57 56 131.33 130.87 0.46 0.4617 104.49 105.56 -1.07 -0.95 57 130.48 131.79 -1.30 -1.3018 108.84 108.12 0.71 0.71 58 129.09 129.21 -0.12 -0.1219 114.21 112.32 1.89 1.85 59 130.11 127.37 2.74 2.7420 121.22 119.93 1.29 1.32 60 130.37 132.53 -2.15 -2.1521 127.95 128.36 -0.42 -0.42 61 130.54 130.39 0.15 0.1522 133.30 133.59 -0.28 -0.30 62 127.69 129.89 -2.20 -2.2023 137.23 137.26 -0.02 -0.01 63 124.37 124.16 0.20 0.2024 140.25 140.25 -0.01 0.00 64 122.42 120.95 1.48 1.4825 142.58 142.64 -0.07 -0.07 65 119.46 121.83 -2.38 -2.3826 145.97 144.40 1.57 1.58 66 117.00 116.25 0.74 0.7427 148.52 149.61 -1.09 -1.08 67 116.57 114.52 2.04 2.0428 149.90 150.54 -0.64 -0.64 68 117.97 117.75 0.23 0.2329 150.15 150.18 -0.02 -0.02 69 119.33 120.05 -0.72 -0.7230 149.94 150.09 -0.15 -0.15 70 121.91 120.07 1.85 1.8531 148.26 149.66 -1.40 -1.40 71 125.02 124.80 0.22 0.2232 147.43 146.02 1.40 1.40 72 129.80 128.37 1.43 1.4333 145.64 147.04 -1.40 -1.40 73 136.48 134.57 1.91 1.9134 143.15 143.94 -0.79 -0.79 74 143.56 143.53 0.03 0.0335 142.69 140.12 2.57 2.57 75 151.87 150.01 1.86 1.8636 146.74 143.55 3.19 3.19 76 159.38 159.65 -0.27 -0.2737 150.35 152.93 -2.57 -2.57 77 166.29 165.96 0.32 0.3238 153.64 152.98 0.67 0.67 78 173.58 171.90 1.68 1.6839 153.37 155.64 -2.27 -2.27 79 177.82 180.54 -2.73 -2.7340 152.01 152.06 -0.05 -0.05 80 182.60 180.24 2.36 2.36

t ttt tt tt ^^^^

t x x (1) u u t x x (1) 81 189.46 186.74 2.72 2.72 116 288.17 287.6582 196.45 197.53 -1.08 -1.08 117 280.04 280.1183 206.18 202.48 3.70 3.70 118 274.16 273.7084 216.82 215.76 1.06 1.06 119 269.46 269.7185 227.51 227.45 0.06 0.06 120 266.15 265.7386 239.30 236.52 2.78 2.78 121 266.90 263.6687 251.10 250.41 0.68 0.68 122 269.58 269.6188 263.66 262.06 1.60 1.60 123 272.31 273.0189 274.55 274.94 -0.39 -0.39 124 275.66 274.0590 281.21 283.67 -2.46 -2.46 125 279.90 279.0391 285.60 284.91 0.70 0.70 126 287.34 284.4792 288.97 288.55 0.42 0.42 127 298.01 295.3793 292.32 292.19 0.13 0.13 128 307.67 309.2894 295.22 295.25 -0.03 -0.03 129 317.59 315.4895 295.88 297.57 -1.69 -1.69 130 323.78 326.1596 298.67 295.38 3.29 3.29 131 328.90 328.1597 301.33 302.21 -0.88 -0.88 132 332.20 332.5098 306.55 304.24 2.31 2.31 133 332.42 334.9699 313.98 311.76 2.22 2.22 134 330.91 330.97

100 323.12 322.19 0.94 0.94 135 328.21 328.65101 329.06 331.89 -2.83 -2.83 136 326.38 325.76102 330.83 332.50 -1.67 -1.67 137 324.11 325.11103 330.49 330.11 0.38 0.38 138 323.97 321.95104 329.80 329.78 0.02 0.02 139 325.45 324.66105 326.90 329.42 -2.52 -2.52 140 325.70 327.91106 324.03 323.08 0.96 0.96 141 325.40 324.88107 318.47 321.30 -2.84 -2.84 142 324.41 324.59108 311.85 312.69 -0.84 -0.84 143 322.55 323.72109 308.86 304.93 3.93 3.93 144 316.41 320.28110 309.02 308.48 0.54 0.54 145 306.58 308.72111 311.54 311.05 0.49 0.49 146 297.83 295.88112 312.19 314.07 -1.88 -1.88 147 290.30 291.14113 310.47 311.78 -1.31 -1.31 148 285.24 284.55114 305.66 307.55 -1.89 -1.89 149 282.77 281.27115 297.09 300.15 -3.06 -3.06 150 284.11 281.97

u 0.52-0.070.46-0.250.423.24-0.03-0.711.610.872.882.64-1.622.11-2.370.75-0.30-2.53-0.06-0.440.62-1.002.010.79-2.210.52-0.18-1.17-3.86-2.141.95-0.840.691.502.14

u 0.52-0.070.46-0.250.423.24-0.03-0.711.610.872.882.64-1.622.11-2.370.75-0.30-2.53-0.06-0.440.62-1.002.010.79-2.210.52-0.18-1.17-3.86-2.141.95-0.840.691.502.14

t t t t t t tt^ ^^^

Raw Observations, One-step Ahead Forecasts, Estimated error , Error

Forecasts with 95% and 66.7% prediction Limits

lower lower upper uppert 95% Limit 66.7% Limit Forecast 66.7% Limit 95%Limit

151 283.93 285.47 287.07 288.67 290.21152 282.14 286.13 290.29 294.45 298.44153 281.94 287.29 292.87 298.45 303.80154 281.91 288.29 294.93 301.57 307.95155 281.84 289.06 296.58 304.10 311.32156 280.35 288.95 297.90 306.85 315.45157 280.21 289.39 298.96 308.53 317.71158 279.94 289.67 299.80 309.93 319.66159 279.59 289.82 300.48 311.14 321.37160 279.16 289.87 301.02 312.17 322.88161 278.67 289.83 301.45 313.07 324.23162 278.15 289.73 301.80 313.87 325.45

Graph: Forecasts with 95% and 66.7% Prediction Limits

150130250

300

350 95% and 66.7% Forecast Limits

Time

95% Limits

66.7% Limits

95% Limits

66.7% Limits

Next Topic – Modelling Seasonal Time series

top related