stationary time series ams 586 1. the moving average time series of order q, ma(q) where {z t |t t}...
TRANSCRIPT
Stationary Time Series
AMS 586
1
The Moving Average Time series of order q, MA(q)
where {Zt|t T} denote a white noise time series with variance 2.
Let {Xt|t T} be defined by the equation.
Then {Xt|t T} is called a Moving Average time series of order q. (denoted by MA(q))
2
2211 qtqtttt ZZZZX
qi
qihXXCOV
hq
ihii
htt
0
if),(
0
2
The autocorrelation function for an MA(q) time series
The autocovariance function for an MA(q) time series
qi
qihh
q
ii
hq
ihii
0
if0 0
2
0
The mean value for an MA(q) time series
tXE
3
The autocorrelation function for an MA(q) time series
qi
qihh
q
ii
hq
ihii
0
if0 0
2
0
Comment
“cuts off” to zero after lag q.
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
1.2
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
q
4
The Autoregressive Time series of order p, AR(p)
where {Zt|t T} is a white noise time series with variance 2.
Let {Xt|t T} be defined by the equation.
2211 tptpttt ZXXXX
Then {Zt|t T} is called a Autoregressive time series of order p. (denoted by AR(p))
5
The mean of a stationary AR(p)
Assuming {Xt|t T} is stationary, and take expectations of the equation, we obtain the mean μ:
2211 tptpttt EZEEXEXEXEX
6
p
tXE
211
Now we can center (remove the mean of) the time series as follows:
)(
)()( 2211
tptp
ttt
ZX
XXX
Computing the autocovariance of a stationary AR(p)
Now assuming {Xt|t T} is stationary with mean zero:
7
Multiplying by Xt-h, h ≥ 0, and take expectations of the equation, we obtain the Yule-Walker equations for the autocovariance. Note, for a zero mean sequence:
)()(
)()()( 2211
htthtptp
htthtthtt
XZEXXE
XXEXXEXXE
2211 tptpttt ZXXXX
)()()()(),() htthtthtthtt XXEXEXEXXEXXCOVh
Note: For h > 0, we have:
The Autocovariance function (h) of a stationary AR(p) series
Satisfies the equations:
21 10 pp
101 1 pp
212 1 pp
and
011 ppp
phhh p 11 for h > p
Yule Walker Equations
8
0)( htt XZE
For h = 0, we have: 2)()( ttt ZVarXZE
2
1
01 1 p p
with
phhh p 11for h > p
111 1 pp
212 1 pp
111 ppp
The Autocorrelation function (h) of a stationary AR(p) series
Satisfies the equations:
and
9
or:
h
pp
hh
rc
rc
rch
111
22
11
and c1, c2, … , cp are determined by using the starting values of the sequence (h).
pp xxx 11
pr
x
r
x
r
x111
21
where r1, r2, … , rp are the roots of the polynomial
10
Conditions for stationarity
Autoregressive Time series of order p, AR(p)
11
The value of Xt increases in magnitude and Zt eventually becomes negligible.
i.e. 11 ttt ZXX
If 1 = 1 and = 0.
The time series {Xt|t T} satisfies the equation:
The time series {Xt|t T} exhibits deterministic behavior.
11 tt XX
12
For a AR(p) time series, consider the polynomial
pp xxx 11
pr
x
r
x
r
x111
21
with roots r1, r2 , … , rp
then {Xt|t T} is stationary if |ri| > 1 for all i.
If |ri| < 1 for at least one i then {Xt|t T} exhibits deterministic behavior.
If |ri| ≥ 1 and |ri| = 1 for at least one i then {Xt|t T} exhibits non-stationary random behavior.
13
since:
h
pp
hh
rc
rc
rch
111
22
11
i.e. the autocorrelation function, (h), of a stationary AR(p) series “tails off” to zero.
lim 0h
h
and |r1 |>1, |r2 |>1, … , | rp | > 1 for a stationary AR(p) series then
0
0.2
0.4
0.6
0.8
1
1.2
0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
14
Special Cases: The AR(1) time
Let {Xt|t T} be defined by the equation.
11 ttt ZXX
15
Consider the polynomial
xx 11
1
1r
x
with root r1= 1/1
1. {xt|t T} is stationary if |r1| > 1 or |1| < 1 .
2. If |ri| < 1 or |1| > 1 then {Xt|t T} exhibits deterministic behavior.
3. If |ri| = 1 or |1| = 1 then {Xt|t T} exhibits non-stationary random behavior.
16
Special Cases: The AR(2) time
Let {Xt|t T} be defined by the equation.
2211 tttt ZXXX
17
Consider the polynomial
2211 xxx
21
11r
x
r
x
where r1 and r2 are the roots of (x)
1. {Xt|t T} is stationary if |r1| > 1 and |r2| > 1 .
2. If |ri| < 1 or |1| > 1 then {Xt|t T} exhibits deterministic behavior.
3. If |ri| ≥ 1 for i = 1,2 and |ri| = 1 for at least on i then {Xt|t T} exhibits non-stationary random behavior.
This is true if 1+2 < 1 , 2 –1 < 1 and 2 > -1.These inequalities define a triangular region for 1 and 2.
18
Patterns of the ACF and PACF of AR(2) Time SeriesIn the shaded region the roots of the AR operator are complex
h kk
h kk
h kk
h kk
1
21
-1
2-2
III
IIIIV
2
19
The Mixed Autoregressive Moving Average Time Series of order p,q The ARMA(p,q) series
20
The Mixed Autoregressive Moving Average Time Series of order p and q, ARMA(p,q)
Let 1, 2, … p , 1, 2, … p , denote p + q +1 numbers (parameters).
Let {Zt|t T} denote a white noise time series with variance 2.
– uncorrelated– mean 0, variance 2.
Let {Xt|t T} be defined by the equation. 2211 ptpttt XXXX
Then {Xt|t T} is called a Mixed Autoregressive- Moving Average time series - ARMA(p,q) series.
2211 qtqttt ZZZZ
21
Mean value, variance, autocovariance function,
autocorrelation function of anARMA(p,q) series
22
Similar to an AR(p) time series, for certain values of the parameters 1, …, p an ARMA(p,q) time series may not be stationary.
An ARMA(p,q) time series is stationary if the roots (r1, r2, … , rp ) of the polynomial
(x) = 1 – 1x – 2x2 - … - p xp
satisfy | ri| > 1 for all i.
23
Assume that the ARMA(p,q) time series {Xt|t T} is stationary:
Let = E(Xt). Then
2211 ptpttt XEXEXEXE
21 p
1 21 p
2211 qtqttt ZEZEZEZE
0000 21 q
or p
tXE
211
24
The Autocovariance function, (h), of a stationary
mixed autoregressive-moving average time series {Xt|t T} be determined by the equation:
ptpttt XXXX 2211
Thus
p 211 now
11 ptptt XXX
qtqttt ZZZZ 2211
qtqttt ZZZZ 2211
25
Hence
tht XXEh
phtpht XXE 11
tqhtqhththt XZZZZ 2211
tphtptht XXEXXE 11
tqhtqthttht XZEXZEXZE 11
phh p 11
qhhh zxqzxzx 11
thtzx XZEh where26
ptptht XXuE 11
qtqttt ZZZZ 2211
pthtptht XZEXZE 11
qthtqthttht ZZEZZEZZE 11
phh zxpzx 11
qhhh zzqzzzz 11
thtzx XZEh note
.0 if 0 where hXZEh thtzx
.0 if 0
.0 if and
2
h
hZZEh thtzz
27
We need to calculate:
qzxzxzx ,,1,0
20 zx
hzx note phh zxpzx 11
qhhh zzqzzzz 11
.0 if 0 and hhzx
.0 if 0
.0 if 2
h
hhzz
222201 zxzx
222 012 zxzxzx
22
22
2
222 etc 28
The autocovariance function (h) satisfies:
phhh p 11
qhhh zxqzxzx 11
For h = 0, 1. … , q:
pp 10 1 qzxqzxzx 10 1
101 1 pp 101 qzxqzx
pqqq p 11 0zxq
for h > q:
phhh p 1129
We then use the first (p + 1) equations to determine: (0), (1), (2), … , (p)
We use the subsequent equations to determine:(h) for h > p.
30
Example:The autocovariance function, (h), for an ARMA(1,1) time series:
11 hh 11 hh zxzx
For h = 0, 1:
10 1 10 1 zxzx
01 1 01 zx
for h > 1: 11 hh
or 10 1 2
1112
01 1 21
31
Substituting (0) into the second equation we get:
or
21
2111
211 11
22
1
1111
1
11
Substituting (1) into the first equation we get:
2111
222
1
11111 1
10
22
1
1112
12
111111
1
111
22
1
1121
1
21
32
for h > 1: 11 hh
22
1
111111 1
112
22
1
1111211 1
123
22
1
1111111 1
11
hhh
33
The Backshift Operator B
34
Consider the time series {Xt : t T} and Let M denote the linear space spanned by the set of random variables {Xt : t T}
(i.e. all linear combinations of elements of {Xt : t T} and their limits in mean square).
M is a vector space
Let B be an operator on M defined by:
BXt = Xt-1.
B is called the backshift operator.35
Note: 1.
2. We can also define the operator Bk withBkXt = B(B(...BXt)) = Xt-k.
3. The polynomial operator p(B) = c0I + c1B + c2B2 + ... + ckBk
can also be defined by the equation.p(B)Xt = (c0I + c1B + c2B2 + ... + ckBk)Xt . = c0IXt + c1BXt + c2B2Xt + ... + ckBkXt
= c0Xt + c1Xt-1 + c2Xt-2 + ... + ckXt-k
ktktt XcXcXcB
21 21
ktktt BXcBXcBXc 21 21
11211 21 ktktt XcXcXc
36
4. The power series operator p(B) = c0I + c1B + c2B2 + ...
can also be defined by the equation.p(B)Xt = (c0I + c1B + c2B2 + ... )Xt
= c0IXt + c1BXt + c2B2Xt + ...
= c0Xt + c1Xt-1 + c2Xt-2 + ...
5. If p(B) = c0I + c1B + c2B2 + ... and q(B) = b0I + b1B + b2B2 + ... are such that
p(B)q(B) = I i.e. p(B)q(B)Xt = IXt = Xt than q(B) is denoted by [p(B)]-1.
37
Other operators closely related to B:
1. F = B-1 ,the forward shift operator, defined by FXt = B-1Xt = Xt+1 and
2. ∇= I - B ,the first difference operator, defined by ∇Xt = (I - B)Xt = Xt - Xt-1 .
38
The Equation for a MA(q) time series
Xt= 0Zt + 1Zt-1 +2Zt-2 +... +qZt-q + can be written
Xt= (B) Zt + where
(B) = 0I + 1B +2B2 +... +qBq
39
The Equation for a AR(p) time series
Xt= 1Xt-1 +2Xt-2 +... +pXt-p + +Zt
can be written
(B) Xt= + Zt
where
(B) = I - 1B - 2B2 -... - pBp
40
The Equation for a ARMA(p,q) time series
Xt= 1Xt-1 +2Xt-2 +... +pXt-p + + Zt + 1Zt-1 +2Zt-2 +... +qZt-q
can be written
(B) Xt= (B) Zt + where
(B) = 0I + 1B +2B2 +... +qBq
and
(B) = I - 1B - 2B2 -... - pBp
41
Some comments about the Backshift operator B
1. It is a useful notational device, allowing us to write the equations for MA(q), AR(p) and ARMA(p, q) in a very compact form;
2. It is also useful for making certain computations related to the time series described above;
42
The partial autocorrelation function
A useful tool in time series analysis
43
The partial autocorrelation function
Recall that the autocorrelation function of an AR(p) process satisfies the equation:
x(h) = 1x(h-1) + 2x(h-2) + ... +px(h-p)
For 1 ≤ h ≤ p these equations (Yule-Walker) become:x(1) = 1 + 2x(1) + ... +px(p-1)
x(2) = 1x(1) + 2 + ... +px(p-2)
...
x(p) = 1x(p-1)+ 2x(p-2) + ... +p. 44
In matrix notation:
pxx
xx
xx
x
x
x
pp
p
p
p
2
1
121
211
111
2
1
These equations can be used to find 1, 2, … , p, if the time series is known to be AR(p) and the autocorrelation x(h) function is known.
45
In this case p
ppp ,,, 21
If the time series is not autoregressive the equations can still be used to solve for 1, 2, … , p, for any value of p >1.
are the values that minimizes the mean square error:
2
1
)()(...p
ixit
pixt XXEESM
46
121
211
111
21
211
111
)(
kk
k
k
kkk
xx
xx
xx
xxx
xx
xx
kkkk
Definition: The partial auto correlation function at lag k is defined to be:
47
Comment:
The partial auto correlation function, kk is determined from the auto correlation function, (h)
48
Some more comments:
1. The partial autocorrelation function at lag k, kk, can be interpreted as a corrected autocorrelation between Xt and Xt-k conditioning on the intervening variables Xt-1, Xt-2, ... , Xt-k+1 .
2. If the time series is an AR(p) time series than
kk = 0 for k > p
3. If the time series is an MA(q) time series than
x(h) = 0 for h > q 49
A General Recursive Formula for Autoregressive Parameters and the
Partial Autocorrelation function (PACF)
50
Letkk
kk
kkk ,,,, 321
denote the autoregressive parameters of order k satisfying the Yule Walker equations:
kkk
kkk13221
223121 kkk
kkk
kkk
kk
kk
kk 332211
51
Then it can be shown that:
k
jj
kj
k
jjk
kjk
kkkk
1
11
1,111
1
and
kjkjkkk
kj
kj ,,2 ,1 11,1
1
52
Proof:
The Yule Walker equations:
kkk
kkk13221
223121 kkk
kkk
kkk
kk
kk
kk 332211
53
In matrix form:
kkk
k
k
kk
k
k
22
1
21
2
1
1
1
1
kkk ρβΡ or
k
k
kk
k
k
k
kk
k
k
k
22
1
21
2
1
and ,
1
1
1
ρβΡ
kkk ρΡβ1
54
The equations for
1
2
11
12
11
1
1
1
1
1
kkk
k
k
kk
k
k
1,111
13
12
11 ,,,,
kkkk
kkk
55
11,1
11
1or
k
k
kk
k
k
kk
ρβ
Aρ
AρΡ
001
000
100
where
A and
113
12
11
11 ,,,, k
kkkkk β
The matrix A reverses order56
kkkk
kk ρAρβΡ
1,11
1
The equations may be written
11,11
1
kkkkk βAρ
Multiplying the first equations by
kkkkkkk
k βρΡAρΡβ
11
1,11
1
1
kΡ
or kkkk
kk AρΡββ1
1,11
1
kkkk
k ρΡAβ1
1,1
k
kkk Aββ 1,1 57
Substituting this into the second equation
or
11,11,1
kkkk
kkkk AββAρ
kkk
kkkk Aβρβρ
11,1 1
and kk
kkk
kk
ρβ
Aβρ
1 1
1,1
58
Hence
k
jj
kj
k
jjk
kjk
kkkk
1
11
1,111
1
and
kjkjkkk
kj
kj ,,2 ,1 11,1
1
kkk
kk Aβββ 1,11
or
59
Some Examples
60
Example 1: MA(1) time seriesSuppose that {Xt|t T} satisfies the following
equation:
Xt = 12.0 + Zt + 0.5 Zt – 1
where {Zt|t T} is white noise with = 1.1.Find:1. The mean of the series,2. The variance of the series,3. The autocorrelation function.4. The partial autocorrelation function.
61
SolutionNow {Xt|t T} satisfies the following equation:
Xt = 12.0 + Zt + 0.5 Zt – 1
Thus:
1. The mean of the series,
= 12.0
The autocovariance function for an MA(1) is
222 21
22
1 0.5 1.1 01 0 1.5125 0
1 0.5 1.1 1 0.605 1
0 1 0 1 0 1
hh h
h h h h
h h h
62
Thus:
2. The variance of the series,
(0) = 1.5125
and
3. The autocorrelation function is:
0.6051.5125
1 0 1 0
1 0.4 10
0 1 0 1
h hh
h h h
h h
63
( )
1 1 1
1 1 2
1 2
1 1 1
1 1 2
1 2 1
kkk k
k k k
k
k
k k
4. The partial auto correlation function at lag k is defined to be:
Thus (1)11 1
11 0.4
1
2 2(2)
22 2 2 2
1 1
1 2 2 1 0.4 0.16.19048
1 1 1 0.4 0.841 1
1 1
64
(3)33 3
1 1 1 1 0.4 0.4
1 1 2 0.4 1 0
2 1 3 0 0.4 0 0.0640.0941
1 .4 0 0.681 1 2
.4 1 .41 1 1
0 .4 12 1 1
(4)44 4
1 1 2 1 1 .4 0 .4
1 1 1 2 .4 1 .4 0
2 1 1 3 0 .4 1 0
3 2 1 4 0 0 .4 0 0.02560.0469
1 .4 0 0 0.54561 1 2 3
.4 1 .4 01 1 1 2
0 .4 1 .42 1 1 1
0 0 .4 13 2 1 1
(5)55 5
0.010240.0234
0.4368 65
66 77 88 990.0117, 0.0059, 0.0029, 0.0015
10,10 11,11 12,120.0007, 0.0004, 0.00029
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1
1.2
0 1 2 3 4 5 6 7 8 9 10 11
Graph: Partial Autocorrelation function kk
66
Exercise: Use the recursive method to calculate kk
111
1 1, 1
1
1
kk
k j k jjk
k k k kkj j
j
and
11, 1 1 1, 2, , k k k
j j k k k j j k
11 1,1 1we start with
67
Exercise: Use the recursive method to calculate kk
212 2 1 12 2,2 21
1 1
0.4.19048
1 1 0.4
and2 1 1
1 1 2.2 1 1j
1 .19048 0.4
1.19048 0.4 .0.476192
2 23 3 1 2 1 13 3,3 2 2
1 1 2 2
0.0941, etc1
68
Example 2: AR(2) time series
Suppose that {Xt|t T} satisfies the following equation:
Xt = 0.4 Xt – 1 + 0.1 Xt – 2 + 1.2 + Zt
where {Zt|t T} is white noise with = 2.1.Is the time series stationary?Find:1. The mean of the series,2. The variance of the series,3. The autocorrelation function.4. The partial autocorrelation function.
69
1. The mean of the series
1 2
1.22.4
1 1 0.4 0.1
3. The autocorrelation function.Satisfies the Yule Walker equations
1 1 2 1 1
2 1 1 2 1
1 0.4 0.1
1 0.4 0.1
1 1 2 1 1then 0.4 0.1
where h h h h h
h h
70
hence
1
2
0.40.4444
0.90.4
0.4 0.1 2.7780.9
1 1 2 1 1then 0.4 0.1
where h h h h h
h h
h 0 1 2 3 4 5 6
h 1.0000 0.4444 0.2778 0.1556 0.0900 0.0516 0.0296
h 7 8 9 10 11 12 13
h 0.0170 0.0098 0.0056 0.0032 0.0018 0.0011 0.0006
71
2. the variance of the series
2 2
1 1 2 1
2.10 5.7522
1 1 0.4 0.4444 0.1 0.2778
4. The partial autocorrelation function.
1,1 1 0.4444
1
1 22,2
1
1
1 1 0.4444
0.4444 .27780.1000
1 0.44441
0.4444 11
72
1 1
1 2
2 1 33,3
1 2
1 1
2 1
1 1 0.4444 0.4444
1 0.4444 1 0.2778
0.2778 0.4444 0.15560
1 1 0.4444 0.2778
1 0.4444 1 0.4444
1 0.2778 0.4444 1
,in fact 0 for 3k k k
The partial autocorrelation function of an AR(p) time series “cuts off” after p.
73
Example 3: ARMA(1, 2) time series
Suppose that {Xt|t T} satisfies the following equation:
Xt = 0.4 Xt – 1 + 3.2 + Zt + 0.3 Zt – 1 + 0.2 Zt – 1
where {Zt|t T} is white noise with = 1.6.Is the time series stationary?Find:1. The mean of the series,2. The variance of the series,3. The autocorrelation function.4. The partial autocorrelation function.
74
Theoretical Patterns of ACF and PACF
75
Type of Model
Typical Pattern of ACF
Typical Pattern of
PACF
AR (p) Decays exponentially or
with damped sine wave pattern or
both
Cut-off after lags p
MA (q) Cut-off after lags q
Declines exponentially
ARMA (p,q)
Exponential decay Exponential decay
Reference
• GEP Box, GM Jenkins, GC Reinsel (1994) Time series analysis: Forecasting and control, Prentice-Hall.
• Brockwell, Peter J. and Davis, Richard A. (1991). Time Series: Theory and Methods. Springer-Verlag.
• We also thank colleagues who posted their notes as on-line open resources for time series analysis.
76