moving-average (ma) model

34
Moving-average (MA) model Model with finite time lags of memory! Some daily stock returns have minor serial corre- lations. Can be modeled as MA or AR models. MA(1) model 1. Form: r t = µ + a t θa t1 2. Stationarity: always stationary. 3. Mean (or expectation): E (r t )= µ. 4. Variance: V ar (r t ) = (1 + θ 2 )σ 2 . 5. Autocovariance: (a). Lag 1: Cov (r t ,r t1 )= θσ 2 . (b). Lag l: Cov (r t ,r tl ) = 0 for l> 1. Thus, r t is not related to r t2 , r t3 , ··· . 1

Upload: others

Post on 02-Feb-2022

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Moving-average (MA) model

Moving-average (MA) model

Model with finite time lags of memory!

Some daily stock returns have minor serial corre-

lations. Can be modeled as MA or AR models.

MA(1) model

1. Form: rt = µ+ at − θat−1

2. Stationarity: always stationary.

3. Mean (or expectation): E(rt) = µ.

4. Variance: V ar(rt) = (1+ θ2)σ2.

5. Autocovariance:

(a). Lag 1: Cov(rt, rt−1) = −θσ2.

(b). Lag l: Cov(rt, rt−l) = 0 for l > 1.

Thus, rt is not related to rt−2, rt−3, · · · .

1

Page 2: Moving-average (MA) model

ACF: ρ1 = −θ1+θ2

, ρl = 0 for l > 1.

Finite memory! MA(1) models do not remember

what happen two time periods ago.

0 5 10 15 20

0.25

0.50

0.75

1.00ACF of MA(1) with θ = (0.5)

0 5 10 15 20

0.0

0.2

0.4PACF of MA(1) with θ = (0.5)

0 5 10 15 20

0.0

0.5

1.0ACF of MA(1) with θ = (-0.5)

0 5 10 15 20

-0.3

-0.2

-0.1

0.0PACF of MA(1) with θ = (-0.5)

ACF and PACF for MA(1) model

Page 3: Moving-average (MA) model

6. Forecast (at origin t = n):

(a). 1-step ahead: rn(1) = µ − θan. Why? Be-

cause at time n, an is known, but an+1 is not.

(b). 1-step ahead forecast error: en(1) = an+1

with variance σ2a .

(c). Multi-step ahead: rn(l) = µ for l > 1.

Thus, for an MA(1) model, the multi-step ahead

forecasts are just the mean of the series. Why?

Because the model has memory of 1 time period.

(d). Multi-step ahead forecast error:

en(l) = an+l − θan+l−1.

(e). Variance of multi-step ahead forecast error:

(1 + θ2)σ2a = variance of rt.

Page 4: Moving-average (MA) model

7. Invertibility:

Concept: rt is a proper linear combination of atand the past observations {rt−1, rt−2, · · · }.

Why is it important? It provides a simple way to

obtain the shock at.

For an invertible model, the dependence of rt on

rt−l converges to zero as l increases.

MA(1) model with condition |θ| < 1:

at = (rt − µ) +∞∑i=1

θi(rt−i − µ)

or AR(∞) model

rt =µ

1− θ+ at −

∞∑i=1

θirt−i.

Invertibility of MA models is the dual property of

stationarity for AR models.

Page 5: Moving-average (MA) model

MA(2) model

1. Form: rt = µ+ at − θ1at−1 − θ2at−2, or

rt = µ+ (1− θ1B − θ2B2)at.

2. Stationary with E(rt) = µ.

3. Invertibility:

all the roots of θ(z) = 1−θ1z−θ2z2 = 0 lie outsidethe unit circle.

Decompose 1− θ1z − θ2z2 = (1− α1z)(1− α2z).

Then |α1| < 1 and |α2| < 1.

rt = µ+ (1− α1B)(1− α2B)at.

Let ut = (1− α2B)at. Then

rt = µ+ ut − α1ut−1 and ut = at − α2at−1.

Given {rt}, we can invert {ut} and then invert {at}.

4. Variance: V ar(rt) = (1+ θ21 + θ22)σ2a .

ACF: ρ1 = 0 and ρ2 = 0, but ρl = 0 for l > 2.

5. Forecasts: go the the mean after 2 periods.

Page 6: Moving-average (MA) model

MA(q) model

Form: rt = µ+ at − θ1at−1 − θ2at−2 − · · · − θqat−q,or

rt = µ+ (1− θ1B − θ2B2 · · · − θqB

q)at.

Invertibility:

all the roots of θ(z) = 1−θ1z−θ2z2−· · ·−θqzq = 0

lie outside the unit circle.

Building an MA model

1. Specification order q: Use sample ACF

Sample ACFs are all small after lag q for an MA(q)

series. (See test of ACF.)

Page 7: Moving-average (MA) model

0 5 10 15 20

0.00

0.25

0.50

0.75

1.00ACF of MA(2) with θ = (0.5, -0.3)

0 5 10 15 20

-0.2

0.0

0.2

PACF of MA(2) with θ = (0.5, -0.3)

0 5 10 15 20

0.00

0.25

0.50

0.75

1.00ACF of MA(4) with θ = (0.5, -0.3, 0.2, 0.3)

0 5 10 15 20

-0.2

0.0

0.2

0.4

PACF of MA(4) with θ = (0.5, -0.3, 0.2, 0.3)

ACF and PACF for MA(p) model

Constant term? Check the sample mean.

Page 8: Moving-average (MA) model

2. Estimation:

Assume that {r1, · · · , rT} is from the MA(1) model:

rt = −θ0at−1 + at with |θ0| < 1.

Conditional LSE– minimizer of

Sn(θ) =n∑t=1

[rt − (−θat−1)]2,

where θ is the unknown parameter of θ0. Note

that

a1 = r1 + θ0a0

a2 = r2 + θ0a1

· · ·

at = rt+ θ0at−1.

Let a0 = 0 and replace θ0 by θ. Denote

a1(θ) = r1 + θ × 0

a2(θ) = r2 + θa1(θ)

· · ·

at(θ) = rt+ θat−1(θ).

Page 9: Moving-average (MA) model

Thus, conditional LSE– minimizer of

Sn(θ) =n∑t=1

[at(θ)]2.

Note that

at = rt+t−1∑i=1

θi0rt−1 + θt0a0,

at(θ) = rt+θ∑

i=1

θirt−i+ θta0.

The initial value a0 does not affect the estimator,

asymptotically.

Similarly, for MA(q) model:

at(θ) = rt−µ+θ1at−1(θ)+θ2at−2(θ)+· · ·+θqat−q(θ).

Page 10: Moving-average (MA) model

Conditional LSE– minimizer of

Sn(θ) =n∑t=1

[at(θ)]2,

where θ = (µ, θ1, · · · , θq)′. Conditional on rt = 0

and at = 0 for t ≤ 0. Denote the minimizer by θ.

Then at(θ) is called residual.

3. Model checking: Ljung-Box text to examine

residuals (to be white noise)

at(θ) = rt−µ+θ1at−1(θ)+θ2at−2(θ)+· · ·+θqat−q(θ),

where θ is the MLE.

4. Forecast: use the residuals as {at} (which can

be obtained from the data and fitted parameters)

to perform forecasts.

Page 11: Moving-average (MA) model

ARMA(1,1) model

1. Form: (1− ϕ1B)rt = ϕ0 + (1− θB)at, or

rt = ϕ1rt−1 + ϕ0 + at − θ1at−1.

A combination of an AR(1) on the LHS and an

MA(1) on the RHS.

2. Stationarity: same as AR(1)

3. Invertibility: same as MA(1)

4. Mean: as AR(1), i.e. E(rt) = ϕ01−ϕ1

5. Variance: given in the text

6. ACF: Satisfies ρk = ϕ1ρk−1 for k > 1, but

ρ1 = ϕ1 − [θ1σ2a/V ar(rt)] = ϕ1.

This is the difference between AR(1) and ARMA(1,1)

models.

PACF: does not cut off at finite lags.

Page 12: Moving-average (MA) model

7. Forecast: MA(1) affects the 1-step ahead fore-

cast. Others are similar to those of AR(1) models.

0 5 10 15 20

0.25

0.50

0.75

1.00ACF of ARMA(1,1) with φ = (0.9) and θ = (-0.5)

0 5 10 15 20

0.2

0.4

0.6

PACF of ARMA(1,1) with φ = (0.9) and θ = (-0.5)

0 5 10 15 20

0.25

0.50

0.75

1.00ACF of ARMA(1,1) with φ = (0.9) and θ = (0.5)

0 5 10 15 20

0.0

0.5

1.0PACF of ARMA(1,1) with φ = (0.9) and θ = (0.5)

ACF and PACF for ARMA(1,1) model

Page 13: Moving-average (MA) model

ARMA(p,q) model:

Assume that r1, r2, · · · , rn follow a stationary andinvertible ARMA(p, q) model:

rt = ϕ0 + ϕ1rt−1 + ϕ2rt−2 + · · ·+ ϕprt−p+at − θ1at−1 − · · · − θqat−q,

where all the roots of

ϕp(z) = 1−p∑

i=1

ϕizi and θq(z) = 1−

q∑i=1

θizi

lie outside the unit circle, and they have no com-mon roots.

Representations:

ϕp(z)θ−1q (z) = 1−

∞∑i=1

πizi,

πi = O(ρi) with ρ ∈ (0,1). AR representation:

rt = ϕ0 + at+ π1rt−1 + π2rt−2 + · · · .It tells how rt depends on its past values.

ϕ−1p (z)θq(z) = 1+

∞∑i=1

ψizi,

ψi = O(ρi) with ρ ∈ (0,1). MA representation:

rt = µ+ at+ ψ1at−1 + ψ2at−2 + · · · .It tells how rt depends on the past shocks.

Page 14: Moving-average (MA) model

0 5 10 15 20

0.0

0.5

1.0ACF of ARMA(2,1) with φ = (0.9, -0.5)and θ = (0.5)

0 5 10 15 20

-0.5

0.0

0.5

PACF of ARMA(2,1) with φ = (0.9, -0.5)and θ = (0.5)

0 5 10 15 20

0.00

0.25

0.50

0.75

1.00ACF of ARMA(3,2) with φ = (0.9, -0.5, 0.2) and θ = (0.5, -0.3)

0 5 10 15 20

-0.25

0.00

0.25

0.50

0.75PACF of ARMA(3,2) with φ = (0.9, -0.5, 0.2) and θ = (0.5, -0.3)

ACF and PACF for ARMA(p) model

Specification: use AIC to determine p and q.

Estimation: conditional or exact likelihood method

Model checking: as before

Forecast: as before.

The MA representation is particularly useful in com-puting variances of forecast errors.

Page 15: Moving-average (MA) model

Procedure for Buliding an ARMA Model

We have log-return: r1, r2, · · · , rn which follow a

stationary and invertible ARMA model:

rt = ϕ0 + ϕ1rt−1 + ϕ2rt−2 + · · ·+ ϕprt−p+at − θ1at−1 − · · · − θqat−q ,

where at ∼ i.i.d.N(0, σ2a). We need to find

p, q, ϕ1, · · · , ϕp, θ1, · · · , θq, µ and σ2a , which are

called unknown parameters.

How to find?

Page 16: Moving-average (MA) model

Step 1. Determine (p, q) by ACF and PACF, or

AIC and BIC. If it is not easy to find p and q, you

can try some different (p, q).

Step 2. Estimate ϕ1, · · · , ϕp, θ1, · · · , θq, µ and σ2a .

Methods are LSE or MLE.

Step 3. Checking whether or not (p, q) is correct.

If it is not correct, try some different (p, q) and

then go to Step 2.

Even if it is correct, we still need to try some

different (p, q) in practice.

Step 4. In general, the correct (p, q) is not unique.

We can select the best one by AIC and BIC cri-

teria.

The above procedure is called building a model,

fit a model, model fitting, or modeling.

Page 17: Moving-average (MA) model

Forecasting Intervals

Assume that r1, r2, · · · , rn follow a stationary and

invertible ARMA model:

rt = ϕ0 + ϕ1rt−1 + ϕ2rt−2 + · · ·+ ϕprt−p+at − θ1at−1 − · · · − θqat−q.

Then

rn(l) = E(rn+l|rn, rn−1, · · · ).

( Recall AR(1) model

rn+l = ϕ0 + ϕ1rt−1 + at.

Starting at the origin n, the forecasting value of

rn+l is rn(l) = ϕ1rn(l − 1)).

Let rt have MA representation:

rt = µ+ at+ ψ1at−1 + ψ2at−2 + · · · .

Forecasting error:

For a l-step ahead forecast, the forecast error is

en(l) = an+l + ψ1an+l−1 + · · ·+ ψl−1an+1.

Page 18: Moving-average (MA) model

Forecasting variance:

V ar[en(l)] = (1+ ψ21 + · · ·+ ψ2

l−1)σ2a .

Forecast interval (limit) (FI):rn(l)−Nα2σa

√√√√√ l−1∑j=0

ψ2j , rn(l) +Nα

2σa

√√√√√ l−1∑j=0

ψ2j

where Nα

2is the α/2-quantile of the standard nor-

mal distribution,

i.e., P (N > Nα2) = α/2.

When α = 0.05, Nα2= 1.96.

See simulation example.

Page 19: Moving-average (MA) model

Non-stationary Time Series Models

Random walk

Form pt = pt−1 + at

Unit root? It is an AR(1) model with coefficient

ϕ1 = 1.

Nonstationary: Why? Because the variance of rtdiverges to infinity as t increases.

Strong memory: sample ACF approaches 1 for any

finite lag.

Random walk with drift

Form: pt = µ+ pt−1 + at, µ = 0.

Has a unit root

Nonstationary

Strong memory

Page 20: Moving-average (MA) model

Has a time trend with slope µ. Why?

Differencing

1st difference: rt = pt − pt−1

If pt is the log price, then the 1st difference is sim-

ply the log return. Typically, 1st difference means

the change or increment of the original series.

Example Simulated 100 values from

(1−B)pt = at ,

and

(1−B)pt = 4+ at ,

Show the sample ACF and PACF.

Page 21: Moving-average (MA) model

0 5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

AC

F

Random walk

5 10 15 200

.00

.20

.40

.60

.81

.0

Lag

Pa

rtia

l A

CF

Random walk

0 5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

AC

F

Random walk with drift

5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

Pa

rtia

l A

CF

Random walk with drift

ACF and PACF for ARIMA(1,1,0) model

Page 22: Moving-average (MA) model

Unit-root test

Let pt be the log price of an asset. To test that

pt is not predictable (i.e. has a unit root), we

consider the AR(1) model:

pt = ϕ1pt−1 + et,

The hypothesis of interest is

H0 : ϕ1 = 1 vs Ha : ϕ1 < 1.

Dickey-Fuller test is the usual t-ratio of the OLS

estimate of ϕ1 being 1:

ρn = n(ϕ1 − 1) =n∑nt=1 pt−1et∑nt=1 p

2t−1

,

τn = n(ϕ1 − 1)

√√√√ n∑t=1

p2t−1/n2.

Page 23: Moving-average (MA) model

In SAS:

ZM: zero mean or no intercept case:

pt = ϕ1pt−d+ et.

When d = 1, test statistics:

ρn →d

∫ 10 B(τ)dB(τ)∫ 10 B

2(τ)dτ,

τn →d

∫ 10 B(τ)dB(τ)√∫ 1

0 B2(τ)dτ

,

where B(τ) is the standard Brownian motion on

C[0,1]

SM: single mean or intercept case.

pt = ϕ0 + ϕ1pt−d+ et.

TR: intercept and deterministic time trend case.

pt = ϕ0 + γt+ ϕ1pt−d+ et.

Page 24: Moving-average (MA) model

In R:

Let ∆pt = pt − pt−1 and rewrite AR(1) model as

∆pt = δpt−1 + et,

where δ = ϕ1 − 1. The hypothesis of interest is

H0 : δ = 0 vs Ha : δ < 0.

Test statistic:

τn →d

∫ 10 B(τ)dB(τ)√∫ 1

0 B2(τ)dτ

.

The general form of Augmented Dickey-Fuller tests:

∆pt = ϕ0 + γt+ δpt−1 +p∑

i=1

ϕ∗i∆pt−i+ et,

where ϕ0 = γ = 0. Using LSE, a test statistic as

τn is constructed for H0 v.s. Ha.

Page 25: Moving-average (MA) model

Autoregressive Integrated Moving-average

(ARIMA) Model

The General ARIMA Model

Let pt be a General Stochastic Trend Model:

(1−B)dpt = rt, d ≥ 1.

If rt is a weakly stationary and invertible ARMA

model,

ϕp(B)rt = θq(B)at,

then:

ϕp(B)(1−B)dpt = θq(B)at ,

pt is called ARIMA(p, d, q) model.(?)

If rt is the following ARMA model,

ϕp(B)rt = θ0 + θq(B)at ,

then,

ϕp(B)(1−B)dpt = θ0 + θq(B)at ,

pt is called ARIMA(p, d, q) model.

Page 26: Moving-average (MA) model

ARIMA(0, 1, 0) model:

(1−B)pt = at

or pt = pt−1 + at (random walk)

pt = θ0 + pt−1 + at.

The ARIMA(0, 1, 1) or IMA(1, 1) Model:

(1−B)pt = (1− θB)at .

or

pt = pt−1 − θat−1 + at ,

where |θ| < 1.

Expansion:

at = pt − (1− θ)∞∑j=1

θj−1pt−j.

or

pt = (1− θ)∞∑j=1

θj−1pt−j + at.

Page 27: Moving-average (MA) model

Example: Simulated 100 values from three mod-

els:

ARIMA(1, 1, 0) model:

(1− 0.8B)(1−B)pt = at,

ARIMA(0, 1, 1) model:

(1−B)pt = (1− 0.75B)at,

ARIMA(1, 1, 1) model:

(1− 0.9B)(1−B)pt = (1− 0.5B)at,

a. Show the sample ACF and PACF.

b. Let rt = (1− B)pt. Show the sample ACF and

PACF of rt.

Page 28: Moving-average (MA) model

0 5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

AC

F

ARIMA(1,1,0) models with φ1 = 0.8

5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

Pa

rtia

l A

CF

ARIMA(1,1,0) models with φ1 = 0.8

0 5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

AC

F

Differencing for ARIMA(1,1,0) models

5 10 15 20

0.0

0.2

0.4

0.6

0.8

Lag

Pa

rtia

l A

CF

Differencing for ARIMA(1,1,0) models

ACF and PACF for ARIMA(1,1,0) model

Page 29: Moving-average (MA) model

0 5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

AC

F

ARIMA(0,1,1) models with θ1 = 0.75

5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

Pa

rtia

l A

CF

ARIMA(0,1,1) models with θ1 = 0.75

0 5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

AC

F

Differencing for ARIMA(0,1,1) models

5 10 15 20

−0

.20

.00

.20

.4

Lag

Pa

rtia

l A

CF

Differencing for ARIMA(0,1,1) models

ACF and PACF for ARIMA(0,1,1) model

Page 30: Moving-average (MA) model

0 5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

AC

F

ARIMA(1,1,1) models with φ1 = 0.9 and θ1 = 0.5

5 10 15 200

.00

.20

.40

.60

.81

.0

Lag

Pa

rtia

l A

CF

ARIMA(1,1,1) models with φ1 = 0.9 and θ1 = 0.5

0 5 10 15 20

0.0

0.2

0.4

0.6

0.8

1.0

Lag

AC

F

Differencing for ARIMA(1,1,1) models

5 10 15 20

−0

.40

.00

.20

.40

.60

.8

Lag

Pa

rtia

l A

CF

Differencing for ARIMA(1,1,1) models

ACF and PACF for ARIMA(1,1,1) model

Page 31: Moving-average (MA) model

Minimum Mean Square Error Forecasts for

ARIMA models

A. Model:

Let pt be ARIMA(p, d, q) model with d = 0,

ϕp(B)(1−B)dpt = θq(B)at.

where all the roots of ϕp(z) = 0 and θq(z) = 0 lie

outside the unit circle.

Given the observations: pn, pn−1, · · · ,

how to forecast pn+1, · · · , pn+l, · · · ?

B. Minimum Mean Square Error Forecasts:

pn(l) = E(pn+l|pn, pn−1, · · · ).

C. Computation of forecast:

Denote

Ψ(B) = ϕp(B)(1−B)d

= 1−Ψ1B −Ψ2B2 − · · · −Ψp+dB

p+d.

pt = Ψ1pt−1 +Ψ2pt−2 + · · ·+Ψp+dpt−p−d+at − θ1at−1 − θ2at−2 − · · · − θqat−q.

Page 32: Moving-average (MA) model

Formulas:

pn(l) = Ψ1pn(l − 1) + · · ·+Ψp+dpn(l − p− d)

+an(l)− θ1an(l− 1)− · · · − θqan(l− q).

where

pn(j) =

{E(pn+j|pn, pn−1, · · · ) if j = 1,2, · · · , l.pn+j if j = 0,−1,−2, · · · .

an(j) =

{0 if j = 1,2, · · · , l.an+j if j = 0,−1,−2, · · · .

D. Forecast error:

en(l) = pn+l − pn(l) =l−1∑j=0

ψjan+l−j ,

where ψi can be calculated, recursively:

ψj =j−1∑i=0

πj−iψi , j = 1,2, · · · , l − 1.

πj is the coefficients of the expansion:

π(B) =ϕp(B)(1−B)d

θq(B)= 1−

∞∑j=1

πjBj.

pt+l =∞∑j=1

πjpt+l−j + at+l.

Page 33: Moving-average (MA) model

E. Forecast variance:

Var[en(l)] = σ2a

l−1∑j=0

ψ2j .

F. Forecast interval (limit) (FI):pn(l)−Nα2σa

√√√√√ l−1∑j=0

ψ2j , pn(l) +Nα

2σa

√√√√√ l−1∑j=0

ψ2j

where Nα

2is the α/2-quantile of the standard nor-

mal distribution,

In SAS, the output gives the forecasting intervalsfor AR(p), MA(q), and ARMA(p,q) models.

Note that rt = pt − pt−1, where pt = lnPt. SASoutputs give the forecasting value of pn+l, i.e.,pn(l), and the forecasting intervals of pn+l, i.e.,[Lα(l), Uα(l)].

Then the forecast value of Pn+l is

Pn(l) = E(Pn+l|Fn) = E(epn+l|Fn)= epn(l)E(epn+l−pn(l)|Fn)= epn(l)E(een(l)|Fn)= epn(l)E(een(l))

= exp{pn(l) +1

2var(en(l))}.

Page 34: Moving-average (MA) model

Math: normal and lognormal dists

If X ∼ N(µ, σ2),

Y = exp(X) is lognormal r.v. with mean and vari-

ance

E(Y ) = exp(µ+σ2

2),

V (Y ) = exp(2µ+ σ2)[exp(σ2)− 1].

The forecasting interval of Pn+l is[eLα(l), eUα(l)

].

Example 1. Consider ARIMA(0, 1, 1) model:

(1−B)pt = (1− θB)at.

Example 2. Consider ARIMA(1, 1, 1) model:

(1− ϕB)(1−B)pt = (1− θB)at.