econometría 2: análisis de series de tiempotime series models stationary models stationary...
TRANSCRIPT
Econometrıa 2: Analisis de series de Tiempo
Karoll [email protected]
http://karollgomez.wordpress.com
Segundo semestre 2016
III. Stationary models
Time Series ModelsStationary Models
1 Purely random process2 Random walk (non-stationary)3 Moving average process MA(q)4 Autoregressive process AR(p)5 Mixed ARMA(p,q)
Time Series ModelsStationary Models
For the stationary stochastic process
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
R code to generate a Random walk
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
NOTE: They were introduced by Slutsky (1937).
Time Series ModelsStationary Models
Alternative notation for MA(q) processes is:
Yt = εt + θ1εt−1 + ...+ θqεt−q
with εt white noise.
Autocorrelation function (ACF):
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
R code to generate a MA(1)
Time Series ModelsStationary Models
Other example: Yt = εt + 0,9εt−1x = arima.sim(model = list(ma=.9), n = 100)plot(x)plot(0:14, ARMAacf(ma=.9, lag=14), type=”h”, xlab = ”Lag”,ylab = .ACF”)abline(h = 0)
Time Series ModelsStationary Models
Partial autocorrelation function (PACF):
In addition to the autocorrelation between Xt and Xt−k , we maywant to investigate the conditional correlation between:
corr(Xt ,Xt−k | Xt−1, ...,Xt−(k−1))
which is referred as PACF in time series analysis.
REMARKS:
I This represent the correlation between Xt and Xt−k after theirlinear dependency on the intervening variablesXt−1, ...,Xt−(k−1) has been removed.
I in terms of a regression model, where the dependent variableis Xt−k we have:
Time Series ModelsStationary Models
Xt−k = φk,1Xt−(k−1) + φk,2Xt−(k−2) + ....+ φk,kXt + et−k
where φk,i represent the i-esimo regression parameter and et−k is aerror term with mean 0, variance σ2e , and uncorrelated withXt−(k−j) with j = 1, 2, 3, ..., k.
Multipliying both sides by Xt−(k−j) and taking expectations:
γj = φk,1γj−1 + φk,2γj−2 + ....+ φk,kγj−k
hence,ρj = φk,1ρj−1 + φk,2ρj−2 + ....+ φk,kρj−k
ACF and PCAF of MA(1): Yt = εt + θ1εt−1
ACF and PACF of MA(2): Yt = εt + θ1εt−1 + θ2εt−2
ACF and PACF of MA(2): continued
Example 1: Table, ACF and PACF for a simulated MA(1)Yt = εt + 0,5εt−1
Example 2: Table, ACF and PACF for a simulated MA(2)Yt = εt + 0,65εt−1 + 0,24εt−2
Time Series ModelsStationary Models
Alternative notation for AR(p) processes is:
Yt = φ1Yt−1 + ...+ φpYt−p + εt
with εt white noise.
Autoregressive series are important because:
I They have a natural interpretation: the next value observed is a slightperturbation of a simple function of the most recent observations.
I It is easy to estimate their parameters and it can be done with standardregression software.
I They are easy to forecast and standard regression software will do the job.
I They were introduced by Yule (1926)
Time Series ModelsStationary Models
4.0 Mean and Variance of AR processes
Time Series ModelsStationary Models
4.1 Yule walker equations (autocorrelation function)
Time Series ModelsStationary Models
Time Series ModelsStationary Models
ACF and PCAF of AR(1): Yt = φ1Yt−1 + εt
Time Series ModelsStationary Models
ACF and PACF of AR(2): Yt = φ1Yt−1 + φ2Yt−2 + εt
ACF and PCAF of AR(2): continued
Time Series ModelsStationary Models
R code to generate a AR(1)
Example 1: Table, ACF and PACF for a simulated AR(1)Yt = 0,65Yt−1 + εt
Example 2: Table, ACF and PACF for a simulated AR(2)Yt = 0,5Yt−1 + 0,3Yt−2 + εt
Time Series ModelsStationary Models
Summary I
Time Series ModelsStationary Models
Summary I (continued)
4 Autoregresive porcesses AR(p):
Time Series ModelsStationary Models
4.2 Stationarity for AR(p) processes
4.2.1 Auxiliary vs Characteristic equation
Time Series ModelsStationary Models
I An alternative solution is possible using the lag (orbackshift) operator:
ρ(k)− α1ρ(k − 1)− · · · − αpρ(k − p) = 0
I The general solution is
ρ(k) = A1π|k|1 + · · ·+ Apπ
|k|p
where α are the solution of the characteristic polynomial
(1− α1B − · · · − αpBp)ρ(k) = 0
orφ(B)ρ(k) = 0
I The necessary an sufficient condition for Xt be stationary isthat all |πi | > 1, i.e. the modulus of all the roots ofcharacteristic polynomial are greater than 1.
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
4.2.2 Back shift Operator B (or L)
Time Series ModelsStationary Models
4.2.2.1 Alternative view of the models using the operator B
Time Series ModelsStationary Models
Remarks:
Time Series ModelsStationary Models
4.2.2.2 Autocovariance and autocorrelation using the operator B
Time Series ModelsStationary Models
4.2.3 Stationarity Conditions
4.2.3.1 Stationary conditions for AR(1):
Xt =αXt−1 + Zt
(1− αB)Xt =Zt
Xt =∞∑i=0
αiZt−i
I If |α| < 1 then there is a stationary solution to AR(1) processI If |α| = 1 there is no stationary solution to AR(1) process.I If |α| > 1 there is no stationary solution to AR(1) process.I This means that for the purposes of modelling and forecasting
stationary time series, we must restrict our attention to series forwhich |α| < 1.
Time Series ModelsStationary Models
Stationary conditions for AR(1) using operator B:
I An equivalent condition for stationarity, is that the root of theequation 1− αB = 0 lies outside the unit circle in thecomplex plane.
I This means that for the purposes of modelling and forecastingstationary time series, we must restrict our attention to seriesfor which |α| < 1 or, equivalently, to series for which the rootof the polynomial 1− αB = 0 lies outside the unit circle inthe complex plane.
Time Series ModelsStationary Models
Example AR(1):
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Example AR(2):
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Real roots: Complex roots:
REMARK: The pattern showed by ACF and PACF also depends onthe solution of the characteristic polynomial.
Time Series ModelsStationary Models
R code for an AR(2) process
To generate the processlibrary(ts)x = arima.sim(model = list(ar=c(1.5,-.75)), n = 100)plot(x)
To find the rootsMod(polyroot(c(1,-2.5,1)))
Time Series ModelsStationary Models
To compute and plot the ACF
Time Series ModelsStationary Models
4.2.3.2 Stationarity for AR(p) processes
Time Series ModelsStationary Models
4.3 Invertable process: duality between AR and MA
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Summary II
Time Series ModelsStationary Models
5. Mixed ARMA(p,q)
They were introduced by Wold (1938)
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Time Series ModelsStationary Models
Example
Time Series ModelsStationary Models
ACF and PACF of ARMA(1,1): Yt = φ1Yt−1 + εt + θ1εt−1
ACF and PCAF of ARMA(1,1): continued
ACF and PCAF of ARMA(1,1): continued
Time Series ModelsStationary Models
R code for an ARMA(1,1) process
Consider the model: Yt = −0,5Yt−1 + εt + 0,3Yt−1x = arima.sim(model = list(ar=-.5,ma=.3), n = 100)plot(x)plot(0:14, ARMAacf(ar=-.5, ma=.3, lag=14), type=”h”, xlab =”Lag”, ylab = .ACF”)abline(h = 0)
Example 1: Table, ACF and PACF for a simulated ARMA(1,1)Yt = 0,9Yt−1 + εt + 0,5εt−1
Example 2: Table, ACF and PACF for a simulated ARMA(1,1)Yt = 0,6Yt−1 + εt + 0,5εt−1
Time Series ModelsStationary Models
5.1 Causality
Time Series ModelsStationary Models
Summary III
Time Series ModelsStationary Models
Characteristics of time series processes
Time Series ModelsStationary Models
Characteristics of theoretical ACF and PACF for stationaryprocesses