![Page 1: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/1.jpg)
12.3 Correcting for Serial Correlation w/ Strictly Exogenous
RegressorsThe following autocorrelation correction requires all our regressors to be strictly exogenous-in particular, we should have no lagged explanatory variables
Assume that our error terms follow AR(1) SERIAL CORRELATION :
(12.26) 1 ttt euu -assuming from here on in that everything is
conditional on X, we can calculate variance as:
(12.27) 1
)(
2
2
etuVar
![Page 2: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/2.jpg)
12.3 Correcting for Serial Correlation w/ Strictly Exogenous
RegressorsIf we consider a single explanatory variable, we can eliminate the correlation in the error term as follows:
2)(t ~)1(~)()()1(
10
11101
10
ttt
tttttt
ttt
exy
uuxxyy
uxy
This provides us with new error terms that are uncorrelated-Note that ytilde and xtilde are called QUASI-DIFFERENCED
DATA
![Page 3: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/3.jpg)
12.3 Correcting for Serial Correlation w/ Strictly Exogenous
RegressorsNote that OLS is not BLUE yet as the initial y1 is undefined
-to make OLS blue and ensure the first term’s errors are the same as other terms, we set
11102
1
12
12
102
12
~~)1(~
)1()1()1()1(
uxy
uxy
-note that our first term’s quasi-differenced data is calculated differently than all other terms
-note also that this is another example of GLS estimation
![Page 4: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/4.jpg)
12.3 Correcting for Serial Correlation w/ Strictly Exogenous
RegressorsGiven multiple explanatory variables, we have:
1111102
1
110
1,1
1,11101
~~...~)1(~
2)(t ~...~)1(~)()(...
)()1(
uxxy
exxy
uuxx
xxyy
kk
ttkktt
ttkttkk
tttt
-note that this GLS estimation is BLUE and will generally differ from OLS
-note also that our t and F statistics are now valid and testing can be done
![Page 5: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/5.jpg)
12.3 Correcting for Serial Correlation w/ Strictly Exogenous
RegressorsUnfortunately, ρ is rarely know, but it can be estimated from the formula:
ttt euu 1ˆˆ We then use ρhat to estimate:
)ˆ1(~
2for t )ˆ1(~
2)(t ~...~~
210
0
1100
x
xwhere
exxxy
t
ttkkttt
Note that in this FEASIBLE GLS (FGLS), the estimation error in ρhat does not affect FGLS’s estimator’s asymptotic distribution
![Page 6: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/6.jpg)
Feasible GLS Estimation of the AR(1) Model
1) Regress y on all x’s to obtain residuals uhat2) Regress uhatt on uhatt-1 and obtain OLS estimates of ρhat
3) Use these ρhat estimates to estimate
2)(t ~...~~1100 ttkkttt exxxy
We now have adjusted slope estimates with valid standard errors for testing
![Page 7: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/7.jpg)
12.3 FGLS Notes-Using ρhat is not valid for small sample sizes-This is due to the fact that FGLS is not
unbiased, it is only consistent if the data is weakly dependent
-while FGLS is not BLUE, it is asymptotically more efficient than OLS (again, large samples)
-two examples of FGLS are the COCHRANE-ORCUTT (CO) ESTIMATION and the PRAIS-WINSTEN (PW) ESTIMATION
-these estimations are similar and differ only in treating the first observation
![Page 8: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/8.jpg)
12.3 Iterated FGLS-Practically, FGLS is often iterated:-once FGLS is estimated once, its residuals
are used to recalculate phat, and FGLS is estimated again
-this is generally repeated until phat converges to a number
-regression programs can automatically perform this iteration
-theoretically, the first iteration satisfies all large sample properties needed for tests
-Note: regression programs can also correct for AR(q) using a complicated FGLS
![Page 9: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/9.jpg)
12.3 FGLS vrs. OLS-In certain cases (such as in the presence
of unit roots), FGLS can fail to obtain accurate estimates; its estimates can vary greatly from OLS
-When FGLS and OLS give similar estimates, FGLS is always preferred if autocorrelation exists
-If FGLS and OLS estimates differ greatly, more complicated statistical estimation is needed
![Page 10: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/10.jpg)
12.5 Serial Correlation-Robust Inference
-FGLS can fail for a variety of reasons:-explanatory variables are not strictly exogenous-sample size is too low-the form of autocorrelation is unknown and more complicated than AR(1)
-in these cases OLS standard errors can be corrected for arbitrary autocorrelation-the estimates themselves aren’t affected, and therefore OLS is inefficient (much like the het-robust correction of simple OLS)
![Page 11: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/11.jpg)
12.5 Autocorrelation-Robust Inference
-To correct standard errors for arbitrary autocorrelation, chose an integer g>0 (generally 1-2 in most cases, 1x-2x where x=frequency greater than annually):
)ˆˆ()ˆˆ()1
1(2)ˆˆ(ˆ111
2htht
n
httt
g
h
n
ttt urur
g
hurv
-where rhat is the residual from regressing x1 on all other x’s and uhat is the residual from the typical OLS estimation
![Page 12: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/12.jpg)
12.5 Autocorrelation-Robust Inference
-After obtaining vhat, our standard errors are adjusted using:
vse
se OLS ˆ)ˆ)ˆ(
()ˆ( 211
-note that this transformation is applied to all variables (as any can be listed as x1)
-these standard errors are also robust to arbitrary heteroskedasticity-this transformation is done using the OLS subcommand /autcov=1 in SHAZAM,
but can also be done step by step:
![Page 13: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/13.jpg)
Serial Correlation-Robust Standard Error for B1hat
1) Regress y on all x’s to obtain residuals uhat, OLS standard errors, and σhat2) Regress x1 on all other x’s and obtain residuals rhat
3) Use these estimates to estimate vhat as seen previously4) Using vhat, obtain new standard errors through:
vse
se OLS ˆ)ˆ)ˆ(
()ˆ( 211
![Page 14: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/14.jpg)
12.5 SC-Robust Notes-Note that these Serial correlation (SC) robust standard errors are poorly behaved for small sample sizes (even as
large as 100)-note that g must be chosen, making this correction less than automatic-if serial correlation is severe, this correction leaves OLS very inefficient, especially in small sample sizes-use this correction only if forced to (some variables not strictly exogenous, lagged dependent variables)-correction is like a hand grenade, not a sniper
![Page 15: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/15.jpg)
12.6 Het in Time Series-Like in cross sectional studies, heteroskedasticity in time series studies doesn’t cause unbiasedness or inconsistency
-it does invalidate standard errors and tests-while robust solutions for autocorrelation may correct Het, the opposite is NOT true
-Heteroskedasticity-Robust Statistics do NOT correct for autocorrelation-note also that Autocorrelation is often more damaging to a model than Het (depending on the amount of auto (ρ) and
amount of Het)
![Page 16: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/16.jpg)
12.6 Testing and Fixing Het in Time SeriesIn order to test for Het:
1) Serial correlation must be tested for and corrected first2) Dynamic Heteroskedasticity (see next section) must not existFixing Het is the same as the cross secitonal case:1) WLS is BLUE if correctly specified2) FGLS is asymptotically valid in large samples3) Het-robust corrections are better than nothing (they don’t correct estimates, only s.e.’s)
![Page 17: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/17.jpg)
12.6 Dynamic Het-Time series adds the complication that the variance of the error term may depend
on explanatory variables of other periods (and thus errors of other periods)-Engle (1982) suggested the AUTOREGRESSIVE CONDITIONAL HETEROSKEDASTICITY
(ARCH) model. A first-order Arch (ARCH(1)) model would look like:
21101
221
2 ),|(),...,,|( tttttt uXuuEXuuuE
![Page 18: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/18.jpg)
12.6 ARCH-The ARCH(1) model can be rewritten as:
(12.50) 2110
2ttt vuu
-Which is similar to the autoregressive model and has the similar stability condition that α1<1
-While ARCH does not make OLS biased or inconsistent, if it exists a WLS or maximum likelihood (ML) estimation are asymptotically more efficient (better estimates)
-note that the usual het-robust standard errors and test statistics are still valid under ARCH
![Page 19: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/19.jpg)
12.6 Het and Autocorrelation…end of the world?
-typically, serial correlation is a more serious issue than Het as it affects standard errors and estimation efficiency more
-however, a low ρ value may cause Het to be more serious-we’ve already seen that het-robust autocorrelation tests are straightforward
![Page 20: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/20.jpg)
12.6 Het and Autocorrelation…is there hope?
If Het and Autocorrelation are found, one can:1) Fix autocorrelation using the CO or PW method (Auto command in Shazam)2) Apply heteroskedastic-robust standard errors to the regression (Not possible through a simple Shazam command)
As a last resort, SC-robust standard errors are also heteroskedastic-robust
![Page 21: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/21.jpg)
12.6 Het and Autocorrelation…is there hope?
Alternately, het can be corrected through a combined WLS AR(1) procedure:
1|| ,v
(12.50) vh
...
1t
ttt
110
tt
ttkktt
ev
u
uxxy
t
t
t
tkk
t
t
tt
t
tvt
h
u
h
x
h
x
hh
y
huVar
...
ng)conditioni g(supressin )(
110
2
-Since ut/ht1/2 is homoskedastic, the above
equation can be estimated using CO or PW
![Page 22: 12.3 Correcting for Serial Correlation w/ Strictly Exogenous Regressors The following autocorrelation correction requires all our regressors to be strictly](https://reader035.vdocument.in/reader035/viewer/2022062216/56649d4b5503460f94a2943d/html5/thumbnails/22.jpg)
FGLS with Heteroskedasticity and AR(1) Serial Correlation:
1) Regress y on all x’s to obtain residuals uhat2) Regress log(uhatt
2) on all xt’s (or ythat and ythat2) and obtain fitted values, ghatt
3) Estimate ht: hthat=exp(ghatt)
4) Estimate the equation
t
t
t
tkk
t
t
tt
t
h
u
h
x
h
x
hh
y
ˆˆ...
ˆˆˆ110
By Cochrane-Orcutt (CO) or Prais-Winsten (PW) methods. (This corrects for serial correlation.)