some estimates - university of california, berkeleybrill/stat153/chap3.2.pdf · 2015. 9. 23. · a...
TRANSCRIPT
Some estimates.
Question. What’s an example? A sporting event?
estimate of the mean level
We will see how to estimate in Chapter 14
3.3 Regression methods.
Unbiased – proof by substitution
= 0
Analysis of varaiance rwalk + drift
std.errors assumed iid Gaussian here
betas are constants
Reminder. The beta’s are fixed (and that the error is
stationary).
Cosine trend.
NB. PHI is assumed constant
NB. –π < atan2(y,x) <= π or in [0,2π)
Yt = β0 + β1 X1t + β2 X2t + Zt
More later
p-value: Prob get a result as or more extreme as observed test statistic
(|t-value| here) under the null model
Fitted curve
3.5 Interpreting output
3.6 RESIDUAL ANALYSIS
Yt = μt + Xt
X̂ = Yt - μ̂t
μt (θ) =̂ μ̂t = μt ( 𝜃)
“most statistics software will produce standardized residuals using a
more complicated standard error in the denominator that takes into
account the specific regression model being fit”
EDA tools.
There are a broad variety of very important residual plots including:
residuals versus fitted values mu-hat transform response?
residuals versus omitted explanatories other X’s include them?
residual versus index i add to model?
residual versus preceding value X[t-1] autocorellation?
qqnorm infer normality?
qqnorm(): displays the quantiles of the data versus the theoretical
quantiles of a normal distribution
Next. transform response?
winter spring/fall summer
rstudent(): standardized residuals .
Could add a line
Sample autocorrelation function.
larain correlogram example
skewed larain stleaf()
the decimal point is at the |
4 | 1168958 6 | 235544566 8 | 01357789991368 10 | 5677790236899 12 | 01367889012677 14 | 133457801334 16 | 02577702455559 18 | 00266778929 20 | 3895 22 | 0672379 24 | 0 26 | 3382 28 | 30 | 63 32 | 3 34 | 0 36 | 38 | 40 | 3
A stochastic (regression) model
Yt = β0 + β1 X1t + β2 X2t + Zt
Zt independent zero mean, white noise, or ar or ma or arma(p,q)
harmonic() creates an [1 X1t X2t ] matrix
classical approach to regression / ordinary least squares
Interpreting regression coefficients.
E{Yt | X1t , X2t} = β0 + β1 X1t + β2 X2t
If increase X1t by 1 keeping X2t fixed, expected value increases by β1
Compare partial derivative ∂∙/∂x1