lecture-4: multiple linear regression-estimationmlr estimation (2) 14 estimates can be derived from...

43
1 Lecture-4: Multiple Linear Regression-Estimation

Upload: others

Post on 22-May-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

1

Lecture-4: Multiple Linear

Regression-Estimation

Page 2: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

In Today’s Class2

Recap

Simple regression model estimation

Gauss-Markov Theorem

Hand calculation of regression estimates

Page 3: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Multiple Regression Analysis (MLR)3

Allows us to explicitly control for many factors those simultaneously affect the dependent variable

This is important for

examining theories

assessing various policies of independent variables

MLR can can accommodate many independent variables that may be correlated with the dependent variable we can infer causality.

In such instances simple regression analysis may be misleading or underestimate the model strength

Page 4: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

MLR Motivation4

Incorporate more explanatory factors into the model

Explicitly hold fixed other factors that otherwise would be in u

Allow for more flexible functional forms

Can take as many as independent variables

Page 5: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

MLR Notation5

Explains “y” in terms of x1, x2, …,xk

Dependent variable,explained variable,response variable,…

Independent variables,explanatory variables,regressors,…

Error term,disturbance,unobservables,…

Intercept Slope parameters

Page 6: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

MLR Example-16

Wage equation

Hourly wage Years of education Labor market experience

All other factors…

Now measures effect of education explicitly holding experience fixed

Page 7: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

MLR Example-27

Average test score, student spending, and income

Average standardizedtest score of school

Other factors

Per student spendingat this school

Average family incomeof students at this school

Page 8: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

MLR Example-38

Example: Family income and family consumption

Model has two explanatory variables: inome and income squared

Consumption is explained as a quadratic function of income

One has to be very careful when interpreting the coefficients:

Family consumption

Other factors

Family income Family income squared

By how much does consumptionincrease if income is increasedby one unit?

Depends on how much income is already there

Page 9: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

MLR Example-49

Example: CEO salary, sales and CEO tenure

Model assumes a constant elasticity relationship between CEO salary

and the sales of his or her firm

Model assumes a quadratic relationship between CEO salary and his or

her tenure with the firm

Meaning of linear regression

The model has to be linear in the parameters (not in the variables)

Log of CEO salary Log sales Quadratic function of CEO tenure with firm

Page 10: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

10

Parallels with Simple Regression

b0 is still the intercept

b1 to bk all called slope parameters

u is still the error term (or disturbance)

Still need to make a zero conditional mean assumption, so now assume that

E(u|x1,x2, …,xk) = 0

Still minimizing the sum of squared residuals, so have k+1 first order conditions

Page 11: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

11

Interpreting Multiple Regression (1)

tioninterpreta a

has each is that ,ˆˆ

thatimplies fixed ,..., holding so

,ˆ...ˆˆˆ

so ,ˆ...ˆˆˆˆ

11

2

2211

22110

ribus ceteris pa

xy

xx

xxxy

xxxy

k

kk

kk

bb

bbb

bbbb

Page 12: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Interpreting Multiple Regression (2)12

Interpretation of the multiple regression model

The multiple linear regression model manages to hold the values of

other explanatory variables fixed even if, in reality, they are correlated

with the explanatory variable under consideration

Ceteris paribus-interpretation

It has still to be assumed that unobserved factors do not change if the

explanatory variables are changed

By how much does the dependent variable change if the j-thindependent variable is increased by one unit, holding all other independent variables and the error term constant

Page 13: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

MLR Estimation (1)13

OLS Estimation of the multiple regression model

Random sample

Regression residuals

Minimize sum of squared residuals

Minimization will be carried out by computer

Page 14: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

MLR Estimation (2)14

Estimates can be derived from the first order conditions Properties of OLS on any sample of data

Fitted values and residuals

Algebraic properties of OLS regression

Fitted or predicted values Residuals

Deviations fromregression line sum up tozero

Correlations betweendeviations and regressors arezero

Sample averages of y and of theregressors lie on regression line

Page 15: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Goodness-of-fit (1)15

SSR SSE SSTThen

(SSR) squares of sum residual theis ˆ

(SSE) squares of sum explained theis ˆ

(SST) squares of sum total theis

:following thedefine then Weˆˆ

part, dunexplainean and part, explainedan of up

made being asn observatioeach ofcan think We

2

2

2

i

i

i

iii

u

yy

yy

uyy

Page 16: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Goodness-of-fit (2)16

How do we think about how well our sample regression line fits our sample data?

Can compute the fraction of the total sum of squares (SST) that is explained by the model, call this the R-squared of regression

R2 = SSE/SST = 1 – SSR/SST

Page 17: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

17

Goodness-of-Fit (3)

22

2

2

2

ˆˆ

ˆˆ

ˆ values theand actual the

betweent coefficienn correlatio squared the

toequal being as of think alsocan We

yyyy

yyyyR

yy

R

ii

ii

ii

Page 18: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

18

More about R-squared

R2 can never decrease when another independent variable is added to a regression, and usually will increase

Because R2 will usually increase with the number of independent variables, it is not a good way to compare models

Page 19: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Assumptions on MLR (1)19

Standard assumptions for the multiple regression model

Assumption MLR.1 (Linear in parameters)

Assumption MLR.2 (Random sampling)

In the population, the relation-ship between y and the expla-natory variables is linear

The data is a random sample drawn from the population

Each data point therefore follows the population equation

Page 20: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Assumptions on MLR (2)20

Standard assumptions for the multiple regression model (cont.)

Assumption MLR.3 (No perfect collinearity)

Remarks on MLR.3

The assumption only rules out perfect collinearity/correlation bet-

ween explanatory variables; imperfect correlation is allowed

If an explanatory variable is a perfect linear combination of other

explanatory variables it is superfluous and may be eliminated

Constant variables are also ruled out (collinear with intercept)

„In the sample (and therefore in the population), noneof the independent variables is constant and there areno exact relationships among the independent variables“

Page 21: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Assumptions on MLR (3)21

Standard assumptions for the multiple regression model (cont.)

Assumption MLR.4 (Zero conditional mean)

In a multiple regression model, the zero conditional mean assumption

is much more likely to hold because fewer things end up in the error

The value of the explanatory variables must contain no information about themean of the unobserved factors

Page 22: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Assumptions on MLR (4)22

Discussion of the zero mean conditional assumption

Explanatory variables that are correlated with the error term are called

endogenous; endogeneity is a violation of assumption MLR.4

Explanatory variables that are uncorrelated with the error term are called

exogenous; MLR.4 holds if all explanat. var. are exogenous

Exogeneity is the key assumption for a causal interpretation of the

regression, and for unbiasedness of the OLS estimators

Theorem 3.1 (Unbiasedness of OLS)

Unbiasedness is an average property in repeated samples; in a given

sample, the estimates may still be far away from the true values

Page 23: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

MLR Unbiasedness23

Population model is linear in parameters: y = b0 + b1x1 + b2x2 +…+ bkxk + u

We can use a random sample of size n, {(xi1, xi2,…, xik, yi): i=1, 2, …, n}, from the population model, so that the sample model is yi = b0 + b1xi1 + b2xi2 +…+ bkxik + ui

E(u|x1, x2,… xk) = 0, implying that all of the explanatory variables are exogenous

None of the x’s is constant, and there are no exact linear relationships among them

Page 24: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

24

A “Partialling Out” Interpretation

2201

1

2

111

22110

ˆˆˆˆ regression

estimated thefrom residuals the

are ˆ where,ˆˆˆ

then ,ˆˆˆˆ

i.e. ,2 wherecase heConsider t

xx

rryr

xxy

k

iiii

b

bbb

Page 25: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson

25

“Partialling Out” continued

Previous equation implies that regressing y on x1

and x2 gives same effect of x1 as regressing y on residuals from a regression of x1 on x2

This means only the part of xi1 that is uncorrelated with xi2 is being related to yi so we’re estimating the effect of x1 on y after x2 has been “partialled out”

Page 26: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

26

Simple vs Multiple Reg Estimate

sample in the eduncorrelat are and

OR ) ofeffect partial no (i.e. 0ˆ

:unless ˆ~ Generally,

ˆˆˆˆ regression multiple with the

~~~ regression simple theCompare

21

22

11

22110

110

xx

x

xxy

xy

b

bb

bbb

bb

Page 27: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

27

Too Many or Too Few Variables

What happens if we include variables in our specification that don’t belong?

There is no effect on our parameter estimate, and OLS remains unbiased

What if we exclude a variable from our specification that does belong?

OLS will usually be biased

Page 28: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

28

Omitted Variable Bias

2

11

11

1

110

22110

~

then,~~~ estimate

but we ,

asgiven is model true theSuppose

xx

yxx

uxy

uxxy

i

iib

bb

bbb

Page 29: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

29

Omitted Variable Bias (cont)

iiiii

iiii

iiii

uxxxxxxx

uxxxx

uxxy

112112

2

111

2211011

22110

becomesnumerator

theso ,

thatso model, true theRecall

bb

bbb

bbb

Page 30: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

30

Omitted Variable Bias (cont)

2

11

211

211

2

11

11

2

11

211

21

~

have wensexpectatio taking0,)E( since

~

xx

xxxE

u

xx

uxx

xx

xxx

i

ii

i

i

ii

i

ii

bbb

bbb

Page 31: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

31

Omitted Variable Bias (cont)

1211

2

11

211

11102

12

~~ so

~ then

~~~

on of regression heConsider t

bbb

E

xx

xxxxx

xx

i

ii

Page 32: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Summary of Direction Bias32

Corr(x1, x2) > 0 Corr(x1, x2) < 0

b2 > 0 Positive bias Negative bias

b2 < 0 Negative bias Positive bias

Page 33: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

33

Omitted Variable Bias Summary

Two cases where bias is equal to zero b2 = 0, that is x2 doesn’t really belong in model

x1 and x2 are uncorrelated in the sample

If correlation between x2 , x1 and x2 , y is the same direction, bias will be positive

If correlation between x2 , x1 and x2 , y is the opposite direction, bias will be negative

Page 34: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

34

The More General Case

Technically, can only sign the bias for the more general case if all of the included x’s are uncorrelated

Typically, then, we work through the bias assuming the x’s are uncorrelated, as a useful guide even if this assumption is not strictly true

Page 35: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson 35

Variance of the OLS Estimators

Now we know that the sampling distribution of our estimate is centered around the true parameter

Want to think about how spread out this distribution is

Much easier to think about this variance under an additional assumption, so

Assume Var(u|x1, x2,…, xk) = s2

(Homoskedasticity)

Page 36: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson

36

Variance of OLS (cont)

Let x stand for (x1, x2,…xk)

Assuming that Var(u|x) = s2 also implies that Var(y| x) = s2

The 4 assumptions for unbiasedness, plus this homoskedasticity assumption are known as the Gauss-Markov assumptions

Page 37: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson

37

Variance of OLS (cont)

s'other allon regressing from

theis and

where,1

ˆ

sAssumption Markov-Gauss Given the

222

2

2

xx

RRxxSST

RSSTVar

j

jjijj

jj

j

sb

Page 38: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson

38

Components of OLS Variances

The error variance: a larger s2 implies a larger variance for the OLS estimators

The total sample variation: a larger SSTj implies a smaller variance for the estimators

Linear relationships among the independent variables: a larger Rj

2 implies a larger variance for the estimators

Page 39: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson

39

Misspecified Models

same there' then theyed,uncorrelat are

and unless ˆ~ Thus,

~ that so ,

~~~

model edmisspecifi again theConsider

2

111

1

2

1110

x

xVarVar

SSTVarxy

bb

sbbb

Page 40: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson

40

Misspecified Models (cont)

While the variance of the estimator is smaller for the misspecified model, unless b2 = 0 the misspecified model is biased

As the sample size grows, the variance of each estimator shrinks to zero, making the variance difference less important

Page 41: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson 41

Estimating the Error Variance

We don’t know what the error variance, s2,

is, because we don’t observe the errors, ui

What we observe are the residuals, ûi

We can use the residuals to form an

estimate of the error variance

Page 42: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson42

Error Variance Estimate (cont)

212

22

1ˆˆ thus,

1ˆˆ

jjj

i

RSSTse

dfSSRknu

sb

s

df = n – (k + 1), or df = n – k – 1

df (i.e. degrees of freedom) is the (number of observations) – (number of estimated parameters)

Page 43: Lecture-4: Multiple Linear Regression-EstimationMLR Estimation (2) 14 Estimates can be derived from the first order conditions Properties of OLS on any sample of data Fitted values

Economics 20 - Prof. Anderson

43

The Gauss-Markov Theorem

Given our 5 Gauss-Markov Assumptions it can be shown that OLS is “BLUE”

Best

Linear

Unbiased

Estimator

Thus, if the assumptions hold, use OLS