multiple regression. introduction describe some of the differences between the multiple regression...

25
Multiple Regression

Upload: ashton-henderson

Post on 28-Mar-2015

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Multiple Regression

Page 2: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Introduction

• Describe some of the differences between the multiple regression and bi-variate regression

• Assess the importance of the R-squared statistic.

• Examine the F-test and distribution

• Show how we can use the F-test to determine joint significance.

Page 3: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Multiple Regression

• In general the regression estimates are more reliable if:

i) n is large (large dataset)

ii) The sample variance of the explanatory variable is high.

iii) the variance of the error term is small

iv) The less closely related are the explanatory variables.

Page 4: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Multiple Regression

• The constant and parameters are derived in the same way as with the bi-variate model. It involves minimising the sum of the error terms. The equation for the slope parameters (α) contains an expression for the covariance between the explanatory variables.

• When a new variable is added it affects the coefficients of the existing variables

Page 5: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Regression

)

tan, 45(

56.1,3.0R

(0.3) (0.4) (0.1)

9.04.06.0ˆ

2

bracketsin

errorsdardsnsobservatio

DW

zxy ttt

Page 6: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Regression

• In the previous slide, a unit rise in x produces 0.4 of a unit rise in y, with z held constant.

• Interpretation of the t-statistics remains the same, i.e. 0.4-0/0.4=1 (critical value is 2.02), so we fail to reject the null and x is not significant.

• The R-squared statistic indicates 30% of the variance of y is explained

• DW statistic indicates we are not sure if there is autocorrelation, as the DW statistic lies in the zone of indecision (Dl=1.43, Du=1.62)

Page 7: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Adjusted R-squared Statistic

• This statistic is used in a multiple regression analysis, because it does not automatically rise when an extra explanatory variable is added.

• Its value depends on the number of explanatory variables

• It is usually written as (R-bar squared):

2R

Page 8: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Adjusted R-squared

• In generally rises when the t-statistic of an extra variable exceeds unity (1),so does not necessarily imply the extra variable is significant.

• It has the following formula (n-number of observations, k-number of parameters):

)1(1 222 Rkn

kRR

Page 9: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

The F-test

• The F-test is an analysis of the variance of a regression• It can be used to test for the significance of a group of

variables or for a restriction• It has a different distribution to the t-test, but can be used

to test at different levels of significance• When determining the F-statistic we need to collect

either the residual sum of squares (RSS) or the R-squared statistic

• The formula for the F-test of a group of variables can be expressed in terms of either the residual sum of squares (RSS) or explained sum of squares (ESS)

Page 10: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

F-test of explanatory power

• This is the F-test for the goodness of fit of a regression and in effect tests for the joint significance of the explanatory variables.

• It is based on the R-squared statistic

• It is routinely produced by most computer software packages

• It follows the F-distribution, which is quite different to the t-test

Page 11: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

F-test formula

• The formula for the F-test of the goodness of fit is:

1

2

2

)/()1(

1/

kknF

knR

kRF

Page 12: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

F-distribution

• To find the critical value of the F-distribution, in general you need to know the number of parameters and the degrees of freedom

• The number of parameters is then read across the top of the table, the d of f. from the side. Where these two values intersect, we find the critical value.

Page 13: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

F-test critical value

5.1 5.2 5.4 5.8 6.6 5

6.3 6.4 6.6 7.0 7.7 4

9.0 9.1 9.3 9.6 10.1 3

19.3 19.3 19.2 19.0 18.5 2

230.2 224.6 215.7 199.5 161.4 1

5 4 3 2 1

Page 14: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

F-distribution

• Both go up to infinity• If we wanted to find the critical value for

F(3,4), it would be 6.6• The first value (3) is often termed the

numerator, whilst the second (4) the denominator.

• It is often written as:

34F

Page 15: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

F-statistic

• When testing for the significance of the goodness of fit, our null hypothesis is that the explanatory variables jointly equal 0.

• If our F-statistic is below the critical value we fail to reject the null and therefore we say the goodness of fit is not significant.

Page 16: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Joint Significance

• The F-test is useful for testing a number of hypotheses and is often used to test for the joint significance of a group of variables

• In this type of test, we often refer to ‘testing a restriction’

• This restriction is that a group of explanatory variables are jointly equal to 0

Page 17: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

F-test for joint significance

• The formula for this test can be viewed as:

remaining freedom of Degrees

remaining/ squares of sum Residual

up used freedom of degrees Extra

fit/in t Improvemen

Page 18: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

F-tests

• The test for joint significance has its own formula, which takes the following form:

RSSrestrictedRSS

RSSedunrestrictRSS

eledunrestrictinparametersk

nsrestrictioofnumberm

knRSS

mRSSRSSF

R

u

u

uR

mod

/

/

Page 19: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Joint Significance of a group of variables

• To carry out this test you need to conduct two separate OLS regression, one with all the explanatory variables in (unrestricted equation), the other with the variables whose joint significance is being tested, removed.

• Then collect the RSS from both equations.• Put the values in the formula• Find the critical value and compare with the test

statistic. The null hypothesis is that the variables jointly equal 0.

Page 20: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Joint Significance

• If we have a 3 explanatory variable model and wish to test for the joint significance of 2 of the variables (x and z), we need to run the following restricted and unrestricted models:

edunrestrict

uzxwy

restricteduwy

ttttt

ttt

3210

10

Page 21: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Example of the F-test for joint significance

• Given the following model, we wish to test the joint significance of w and z. Having estimated them, we collect their respective RSSs (n=60).

5.1

75.0

10

3210

R

ttt

u

ttttt

RSS

restrictedvxy

RSS

edunrestrictuzwxy

Page 22: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Joint significance

sy variableexplanator are ,,

serror term are ,

.parameters slope are ,....

constants. are ,:

131

00

ttt

tt

zwx

vu

where

Page 23: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Joint significance

• Having obtained the RSSs, we need to input the values into the earlier formula (slide 18):

15.3: valuecritical

280134.0

375.0

460/75.0

2/75.05.1

256

F

Page 24: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Joint significance

• As the F statistic is greater than the critical value (28>3.15), we reject the null hypothesis and conclude that the variables w and z are jointly significant and should remain in the model.

0:

0:

321

320

H

H

Page 25: Multiple Regression. Introduction Describe some of the differences between the multiple regression and bi-variate regression Assess the importance of

Conclusion

• Multiple regression analysis is similar to bi-variate analysis, however correlation between the x variables needs to be taken into account

• The adjusted R-squared statistic tends to be used in this case

• The F-test is used to test for joint explanatory power of the whole regression or a sub-set of the variables

• We often use the F-test when testing for things like seasonal effects in the data.