multiple regression analysis chapter 14 mcgraw-hill/irwin copyright © 2012 by the mcgraw-hill...

42
Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.

Upload: micaela-flathers

Post on 02-Apr-2015

214 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Multiple Regression Analysis

Chapter 14

McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved.

Page 2: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Topics1. Multiple Regression

Estimation Global Test Individual Coefficient Test

2. Regression Assumptions and Regression Diagnostics

Error Term Distribution Multicollinearity Heteroscedascity Autocorrelation

3. Dummy Variable4. Stepwise Regression

13-2

Page 3: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Multiple Regression Analysis

• a and b1, b2, …, bk are estimated coefficients from the sample. • bi is the net change in Y for each unit change in Xi holding other X’s

constant.• The least squares criterion is used to develop this equation.

14-3

Multiple Linear Regression Model: Y = α + β1X1 + β2X2+ ··· +βkXk+ ε

Estimated Regression Equation: = a + b1X1 + b2X2+ + ··· +bkXkY

• Y is the dependent variable and X1, X2, … Xk are the independent variable.

• α, β1, β2, …, βk are population coefficients that need to be estimated using sample data.

• ε is the error term.• The model represents the linear relationship between the two variables in the population

Page 4: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Multiple Linear Regression - ExampleSalsberry Realty sells homes along the east coast of the United States. One of the questions most frequently asked by prospective buyers is: If we purchase this home, how much can we expect to pay to heat it during the winter? The research department at Salsberry has been asked to develop some guidelines regarding heating costs for single-family homes.

Three variables are thought to relate to the heating costs: (1) the mean daily outside temperature, (2) the number of inches of insulation in the attic, and (3) the age in years of the furnace.

To investigate, Salsberry’s research department selected a random sample of 20 recently sold homes. It determined the cost to heat each home last January, as well as the January outside temperature in the region, the number of inches of insulation in the attic, and the age of the furnace.

Y X1 X2 X3

14-4

Data Salsberry

Page 5: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

SUMMARY OUTPUT

Regression Statistics Multiple R 0.896755 R Square 0.80417 Adjusted R Square 0.767452 Standard Error 51.04855 Observations 20 ANOVA

df SS MS F Significance F

Regression 3 171220.5 57073.49 21.90118 6.56E-06 Residual 16 41695.28 2605.955 Total 19 212915.8

CoefficientsStandard

Error t Stat P-value Lower 95% Upper 95%Intercept 427.1938 59.60143 7.167509 2.24E-06 300.8444 553.5432Temp -4.58266 0.772319 -5.93364 2.1E-05 -6.21991 -2.94542Insul -14.8309 4.754412 -3.11939 0.006606 -24.9098 -4.75196Age 6.101032 4.01212 1.52065 0.147862 -2.40428 14.60635

Multiple Linear Regression – Excel Output

a b1, b2, and b3,

14-5

See Excel instruction in the textbook, P 566, #2.

Page 6: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Estimating the Multiple Regression Equation

Interpreting the Regression Coefficients

The regression coefficient for mean outside temperature, X1, is -4.583. For every unit increase in temperature, holding the other two independent variables constant, monthly heating cost is expected to decrease by $4.583.

The attic insulation variable, X2, also shows a negative relationship. For each additional inch of insulation, the cost to heat the home is expected to decline by $14.83 per month, .

The age of the furnace variable shows a positive relationship. For each additional year older the furnace is, the cost is expected to increase by $6.10 per month.

14-6

Page 7: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Using the Multiple Regression Equation

Applying the Model for Estimation

What is the estimated heating cost for a home if the mean outside temperature is 30 degrees, there are 5 inches of insulation in the attic, and the furnace is 10 years old?

56.276ˆ

)10(101.6)5(831.14)30(583.4194.427ˆ

101.6831.14583.4194.427ˆ321

Y

Y

XXXY

Page 8: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Fitness of the model—Adjusted r2

The Adjusted R2

1. R2 is inflated by the number of independent variables.

2. In multiple regression analysis, the adjusted R2 is a better measurement of the fitness of the model.

3. Ranges from 0 to 1. 4. The Adjusted R2 is adjusted by the

number of independent variables and sample size.

5. It measures the percentage of total variation in Y that is explained by all independent variables, that is, explained by the regression model.

14-8

SUMMARY OUTPUT

Regression Statistics Multiple R 0.896755R Square 0.80417Adjusted R Square 0.767452Standard Error 51.04855Observations 20

About 76.7% of the variation in the heating cost is explained by the mean outside temperature, attic insulation and the age of furnace.

Page 9: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Global Test: Testing the Multiple Regression Model

The global test is used to investigate whether any of the independent variables have coefficients that are significantly different from zero. This is also the test on the validity of the model.

The hypotheses are: valid) is model (The 0 equal s all Not

invalid) is model (The

:

0...:

1

210

H

H k

Decision Rules: (1) Reject H0 if F > F,k,n-k-1

or(2) Reject H0 if p-value<α

14-9

Page 10: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

F-distribution The distribution takes nonnegative values only. Asymmetric, skewed to the right. The shape of the distribution is controlled by 2 degrees

of freedoms, denoted v1 and v2. The degrees of freedoms are usually reported in the

ANOVA table in the output

ANOVA df SS MS F Significance F

Regression 3 171220.5 57073.49 21.90118 6.56E-06Residual 16 41695.28 2605.955 Total 19 212915.8

Excel function: =FINV(α, k, n-k-1)

Page 11: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

ANOVA

df SS MS F Significance F

Regression 3 171220.5 57073.49 21.90118 6.56E-06 Residual 16 41695.28 2605.955 Total 19 212915.8

Global test—Example

14-11

2. Significance level: α=0.053. Test statistic: F=21.90

0 equal s all Not

:

0:.1

1

3210

H

H

Page 12: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

ANOVA

df SS MS F Significance F

Regression 3 171220.5 57073.49 21.90118 6.56E-06 Residual 16 41695.28 2605.955 Total 19 212915.8

Global test—Example

14-12

4. Rule (1) Rejection region: Reject H0 if F >3.24According to step 3, F=21.90,

which falls in the rejection region.

Rule (2) Reject H0 if p-value < α p-value =0.00, less than 0.05

5. Decision: rejection the null hypothesis =FINV(.05, 3, 16) = 3.24

Page 13: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Interpretation

The null hypothesis that all the multiple regression coefficients are zero is rejected.

Interpretation: Some of the independent variables are useful in predicting the

dependent variable (heating cost). Some of the independent variables are linearly related to the

dependent variable. The model is valid.

Logical question – which ones?

14-13

Page 14: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

This test is used to determine which independent variables have nonzero regression coefficients.

The variables that have nonzero regression coefficients are said to have significant coefficients (significantly different from zero).

The variables that have zero regression coefficients can be dropped from the analysis.

The test statistic follows t distribution. The test hypotheses test are:

H0: βi = 0

H1: βi ≠ 0 Instead of comparing test statistic with rejection region for each

independent variable (which is tedious), we rely on the p-values. If p-value < α, we reject the null hypothesis.

Evaluating Individual Regression Coefficients (βi)

14-14

Page 15: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

P-values for the Slopes

14-15

CoefficientsStandard

Error t Stat P-value Lower 95% Upper 95%Intercept 427.1938 59.60143 7.167509 2.24E-06 300.8444 553.5432Temp -4.58266 0.772319 -5.93364 2.1E-05 -6.21991 -2.94542Insul -14.8309 4.754412 -3.11939 0.006606 -24.9098 -4.75196Age 6.101032 4.01212 1.52065 0.147862 -2.40428 14.60635

For temperature: For Insulation: For furnace age: H0: β1 = 0 H0: β2 = 0 H0: β3 = 0 H1: β1 ≠ 0 H1: β2 ≠ 0 H1: β3 ≠ 0 P-value=.00 < .05 P-value=.007 < .05 P-value=.148 < .05Conclusions: For temperature and insulation, rejection the null hypothesis.

(1)The coefficients are significant (significantly different from zero);(2)The variables are linearly related to heating cost(3)The variables are useful in predicting heating cost

For furnace age, do not rejection the null hypothesis.(4)the coefficient is insignificant and thus can be dropped from the model(5)The variable is not linearly related to heating cost(6)The variable is not useful in predicting heating cost

Page 16: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

New Regression without Variable “Age”

14-16

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.880834R Square 0.775868Adjusted R Square 0.7495Standard Error 52.98237Observations 20

ANOVA df SS MS F Significance F

Regression 2 165194.5 82597.26 29.42408 3.01E-06Residual 17 47721.23 2807.131Total 19 212915.8

CoefficientsStandard

Error t Stat P-value Lower 95%Upper 95%

Lower 95.0%

Upper 95.0%

Intercept 490.2859 44.40984 11.04003 3.56E-09 396.5893 583.9825 396.5893 583.9825Temp -5.14988 0.701887 -7.3372 1.16E-06 -6.63074 -3.66903 -6.63074 -3.66903Insul -14.7181 4.933918 -2.98305 0.008351 -25.1278 -4.30849 -25.1278 -4.30849

Page 17: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

d.f. (2,17)

3.59

ANOVA df SS MS F Significance F

Regression 2 165194.5 82597.26 29.42408 3.01E-06Residual 17 47721.23 2807.131

Total 19 212915.8

New Regression Model without Variable “Age”

2. Significance level: α=0.05

3. Test statistic: F=29.42

4. Rejection region: Reject H0 if F >3.59, test statistic

falls in the rejection region.

p-value =0.00, less than 0.05

5. Decision: rejection the null hypothesis

0 equal s all Not

:

0...:.1

1

210

H

H k

Page 18: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Individual t-test on the new Coefficient

14-18

For temperature: For Insulation: H0: β1 = 0 H0: β2 = 0 H1: β1 ≠ 0 H1: β2 ≠ 0 P-value=.00 < .05 P-value=.008 < .05

Conclusions: For temperature and insulation, rejection the null hypothesis.

(1)The coefficients are significant (significantly different from zero);(2)The variables are linearly related to heating cost(3)The variables are useful in predicting heating cost

CoefficientsStandard

Error t Stat P-value Lower 95%Upper 95%

Lower 95.0%

Upper 95.0%

Intercept 490.2859 44.40984 11.04003 3.56E-09 396.5893 583.9825 396.5893 583.9825Temp -5.14988 0.701887 -7.3372 1.16E-06 -6.63074 -3.66903 -6.63074 -3.66903

Insul -14.7181 4.933918 -2.98305 0.008351 -25.1278 -4.30849-25.1278-

4.30849

Page 19: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Multiple Regression AssumptionsI. Each of the independent variables and the

dependent variable have a linear relationship.

II. The independent variables are not correlated. When this assumption is violated, we call the condition multicollinearity.

III. The probability distribution of ε is normal.

IV. The variance of ε is constant regardless of the value of . This condition is called homoscedasticity. When the requirement is violated, we say heterscedasticity is observed in the regression.

V. The error terms are independent of each other. This assumption is often violated when time is involved and we call this condition autocorrelation.

14-19

Y

Page 20: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Evaluating the Assumptions of Multiple Regression

I. There is a linear relationship. We use scatter plot to examine this assumption.

II. The independent variables are not correlated. We examine the correlation coefficient among the independent variables.

III. The error term follow the normal probability distribution. We use histogram of the residual or normal probability plot to examine the normality.

IV. The variance of ε is constant regardless of the value of . V. The error term independent of each other.

We plot the residual against the predicted Y to examine the last two assumptions.

14-20

Y

Page 21: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Assumption I: linear relationshipA scatter plot of each independent variable against the dependent variable is used.

14-21

0 50 100 150 200 250 300 350 400 4500

10

20

30

40

50

60

70

Temp

Temp

0 5 10 15 20 250

2

4

6

8

10

12

14

Insul

Insul

In practice, we can skip this check since the test on individual coefficient will serve the same purpose.

0 50 100 150 200 250 300 350 400 45002468

10121416

Age

Age

Page 22: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Assumption II: Multicollinearity Multicollinearity exists when independent variables (X’s) are correlated.

Effects of Multicollinearity on the Model:

1. An independent variable known to be an important predictor ends up having an insignificant coefficient.

2. A regression coefficient that should have a positive sign turns out to be negative, or vice versa.

3. Multicollieanrity adds difficulty to the interpretation of the coefficients. When one variable changes by 1 unit, other correlated variables will change also (but we require it to be held constant in order to correctly interpret the coefficient).

However, correlated independent variables do not affect a multiple regression equation’s ability to predict the dependent variable (Y).

Minimizing the effect of multicollinearity is often easier than correcting it:

1. Try to include explanatory variables that are independent of each other.

2. Remove variables that cause multicollinearity in the model.

14-22

Page 23: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Multicollinearity: Detection A general rule is if the correlation between two independent

variables is between -0.70 and 0.70 there likely is not a problem using both of the independent variables.

A more precise test is to use the variance inflation factor (VIF). A VIF > 10 is unsatisfactory. Remove that independent variable

from the analysis. The value of VIF is found as follows:

The term R2j refers to the coefficient of determination, where the selected

independent variable is used as a dependent variable and the remaining independent variables are used as independent variables.

21

1

jRVIF

14-23

Page 24: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Multicollinearity – ExampleRefer to example of heating cost, which is related to the independent variables outside temperature, amount of insulation, and age of furnace. Develop a correlation matrix for all the independent variables.

Does it appear there is a problem with multicollinearity?

14-24

Correlation Matrix

Excel: Data-> Data Analysis-> Correlation

Temp Insul Age

Temp 1.00

Insul -0.10 1.00

Age -0.49 0.06 1.00

None of the correlations (highlighted above) among the independent variables exceed -.7u0 or .70, so we do not suspect problems with multicollinearity.

Page 25: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

SUMMARY OUTPUT

Regression Statistics

Multiple R 0.491328R Square 0.241403

Adjusted R Square 0.152157

Standard Error 16.03105

Observations 20

CoefficientsStandard

Error t Stat P-valueIntercept 57.99449 12.34827 4.696567 0.000208Insul -0.50888 1.487944 -0.342 0.736541Age -2.50902 1.103252 -2.2742 0.036201

VIF – Example

Coefficient of Determination

The VIF value of 1.32 is less than the upper limit of 10. This indicates that the independent variable temperature is not strongly correlated with the other independent variables.

14-25

Find and interpret the variance inflation factor for each of the independent variables.

We consider variable temperature first. We run a multiple regression with temperature as the dependent variable and the other two as the independent variables.

Page 26: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

VIF – Example

None of the VIFs are higher than 10. hence, we conclude there is not a problem with multicollinearity in this example.

Note: for your project first obtain correlation matrix. For variables that are associated with correlation coefficients exceeding -.70 or .70, calculated the corresponding VIFs to further determine whether multicollinearity is an issue or not.

14-26

Calculating the VIF for each variable using Excel can be tedious.

Minitab generates the VIF values for each independent variable in its output, which is shown below.

Page 27: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Assumption III: Normality of Error Term Histogram (discuss in review) of residuals is used to visually

determine whether the assumption of normality is satisfied. Excel offers another graph, normal probability plot, that helps

to evaluate this assumption. Basically, if the plotted points are fairly close to a straight line drawn from the lower left to the upper right, the normality assumption is satisfied.

14-27

0 20 40 60 80 100 1200

100

200

300

400

500

Normal Probability Plot

Sample Percentile

Co

st

Page 28: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Assumption IV & V

14-28

0 50 100 150 200 250 300 350 400

-100-80-60-40-20

020406080

100

Residuals

Predicted Cost

Res

idu

als

As we can see from the scatter plot, the residuals are randomly distributed across the horizontal axis and there is no obvious. Therefore, there is no sign of heteroscedasticity or autocorrelation.

Page 29: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Residual Plot versus Fitted Values:Testing the Heteroscedasticity Assumption

When the variance of the error term is changing across different values of Y’s, we refer to this condition as heteroscedasticity.

In the plot of the residuals against the predicted value of Y, we look for a change in the spread of the plotted points.

The spread of the points increases as the predicted value of Y increases. A scatter plot such as this would indicate possible heteroscedasticity.

14-29

-4 -2 0 2 4 6 8 10 12

-10

-8

-6

-4

-2

0

2

4

6

Residuals

Predicted Y

Res

idu

als

Page 30: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Residual Plot versus Fitted Values:Testing the Independence Assumption

When successive residuals are correlated we refer to this condition as autocorrelation, which frequently occurs when the data are collected over a period of time.

Note the run of residuals above the mean of the residuals, followed by a run below the mean. A scatter plot such as this would indicate possible autocorrelation.

14-30

Page 31: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable

Dummy variable: Dummy variable is a variable that can assume either one of only two values (usually 1 and 0), where 1 represents the existence of a certain condition and 0 indicates that the condition does not hold.

Usually categorical data or nominal data cannot be included in the analysis directly. Instead, we need to use dummy variables to denote the categories.

otherwise

holds condition

,0

,1INotation:

Page 32: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable - Example

14-32

Suppose in the Salsberry Realty example that the independent variable “garage” is added, which indicate whether a house comes with an attached garage or not. To include this variable in our analysis, we define a dummy variable as follows: for those homes without an attached garage, 0 is used; for homes with an attached garage, a 1 is used.

otherwise

garage attached with

,0

,1I

Page 33: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable - Example

14-33

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.932651R Square 0.869838Adjusted R Square 0.845433

Standard Error 41.61842Observations 20

ANOVA

df SS MS F Significance F

Regression 3 185202.3 61734.09 35.64133 2.59E-07Residual 16 27713.48 1732.093Total 19 212915.8

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%Intercept 393.6657 45.00128 8.747876 1.71E-07 298.2672 489.0641Temp -3.96285 0.652657 -6.07186 1.62E-05 -5.34642 -2.57928Insul -11.334 4.001531 -2.8324 0.01201 -19.8168 -2.85109Garage 77.4321 22.78282 3.398706 0.00367 29.13468 125.7295

IXXY

IbXbXbaY

4.773.1196.3394ˆ

ˆ

21

32211

New estimated regression equation:

Page 34: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable - Example

Interpretation:

b3 = 77.4: the heating cost for homes with attached garage is on average $77.4 higher than homes without attached garage, with other conditions being the same.

14-34

IXXY

IbXbXbaY

4.773.1196.367.393ˆ

ˆ

21

32211

Page 35: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable – Another Example

14-35

What determines the value of a used car? To examine this issue, a used-car dealer randomly selected 100 3-year-old Toyota Camrys that were sold at auction during the past month. Each car was in top condition and equipped with all the features that come standard with this car. The dealer recorded the price ($1,000), the number of miles (thousands) on the odometer and the color of the car.When recording the color, the dealer uses 1 to denote white, 2 to denote silver and 3 to denote other colors.

Page 36: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable – Another Example

Although variable color include numbers, 1, 2, and 3, they cannot be included in the analysis. Instead we need to generate dummy variables to denote the different categories.

Rule of assigning dummy variables: if there are m different categories in the data, generate m-1 dummy variables. The last category is represented by I1 = I2 = … = Im-1 = 0 , and is called the omitted category.

Since there are three categories in variable color, we generate two dummy variables defined as follows:

“Other colors” is the omitted category and is represented by

I1 = I2 = 0

14-36

otherwise

white

,0

,11I

otherwise

silver

,0

,12I

Page 37: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable – Excel Open data Toyota Camry In the column next to “color” type “I1” to generate the dummy

variable for “white.” In the cell below it, type =IF(C2=1, 1, 0) and hit enter. (Excel function: IF(logical_test, [value_if_true], [value_if_false].)

Copy the cell and paste to the rest of cells in the column till the cell in the previous column is empty.

Similarly, generate the dummy variable for “silver” in the next column by typing =IF(C2=2, 1, 0) and follow the same procedure.

To run regression, we need to put the explanation variables together. Copy the column of Odometer and past to the column next to the second dummy variable.

Run multiple regression using the 2 dummy variables and Odometer.

14-37

Page 38: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable – Excel

14-38

=IF(C2=1, 1, 0)

=IF(C2=2, 1, 0)

Page 39: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable – Excel

14-39

SUMMARY OUTPUT

Regression StatisticsMultiple R 0.837135R Square 0.700794Adjusted R Square 0.691444Standard Error 0.304258Observations 100

ANOVA

  df SS MS FSignificance

F

Regression 3 20.81492 6.938306 74.9498 4.65E-25Residual 96 8.886981 0.092573Total 99 29.7019     

  CoefficientsStandard

Error t Stat P-value Lower 95% Upper 95% Lower 95.0% Upper 95.0%Intercept 16.83725 0.197105 85.42255 2.28E-92 16.446 17.2285 16.446 17.2285I1 0.091131 0.072892 1.250224 0.214257 -0.05356 0.235819 -0.05356 0.235819I2 0.330368 0.08165 4.046157 0.000105 0.168294 0.492442 0.168294 0.492442Odometer -0.05912 0.005065 -11.6722 4.04E-20 -0.06918 -0.04907 -0.06918 -0.04907

XIIY

XbIbIbaY

59.033.009.084.16ˆ

ˆ

21

32211

Estimated regression equation:

Page 40: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Dummy Variable – Interpretation

The coefficient of I1 : b1 = 0.09:

A white Camry sells for .0911 thousand or $91.10 on average more than other colors (nonwhite, nonsilver) with the same odometer reading.

The coefficient of I2 : b2 = 0.3304:

A silver Camry sells for .3304 thousand or $33.04 on average more than other colors (nonwhite, nonsilver) with the same odometer reading.

14-40

Page 41: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Stepwise Regression

The advantages to the stepwise method are:1. Only independent variables with significant regression

coefficients are entered into the equation.2. The steps involved in building the regression equation are clear.3. It is efficient in finding the regression equation with only

significant regression coefficients.4. The changes in the multiple standard error of estimate and the

coefficient of determination are shown.

14-41

Page 42: Multiple Regression Analysis Chapter 14 McGraw-Hill/Irwin Copyright © 2012 by The McGraw-Hill Companies, Inc. All rights reserved

Stepwise Regression – Minitab Example

The stepwise MINITAB output for the heating cost problem follows. Temperature is

selected first. This variable explains more of the variation in heating cost than any otherproposed independent variables.

Garage is selected next, followed by Insulation.

Variable age is not selected

14-42