ch 11: correlations (pt. 2) and ch 12: regression (pt.1) nov. 13, 2014

14
Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Upload: cathleen-newman

Post on 22-Dec-2015

217 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Ch 11: Correlations (pt. 2)and Ch 12: Regression (pt.1)

Nov. 13, 2014

Page 2: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Hypothesis Testing for Corr

• Same hypothesis testing process as before:

• 1) State research & null hypotheses –– Null hypothesis states there is no relationship between

variables (correlation in pop = 0)– Notation for population corr is rho ()– Null: = 0 (no relationship betw gender & ach)– Research hyp: doesn’t = 0 (there is a signif

relationship betw gender & ach)

Page 3: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

(cont.)

• The appropriate statistic for testing the signif of a correlation (r) is a t statistic

• Formula changes slightly to calculate t for a correlation:

• Need to know r and sample size

Page 4: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

• Find the critical value to use for your comparison distribution – it will be a t value from your t table, with N-2 df

• Use same decision rule as with t-tests:– If (abs value of) t obtained > (abs value) t critical

reject Null hypothesis and conclude correlation is significantly different from 0.

Page 5: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Example

• For sample of 35 employees, correlation between job dissatisfaction & stress = .48

• Is that significantly greater than 0?

• Research hyp: job dissat & stress are significantly positively correlated ( > 0)

• Null hyp: job dissat & stress are not correlated ( = 0)

• Note 1-tailed test, use alpha = .05

Page 6: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Regression

• Predictor and Criterion Variables

• Predictor variable (X) – variable used to predict something (the criterion)

• Criterion variable (Y) – variable being predicted (from the predictor!)– Use GRE scores (predictor) to predict your

success in grad school (criterion)

Page 7: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Prediction Model

• Direct raw-score prediction model– Predicted raw score (on criterion variable) =

regression constant plus the result of multiplying a raw-score regression coefficient by the raw score on the predictor variable

– Formula

))((ˆ XbaY ))((ˆ XbaY

b = regression coefficient (not standardized)

a = regressionconstant

Page 8: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

• The regression constant (a)– Predicted raw score on criterion variable

when raw score on predictor variable is 0 (where regression line crosses y axis)

• Raw-score regression coefficient (b)– How much the predicted criterion variable

increases for every increase of 1 on the predictor variable (slope of the reg line)

Page 9: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Correlation Example: Info needed to compute Pearson’s r correlation

x y (x-Mx) (x-Mx)2 (y-My) (y-My)2 (x-Mx)(y-My)

6 6 2.4 5.76 2 4 4.8

1 2 -2.6 6.76 -2 4 5.2

5 6 1.4 1.96 2 4 2.8

3 4 -.6 .36 0 0 0

3 2 -.6 .36 -2 4 1.2

Mx=3.6

My=4.0

0 SSx=15.2

0 SSy= 16 SP = 14.0

Refer to this totalas SP (sum of products)

Page 10: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Formulas for a and b

• First, start by finding the regression coefficient (b):

• Next, find the regression constant or intercept, (a):

XSS

SPb )( slope

)()(intercept MxbMya

This is known as the “Least Squares Solution” or ‘least squares regression’

Page 11: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Computing regression line(with raw scores)

6 61 25 6

3 4

3 2

X Y

14.015.20 16.0

SSYSSX

SP

slope bSP

SSX

14

15.20.92

)()(intercept MxbMya

mean 3.6 4.0

4.0 (0.92)(3.6)

0.688Ŷ = .688 + .92(x)

Page 12: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Interpreting ‘a’ and ‘b’

• Let’s say that x=# hrs studied and y=test score (on 0-10 scale)

• Interpreting ‘a’:– when x=0 (study 0 hrs), expect a test score

of .688

• Interpreting ‘b’– for each extra hour you study, expect an

increase of .92 pts

Page 13: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Correlation in SPSS• Analyze Correlate Bivariate

– Choose as many variables as you’d like in your correlation matrix OK

– Will get matrix with 3 rows of output for each combination of variables

• Notice that the diagonal contains corr of variable with itself, we’re not interested in this…

• 1st row reports the actual correlation• 2nd row reports the significance value (compare to alpha – if <

alpha reject the null and conclude the correlation differs significantly from 0)

• 3rd row reports sample size used to calculate the correlation

Page 14: Ch 11: Correlations (pt. 2) and Ch 12: Regression (pt.1) Nov. 13, 2014

Simple Regression in SPSS– Analyze Regression Linear– Note that terms used in SPSS are “Independent Variable”

(this is x or predictor) and “Dependent Variable” (this is y or criterion)

– Class handout of output – what to look for:• “Model Summary” section - shows R2

• ANOVA section – 1st line gives ‘sig value’, if < .05 signif– This tests the significance of the R2 for the regression.

If yes it does predict y)

• Coefficients section – 1st line gives ‘constant’ = a (listed under ‘B’ column)

– Other line gives ‘unstandardized coefficient’ = b

– Can write the regression/prediction equation from this info…