overview of techniques case 1 independent variable is groups, or conditions dependent variable is...

27
Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) X One sample: Z-test or t-test Two samples: T-test (independent or paired) Three samples: One-way ANOVA F-test Factorial design: Two-Way ANOVA F-test

Upload: caleb-rolf

Post on 31-Mar-2015

217 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Overview of Techniques

Case 1 Independent Variable is Groups, or Conditions

Dependent Variable is continuous ( )X

One sample: Z-test or t-test

Two samples: T-test (independent or paired)

Three samples: One-way ANOVA F-test

Factorial design: Two-Way ANOVA F-test

Page 2: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Overview of Techniques

Case 2 Independent Variable is continuous (x)

Dependent Variable is continuous (y)

One DV, one predictor: correlation, simple linear regression

One DV, multiple predictors: partial correlation, multiple correlation, multiple regression

Page 3: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

What if we have two predictor variables?

We want to predict depression. We have measured stress and loneliness.

We can ask several questions:

1) which is the stronger predictor?

2) how well do they predict depression together?

3) what is the effect of loneliness on depression, controlling for stress?

Page 4: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

What if we have two predictor variables?

Regressing depression on stress

Predictor Unstandardized Coefficient

Standard error

Standardized Coefficient

t sig

Stress .50 .16 .25 3.125 <.05

Regressing depression on loneliness

Predictor Unstandardized Coefficient

Standard error

Standardized Coefficient

t sig

Loneliness .80 .28 .20 2.86 <.05

R2 = .0625

R2 = .04

Which is the better predictor?How well do they predict depression together?

Page 5: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Multiple Correlation

Page 6: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

How well do they predict depression together?

depression

loneliness

R2

Page 7: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

How well do they predict depression together?

depression

stress

R2

Page 8: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

How well do they predict depression together?

depression

lonelinessstress

(a)(b)

(c)

Multiple R2: (a) + (b) + (c) Pearson’s R2 for loneliness: (a) + (b)

Pearson’s R2 for stress: (c) + (b)

Page 9: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Partial Correlation

Page 10: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

depression

lonelinessstress

(a)(b)

(c)

What is the effect of loneliness controlling for stress?

Pearson’s R2 for loneliness: (a) + (b)

Pearson’s R2 for stress: (c) + (b)

Partial R2 for loneliness: (a)

Partial R2 for stress: (b)

Page 11: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Multiple Regression

Page 12: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Types of effects

Total effect of stress: (b) + (c)

depression

loneliness stress

(a)(b)

(c)

Shared effect of stress and loneliness: (b)

Unique effect of stress: (c)

Slope coefficients in simple regression capture total effects

Slope coefficients in multiple regression capture unique effects

Page 13: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Reasons for Multiple Regression

1) It allows you to directly compare the effect sizes for different predictor variables

2) Adding additional predictors that are related to your Y variable (we call them covariates) allows you to explain more of the residual variance. This makes MS error smaller and increases your power.

2) If you are worried that your key predictor is confounded with other variables, you can “partial them out” or “control for them” in your multiple regression by including them in the analysis.

depression

loneliness stress

(a)(b)

(c)

Page 14: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Two separate regressions

Regressing depression on stress

Predictor Unstandardized Coefficient

Standard error

Standardized Coefficient

t sig

Stress .50 .16 .25 3.125 <.05

Regressing depression on loneliness

Predictor Unstandardized Coefficient

Standard error

Standardized Coefficient

t sig

Loneliness .80 .28 .20 2.86 <.05

R2 = .0625

R2 = .04

Page 15: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Regressing depression on loneliness and stress

Predictor Unstandardized Partial Coefficient

Standard error

Standardized Partial Coefficient

t sig

Intercept 1.8 1.1 - 1.64 .08

Stress .34 .11 .17 3.09 <.05

Loneliness .10 .05 .05 2.00 .06

Multiple R2 = .0625

df = n – p - 12211' XbXbaY

LonelyStressY 10.34.8.1'

A multiple regression

Page 16: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Source SS df s2

Model

Error

Total

2resid

2model

s

sF

YY '

'YY

YY

2)'( YY

2)'( YY

2)( YY 1n

1 pn

p 2models

2resids

2Ys

2Y

2model

s

sR multiple 2

The F-test is for the whole model, doesn’t tell you about individual predictors

Multiple Regression ANOVA

Page 17: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Categorical Predictors in Multiple Regression

Page 18: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Regressing depression on gender (0=female, 1=male)

A dichotomous 0/1 predictor

Gender depression

0 8

1 4

1 10

0 15

1 8

0 14

Page 19: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Regressing depression on gender (0=female, 1=male)

Predictor Unstandardized Partial Coefficient

Standard error

Standardized Partial Coefficient

t sig

Intercept 12.33 1.99 - 6.2 <.01

Gender -5.00 2.81 .66 -1.8 .15

A dichotomous 0/1 predictor

The intercept coefficient tells you the mean depression of the 0 (female) groupThe gender coefficient tells you what to add to get the mean depression of the 1 (male) group

If the gender coefficient is significant, the groups significantly differ

Page 20: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Categorical and Continuous Predictors

in Multiple Regression

Page 21: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Combining Types of Predictors

T-tests and ANOVAs use group variables to predict continuous outcomes

Correlations and simple regressions use continuous variables to predict continuous outcomes

Multiple regressions allow you to use 1) information about group membership and 2) information about other continuous measurements, in the same analysis

Page 22: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Combining Types of Predictors

WHY would we want this?

Imagine that we have a control group and a highly-provoked group, and we also measure the “TypeA-ness” of each participant.

We noticed that because of streaky random sampling, we got more TypeA people in the control group than in the provoked group.

Multiple regression allows us to see if there was an effect of our manipulation, controlling for individual differences in TypeA-ness.

Basically, it allows us to put a situational manipulation and a personality scale measurement into the same study.

Page 23: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Group Provoke TypeA aggression

Control 0 3 3

Control 0 6 4

Control 0 10 6

Control 0 8 5

High 1 2 9

High 1 4 10

High 1 11 24

High 1 12 20

Page 24: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

Predictor Unstandardized Partial Coefficient

Standard error

Standardized Partial Coefficient

t sig

Intercept -3.26 2.23 - -1.46 .20

Provoke 10.675 1.89 .734 5.65 <.01

Type A 1.15 .26 .565 4.35 <.01

There is a significant effect of experimental condition and a significant effect of TypeA-ness

Page 25: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

General Linear Model

Page 26: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

General Linear Model

All of the techniques we’ve covered so far can be expressed as special cases of multiple regression

If you run a multiple regression with an intercept and no slope, the t-test for the intercept is the same as a single sample t-test.

If you put in a dichotomous (0/1) predictor, the t-test for your slope will be the same as an independent samples t-test.

If you put in dummy variables for multiple groups, your regression ANOVA will be the same as your one-way ANOVA or two-way ANOVA.

If you put in one continuous predictor, your β will be the same as your r.

Page 27: Overview of Techniques Case 1 Independent Variable is Groups, or Conditions Dependent Variable is continuous ( ) One sample: Z-test or t-test Two samples:

General Linear Model

Plus multiple regression can do so much more!

Looking at several continuous predictors together in one model.

Controlling for confounds.

Using covariates to “soak up” residual variance.

Looking at categorical and continuous predictors together in one model.

Looking at interactions between categorical and continuous variables.