statistics for the social sciences psychology 340 spring 2009 analysis of variance (anova)

19
Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Upload: matilda-obrien

Post on 08-Jan-2018

220 views

Category:

Documents


0 download

DESCRIPTION

Statistics for the Social Sciences Outline Basics of ANOVA Why Computations Post-hoc and planned comparisons Power and effect size for ANOVA Assumptions SPSS –1 factor between groups ANOVA –Post-hoc and planned comparisons

TRANSCRIPT

Page 1: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social SciencesPsychology 340

Spring 2009

Analysis of Variance (ANOVA)

Page 2: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Outline

• Basics of ANOVA• Why• Computations• ANOVA in SPSS• Post-hoc and planned comparisons• Assumptions • Power and effect size for ANOVA

Page 3: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Outline

• Basics of ANOVA• Why• Computations• Post-hoc and planned comparisons• Power and effect size for ANOVA• Assumptions • SPSS

– 1 factor between groups ANOVA– Post-hoc and planned comparisons

Page 4: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Example

• Effect of knowledge of prior behavior on jury decisions– Dependent variable: rate how innocent/guilty– Independent variable: 3 levels

Compare the means of these three groupsClean recordJurors

Guilt Rating

Criminal record

No Information

Guilt Rating

Guilt Rating

XC

XB

XA

Page 5: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Analysis of Variance

XB XAXC

Criminal record Clean record No information

10 5 4

7 1 6

5 3 9

10 7 3

8 4 3XA =8.0 XB =4.0 XC =5.0

– Need a measure that describes several difference scores

• Variance

Test statisticObserved variance

Variance from chanceF-ratio =

• More than two groups

Page 6: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Testing Hypotheses with ANOVA

– Step 2: Set your decision criteria– Step 3: Collect your data – Step 4: Compute your test statistics

• Compute your estimated variances• Compute your F-ratio• Compute your degrees of freedom (there are several)

– Step 5: Make a decision about your null hypothesis

• Hypothesis testing: a five step program– Step 1: State your hypotheses

– Additional tests: Planned comparisons & Post hoc tests• Reconciling our multiple alternative hypotheses

Page 7: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

• Null hypothesis: H0: all the groups are equal

XB XAXC

H0 :μA =μB =μC

– Step 1: State your hypotheses• Hypothesis testing: a five step program

• Alternative hypotheses (HA)– Not all of the populations all have same mean

H A :μA ≠μB ≠μC

H A :μA =μB ≠μC

H A :μA ≠μB =μC

The ANOVA tests this one!!

Testing Hypotheses with ANOVA

Choosing between these requires additional test H0 :μA =μC ≠μB

Page 8: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

1 factor ANOVA

XB XAXC

H A :μA ≠μB ≠μC

H A :μA =μB ≠μC

H A :μA ≠μB =μC

H0 :μA =μC ≠μB

• Alternative hypotheses (HA)– Not all of the populations all have same mean

• Planned contrasts and Post-hoc tests:– Further tests used to rule out the different alternative

hypothesesTest1 H0 :μA =μB

Test2 H0 :μA =μC

Test3 H0 :μB =μC

– reject

– reject– fail to reject

Page 9: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Why do the ANOVA?

• What’s the big deal? Why not just run a bunch of t-tests instead of doing an ANOVA?– Experiment-wise error

– The type I error rate of the family (the entire set) of comparisons

» αEW = 1 - (1 - α)c where c = # of comparisons» e.g., If you conduct two t-tests, each with an alpha

level of 0.05, the combined chance of making a type I error is nearly 10 in 100 (rather than 5 in 100)

– Planned comparisons and post hoc tests are procedures designed to reduce experiment-wise error

Page 10: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Which follow-up test?

• Planned comparisons– A set of specific comparisons that you “planned” to do

in advance of conducting the overall ANOVA

• Post-hoc tests– A set of comparisons that you decided to examine only

after you find a significant (reject H0) ANOVA– Often end up looking at all possible pair-wise

comparisons

Page 11: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Planned Comparisons

• General Rule of Thumb– Don’t plan more contrasts than (# of conditions – 1)

• Different types– Simple comparisons - testing two groups– Complex comparisons - testing combined groups– Bonferroni procedure (Dunn’s test)

• Use more stringent significance level for each comparison– Divide your desired α-level by the number of planned contrasts

Page 12: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Planned Comparisons

• Basic procedure:1. Within-groups population variance estimate

(denominator)2. Between-groups population variance estimate of the

two groups of interest (numerator)3. Figure F in usual way

Page 13: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Planned Comparisons

• Example: compare criminal record & no info grps

XB XAXC

Criminal record Clean record No information

10 5 4

7 1 6

5 3 9

10 7 3

8 4 3XA =8.0 XB =4.0 XC =5.0

SSA =18.0 SSB =20.0 SSC =26.0

SSWithin =64dfWithin =12

MSWithin =6412

=5.33

SSBetween =43.3dfbetween =2

MSBetween =43.32

=21.67

1) Within-groups population variance estimate (denominator)

MSWithin =6412

=5.33

2) Between-groups population variance estimate of the two groups of interest (numerator)

SSBetween = n X −GM( )∑ 2

dfbetween =#groups−1

MSBetween =SSBetweendfBetween

=2 − 1 = 1

=22.5

1= 22.5

=5 8 − 6.5( )2 + 5 5 − 6.5( )2

GM =X∑

N=6510

=6.5

=22.5

Page 14: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Planned Comparisons

• Example: compare criminal record & no info grps

Criminal record Clean record No information

10 5 4

7 1 65 3 9

10 7 3

8 4 3XA =8.0 XB =4.0 XC =5.0

SSA =18.0 SSB =20.0 SSC =26.0

SSWithin =64dfWithin =12

MSWithin =6412

=5.33

SSBetween =43.3dfbetween =2

MSBetween =43.32

=21.67

1) Within-groups population variance estimate (denominator)

MSWithin =6412

=5.33

2) Between-groups population variance estimate of the two groups of interest (numerator)

MSBetween =SSBetweendfBetween

=22.5

1= 22.5

GM =X∑

N=6510

=6.5

3) Figure F in usual way

F =MSBetweenMSWithin

=22.55.33

= 4.22 Fcrit (1,12) = 4.75α = 0.05

Fail to reject H0: Criminal record and no info are not statistically different

XB XAXC

Page 15: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Post-hoc tests

• Generally, you are testing all of the possible comparisons (rather than just a specific few)– Different types

• Tukey’s HSD test (only with equal sample sizes)• Scheffe test (unequal sample sizes okay, very conservative)• Others (Fisher’s LSD, Neuman-Keuls test, Duncan test)

– Generally they differ with respect to how conservative they are.

Page 16: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Effect sizes in ANOVA

• The effect size for ANOVA is r2

– Sometimes called η2 (“eta squared”) – The percent of the variance in the dependent variable

that is accounted for by the independent variable

r2 =SSBetweenSSTotal

=(MS2

Between )(dfBetween )(MS2

Between )(dfBetween ) + (MS2Within )(dfWithin )

Recall:S2 =MS=

SSdf

SStotal =SSbetween + SSwithin

=(F)(dfBetween )

(F)(dfBetween ) + (dfWithin )

Page 17: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

Effect sizes in ANOVA

• The effect size for ANOVA is r2

– Sometimes called η2 (“eta squared”) – The percent of the variance in the dependent variable

that is accounted for by the independent variable– Size of effect depends, in part, on degrees of freedom

• See tables 9-9& 9-10 in textbook for what is considered “small” “medium” and “large”

Page 18: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

ANOVA Assumptions

• Basically the same as with T-tests– Assumes that the distributions are Normal– Assumes that the distributions have equal

variances

– In both cases ANOVA analyses are generally robust against violations of these assumptions

Page 19: Statistics for the Social Sciences Psychology 340 Spring 2009 Analysis of Variance (ANOVA)

Statistics for the Social Sciences

ANOVA in SPSS

• Let’s see how to do a between groups 1-factor ANOVA in SPSS (and the other tests too)– Enter the data: similar to independent samples t-test,

observations in one column, a second column for group assignment

– Analyze: compare means, 1-way ANOVA• Observations -> Dependent list• Group assignment -> factor

– specify any comparisons or post hocs at this time too• Comparisons are entered with 1, 0, & -1