design and analysis of experiments

95
Design and Analysis of Experiments Dr. Tai-Yue Wang Department of Industrial and Information Management National Cheng Kung University Tainan, TAIWAN, ROC 1/33

Upload: kane-cobb

Post on 03-Jan-2016

65 views

Category:

Documents


1 download

DESCRIPTION

Design and Analysis of Experiments. Dr. Tai-Yue Wang Department of Industrial and Information Management National Cheng Kung University Tainan, TAIWAN, ROC. An alysis o f Va riance. Dr. Tai-Yue Wang Department of Industrial and Information Management National Cheng Kung University - PowerPoint PPT Presentation

TRANSCRIPT

Design and Analysis of Experiments

Dr. Tai-Yue Wang Department of Industrial and Information Management

National Cheng Kung UniversityTainan, TAIWAN, ROC

1/33

Analysis of Variance

Dr. Tai-Yue Wang Department of Industrial and Information Management

National Cheng Kung UniversityTainan, TAIWAN, ROC

2/33

Outline(1/2)

Example The ANOVA Analysis of Fixed effects Model Model adequacy Checking Practical Interpretation of results Determining Sample Size

3/33

Outline (2/2)

Discovering Dispersion Effects Nonparametric Methods in the ANOVA

4/33

5

What If There Are More Than Two Factor Levels? The t-test does not directly apply There are lots of practical situations where there are either

more than two levels of interest, or there are several factors of simultaneous interest

The ANalysis Of VAriance (ANOVA) is the appropriate analysis “engine” for these types of experiments

The ANOVA was developed by Fisher in the early 1920s, and initially applied to agricultural experiments

Used extensively today for industrial experiments

6

An Example(1/6)

7

An Example(2/6)

An engineer is interested in investigating the relationship between the RF power setting and the etch rate for this tool. The objective of an experiment like this is to model the relationship between etch rate and RF power, and to specify the power setting that will give a desired target etch rate.

The response variable is etch rate.

8

An Example(3/6)

She is interested in a particular gas (C2F6) and gap (0.80 cm), and wants to test four levels of RF power: 160W, 180W, 200W, and 220W. She decided to test five wafers at each level of RF power.

The experimenter chooses 4 levels of RF power 160W, 180W, 200W, and 220W

The experiment is replicated 5 times – runs made in random order

9

An Example --Data

10

An Example – Data Plot

11

Does changing the power change the mean etch rate?

Is there an optimum level for power?We would like to have an objective way

to answer these questionsThe t-test really doesn’t apply here – more

than two factor levels

An Example--Questions

12

The Analysis of Variance

In general, there will be a levels of the factor, or a treatments, and n replicates of the experiment, run in random order…a completely randomized design (CRD)

N = an total runs We consider the fixed effects case…the

random effects case will be discussed later Objective is to test hypotheses about the

equality of the a treatment means

13

The Analysis of Variance

14

The Analysis of Variance

The name “analysis of variance” stems from a partitioning of the total variability in the response variable into components that are consistent with a model for the experiment

15

The Analysis of Variance

The basic single-factor ANOVA model is

2

1,2,...,,

1, 2,...,

an overall mean, treatment effect,

experimental error, (0, )

ij i ij

i

ij

i ay

j n

ith

NID

16

Models for the Data

There are several ways to write a model for the data

Mean model

Also known as one-way or single-factor ANOVA

ijiijy

17

Models for the Data

Fixed or random factor? The a treatments could have been specifically

chosen by the experimenter. In this case, the results may apply only to the levels considered in the analysis. fixed effect models

18

Models for the Data

The a treatments could be a random sample from a larger population of treatments. In this case, we should be able to extend the conclusion to all treatments in the population.

random effect models

19

Analysis of the Fixed Effects Model

Recall the single-factor ANOVA for the fixed effect model

Defineijiijy

ijiijy

ainyyandyy ii

n

jiji ,...,2,1 / ..

1.

/ ....1 1

.. anNNyyandyya

i

n

jij

20

Analysis of the Fixed Effects Model

Hypothesis

j)pair(i, oneleast at for :

:

1

210

ji

a

H

H

1

a

a

ii

0 1

a

ii

21

Analysis of the Fixed Effects Model

Thus, the equivalent Hypothesis

i oneleast at for 0:

0:

1

210

i

a

H

H

22

Total variability is measured by the total sum of squares:

The basic ANOVA partitioning is:

2..

1 1

( )a n

T iji j

SS y y

2 2.. . .. .

1 1 1 1

2 2. .. .

1 1 1

( ) [( ) ( )]

( ) ( )

a n a n

ij i ij ii j i j

a a n

i ij ii i j

T Treatments E

y y y y y y

n y y y y

SS SS SS

Analysis of the Fixed Effects Model-Decomposition

23

In detail

Analysis of the Fixed Effects Model-Decomposition

a

i

n

jiiji

a

i

n

jiij

a

ii

a

i

n

jiiji

a

i

n

jij

yyyy

yyyyn

yyyyyy

1 1....

1 1

2

.1

2

...

1 1

2

....1 1

2

..

2

(=0)

24

Thus

Analysis of the Fixed Effects Model-Decomposition

a

i

n

jiij

a

ii

a

i

n

jiiji

a

i

n

jij

yyyyn

yyyyyy

1 1

2

.1

2

...

1 1

2

....1 1

2

..

SST

SSTreatments SSE

25

A large value of SSTreatments reflects large differences in treatment means

A small value of SSTreatments likely indicates no differences in treatment means

Formal statistical hypotheses are:

T Treatments ESS SS SS

0 1 2

1

:

: At least one mean is different aH

H

Analysis of the Fixed Effects Model-Decomposition

26

For SSE

Recall

Analysis of the Fixed Effects Model-Decomposition

a

i

n

jiij

a

i

n

jiijE yyyySS

1 1

2

.1 1

2

.

2

1

2

.

1

2

.2

)1(

1

i

n

jiij

n

jiij

i

Snyy

n

yy

S

27

Combine a sample variances

The above formula is a pooled estimate of the common variance with each a treatment.

Analysis of the Fixed Effects Model-Decomposition

)1()1()1(

)1()1()1(

)1()1()1(1

)1()1()1(

1

)1()1()1()1(

222

21

222

21

222

21

1

2

nnn

SnSnSn

aN

SS

nnn

aNSnSnSn

SnSnSnSnSS

aE

a

aa

iiE

28

Define

and

Analysis of the Fixed Effects Model-Mean Squares

1a

SSMS Treatments

Treatments

aN

SSMS E

E

df

df

29

By mathematics,

That is, MSE estimates σ2. If there are no differences in treatments

means, MSTreatments also estimates σ2.

Analysis of the Fixed Effects Model-Mean Squares

1)(

)(

1

2

2

2

a

nMSE

MSEa

ii

Treatments

E

Cochran’s Theorem

Let Zi be NID(0,1) for i=1,2,…,v and

then Q1, q2,…,Qs are independent chi-square random variables withv1, v2, …,vs degrees of freedom, respectively, If and only if 30

Analysis of the Fixed Effects Model-Statistical Analysis

),,2,1( degrees and

, where

i

121

sifreedomofvhasQ

vs

QQQZ

i

v

isi

svvvv 21

Cochran’s Theorem implies that

are independently distributed chi-square random variables

Thus, if the null hypothesis is true, the ratio

is distributed as F with a-1 and N-a degrees of freedom. 31

Analysis of the Fixed Effects Model-Statistical Analysis

22 / and / ETreatments SSSS

)/(

)1/(0 aNSS

aSSF

E

Treatments

Cochran’s Theorem implies that

Are independently distributed chi-square random variables

Thus, if the null hypothesis is true, the ratio

is distributed as F with a-1 and N-a degrees of freedom. 32

Analysis of the Fixed Effects Model-Statistical Analysis

22 / and / ETreatments SSSS

)/(

)1/(0 aNSS

aSSF

E

Treatments

33

Analysis of the Fixed Effects Models-- Summary Table

The reference distribution for F0 is the Fa-1, a(n-

1) distribution Reject the null hypothesis (equal treatment

means) if 0 , 1, ( 1)a a nF F

34

Analysis of the Fixed Effects Models-- Example

Recall the example of etch rate,

Hypothesis

different are means some:

:

1

43210

H

H

35

Analysis of the Fixed Effects Models-- Example

ANOVA table

36

Analysis of the Fixed Effects Models-- Example

Rejection region

37

Analysis of the Fixed Effects Models-- Example

P-value

P-value

38

Analysis of the Fixed Effects Models-- Example

MinitabOne-way ANOVA: Etch Rate versus Power

Source DF SS MS F PPower 3 66871 22290 66.80 0.000Error 16 5339 334Total 19 72210

S = 18.27 R-Sq = 92.61% R-Sq(adj) = 91.22%

Individual 95% CIs For Mean Based on Pooled StDevLevel N Mean StDev ---+---------+---------+---------+------160 5 551.20 20.02 (--*---)180 5 587.40 16.74 (--*---)200 5 625.40 20.53 (--*---)220 5 707.00 15.25 (--*---) ---+---------+---------+---------+------ 550 600 650 700

Pooled StDev = 18.27

Coding the observations will not change the results

Without the assumption of randomization, the ANOVA F test can be viewed as approximation to the randomization test.

39

Analysis of the Fixed Effects Model-Statistical Analysis

Reasonable estimates of the overall mean and the treatment effects for the single-factor model

are given by

40

Analysis of the Fixed Effects Model- Estimation of the model parameters

...

..

yy

y

ii

ijiijy

Confidence interval for μi

Confidence interval for μi - μj

41

Analysis of the Fixed Effects Model- Estimation of the model parameters

n

MSty

n

MSty E

aNiiE

aNi ,2/.,2/.

n

MStyy

n

MStyy E

aNjiiE

aNji ,2/..,2/..

For the unbalanced data

42

Analysis of the Fixed Effects Model- Unbalanced data

a

i

n

jT

i

ij N

yySS

1 1

2..2

a

i iTreatments N

y

n

ySS i

1

2..

2

.

43

A little (very little) humor…

Assumptions on the model

Errors are normally distributed and independently distributed with mean zero and constant but unknown variances σ2

Define residual

where is an estimate of yij 44

Model Adequacy Checking

ijiijy

ijijij yye

ijy

Normal probability plot

45

Model Adequacy Checking--Normality

Four-in-one

46

Model Adequacy Checking--Normality

Residuals vs run order

47

Model Adequacy Checking--Plot of residuals in time sequence

Residuals vs fitted

48

Model Adequacy Checking--Residuals vs fitted

Defects Horn shape Moon type

Test for equal variances Bartlett’s test

49

Model Adequacy Checking--Residuals vs fitted

21

222

210

oneleast at for not true above:

:

i

a

H

H

Test for equal variances Bartlett’s test

50

Model Adequacy Checking--Residuals vs fitted

Test for Equal Variances: Etch Rate versus Power

95% Bonferroni confidence intervals for standard deviations

Power N Lower StDev Upper 160 5 10.5675 20.0175 83.0477 180 5 8.8384 16.7422 69.4591 200 5 10.8357 20.5256 85.1557 220 5 8.0496 15.2480 63.2600

Bartlett's Test (Normal Distribution)Test statistic = 0.43, p-value = 0.933

Levene's Test (Any Continuous Distribution)Test statistic = 0.20, p-value = 0.898

Test for equal variances Bartlett’s test

51

Model Adequacy Checking--Residuals vs fitted

Variance-stabilizing transformation Deal with nonconstant variance If observations follows Poisson distribution

square root transformation

If observations follows Lognormal distribution

Logarithmic transformation

52

Model Adequacy Checking--Residuals vs fitted

ijijijij yyyy 1or **

ijij yy log*

Variance-stabilizing transformation If observations are binominal data

Arcsine transformation

Other transformation check the relationship among observations and mean.

53

Model Adequacy Checking--Residuals vs fitted

arcsin*ijij yy

The one-way ANOVA model

is a regression model and is similar to

54

Practical Interpretation of Results – Regression Model

ijiij xy 10

ijiijy

Computer Results

55

Practical Interpretation of Results – Regression Model

Regression Analysis: Etch Rate versus Power The regression equation isEtch Rate = 138 + 2.53 Power

Predictor Coef SE Coef T PConstant 137.62 41.21 3.34 0.004Power 2.5270 0.2154 11.73 0.000

S = 21.5413 R-Sq = 88.4% R-Sq(adj) = 87.8%

Analysis of VarianceSource DF SS MS F PRegression 1 63857 63857 137.62 0.000Residual Error 18 8352 464Total 19 72210

Unusual ObservationsObs Power Etch Rate Fit SE Fit Residual St Resid 11 200 600.00 643.02 5.28 -43.02 -2.06R

R denotes an observation with a large standardized residual.

56

The Regression Model

57

Practical Interpretation of Results – Comparison of Means

The analysis of variance tests the hypothesis of equal treatment means

Assume that residual analysis is satisfactory If that hypothesis is rejected, we don’t know

which specific means are different Determining which specific means differ

following an ANOVA is called the multiple comparisons problem

58

Practical Interpretation of Results – Comparison of Means

There are lots of ways to do this We will use pairwised t-tests on means…

sometimes called Fisher’s Least Significant Difference (or Fisher’s LSD) Method

59

Practical Interpretation of Results – Comparison of Means

Fisher 95% Individual Confidence IntervalsAll Pairwise Comparisons among Levels of PowerSimultaneous confidence level = 81.11%

Power = 160 subtracted from:

Power Lower Center Upper ----+---------+---------+---------+-----180 11.71 36.20 60.69 (--*-)200 49.71 74.20 98.69 (-*--)220 131.31 155.80 180.29 (--*-) ----+---------+---------+---------+----- -100 0 100 200

60

Practical Interpretation of Results – Comparison of Means

Fisher 95% Individual Confidence IntervalsAll Pairwise Comparisons among Levels of PowerSimultaneous confidence level = 81.11%

Power = 180 subtracted from:

Power Lower Center Upper ----+---------+---------+---------+-----200 13.51 38.00 62.49 (--*-)220 95.11 119.60 144.09 (-*-) ----+---------+---------+---------+----- -100 0 100 200Power = 200 subtracted from:

Power Lower Center Upper ----+---------+---------+---------+-----220 57.11 81.60 106.09 (-*--) ----+---------+---------+---------+----- -100 0 100 200

61

Practical Interpretation of Results – Graphical Comparison of Means

62

Practical Interpretation of Results – Contrasts

A linear combination of parameters

So the hypothesis becomes

0 and 11

a

ii

a

iii cc

0:

0:

11

10

a

iii

a

iii

cH

cH

63

Practical Interpretation of Results – Contrasts

Examples

0:

0:

431

430

H

H

0:

0:

43211

43210

H

H

03:

03:

43211

43210

H

H

64

Practical Interpretation of Results – Contrasts Testing

t-test Contrast average

Contrast Variance

Test statistic

a

iii ycC

1.

a

iic

nCV

1

2

a

ii

E

a

iii

cn

MS

yct

1

1.

0

65

Practical Interpretation of Results – Contrasts

Testing F-test

Test statistic

a

i

a

iii

C

a

i

a

iii

E

E

C

E

Ca

i

E

a

iii

i

i

i

cn

yc

SS

cn

yc

MS

MS

SS

MS

MS

cn

MS

yc

tF

1

2

2

1.

1

2

2

1.

1

2

2

1.

20

1 where

1

1

1/0

66

Practical Interpretation of Results – Contrasts

Testing F-test

Reject hypothesis if

aNFF ,1,0

67

Practical Interpretation of Results – Contrasts

Confidence interval

a

i

EaN

a

iii

a

iii

a

i

EaN

a

iii

i

i

cn

MStyc

c

cn

MStyc

1

2,2/

1.

1

1

2,2/

1.

68

Practical Interpretation of Results – Contrasts

Standardized contrast Standardize the contrasts when more than one

contrast is of interest Standardized contrast

a

ii

ii

a

ii

cn

cc

yci

1

2

*

1.

*

1

where

69

Practical Interpretation of Results – Contrasts

Unequal sample size Contrast

t-statistic

0

1

1

a

iii

a

iii

cn

c

a

i iE

a

iii

n

cMS

yct

i

1

2

1.

0

70

Practical Interpretation of Results – Contrasts

Unequal sample size Contrast sum of squares

a

i i

a

iii

C

n

c

yc

SSi

1

2

2

1.

71

Practical Interpretation of Results – Orthogonal Contrasts

Define two contrasts with coefficients {ci} and {di} are orthogonal contrasts if

Unbalanced design

01

a

iiii dcn

01

a

iiidc

72

Practical Interpretation of Results – Orthogonal Contrasts

Why use orthogonal contrasts ?

For a treatments, the set of a-1 orthogonal contrasts partition the sum of squares due to treatments into a-1 independent single-degree-of freedom

tests performed on orthogonal contrasts are independent.

73

Practical Interpretation of Results – Orthogonal Contrasts

example

74

Practical Interpretation of Results – Orthogonal Contrasts

Example for contrast

75

Practical Interpretation of Results – Scheffe’s method for comparing all contrasts

Comparing any and all possible contrasts between treatment means

Suppose that a set of m contrasts in the treatment means

of interest have been determined. The corresponding contrast in the treatment

averages is

muccc aauuuu ,...,2,1 2211

.iy

muycycycC aauuuu ,...,2,1 ..22.1.1

76

Practical Interpretation of Results – Scheffe’s method for comparing all contrasts

The standard error of this contrast is

The critical value against which should be compared is

If |Cu|>Sα,u , the hypothesis that contrast Γu equals zero is rejected.

a

iiiuECu ncMSS

1

2 /

uC

a,Nα,aCu Fa-SSu 1, 1

77

Practical Interpretation of Results – Scheffe’s method for comparing all contrasts

The simultaneous confidence intervals with type I error α

uuuuu SCSC ,,

78

Practical Interpretation of Results – example for Scheffe’s method

8.1550.7072.551

8.1930.7074.6254.5872.551

.4.12

.4.3.2.11

yyC

yyyyC

Contrast of interest

The numerical values of these contrasts are412

43211

79

Practical Interpretation of Results – example for Scheffe’s method

97.45

09.65

2,01.0

1,01.0

S

S

Standard error

One percent critical values are

|C1|>S0.01,1 and |C1|>S0.01,1 , both contrast hypotheses should be rejected.

55.11

34.16

2

1

C

C

S

S

80

Practical Interpretation of Results –comparing pairs of treatment means

Tukey’s test Fisher’s Least significant Difference (LSD)

method Hsu’s Methods

81

Practical Interpretation of Results –comparing pairs of treatment means—Computer output

One-way ANOVA: Etch Rate versus Power

Source DF SS MS F PPower 3 66871 22290 66.80 0.000Error 16 5339 334Total 19 72210

S = 18.27 R-Sq = 92.61% R-Sq(adj) = 91.22%

Individual 95% CIs For Mean Based on Pooled StDev

Level N Mean StDev ---+---------+---------+---------+------160 5 551.20 20.02 (--*---)180 5 587.40 16.74 (--*---)200 5 625.40 20.53 (--*---)220 5 707.00 15.25 (--*---) ---+---------+---------+---------+------ 550 600 650 700

Pooled StDev = 18.27

82

Practical Interpretation of Results –comparing pairs of treatment means—Computer output

Grouping Information Using Tukey MethodPower N Mean Grouping220 5 707.00 A200 5 625.40 B180 5 587.40 C160 5 551.20 D

Means that do not share a letter are significantly different.

Tukey 95% Simultaneous Confidence IntervalsAll Pairwise Comparisons among Levels of PowerIndividual confidence level = 98.87%

Power = 160 subtracted from:Power Lower Center Upper -----+---------+---------+---------+----180 3.11 36.20 69.29 (---*--)200 41.11 74.20 107.29 (--*---)220 122.71 155.80 188.89 (---*--) -----+---------+---------+---------+---- -100 0 100 200

83

Practical Interpretation of Results –comparing pairs of treatment means—Computer output

Power = 180 subtracted from:

Power Lower Center Upper -----+---------+---------+---------+----200 4.91 38.00 71.09 (---*--)220 86.51 119.60 152.69 (--*--) -----+---------+---------+---------+---- -100 0 100 200

Power = 200 subtracted from:

Power Lower Center Upper -----+---------+---------+---------+----220 48.51 81.60 114.69 (--*--) -----+---------+---------+---------+---- -100 0 100 200

84

Practical Interpretation of Results –comparing pairs of treatment means—Computer output

Hsu's MCB (Multiple Comparisons with the Best)

Family error rate = 0.05Critical value = 2.23Intervals for level mean minus largest of other level means

Level Lower Center Upper ---+---------+---------+---------+------160 -181.53 -155.80 0.00 (---*------------------)180 -145.33 -119.60 0.00 (--*--------------)200 -107.33 -81.60 0.00 (--*---------)220 0.00 81.60 107.33 (---------*--) ---+---------+---------+---------+------ -160 -80 0 80

85

Practical Interpretation of Results –comparing pairs of treatment means—Computer output

Grouping Information Using Fisher Method

Power N Mean Grouping220 5 707.00 A200 5 625.40 B180 5 587.40 C160 5 551.20 D

Means that do not share a letter are significantly different.

Fisher 95% Individual Confidence IntervalsAll Pairwise Comparisons among Levels of PowerSimultaneous confidence level = 81.11%

Power = 160 subtracted from:

Power Lower Center Upper ----+---------+---------+---------+-----180 11.71 36.20 60.69 (--*-)200 49.71 74.20 98.69 (-*--)220 131.31 155.80 180.29 (--*-) ----+---------+---------+---------+----- -100 0 100 200

86

Practical Interpretation of Results –comparing pairs of treatment means—Computer output

Power = 180 subtracted from:

Power Lower Center Upper ----+---------+---------+---------+-----200 13.51 38.00 62.49 (--*-)220 95.11 119.60 144.09 (-*-) ----+---------+---------+---------+----- -100 0 100 200

Power = 200 subtracted from:

Power Lower Center Upper ----+---------+---------+---------+-----220 57.11 81.60 106.09 (-*--) ----+---------+---------+---------+----- -100 0 100 200

87

Practical Interpretation of Results –comparing treatment means with a control

Donntt’s method Control– the one to be compared Totally a-1 comparisons

Dunnett's comparisons with a control

Family error rate = 0.05Individual error rate = 0.0196 Critical value = 2.59Control = level (160) of Power

Intervals for treatment mean minus control meanLevel Lower Center Upper ---------+---------+---------+---------+180 6.25 36.20 66.15 (-----*-----)200 44.25 74.20 104.15 (-----*-----)220 125.85 155.80 185.75 (-----*-----) ---------+---------+---------+---------+ 50 100 150 200

88

Determining Sample size -- Minitab

One-way ANOVA

Alpha = 0.01 Assumed standard deviation = 25

Factors: 1 Number of levels: 4

Maximum Sample TargetDifference Size Power Actual Power 75 6 0.9 0.915384

The sample size is for each level.

StatPower and sample sizeOne-way ANOVA

Number of level ->4 Sample size -> Max. difference 75 Power value 0.9 SD 25

89

Dispersion Effects

ANOVA for location effects Different factor level affects variability dispersion effects

Example Average and standard deviation are measured for a

response variable.

90

Dispersion Effects

ANOVA found no location effects Transform the standard deviation to

(s)y ln

Ratio Obser.

Control

Algorithm 1 2 3 4 5 6

1 -2.99573 -3.21888 -2.99573 -2.81341 -3.50656 -2.99573

2 -3.21888 -3.91202 -3.50656 -2.99573 -3.50656 -2.40795

3 -2.40795 -2.04022 -2.20727 -1.89712 -2.52573 -2.12026

4 -3.50656 -3.21888 -2.99573 -2.99573 -3.50656 -3.91202

91

Dispersion Effects

ANOVA found dispersion effects

One-way ANOVA: y=ln(s) versus Algorithm

Source DF SS MS F PAlgorithm 3 6.1661 2.0554 21.96 0.000Error 20 1.8716 0.0936Total 23 8.0377

S = 0.3059 R-Sq = 76.71% R-Sq(adj) = 73.22%

92

Nonparametric methods in the ANOVA

When normality is invalid Use Kruskal-Wallis test

Rank observation yij in ascending order Replace each observation by its rank, Rij

In case tie, assign average rank to them test statistic

a

i i

i NN

n

R

SH

1

22.

2 4

)1(1

93

Regression and ANOVARegression Analysis: Etch Rate versus Power The regression equation isEtch Rate = 138 + 2.53 Power

Predictor Coef SE Coef T PConstant 137.62 41.21 3.34 0.004Power 2.5270 0.2154 11.73 0.000

S = 21.5413 R-Sq = 88.4% R-Sq(adj) = 87.8%

Analysis of VarianceSource DF SS MS F PRegression 1 63857 63857 137.62 0.000Residual Error 18 8352 464Total 19 72210

Unusual ObservationsObs Power Etch Rate Fit SE Fit Residual St Resid 11 200 600.00 643.02 5.28 -43.02 -2.06R

R denotes an observation with a large standardized residual.

94

Nonparametric methods in the ANOVA

Kruskal-Wallis test test statistic

where

a

i i

i NN

n

R

SH

1

22.

2 4

)1(1

a

i

n

j

NNR

NS

j

ij1

2

1

22

4

)1(

1

1

95

Nonparametric methods in the ANOVA

Kruskal-Wallis test If

the null hypothesis is rejected.

21, aH