the development of analysis of variance techniques for angular...

288
The development of analysis of variance techniques for angular data. HARRISON, David. Available from Sheffield Hallam University Research Archive (SHURA) at: http://shura.shu.ac.uk/19759/ This document is the author deposited version. You are advised to consult the publisher's version if you wish to cite from it. Published version HARRISON, David. (1987). The development of analysis of variance techniques for angular data. Doctoral, Sheffield Hallam University (United Kingdom).. Copyright and re-use policy See http://shura.shu.ac.uk/information.html Sheffield Hallam University Research Archive http://shura.shu.ac.uk

Upload: others

Post on 11-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The development of analysis of variance techniques for angular data.

HARRISON, David.

Available from Sheffield Hallam University Research Archive (SHURA) at:

http://shura.shu.ac.uk/19759/

This document is the author deposited version. You are advised to consult the publisher's version if you wish to cite from it.

Published version

HARRISON, David. (1987). The development of analysis of variance techniques for angular data. Doctoral, Sheffield Hallam University (United Kingdom)..

Copyright and re-use policy

See http://shura.shu.ac.uk/information.html

Sheffield Hallam University Research Archivehttp://shura.shu.ac.uk

Page 2: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

65 Oiiiiiiiiiiiiiniiiiwiii

Page 3: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

ProQuest Number: 10697061

All rights reserved

INFORMATION TO ALL USERS The quality of this reproduction is dependent upon the quality of the copy submitted.

In the unlikely event that the author did not send a com ple te manuscript and there are missing pages, these will be noted. Also, if material had to be removed,

a note will indicate the deletion.

uestProQuest 10697061

Published by ProQuest LLC(2017). Copyright of the Dissertation is held by the Author.

All rights reserved.This work is protected against unauthorized copying under Title 17, United States C ode

Microform Edition © ProQuest LLC.

ProQuest LLC.789 East Eisenhower Parkway

P.O. Box 1346 Ann Arbor, Ml 48106- 1346

Page 4: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

THE DEVELOPMENT OF ANALYSIS OF VARIANCE TECHNIQUES

FOR ANGULAR DATA

DAVID HARRISON

A thesis submitted to the Council for National Academic Awards in partial fulfilment of the requirements for the degree of Doctor of Philosophy

Sponsoring Establishment Department of Applied Statistics and Operational Research

Sheffield City Polytechnic

September 1987

Page 5: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

^ v \

Page 6: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TH E DEVELOPMENT OF ANALYSIS OF VARIANCE TECHNIQUES FOR

ANGULAR DA TA

D HARRISON

ABSTRACT

In many areas of research, such as within medical statistics, biology and geostatistics, problems arise requiring the analysis of angular (or directional) data. Many possess experimental design problems and require analysis of variance techniques for suitable analysis of the angular data. These techniques have been developed for very limited cases and the sensitivity of such techniques to the violation of assumptions made, and their possible extension to larger experimental models, has yet to be investigated.

The general aim of this project is therefore to develop suitable experimental design models and analysis of variance type techniques for the analysis of directional data.

Initially a generalised linear modelling approach is used to derive parameter estimates for one-way classification designs leading to maximum likelihood methods. This approach however, when applied to larger experimental designs is shown to be intractable due to optimization problems.

The limited analysis of variance techniques presently available for angular data are reviewed and extended to take account of the possible addition of further factors within an experimental design. These are shown to breakdown under varying conditions and question basic underlying assumptions regarding the components within the original approach.

A new analysis of variance approach is developed which possesses many desirable properties held in standard 'linear' statistical analysis of variance.

Finally several data sets are analysed to support the validity of the new techniques.

Page 7: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

ACKNOWLEDGEMENTS

I would like to express my gratitude to Dr G K Kanji, acting Head of Department,

and Dr R J Gadsden, Principal Lecturer, for introducing me to this field and under

whose supervision and constant encouragement this work has been carried out. I

would also like to thank the members of the Departments of Mathematics and

Statistics for their assistance on specific points.

My indebtedness to the work of Professor K V Mardia will be evident; I should

here like to thank him, in addition, for his valuable comments and suggestions.

I wish to thank Ms Ann Moon for the excellent typing of this thesis produced from

the demanding material with which she had to work.

Finally, and most of all, I would like to express my love and gratitude to my wife

Carole, for her patience and understanding.

Page 8: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CONTENTS

PageCHAPTER 1

INTRODUCTION

1.1 Directional Data 1

1.2 A General Thesis Review 3

1.3 Probability Distributions on the Circle 6

1.4 Statistics of a Circular Distribution 10

CHAPTER 2

TH E VON MISES DISTRIBUTION

2.1 Derivation 15

2.2 Properties of the von Mises Distribution and its

Parameters 16

2.3 The Distribution of R for the Uniform and von Mises

Distributions 21

2.3.1 Preliminary Results 21

2.3.2 Distribution of R for the Uniform Distribution 22

2.3.3 Distribution of R for the von Mises Distribution 24

2.4 Combining von Mises Distributions 24

2.5 A Summary of Exact and Approximate Moments of R 27

CHAPTER 3

LIKELIHOOD RATIO TESTS AND APPROXIMATIONS TO TH E

M AXIM UM LIKELIHOOD ESTIMATES

3.1 The Method of Maximum Likelihood and Likelihood Ratio

Tests 30

- iii -

Page 9: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

3.2 Approximations for the von Mises Concentration Statistic, k,

when k is small 33

3.3 Approximations for the von Mises Concentration Statistic, k ,

when k is large 42

3.4 The Best Approximation for the von Mises Concentration

Statistic, k, in the range 1 < k < 2.5 52

3.5 Summary of Approximations 54

3.6 Expansion of [ I0(A )/I0(B)]N 56

CHAPTER 4

THE DEVELOPMENT OF CIRCULAR ANALYSIS OF VARIANCE

TECHNIQUES

4.1 Introduction 58

4.2 Exact and Approximate Tests of Significance from Watson

and Williams 59

4.2.1 An Exact Test 59

4.2.2 Approximate Tests 59

4.3 Hypothesis Testing Concerning the Mean Direction 61

4.4 Multi-sample Tests Concerning the Mean Direction 65

4.4.1 For Small Concentration Parameter, k 66

4.4.2 For Large Concentration Parameter, k 66

4.5 Multi-sample Tests for the Equality of Concentration

Parameters, fcj 69

4.5.1 Rj/Nj < 0.45 70

4.5.2 0.45 < Rj/Nj 0.70 71

4.5.3 Rj/Nj > 0.70 71

- iv -

Page 10: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 5

PARAMETER ESTIMATION LEADING TO M A X IM U M LIKELIHOOD

METHODS FOR LARGER EXPERIMENTAL DESIGNS

5.1 Introduction 73

5.2 Constraints for Circular Models 75

Example 5.2.1

5.3 One-way Classification 77

Example 5.3.1

5.4 Randomised Complete Block and Larger Experimental Designs 84

Example 5.4.1

5.5 Summary 90

CHAPTER 6

EXTENDING TO LARGER EXPERIMENTAL DESIGNS

6.1 Introduction 91

6.2 Nested or Hierarchical Design 92

6.3 The Randomised Complete Block Design and Two-way Classification

with Interaction Design 94

6.4 Accuracy of the Associated x 2 Distributions for the

Randomised Block Design and the Two-way Classification

and their Corresponding F Statistics 98

6.5 Summary 114

- v -

Page 11: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 7

THE ROBUSTNESS AND POSSIBLE COLLAPSE OF TH E EXTENDED

TECHNIQUES

7.1 Introduction 115

7.2 Robustness of Assumptions 115

Example 7.2.1

7.3 A Further Approach to Watson and Williams Design Using a

Regression Model 119

7.3.1 Total Measure of Variation 122

7.3.2 Residual or Within Measure of Variation 123

7.3.3 Between Measure of Variation 124

7.3.3.1 Combining Mean Angular Directions and Resultant

Lengths 125

7.4 Cross Product Terms and the Analysis of

Cross-Classification 130

7.5 Summary 134

CHAPTER 8

DEVELOPMENT OF A NEW ANALYSIS OF VARIANCE PROCEDURE

8.1 Introduction 135

8.2 Minimising Chord Distances to Build New Model Components 135

8.2.1 Total Measure of Variation 136

8.2.2 Residual Measure of Variation 138

8.2.3 Between Measure of Variation 139

8.3 Cross Product Terms 141

Page 12: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

8.4 Analysis of Cross-Classification and Interaction 143

8.4.1 The Interpretation of Interaction on the Circle 143

8.4.2 The Calculation of Interaction on the Circle 147

8.5 Other Design Models 151

8.5.1 The Randomised Complete Block Design 151

8.5.2 Latin Square Design 152

8.5.3 Nested or Hierarchical Design 154

8.5.4 Three-way Classification 155

8.6 Summary 156

CHAPTER 9

TH E DISTRIBUTION OF THE NEW COMPONENTS A ND TEST

STATISTICS

9.1 Introduction 157

9.2 Small Concentration Parameter, k 158

9.3 Large Concentration Parameter, k 160

9.3.1 Distribution of fc(N-(C2/N)) 161

9.3.2 Distribution of fc((R2/N )-(C 2/N)) 162

9.3.3 Distribution of fc(N-(R2/N )) 162

9.4 The Variance of the Component Chi-Squared Approximations 163

9.4.1 Variance of (N -(R 2/N )) 163

9.4.2 Variance of (N-^(R2j/N 2j)) 166

9.4.3 Variance of ® R 2j/N j)-(R 2/N)) 167

9.5 The Adequacy of the New Procedure, for Large k 168

9.6 Comparing the Power of the Tests, for Large k 184

9.7 The Adequacy of the New Procedure, for Small k 187

9.8 Comparing the Power of the Tests, for Small k 191

9.9 Summary 194

- vii -

Page 13: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 10

TH E RANDOMISED COMPLETE BLOCK AND T W O -W A Y DESIGNS

V IA TH E NEW APPROACH

10.1 Introduction 196

10.2 Randomised Complete Block and Two-way Classification with

Interaction Designs, for Large k 196

10.3 Accuracy of the Associated \ 2 Approximations for the

Randomised Complete Block and the Two-way Classification

Designs with their Corresponding F Statistics, for

Large k 200

10.3.1 The Randomised Complete Block Design 200

10.3.2 Two-way Classification with Interaction 211

10.4 Randomised Complete Block and Two-way Classification with

Interaction Designs, for Small k 218

10.5 Accuracy of the Associated Chi-squared Approximations for

Small k 219

10.6 Summary 227

CHAPTER 11

ANALYSIS OF VARIANCE EXAMPLES FOR CIRCULAR STATISTICS

11.1 Introduction 228

11.2 One-way Analysis 228

Example 11.2.1 for Large k

Example 11.2.2 for Small k

11.3 Randomised Complete Block Design 234

Example 11.3.1 for Large k

Example 11.3.2

Page 14: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

11.4 Two-way Classification with Interaction

Example 11.4.1 for Large k

Example 11.4.2 for Small k

11.5 The Latin or Graeco-Latin Square

Example 11.5.1

11.6 Split Plot Design

Example 11.6.1

11.7 Summary

CHAPTER 12

SYNOPSIS OF RESULTS AND CONCLUSIONS

Appendix A

Index of Notation

Appendix B

The Simulation and Accuracy of Numerical Results

References

Page 15: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 1

INTRODUCTION

1.1 Directional Data

In many scientific fields the experimenter is interested primarily in the direction of a

measured variable. These observations will be bearings from some central point, or

origin, ending on a sphere or circumference of a circle, and may be regarded as

vectors. The radius can be represented as a unit vector, while the length or

magnitude of the vector is not important. Directions may be thought of in any

number of dimensions but in practice they are invariably collected in two or three

dimensional space. The tests and new results presented in this thesis are solely

concerned with directions in two-dimensions. Directions are measured by angles

ranging from 0* to 360*, or, equivalently, from 0 to 2x radians. Circular or

directional data is the name given to data which arise when the observations are

angles.

There are many examples of circular data originating from various disciplines. For

example, geologists study the orientation of fractures in deformed rocks to interpret

structural changes, and the orientation of cross-bedding or particles in undisturbed

sediments to the direction of depositing currents of wind and water. (Pincus (1953),

Curray (1956), Sengupta and Rao (1966), and Sanderson (1976)). The classic example

of directional data is from the study of bird orientation in homing or migration

which involves observing the birds vanishing angles from their release point.

Zoologists use such data to investigate consistency of bird migration under certain

conditions. Many examples from this field of study are well cited and illustrated by

Batschelet (1965, 1981).

- 1 -

Page 16: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Directional data is not confined to observations directly measured in degrees or

radians, but may also occur in the area of biological rhythms. A period of 24 hours

corresponds to a full turn of 360 degrees. Similarly, a month, a year or any other

period of a cyclic event may be represented by a rotation of 360 degrees. The

number of deaths due to a disease or the number of onsets of a disease in each

month over years fall in this category and can be treated as directional or circular

data. Other examples of this type can be seen in Gumbel (1954).

It is tempting to use the conventional measures of location and spread used in linear

analysis to analyse directional data. For example, suppose our data are the four

values 5, 14, 351 and 10, a simple arithmetic mean would give a value of 95. For

linear analysis this is understandable as the value of 351 has a large influence and

draws the mean away from the other data points. If these values are now regarded

as angles the spread of the whole sample is reduced, since in angular terms the

sample value of 351* is now situated close to the other data points. Similarly the

point of central location will have changed considerably and can now be seen to be

around zero degrees. Then the simple arithmetic mean would not, in general, give a

meaningful mean direction of the sample, similarly, the standard deviation would not

give a good measure of dispersion. If, however, the zero direction was taken at a

different position on the circle such as the y-axis in place of the x-axis then the

linear measure may give a sensible result. For example, if the above sample values

were rotated by 90*, to become 95*, 104*, 81* and 100*, the arithmetic mean would

give a sensible result of 95*. It is therefore not possible to define an arithmetic

mean or standard deviation in such a way that it is invariant under a rotation of the

circle. This heavy dependence on the zero direction shows the inappropriate use of

basic linear methods for circular statistics. Simple examples of such problems are

given by Batschelet (1965, 1981), Mardia (1972) and Watson (1983). Distribution

functions, characteristic functions and moments all suffer from the same draw-back

and must in some way take account of the natural periodicity of the circle.

- 2 -

Page 17: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

1.2 A General Thesis Review

Having introduced the study of directional data, this section gives a brief review of

the work discussed within each of the following chapters, whilst indicating the

structure and progression of the thesis as a whole. The following two sections within

this chapter give further background to the subject of directional data. The first

discusses different types of probability distributions that may exist on the circle, whilst

the second states the elementary statistics of angular data required for further use in

later chapters.

In Chapter 2 the von Mises distribution is discussed from estimation and distributional

view points. The maximum likelihood estimates of the von Mises parameters are

seen to be asymptotically independent so that construction of simple large-sample

tests, for differing hypotheses regarding the parameters, may be carried out. Interest

is focused on the distributional form of the resultant length on the random (Uniform)

distribution, k - 0, and the von Mises distribution. The results from the special

case of the random distribution are required since terms in its solution arise again in

the general distribution theory. A summary of exact and approximate moments of

the resultant length, R, are given in preparation for the derivation of further circular

statistical tests.

Chapter 3 is a review of the maximum likelihood results and tests for the von Mises

distribution. Extensive investigation of the many approximations for the von Mises

concentration parameter, k, for both small and large k, has been undertaken. Seven

approximations for the maximum likelihood estimate of small k, k, which have been

cited by various authors are reviewed. Their corresponding values, residuals and

relative residuals are plotted to enable comparison and evaluation of their accuracy.A

Similar work is carried out for nine approximations to large k, k. A summary of

the 'best' and 'best simple1 approximations is given in Section 3.5 against varying

- 3 -

Page 18: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

ranges of the concentration parameter.

Chapter 4 gives an historical review of the development of analysis of variance

techniques. The work covers the exact and approximate tests for differing mean

directions derived by Watson (1956), Watson and Williams (1956) and Stephens

(1962a, b and c), to multi-sample tests for the equality of concentration parameters.

The homogeneity tests for varying ranges of concentration parameter are cited for

later use with new design tests.

Chapter 5 discusses the use of the generalised linear modelling approach for circular

statistics to derive parameter estimates leading to maximum likelihood methods. For

circular statistics it is shown to be desirable to choose the constraint on the angles

specifying the factor parameters so that their sines sum to zero. Section 5.3 shows

how parameter estimates for the one-way classification design may be found and

therefore assist in further understanding of the underlying structure under

investigation. The approach, however, when applied to larger experimental designs is

seen, at this time, to be intractable since the optimization procedures cannot be

solved due to the numerous local maxima found within the constrained equations.

Chapters 6 and 7 examine the possibility of extending the original procedure for the

one-way analysis, derived by Watson and Williams (1956), to larger designs, for large

k. Chapter 6 shows the construction of the nested or hierarchical design, the

randomised complete block and two-way classification design with interaction together

with a comparison of their accuracy to the chi-squared and F distributions. Chapter

7, however, shows the possible collapse of these test statistics under particular

circumstances. The problems associated with the combining of circular mean

directions are shown to be influential in this collapse whilst the cross-product terms

are seen to be non-zero and requiring a correction factor to eliminate them.

- 4 -

Page 19: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Chapter 8 develops a new analysis of variance approach by taking account of the

resultant lengths together with their corresponding mean directions to eliminate the

possible collapse discussed in Chapter 7. The method is still based on maximum

likelihood techniques but requires the user to test for equality of concentration

parameters prior to testing for any difference between mean directions. The

cross-product terms are examined and found to equal their desired combined value of

zero. An investigation of the interpretation and representation of interaction on the

circle is given in Section 8.4 prior to its calculation via the new approach. For the

two-way design the cross-product terms are again shown to equal zero. Further

designs are then constructed in the same manner.

Following the development of the procedures in Chapter 8, Chapter 9 examines the

statistical theory and distributions behind the new design components and test

statistics. The exact theoretical distributions are seen to be intractable, and therefore

distribution approximations are used to examine the theory whilst simulation

techniques reproduce the distributions of the test statistics for comparison with their

assumed expected distributions. The comparisons are carried out for both large and

small k and test statistic improvements are made using the component moments.

The power of the new tests are also compared with existing tests for the

multi-sample case and are seen to compare favourably for both large and small k.

Chapter 10 reproduces the components within the new procedure for the randomised

complete block and two-way designs together with their improvement factor derived

in Chapter 9. The component statistics and test statistics are compared to their

respective exact chi-squared and F distributions. These two designs are used to

illustrate the validity of the approach for larger more complex design situations.

- 5 -

Page 20: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Chapter 11 gives several examples where the new approach is applied to real data

sets with varying sizes of concentration parameter. The examples vary from the

one-way design to the Graeco-Latin square and split plot designs.

Finally Chapter 12 summarises the development of the new analysis of variance

techniques. The adequacy of the new procedures, produced for both large and small

concentration parameter, are discussed together with their respective components and

test statistics.

Appendix A gives a list of notations used throughout the thesis together with the

design notations set out in tabular form. Appendix B reviews the techniques used to

simulate the von Mises distribution and the required experimental designs. The size

and accuracy of the numerical results are also discussed.

1.3 Probability Distributions on the Circle

There is no single distribution on the circle which has all the desirable properties

which the Normal distribution possesses on the line. Most of the distributions on the

circle have been derived either from transformations of the standard univariate (or

bivariate) distributions or as circular analogues of important univariate characteristics.

Linear distributions may have a finite range, range to infinity, or may even extend

over the whole straight line. Circular distributions, however, are always finite

ranging from 0* to 360’ (or, equivalently, 0 to 2ir), or are fractions within this

range.

In general, circular distributions are continuous over the circumference of the circle

and may be specified by a probability density function f(0), which is a periodic

function satisfying

- 6 -

Page 21: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

" 27T

f ( 0 ) d0 - 1 ( 1 . 3 . 1 )

. o

Although no circular distribution holds all the desirable properties seen in the Normal

distribution, the von Mises distribution (originally referred to as the Circular Normal

distribution) is the most generally used distribution in statistical inference on the

circle. The importance of the von Mises distribution on the circle is often compared

to that of the Normal distribution on the line.

The distribution has probability density function

f ( 0 ) = 2ffT (E) exp^ cos( 6 ~ Mo)l 0 < 6 < 2x ( 1 . 3 . 2 )

where I 0(fc) is a modified Bessel function, and A: is a parameter of concentration of

the data about a mean direction nQ. A complete discussion of the von Mises and

its properties can be seen in Chapter 2.

There are two limiting cases of circular distributions. The first case occurs when all

the angles are the same i.e. concentrated at one single point 8 = fiQ. The 'point'

distribution has little if no practical or theoretical interest here, but has been used

for the analysis of Brownian movement and the paths of beta rays.

The second limiting case is the Uniform distribution, where every angle on the unit

circle has an equal chance of occurring or no sector is preferred to any other sector.

The probability density of 6 is constant over the whole circumference and is defined

by

f ( 0 ) = 0 < 8 < 2tt ( 1 . 3 . 3 )

As there is no concentration of points about any given direction on the circle, then

no mean direction exists. Polya (1935) used an analogue of this when he investigated

whether the stars are distributed at random over the celestial sphere.

- 7 -

Page 22: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

As in the linear case an infinite number of circular distributions exist. Among these,

a few, possessing some desirable properties, have received attention. After the von

Mises, probably the most important is the wrapped Normal distribution. This

distribution is a natural conversion of the Normal distribution and is obtained, as its

name suggests, by wrapping the Normal distribution around the circumference of the

circle and adding those probabilities that fall into the same sector of the circle. The

addition of the overlapping tails leads to a rather complicated density function,

though when cr is small the distribution will be approximately Gaussian in (0 ,2k ).

Like the Circular Uniform distribution the wrapped Normal distribution possess the

additive property i.e. the sum of two or more wrapped Normal distributions produces

another wrapped Normal distribution with related parameters. The probability density

of a random variable 0, with mean angle / i0 = 0* , from the wrapped Normal

distribution is

f ( 0 ) ----------- — T exp[~ (<> t A m ) 2 ] o < e < 2x ( 1 . 3 . 4 )O' V 2 7 1 L 1 2<r 1n=-oo

where the mean vector length is

- c r 2exp

As p tends to zero, the wrapped Normal distribution approaches the Uniform

distribution, and as p tends to one it is concentrated at a single point. The

distribution has applications in the study of diffusion processes, and amongst others

has been examined by Stephens (1963) and Bingham (1971).

Another distribution which has been wrapped around the circle in a similar manner

to the Normal is the Cauchy distribution. The result is again a unimodal and

symmetric circular distribution possessing the additive property. With mean angle fiQ

and mean vector length p the probability density function for the wrapped Cauchy is

- 8 -

Page 23: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

introduced by Levy (1939) and studied by Wintner (1947).

Other circular distributions of less importance include the cosine distribution, also

called the sine wave distribution, with mean vector length p and mean angle p 0, and

has the density function

f ( 0 ) — 7s— + — cos(0 - p 0) ( 1 . 3 . 6 )Z tt tt

and the cardioid distribution with the density function

f ( 0 ) - [1 + 2p cos(0 - (iQ) ] ( 1 . 3 . 7 )

introduced by Jeffreys (1948).

All the circular distributions discussed so far have been unimodal distributions, with a

single preferred direction of p Q. There are, however, circular distributions with

multimodal directions. (1.3.8) gives the density function of a multimodal von Mises

distribution where v denotes the number of modes.

f (0) = 2^1 ( k ) exP ^ cos v ( 0 " ^o)3 ( 1 . 3 . 8 )

This was suggested by Breitenberger (1963) and further investigated by Stephens

(1965).

Many examples of bimodal data can be found, particularly in scientific fields where

orientation is measurable but direction is not. Batschelet (1965, 1981) gives many

examples of bimodal data from animal orientation and navigation. A similar situation

occurs if we observe the position of undirected straight lines or undirected axes.

Gadsden and Kanji (1983) collected this type of data on clay particles following the

Page 24: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

removal of their electric charge. These particles do not have a 'head' or 'tail' and

so the observations lie in the range 0 to x radians.

Other unimodal circular distributions were discussed by Mardia (1972 p.48-61).

Batschelet (1981 p.275-90) also reviews skewed, flat-topped and sharply peaked

circular distributions.

The most widely used circular distribution, however, is the von Mises distribution and

it is on this distribution that this thesis is based. The following chapters investigate

and extend the theory and uses of this distribution.

1.4 Statistics of a Circular Distribution

As we noted in Section 1.1 the simple arithmetic mean would not, in general, give a

meaningful mean direction of a sample of angles 0 1> e 2, ...., %• The mean

direction in circular statistics is determined by applying trigonometric functions.

Let Pj be one of the N observed angles 0j, i = 1, 2,...,N , with origin 0. Let q

and S[ be the rectangular components of Pj. Then by definition of sine and cosine,

c j = cos 0j s j = s in ( 1 . 4 . 1 )

where

N N

( 1 . 4 . 2 )i = l

Therefore, if R is the length of the resultant vector with components

mean direction, with components C and S, then

r = (C 2 + S2) 2 ( 1 . 4 . 3 )

N N

R - Nr ( 1 . 4 . 4 )

i= l i= l - 10 -

Page 25: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Applying basic trigonometry to calculate 6, the mean direction of the sample

arc tan

180 + arc tan

i f C > 0

i f C < 0

with the exceptional cases of

90* i f C = 0 and S > 0

270* i f C = 0 and S < 0

undetermined i f C - 0 and S = 0

Figure 1.4.1 illustrates these circular measures:

( 1 . 4 . 5 )

( 1 . 4 . 6 )

uni

= Sin 0£.

Cc = coS Oc

Figure 1.4.1 The Circular Statistics

- 11 -

Page 26: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

It is clear from Figure 1.4.1 that the length of the resultant cannot exceed N and

similarly, from (1.4.4), that r cannot exceed 1.

Hence,

In the extreme case when all sample points fall onto the same point, the length of

the mean vector, r, equals 1. When points are close together, concentrated over a

small arc, the centre of mass is still very close to the circumference of the unit

circle, and r is close to 1. Less concentration leads to smaller values of r. At the

lower end r = 0, with no concentration around a single direction. Hence, in

unimodal samples, the mean vector length, r, serves as a measure of concentration.

From the above results we may now see that 6 has some desirable properties as a

measure of location. One such property is that the mean vector does not depend on

the zero direction of the sample. If a rotation of \p is applied to each angle, then

the sample values 6\ turn into 6[ = 6\ - \p. Similarly the new mean angle is

6" = 6 - \p, but the mean vector length, r, remains invariant.

Examining the sine of the difference between the mean angular direction and the

sample angles, it is easily shown that

( 1 . 4 . 7 )

and

( 1 . 4 . 8 )

N

d . 4 . 9 )

i = l

which is analogous to

N

( 1 . 4 . 1 0 )

in linear statistical analysis.- 12 -

Page 27: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Similarly, for the cosine of the difference

N

J cos(6{ - 6) = R ( 1 . 4 . 1 1 )

i = l

and therefore

N

i T 2[1 - cos( 9 j - ? ) ] - 2(1 - r ) ( 1 . 4 . 1 2 )

1=1

For small deviations, 6\ - 6

2[1 - cos( 0 j - 0) ] « ( e i - I ) 2 ( 1 . 4 . 1 3 )

hence,

N

1 J (« i - 9 ) 2 * 2(1 - r ) ( 1 . 4 . 1 4 )

i= l

which is analogous to

N

i ^ (X | - x ) 2 = s 2 ( 1 . 4 . 1 5 )

i = l

in linear statistics.

Equation (1.4.12) may be defined as the angular variance and, from (1.4.14), is

asymptotically equivalent to the variance in linear statistics.

Taking the square root of (1.4.12) gives a measure of dispersion, equivalent to the

standard deviation

s = [2(1 - r ) ] i ( 1 . 4 . 1 6 )

called the mean angular deviation.

- 13 -

Page 28: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The basic results of this section were adapted from Batschelet (1981) where further

discussion of the properties are given. The analogies have been reiterated here so

further use may be made of them in later chapters.

- 14 -

Page 29: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 2

TH E VON MISES DISTRIBUTION

2.1 Derivation

Gauss showed that the Normal distribution can be derived by the method of

maximum likelihood with the single assumption that the mean is the most probable

value. Von Mises (1918) applied this to a circular variate, and for this reason

Gumbel, Greenwood and Durand (1953) referred to the distribution as the Circular

Normal Distribution. Von Mises procedure was for a distribution f(0| - n 0), such

that the direction f iQ upon N observations 0 lt 0 2, ...... 0^ is a maximum given by

the constraints

where f(0) is the required distribution and f'(0 ) is the first derivative of f(0) with

respect to n Q. Since the equations (2.1.1) and (2.1.2) are identical for each 0j,

therefore

N

(2 . 1 . 1)i= l

and

(2 . 1 .2 )

f ' ( * - Mo)-------------------------s in (0 - /t0)f (0 - H0)

( 2 . 1 . 3 )

The equation has the solution

f (0 - j i0) - U exp[/c cos(0 - / i0) ] ( 2 . 1 . 4 )

where the two variables U and k are linked by the condition (1.3.1).

- 15 -

Page 30: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Hence

U -1

exp[k cos (6 - n Q) ] d(0 - /tQ)o

1 ( 2 . 1 . 5 )2x10(k)

where 10(k) is the modified Bessel function of the first kind and order zero. A

proof of (2.1.5) can be seen in Mardia (1972, p.58).

The von Mises distribution then, denoted as M (/*0,fc), is given by

The von Mises distribution is unimodal and symmetric with its mode at f iQ and

anti-mode at n Q + x. For k=0, the von Mises degenerates into the Uniform

distribution, and for large k the distribution concentrates around the mean direction.

Therefore, k is called the parameter of concentration. The concentration parameter

k is analogous to the inverse of the variance parameter cr2 of the Normal

distribution in its effect on the shape of the distribution. For sufficiently large k we

may approximate the von Mises by the Normal distribution. Using an approximation

quoted by Bickley (1957), for large k

2x10(k) exP ^ cos(0 “ *o>] (2 . 1 .6)

2.2 Properties of the von Mises Distribution and its Parameters

1 exp (k)10c « - — -----------

(2 r k ) i ( c ( k ) )

where

2 1(2 .2 . 1)

- 16 -

Page 31: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Extending the limits of this approximation would be reasonable since, when k is

large, the additional area is negligible. Replacing this into (2.2.2), a von Mises with

mean at zero;

f (» )1

2x 1n(k) exp(/c cos 0) ( -X < 0 < x) (2 .2 .2)

gives

f (0 ) * c(/c) k_2x exp{k((cos 0) - 1)} ( —go < 0 < °°) ( 2 . 2 . 3 )

Gumbel, Greenwood and Durand (1953) simplified (2.2.3) to obtain

M(0, k) » N ( 0 , / H )

alternatively

0 V F « N( 0 ,1) ( 2 . 2 . 4 )

Upton (1974) considered more accurate approximations by taking further terms in the

power series of ((cos 0) - 1), and produced two new approximations

v / F 0 » N ( 0 ,1) ( 2 . 2 . 5 )

and more accurately,

J T 1 - 8k124 1 + 4k ' )■

03 « N ( 0 , 1) (2 .2 .6)

A fourth approximation was considered by Upton (1974) given by Mardia (1972,

p.64), without proof;

i0 « N ( 0 , 1) ( 2 . 2 . 7 )

Upton tested the power of all the approximations, finding that all four consistently

overestimated the upper tail probability. Approximation (2.2.6) was found to be the

best, and this was later confirmed from further work by Hill (1978).

Stephens- 17 -

Page 32: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

(1962c) gave another approach to the equality of the two distributions using the

moment generating function.

Estimates of the parameters k and /x0 may be obtained by means of the maximum

likelihood method.

A sample of N angles •••% are collected from a von Mises distribution with

unknown population parameters k and pL0 which we wish to estimate. The

probability density for these angles is

CN exp[/c cos(0 1 - /*0) ] exp[/c cos(0 2 - /*0) ] ............

- CN exp k[cos(6^ - / i0) + cos(0 2 - fiQ) + (2 .2 . 8)

where

12x10(k)

The log likelihood function is

N

log L(f i0 ,k) - -N log 10 (k ) + k ^ cos(0 j - / i0) + const

i= l

( 2 . 2 . 9 )

For the maximum likelihood estimate of the parameter (i0

“ k [ s i n ( 6 , - n 0) + s in (0 2 - fLQ) +

N

(2 .2 . 10)

aand vanishes for the particular value (nQ) with

N

- 18 -

Page 33: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

From (1.4.yj we Know mat mis equauon is sausriea ior me sampie mean angie a

which we calculated in (1.4.5). Hence, the maximum likelihood estimate of the

parameter p Q of a von Mises distribution is the sample mean angle 0. Bingham and

Mardia (1975) showed that there exists only one circular distribution for which the

sample mean angle is the maximum likelihood estimate of the population mean angle,

namely the von Mises distribution.

For the maximum likelihood estimate of the parameter k

Ndlog L 1, (10------------------N ---------

d(/c) I 0(k)+ ^ c o s ( 0 { - H0) ( 2 . 2 . 1 1 )

Therefore dlog Ud{k) is zero if

N

- £ cos( 0 j - IX0) ( 2 . 2 . 1 2 )I 00 0 N ^

The right hand side is the mean vector length of the sample, r, as indicated by

equation (1.4.11). Hence, the maximum likelihood estimate of k is the solution of

I , < £ ) RA ( k ) ------------------------ r ( 2 . 2 . 1 3 )

I 0<£) N

If the mean direction is known to be p Qt then the maximum likelihood estimate of

k, k, is no longer given by equation (2.2.13), but instead by equation (2.2.14)

I , ( * ) X----------------- ( 2 . 2 . 1 4 )I 0W N

where X is the component of R on 6, when /x0 is known.

The solution of (2.2.14) is obtained numerically. Tables are not provided here since

adequate tables have been produced by Upton (1970, Appendix G), Mardia (1972,

- 19 -

Page 34: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Appendix 2.2) and Batschelet (1981, Table B). For the extreme cases, there are

approximate solutions to (2.2.13) which will be discussed in Chapter 3.

Figure 2.2.1, taken from Upton (1970), illustrates the measures R and X with their

relationship to C and S from (1.4.2). 5 is the angle between R and X.

2encjfh O Z = R

OT = Xv z = c = 2cos QO V = < 5 = 2 Sin ©£

VO

Figure 2.2.1 Statistics R, X, C and S

Clearly from Figure 2.2.1

C = R cos(0 - 6)

S = R sin(0 - 6)

R2 = C 2 + S2

The estimates of /<0 and k by the method of moments are the solutions of

C — A(/c)cos fiQ S = A(/c)sin (iQ ( 2 . 2 . 1 5 )

- 20 -

Page 35: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

which give the same results as the maximum likelihood estimates. Hogg and Craig

(1965) have shown that 8 and R are jointly complete sufficient statistics for p 0 and

k. However, if k is known then Rcos 8 and Rsin 8 are minimal sufficient statistics

for p 0 which implies that 8 itself does not contain all the information about fiQ.

This underlines the difficulty in constructing an optimal criterion for estimating the

circular mean direction. Yet if n 0 is known then C is a complete sufficient statistic

of k, and C an unbiased estimate of A (k).

2.3 The Distribution of R for the Uniform and von Mises Distributions

The distribution of R for the Uniform and von Mises distributions were derived and

discussed by Stephens (1962a, 62b), Upton (1970) and Mardia (1972) and therefore

will not be fully reiterated here. A brief summary of results, however, will be given

in order that they may be utilised in later chapters.

2.3.1 Preliminary Results

Using the notation of Mardia (1972)

(a) Let i/{p,$) be the characteristic function of a continuous two-dimensional random

variable (x,y) where

x — r cos 8 y — r s in 8 ( 2 . 3 . 1 )

^(p,3>) - E [exp { ip r cos(0 - $ ) } ] ( 2 . 3 . 2 )

The joint density of r and 8 is given by

00 f 27T

e x p [ - ip r c o s (0 - <i>) ] p ^ (p ,<!>) dpd$ ( 2 . 3 . 3 )

- 21 -

Page 36: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Integrating over 0 gives the density of r

P(r) <2t )

2ir( 2 . 3 . 4 )

where Jm(x) is the standard Bessel function of order m and real argument.

Equation (2.3.4) is described as an inversion formula for the distribution of r.

Let 0 j,j= l,2 ...,N be distributed independently with probability density function fj(0),

j= l,2 ,...,N and of unit length and

N N

C - ^ cos 0j S - J s in 0 j

J - l J - l

The joint characteristic function of (C,S) is given by

Nn imp.*>

J -l

where ^j(p,$) is the joint characteristic function of (cos0j,sin0j).

( 2 . 3 . 5 )

Hence, from equation (2.3.5) the probability density function of R is given by

P(R) <2x)

2ir NJ 0<PR) (II l h < P . * ) ) P dpd4»

j “ l( 2 . 3 . 6 )

2.3.2 Distribution of R for the Uniform Distribution

To enable the construction of the distribution of R when the observations are taken

from the von Mises we shall initially consider the special case of the von Mises when

k - 0 and the distribution is Uniform.

- 22 -

Page 37: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The problem of finding the density of R is analogous to the problem of random

walk. Pearson (1905) required the probability that after N steps a man is at a

distance between R and R+5R from his starting point, 0. Here his steps, /, are

regarded as of unit length.

Using (1.3.3), the Uniform distribution, in (2.3.2), the characteristic function of

(cos0j,sin0j) is given by

* & ■ * > - < hexp[ ip cos(0 - <£)] d0 ( 2 . 3 . 7 )

o

From (2.3.5) the c.f. of (C,S) is given by

J„N(P) ' ( 2 . 3 . 8 )

Then substituting (2.3.8) into the inversion formula (2.3.6) for R, the probability

density function of R for the Uniform distribution is

pu (R) - R uJ0 (Ru) J N(u) du ( 2 . 3 . 9 )

where

pu (R) = 0 fo r R > N

The integral (2.3.9) is often referred to as Kluyver’s integral (1906). The asymptotic

solution of Pearson had been obtained already by Lord Rayleigh (1880). Pearson

(1906) gave another proof of Kluyvers result. Rayleigh (1919) used Kluyvers

technique to obtain the solution to the problem in three dimensions, or random

flights. Tables of pu(R) for differing N are given by Greenwood and Durand (1955)

and updated and extended by Durand and Greenwood (1957). Asymptotic

approximations will be discussed in Chapter 4.

- 23 -

Page 38: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

2.3.3 Distribution of R for the von Mises Distribution

The joint probability density function of C and S is given by

1 Pu<R>g(C,S) - 7J exp[k#C + /cvS] --------- ( 2 . 3 . 1 0 )

2*1 ” (lc) R

where

H — cos fi0 v - s in p0

On transforming C and S to 0 and R by C = R cos 0 and S = R sin 8 in (2.3.10)

the joint p.d.f. of 8 and R is seen to be

1g ( 0 , R ) -rr exp[RR co s (0 - p 0) ] pu (R) ( 2 . 3 . 1 1 )

2x I q( /c)

0 < 0 < 2 * 0 < R < N

Integrating with respect to 0, the p.d.f. of R for the von Mises is given by

1pv ( R ) -------------------10(RR) PU(R) 0 < R < N ( 2 . 3 . 1 2 )

( i „ W ] N

where pu(R) denotes the p.d.f. for the Uniform distribution given by (2.3.9).

Equation (2.3.12) is due to Greenwood and Durand (1955). Asymptotic

approximations will be discussed in Chapter 4.

2.4 Combining von Mises Distributions

Let 0 j , 0 2 be independently distributed as von Mises M M ( v Qtk 2) respectively.

The probability distribution function of 0 = 8 y + 0 2, using the convolution formula

is given by

f 2 7rexp[r cos($- /3) ] d£ ( 2 . 4 . 1 )

o47r21 0 (k, ) I 0 (/c2)

- 24 -

Page 39: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

where

r cos (3 - k., + k2cos(0 - a )

r sin /3 - k2sin(0 - a )

a “ Mo + v0

Proof of Equation (2.4.1), Outlined by Mardia (1972)

The convolution formula is given by

2*f 1( 0 ) f 2( r - 0) d0

2 *

2x 1 ^ ; - exP[fc.cos<« - /*»)] 2tIq(Ic2) exP[*2cos(f - 0

4 x * I 0 ( k , ) l 0 ( k 2)

2 *

exp[/c1cos(0 - f i0) + k 2cos(T - (0 - v,

Taking the exponential term of the integral

k ,c o s (0 - p 0) + k 2cos( f - (0 - v Q) )

Let £ = 0 - p 0

Then (2.4.4) becomes

k,cos £ + k 2c o s ( f - ( £ . + p0 - v Q) )

Expanding and using the sine and cosine rules produces

[k , + k 2cos( f - p 0 - v 0)]cos £ + [ k 2s i n ( f - p 0 - v 0) ] s i n £

Using the equalities (2.4.2)

- [ r cos /3]cos £ + [ r s in /3]sin £

r cos(£ - 0)

(2.4.2)

(2.4.3)

- v 0) ) ]

) ) ] d0

(2.4.4)

(2.4.5)

(2.4.6)

(2.4.7)

(2.4.8)

- 25 -

Page 40: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Replacing (2.4.7) into (2.4.4) gives equation (2.4.1)

As2x

exp[/c cos 0 ] d0 - 2 x l 0 (/c) ( 2 . 4 . 9 )o

(2.4.1) may be reduced i.e.

2 Xexp[/c cos 0] d0

o

2Xexp[r cos(£ - 0 ) ] d£

o

where k=r

r » [ + k 2 cos(0 - a ) ] 2 + [ k 2 s in (0 - a ) ] 2 ]^

= k 2 + k 2 + 2k.k? cos (0 - a) ( 2 . 4 . 1 0 )1 2 1

From (2.4.9), (2.4.4) now becomes

2xl„(fe)4 x M 0( k , ) I 0(/c2) ‘•“ ‘ 0

~ a , i B( O l B(ic2) l o K * ? + * i + 2 M 2 o o . ( * - a ) ) i ] ( 2 . 4 . 1 1 )

I f k^=k2=k, and using cos20 = 2cos20 - l , (2.4.11) becomes

I 0 [2k COS j ( e - Of)] ( 2 . 4 . 1 2 )0

The expression (2.4.12) is not the density of a von Mises distribution, i.e. the

convolution of von Mises distributions is not a von Mises distribution. However,

expression (2.4.12) may be approximated by a von Mises distribution. Without loss

of generality, let fiQ and v0 equal zero, the distributions M (0 ,k t) and M(0,fc2) may

now be approximated by the wrapped Normal distribution. The wrapped Normal, as

discussed in Section 1.3, holds the additive property, therefore two wrapped Normals

gives another wrapped Normal with parameter <r| = c f + a \ . This distribution can

then be approximated by M(0,fc3) where k 3 is the solution of

A(fc3) - A(/c1 )A(/c2) ( 2 . 4 . 1 3 )

- 26 -

Page 41: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Hence

( 0 1 + 6 2) mod 2x is approximately M (/i0 + v 0 , k 3)

and ( 2 . 4 . 1 4 )

( 0 1 - 6 2) mod 2 t is approximately M(f iQ - v Q, k 3)

Stephens (1963) has shown numerically that this approximation is satisfactory,

although reducing in accuracy as k decreases.

Full details of the exact and approximate moments of R may be found in Upton

(1970) and Mardia (1972). Here only those necessary for the improvement and

examination of tests discussed in later chapters will be given.

The distribution (2.3.12) cannot be used directly to obtain the expected values of R,

however, for large or small values of k (2.3.12) may be replaced by approximation

expressions from which the expectation of R for differing k may be calculated.

Stephens (1969) was the first to suggest and undertake the method of repeated

differentiation of the probability density function to obtain the exact even moments of

R. Upton (1970) having defined S and C by

N N

utilised the moment generating functions of S and C to obtain their exact

expectations and in turn produce the expectation of R 2 as did Stephens, as

2.5 A Summary of Exact and Approximate Moments of R

i -1 i-1

where R2 - S2 + C2 ( 2 . 5 . 1 )

E(R2) - N + N(N - l ) p 2 ( 2 . 5 . 2 )where

Page 42: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

As k-*0, then p-»0 and E (R 2)-*N, a result expected for the Uniform distribution. As

the distribution becomes more concentrated about its angular mean direction (&-*»)

then p-»1 and E (R 2)—»N2.

Using both these approaches the exact expectation of R 4 has been calculated and

given as

Upton (1970) gives several approximations to the expectation of R by equating

distribution approximations given by Watson and Williams (1956) and Stephens (1969)

to their associated expected chi-squared values for large and small values of k. The

expectation of R for the von Mises distribution, as N— is given by

( 2 . 5 . 3 )

where

1 - 2 —(Q.k

E(R) - Np + i - ( l - p ’ ) + 0 -4 p1

IN J

( 2 . 5 . 4 )

From Watson and Williams approximation, for large k

( 2 . 5 . 5 )

From Stephens approximation, for large k

( 2 . 5 . 6 )

- 28 -

Page 43: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

I f k and N are large, by substituting the Bessel functions, discussed later in Chapter

3;

E(R) - N - ^ ( 2 . 5 . 7 )

which is in close agreement with the result (2.5.5).

I f k is small, without being too small for the approximation to the ratio of Bessel

function to be invalid.

Page 44: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 3

LIKELIHOOD RATIO TESTS AND APPROXIMATIONS TO TH E M A XIM U M

LIKELIHOOD ESTIMATES

In the early 1920's, R A Fisher proposed a general method of estimation, called the

method of maximum likelihood. Fisher demonstrated the advantage of this method

by showing that (1) it yields sufficient estimates whenever they exist, and (2) it yields

estimates which are asymptotically (when N—*») minimum variance unbiased

estimators. In principle, the method of maximum likelihood consists of selecting that

value of the parameter 0 under consideration for which f(x 1,x2,...,xn;^), the

probability of obtaining the sample values, is a maximum.

The joint likelihood of the N observations 0 1, 0 2, . . . ,0 N from a von Mises

distribution with parameters k and fi0 is

as was given and used in Chapter 2.2.

Likelihood ratio tests utilise maximum likelihood estimates to test whether a particular

set of data is consistent with some hypothesis about its underlying distribution. The

likelihood ratio test is a uniformly most powerful test. A detailed discussion of these

tests originally formulated by Neyman and Pearson, can be seen in Kendall and

Stuart (1967).

3.1 The Method of Maximum Likelihood and Likelihood Ratio Tests

N

( 3 . 1 . 1 )

i= l

- 30 -

Page 45: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The likelihood ratio test provides a means by which a null hypothesis can be tested

against an alternative hypothesis. A null hypothesis k = k Q, fiQ = j tQ may be tested

against an alternative hypothesis A: = & ,, /*0 = ? 0 parameters of the population given

by f(0 ; k, fiQ). Let L 0 and L , denote the likelihoods of k Q, / *0 and k , given

the population with its parameters k and /t0. Symbolically,

N N

L 0 - II f ( 0 iJ k 0 , ?0) and L 1 — II f (0 j ; k , , £ , ) ( 3 . 1 . 2 )i - 1 i - 1

These quantities are both values of random variables, they depend on the observed

sample values 0 , , 0 2,... 0N, and their ratio.

max {L 0} under null hypothesisX -------------------: ------------------------------------------------ ( 3 . 1 . 3 )

max { L , } under alternative hypothesis

which is referred to as a value of the likelihood ratio statistic X.

Since max L 0 is apt to be small compared to max L , when the null hypothesis is

false, then the null hypothesis should be rejected when X is small.

Usually the natural logarithm of the ratio (3.1.3) is taken since, for large N, the

distribution of -21og X approaches, under very general conditions, the chi-squared

distribution with its degrees of freedom given by the number of parameters which are

constrained by the null hypothesis. Let, under the null hypothesis, the best estimatesa a

of k and /i0 be k Q and /*0 respectively, where these are either given values specified

by the hypothesis or the maximum likelihood estimates of the parameters under the

null hypothesis.

- 31 -

Page 46: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Similarly, let k y and be the corresponding estimates under the alternative

hypothesis. Then, using (3.1.3) the likelihood ratio statistic is

( 3 . 1 . 4 )X -

For observations from the von Mises distribution

X -

It is, therefore, not necessary to rely on being able to derive the distribution of our

test statistic theoretically since a good approximation to the distribution, using

-21og X, is available.

Since this, and as we will see later, several other successive approximations to the

likelihood ratio test statistics are used, invariably the tests are biased. However, by

equating the expectation of the test statistic to its associated chi-square expectation

(given by its degrees of freedom), we may attempt to remove this bias, and hence

obtain a more effective test.

Upton (1973, 76) utilizes this method extensively to improve his statistics for

single-sample and multi-sample tests of the von Mises distribution. (These tests will

be discussed and summarised in Chapter 4.) Many of the test statistics resulting

from the likelihood ratio method could be used without simplication. By using the

various approximations, test statistics which are simpler, both in form and use, may

be derived.

- 32 -

Page 47: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

3.2 Approximations for the von Mises Concentration Statistic, k, when k is

Small

In Chapter 2, we have seen that if 0N are a random sample from M ( f iQik)

then the maximum likelihood estimate of k, k, is the solution of

M * ) RA ( k ) ---------------------- ( c 2 + s 2) *

I „ ( k ) N( 3 . 2 . 1 )

thus

k = A" 1 ( 3 . 2 . 2 )

where the ratio of the Bessel functions I ,(k)/10(k) will be denoted as A (k).

Limited tables of A -1 are given by Mardia (1972, p 298) and Batschelet (1980,

Tables B, C and D) based on those in Gumbel, Greenwood and Durand (1953).

Mardia and Zemroch (1975) gave a computer algorithm for calculating k and other

circular statistics by an iterative process.

In this and section 3.3 several approximations to A ” 1, which do not need tables or

large computing equipment, are given. From these functions the statistic k can be

obtained fairly accurately. The approximations stated here are taken from Dobson

(1978) and Upton (1970). In a similar manner to Upton we shall denote R/N and

X/N by x. This causes no problems since the choice is determined by whether or

not the mean direction is known or not.

Dobson initially states four approximations to A -1 which are global approximations

for all values of k. The first uses Amos (1974) equation that

Page 48: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

and hence that A" 1 (x) is approximately

A 7 ' (x ) [ l - x 2l f l *'

: f +C (1 - x 2) + - h ( 3 . 2 . 4 )

where c = 1.46 to minimize the maximum relative error, and so k can be estimated

using k =

The other three global approximations use a noticed feature that A(x) behaves like

(2/x)tan~1x and so A_ 1(x) is like tan(xx/2). From this Dobson states the

approximation

a ; 1( x ) i + x 2| - 4X | 4 XXXtan 2 ( 3 . 2 . 5 )

Improvements are found by replacing terms in (3.2.5) by minimax values, giving

1A ; 1( x ) 1.32 + X‘ 1.32 - 1.32 tan xx

2~ ( 3 . 2 . 6 )

and

A"1(x) = (1 .28 - 0 .5 3 x 2) t a n xx ( 3 . 2 . 7 )

Compared to these global approximations, approximations to A - 1 (x) for particular

parts of its range were studied by Upton. The power series for the Bessel functions

I 0(x) and I , (x ) for small x gives

A(/c) " 2 ^ ( 3 . 2 . 8 )

On inverting the series and taking the first three terms an approximation of A *00

is given by

a ; 1(x ) - x^2 + x 2 + ] ( 3 . 2 . 9 )

- 34 -

Page 49: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Using the first two terms

A ^ C x ) - x [2 + x 2 ] - k 6 ( 3 . 2 . 1 0 )

Finally if k is expected to be very small, then we may simply use the first term

approximation of the power series, which gives

A7 1(x ) - 2x - k 7 ( 3 . 2 . 1 1 )

To examine these approximations further Table 3.1 lists the true value of k against

the approximations k , to k 7 between 0 and 1. For quicker and easier appreciation

of the accuracy of the approximations Figures 3.1(a) to 3.1(g) plot the residuals,

k-k'v against the true k. Figures 3.2(a) to 3.2(g) plot the relative percentage errors,

\ k-k{ \ lk, for each of the approximations.

A A A A

From the global approximations, k , , . . . , for values of k less than 1, k 2 and k 4

are clearly the best, with maximum relative errors, illustrated in Figures 3.2(b) and

3.2(d), of 0.71% and 0.84%, respectively at k=l .

A

By far the best approximation obtained from the power series is k 5, for all values of

k < 1, with maximum relative error of 0.35%. For very small values of k (lessA

than 0.2 or R/N less than 0.1), however, k 7 is by far the simplest and quickest

method of estimating. From Figures 3.1(g) and 3.2(g) we can see that, outside the

range of 0 < /: < 0 .2 , k 7 deteriorates rapidly.

A

A disappointing function approximation is k , with maximum relative error of 9.61%

for k=0.05 (the minimum value of k tested). For values of x (ie R/N) near toA A

zero, the global approximations k A are not as good as those obtained from the

power series.

- 35 -

Page 50: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

T A B L E 3 . 1

A P P R O X I M A T I O N S T O S M A L L K

T r u e k X N

A

K 2 * 3 * 4 * 5 * 6 *70 0 0 0 0 0 0 0 0

0 . 0 5 0 . 0 2 5 0 0 . 0 4 5 2 0 . 0 5 0 0 0 . 0 5 1 8 0 . 0 5 0 3 0 . 0 5 0 0 0 . 0 5 0 0 0 . 0 5 0 0

0 . 1 0 0 . 0 4 9 9 0 . 0 9 0 4 0 . 1 0 0 0 0 . 1 0 3 6 0 . 1 0 0 5 0 . 1 0 0 0 0 . 1 0 0 0 0 . 0 9 9 9

0 . 1 5 0 . 0 7 4 8 0 . 1 3 5 7 0 . 1 4 9 9 0 . 1 5 5 4 0 . 1 5 0 7 0 . 1 5 0 0 0 . 1 5 0 0 0 . 1 4 9 6

0 . 2 0 0 . 0 9 9 5 0 . 1 8 1 1 0 . 1 9 9 9 0 . 2 0 7 1 0 . 2 0 0 9 0 . 2 0 0 0 0 . 2 0 0 0 0 . 1 9 9 0

0 . 2 5 0 . 1 2 4 0 0 . 2 2 6 6 0 . 2 4 9 8 0 . 2 5 8 8 0 . 2 5 1 0 0 . 2 5 0 0 0 . 2 5 0 0 0 . 2 4 8 1

0 . 3 0 0 . 1 4 8 3 0 . 2 7 2 3 0 . 2 9 9 6 0 . 3 1 0 3 0 . 3 0 1 0 0 . 3 0 0 0 0 . 2 9 9 9 0 . 2 9 6 7

0 . 3 5 0 . 1 7 2 4 0 . 3 1 8 2 0 . 3 4 9 4 0 . 3 6 1 8 0 . 3 5 0 9 0 . 3 5 0 0 0 . 3 4 9 9 0 . 3 4 4 7

0 . 4 0 0 . 1 9 6 1 0 . 3 6 4 3 0 . 3 9 9 1 0 . 4 1 3 1 0 . 4 0 0 8 0 . 4 0 0 0 0 . 3 9 9 7 0 . 3 9 2 2

0 . 4 5 0 . 2 1 9 5 0 . 4 1 0 6 0 . 4 4 8 8 0 . 4 6 4 3 0 . 4 5 0 5 0 . 4 5 0 0 0 . 4 4 9 6 0 . 4 3 9 0

0 . 5 0 0 . 2 4 2 5 0 . 4 5 7 2 0 . 4 9 8 4 0 . 5 1 5 4 0 . 5 0 0 1 0 . 5 0 0 0 0 . 4 9 9 3 0 . 4 8 5 0

0 . 5 5 0 . 2 6 5 1 0 . 5 0 4 1 ' 0 . 5 4 8 0 0 . 5 6 6 3 0 . 5 4 9 6 0 . 5 4 9 9 0 . 5 4 8 8 0 . 5 3 0 2

0 . 6 0 0 . 2 8 7 3 0 . 5 5 1 3 0 . 5 9 7 5 0 . 6 1 7 1 0 . 5 9 9 1 0 . 5 9 9 9 0 . 5 9 8 2 0 . 5 7 4 2

0 . 6 5 0 . 3 0 9 0 0 . 5 9 8 9 0 . 6 4 6 9 0 . 6 6 7 8 0 . 6 4 8 4 0 . 6 4 9 8 0 . 6 4 7 4 0 . 6 1 7 9

0 . 7 0 0 . 3 3 0 2 0 . 6 4 6 8 0 . 6 9 6 4 0 . 7 1 8 4 0 . 6 9 7 6 0 . 6 9 9 6 0 . 6 9 6 3 0 . 6 6 0 4

0 . 7 5 0 . 3 5 0 9 0 . 6 9 5 1 0 . 7 4 5 8 0 . 7 6 8 9 0 . 7 4 6 7 0 . 7 4 9 4 0 . 7 4 5 0 0 . 7 0 1 8

0 . 8 0 0 . 3 7 1 1 0 . 7 4 3 8 0 . 7 9 5 2 0 . 8 1 9 2 0 . 7 9 5 8 0 . 7 9 9 1 0 . 7 9 3 2 0 . 7 4 2 1

0 . 8 5 0 . 3 9 0 7 0 . 7 9 2 9 0 . 8 4 4 6 0 . 8 6 9 5 0 . 8 4 4 8 0 . 8 4 8 7 0 . 8 4 1 1 0 . 7 8 1 5

0 . 9 0 0 . 4 0 9 8 0 . 8 4 2 4 0 . 8 9 4 0 0 . 9 1 9 7 0 . 8 9 3 7 0 . 8 9 8 1 0 . 8 8 8 5 0 . 8 1 9 7

0 . 9 5 0 . 4 2 8 4 0 . 8 9 2 4 0 . 9 4 3 4 0 . 9 6 9 8 0 . 9 4 2 7 0 . 9 4 7 4 0 . 9 3 5 4 0 . 8 5 6 8

1 . 0 0 0 . 4 4 6 4 0 . 9 4 2 8 0 . 9 9 2 9 1 . 0 1 9 9 0 . 9 9 1 6 0 . 9 9 6 5 0 . 9 8 1 7 0 . 8 9 2 8

- 36 "

Page 51: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

approximations .for small k against k.

0.03

0.01

0 .40 .2 0. 8 1.0-o.oi

F i g u r e 3 . l C a )

0.05 •

0 .03

0.01

-0.010 .2 0 .4 0-5 o.s 1.0

-0 .0 3 '

-0 .0 5 '

F i g u r e 3 . 1 ( b )

0.05

0.03

0 . 01

"r1.32 tan1.32

-0 .0 5

Figure 3.1(c)

- 37 -

Page 52: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

0 .05

0 .03

0.01-i<=3a

1 .00.801IL Itr -0.01

0 .40.2

0 .03

1.28 - 0 .5 3 r a ) tan

F i g u r e 3 . 1 ( d )

0 .05

0.03

J 0 .0 1

1.00 .80.60 .4£ - 0.01

■0.03'

-0 .0 5 '

F i g u r e 3 . l C e )

0 .05

< 0.01 CM ___cnliic - 0 . 0 1

1 .00 . 80.60 .4

•0.03

-0.05-

Figure 3.lCf)- 38 -

Page 53: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

0.05

£ - 0.01

-0 .03

-0 .0 5

F i g u r e 3 . 1 C g )

F i g u r e s 3 . 2 ( a ) t o C g ) . P l o t s o f t h e r e l a t i v e p e r c e n t a g e e r r o r o f

t h e a p p r o x i m a t i o n s f o r s m a l l k a g a i n s t k .

*3*

7 .0

S .C -

5.0'

0 .4 0.5 0 .3

F i g u r e 3 . 2 ( a )

2.0

tan

0 .2

Figure 3.2Cb)_ 39 _

Page 54: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

4 .0

1.32 tan

3 .0

(Eatr<rUJ

tu>t-it-<UJtr

00 .4 0.S 0.5 1 .0

F i g u r e 3 t. 2 C c ]

2 . 0'

(1 .25 - O.SSr1 ) tan

00 .80.50 .40 . 2

F i g u r e 3 . 2 C d 3

o0 .2 0 .4 o . s o . a

Figure 3.2Ce)

- 40 -

Page 55: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

2.0

0 .4 0.6 0 .6 1.0

F i g u r e 3 . 2 C f )

3.0-

Ui

0.60.60 .40 .2

F i g u r e 3 . 2 C g )

- 41 -

Page 56: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

3.3 Approximations for the von Mises Concentration Statistic, k, when k is

Large

For large values of k the power series expansion of Bessel functions does not provide

a simple approximation to (3.2.1), however, the asymptotic expansion quoted by

Abramowitz and Stegun (1965) may be used

exp[lc]I PW *

( 21c) i

m - 1 (m - l ) ( m - 9) (m - l ) ( m - 9 ) (m - 25) 1 + --------------------------------------------------------------------------

81c 2! (81c)2 3! (81c) 3

( 3 . 3 . 1 )

where m = 4p2. Using this relation

1 1 1A(k) * 1 ------------------------------+ 0 ( k “ 4) ( 3 . 3 . 2 )

2k 8 k 2 8 k 3

Denoting R/N and X/N by x, using Maclaurins theorem the solution of (3.3.2) is

1- = 2(1 - x ) - (1 - x ) 2 - (1 - x ) 3 ( 3 . 3 . 3 )k

On inverting we obtain, as the first three terms

^ - 2 ( r - h o + ? + § (1 + x) ( 3 -3 -4)

9 — 8x — 3x2 aA; 1(*> - 8 ( r - x) = ( 3 - 3 -5)

Using the first two terms of (3.3.3) only

- 4(1 I x) - (3-3-6)If k is expected to be very large, then we may simply use the first term

approximation of the power series, giving

A-1 (x ) - 75-7-=—"— r “ K o ( 3 . 3 . 7 )i o 2 ( l - x ) 10

Upton (1970) gives his solution of (3.3.2) as

- 42 -

Page 57: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

1 3- = 2(1 - x) - (1 - x ) 2 - - ( 1 - x ) 3 ( 3 . 3 . 8 )k 2

On inverting

k = * + — + ^ (3 3 9)2(1 - x) 4 2

5 — 5v + 2v2 a

A71<*> <3 - 3 - 10)

Upton does not explain how he derives equation (3.3.8) and therefore does not state

why the third term is different to that of equation (3.3.3), however, as we shall seeA /V

later in the chapter, the approximation is better than k Q.

Upton (1973) states an approximation suggested by M A Stephens where the variance

of the maximum likelihood estimator is considered, giving

AT2( x ) “ T - ^ - k , 2 ( 3 . 3 . 1 1 )

A A A A

To examine the approximations k 8, . . . .k12 and the global approximations k 1 k A,

given in Section 3.2 for large values of k, Table 3.2 lists the true values of k

between 1 and 20 against these approximations.

A

As with small values of k , the residuals, k-k'v have been plotted for each

approximation for easier appreciation of their accuracy, given in Figures 3.3(a) to

3.3(i). Similarly Figures 3.4(a) to 3.4(i) plot the relative percentage errors for each

of the approximations.

Figures 3.3(a) to 3.3(d) show the residuals for the global approximations, k 1, . . . .k4.

Due to the large residuals shown within these approximations the graph scales, for

analysis of large k t have been increased compared to those of small k (Figures 3.1

(a) to 3.1(g)). Although these scales have increased, k , equation (3.2.4), has such

large residuals and relative errors, compared to the other approximations, that the

- 43 -

Page 58: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

plots 3.3(a) and 3.4(a) still leave the graph. A residual of -0.8183 would be seen

at k=20, and a maximum relative error of 9.6 at approximately k - 4.5. Function

approximation, is unexpectedly poor for both small and large k especially

considering its complex form.

From the remaining global approximations, k 3 has the smallest maximum actual

residual, in the range k - 1 to 20, of 0.2634 at k - 20, in comparison k A has maximum

actual residual of 0.4944 at k=20. However, k A may be seen as the best global

function approximation since it has a maximum relative error of 2.47% at k=20,

while k 3 has a maximum relative error of 3.42 at k=3.85.

O f the five power series functions, from examination of Figures 3.3(e) to 3.3(i), k 8

and k 1 n are the better two approximations, with maximum residuals of 0.048 at

k=3.1, and 0.025 at k - 3.2, respectively, for values of k\ > 2.5. On examination of

the relative percentage errors k , 1 is the best approximation with maximum relative

error of 0.79% at k - 3.1, compared to 1.61% at k - 2.85 for k Q, for values of

k^> 2.5.

A

It is interesting to note the adequacy of k 10, equation (3.3.7), as this is often

quoted as a good approximation for large k. Form Figure 3.3(g) we can see that a

far greater improvement would be seen if 0.25 was added to the approximation,

producing k g.

A A

For almost all k , ......... k , 2 approximations the actual and relative errors fluctuate

most in the range 1 < k < 2.5.

- 44 -

Page 59: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

T A B L E 3 . 2

A P P R O X I M A T I O N S T O L A R G E K

T r u e k XS k 3 *4 "a

CD<

K _ 1 0 k 1 1 K 1 2

1 . 0 0 . 4 4 6 4 0 . 9 4 2 8 0 . 9 9 2 9 1 . 0 1 9 9 0 . 9 9 1 6 1 . 3 6 0 8 1 . 1 5 3 2 0 . 9 0 3 2 1 . 4 3 0 0 1 . 2 4 8 9

1 . 5 0 . 5 9 6 1 1 . 4 7 1 7 1 . 4 9 4 7 1 . 5 2 2 2 1 . 4 8 3 5 1 . 6 3 9 5 1 . 4 8 8 0 1 . 2 3 8 0 1 . 6 9 0 0 1 . 5 5 1 3

2 . 0 0 . 6 9 7 8 2 . 0 3 9 2 2 . 0 1 5 3 2 . 0 3 5 6 1 . 9 8 8 5 2 . 0 1 7 7 1 . 9 0 4 4 1 . 6 5 4 4 2 . 0 5 5 5 1 . 9 4 8 9

2 . 5 0 . 7 6 5 0 2 . 6 2 8 1 2 . 5 5 3 1 2 . 5 6 1 2 2 . 5 0 6 8 2 . 4 6 5 8 2 . 3 7 7 6 2 . 1 2 7 6 2 . 4 9 5 1 2 . 4 1 0 9

3 . 0 0 . 8 1 0 0 3 . 2 2 0 0 3 . 0 9 8 1 3 . 0 9 1 0 3 . 0 3 0 2 2 . 9 5 2 6 2 . 8 8 1 4 2 . 6 3 1 4 2 . 9 7 6 4 2 . 9 0 7 6

3 . 5 0 . 8 4 1 1 3 . 8 0 3 4 3 . 6 4 0 9 3 . 6 1 7 4 3 . 5 5 0 5 3 . 4 5 6 3 3 . 3 9 6 7 3 , 1 4 6 7 3 . 4 7 6 2 3 , 4 1 8 3

4 . 0 0 . 8 6 3 5 4 , 3 7 4 0 4 . 1 7 7 2 4 . 1 3 6 5 1 . 0 6 3 8 3 . 9 6 4 8 3 . 9 1 3 6 3 . 6 6 3 6 3 . 9 8 1 8 3 . 9 3 1 9

4 . 5 0 . 8 8 0 3 4 . 9 3 1 9 4 . 7 0 5 9 4 . 6 4 7 9 4 . 5 6 9 7 4 . 4 7 3 1 4 . 4 2 8 2 4 . 1 7 8 2 4 . 4 8 8 0 4 . 4 4 4 1

5 . 0 0 . 8 9 3 4 5 . 4 7 9 1 5 . 2 2 8 2 5 . 1 5 2 7 5 . 0 6 9 2 4 . 9 7 9 6 4 . 9 3 9 6 4 . 6 8 9 7 4 . 9 9 3 0 4 . 9 5 3 7

5 . 5 0 . 9 0 3 8 6 . 0 1 8 0 5 . 7 4 5 5 5 . 6 5 2 5 5 . 5 6 3 7 5 . 4 8 4 5 5 . 4 4 8 4 5 . 1 9 8 4 5 . 4 9 6 5 5 , 4 6 1 0

6 . 0 0 . 9 1 2 4 6 . 5 5 0 6 6 . 2 5 9 2 6 . 1 4 8 6 6 . 0 5 4 7 5 . 9 8 8 0 5 . 9 5 5 1 5 . 7 0 5 1 5 . 9 9 8 9 5 . 9 6 6 6

6 . 5 0 . 9 1 9 5 7 . 0 7 8 1 6 . 7 7 0 0 6 . 6 4 1 8 6 . 5 4 2 8 6 . 4 9 0 4 6 . 4 9 0 2 6 . 2 1 0 2 6 . 5 0 0 5 6 . 4 7 0 7

7 . 0 0 . 9 2 5 5 7 . 6 0 2 0 7 . 2 7 8 9 7 . 1 3 3 2 7 . 0 2 9 1 6 . 9 9 2 3 6 . 9 6 4 2 6 . 7 1 4 3 7 . 0 0 1 6 6 . 9 7 4 0

7 . 5 0 . 9 3 0 7 8 . 1 2 2 7 7 . 7 8 6 3 7 . 6 2 2 9 7 . 5 1 3 9 7 . 4 9 2 5 7 . 4 6 7 5 7 . 2 1 7 5 7 . 5 0 2 2 7 . 4 7 6 5

8 . 0 0 . 9 3 5 2 8 . 6 4 1 0 8 . 2 9 2 4 8 . 1 1 1 5 7 . 9 9 7 5 7 . 9 9 4 5 7 . 9 7 0 2 7 . 7 2 0 2 8 . 0 0 2 6 7 . 9 7 8 6

8 . 5 0 . 9 3 9 2 9 . 1 5 7 2 8 . 7 9 7 8 8 . 5 9 9 2 8 . 4 8 0 2 8 . 4 9 5 2 8 . 4 7 2 4 8 . 2 2 2 4 8 . 5 0 2 8 8 . 4 8 0 3

9 . 0 0 . 9 4 2 7 9 . 6 7 2 0 9 . 3 0 2 5 0 . 0 8 6 3 8 . 9 6 2 4 8 . 9 9 6 0 8 . 9 7 4 5 8 . 7 2 4 5 9 . 0 0 3 1 8 . 9 8 1 8

9 . 5 0 . 9 4 5 8 1 0 . 1 8 5 2 9 . 8 0 6 6 9 . 5 7 2 7 9 . 4 4 3 9 9 . 4 9 6 5 9 . 4 7 6 2 9 . 2 2 6 2 9 . 5 0 3 3 9 . 4 8 3 1

1 0 . 0 0 . 9 4 8 6 1 0 . 6 9 7 2 1 0 . 3 1 0 2 1 0 . 0 5 8 7 9 . 9 2 4 9 9 . 9 9 6 8 9 . 9 7 7 6 9 . 7 2 7 6 1 0 . 0 0 3 3 9 . 9 8 4 1

1 1 . 0 0 . 9 5 3 4 1 1 . 7 1 8 0 1 1 . 3 1 6 1 1 1 . 0 2 9 3 1 0 . 8 8 5 7 1 0 . 9 9 7 3 1 0 . 9 7 9 8 1 0 . 7 2 9 8 1 1 . 0 0 3 1 1 0 . 9 8 5 9

1 2 . 0 0 . 9 5 7 4 1 2 . 7 3 5 9 1 2 . 3 2 1 2 1 1 . 9 9 9 0 1 1 . 8 4 5 7 1 1 . 9 9 7 8 1 1 . 9 8 1 8 1 1 . 7 3 1 8 1 2 . 0 0 3 1 1 1 . 9 8 7 3

1 3 . 0 0 . 9 6 0 7 1 3 . 7 5 1 4 1 3 . 3 2 5 4 1 2 . 9 6 7 9 1 2 . 8 0 4 8 1 2 . 9 9 8 3 1 2 . 9 8 3 6 1 2 . 7 3 3 6 1 3 . 0 0 3 2 1 2 . 9 8 8 6

1 4 . 0 0 . 9 6 3 6 1 4 . 7 6 4 7 1 4 . 3 2 8 9 1 3 . 9 3 6 0 1 3 . 7 6 3 1 1 3 . 9 9 8 5 1 3 . 9 8 4 9 1 3 . 7 3 4 9 1 4 . 0 0 3 1 1 3 . 9 8 9 5

1 5 . 0 0 . 9 6 6 1 1 5 . 7 7 6 6 1 5 . 3 3 1 9 1 4 . 9 0 3 7 1 4 . 7 2 1 1 1 4 . 9 9 8 9 1 4 . 9 8 6 2 1 4 . 7 3 6 2 1 5 . 0 0 3 2 1 4 . 9 9 0 5

1 6 . 0 0 . 9 6 8 2 1 6 . 7 8 6 7 1 6 . 3 3 4 3 1 5 . 8 7 0 7 1 5 . 6 7 8 4 1 5 . 9 9 8 8 1 5 . 9 8 6 9 1 5 . 7 3 6 9 1 6 . 0 0 2 8 1 5 . 9 9 0 9

1 7 . 0 0 . 9 7 0 1 1 7 . 7 9 5 9 1 7 . 3 3 6 4 1 6 . 8 3 7 4 1 6 . 6 3 5 5 1 6 . 9 9 8 9 1 6 . 9 8 7 7 1 6 . 7 3 7 7 1 7 . 0 0 2 7 1 6 . 9 9 1 5

1 8 . 0 0 . 9 7 1 8 1 8 . 8 0 4 4 1 8 . 3 3 8 8 1 7 . 8 0 4 4 1 7 . 5 9 2 9 1 7 . 9 9 9 3 1 7 . 9 8 8 7 1 7 . 7 3 8 7 1 8 . 0 0 2 8 1 7 . 9 9 2 2

1 9 . 0 0 . 9 7 3 3 1 9 . 8 1 1 4 1 9 . 3 4 0 1 1 8 . 7 7 0 3 1 8 . 5 4 9 0 1 8 . 9 9 9 0 1 8 . 9 8 9 0 1 8 . 7 3 9 0 1 9 . 0 0 2 4 1 8 . 9 9 2 4

2 0 . 0 0 . 9 7 4 7 2 0 . 8 1 8 2 2 0 . 3 4 1 8 1 9 . 7 3 6 6 1 9 . 5 0 5 6 1 9 . 9 9 9 0 1 9 . 9 8 9 5 1 9 . 7 3 9 5 2 0 . 0 0 2 1 1 9 . 9 9 2 7

- 45 -

Page 60: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

0.05

15

-0.05oiUJ<r

- 0.10

-0 .1 5

- 0.20

-0 .2 5

- 0 . 2 0

F i g u r e 3 . 3 ( a )

0.05

13 14 15

-0 .0 5

« - 0.10

-0 .3 5

- 0 . 2 0

-0 .2 5

F i g u r e 3 . 3 C b 3

0.15

0.10

0.C5

0

-0 .0 5

1.32 1 . 2 2 :an(—-

-Q.1S

Figure 3.3Cc)- 46 -

Page 61: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

0.20

0-25

0 .20

(1 -2 8 - O .S jr1 ] tan (-4r

0 .15

ce 0.10

0.05

-0 .0 5

F i g u r e 3 . 3 C d 3

0.05

«c= 3ah*UiUicr

-0 .0 5

a c 1 - r J

F i g u r e 3 . 3 C e )

0.15-

0.10

0.05

0

-Q.05-

- 0 . 10'

-0 .1 5 '

Figure 3.3(f]- 47 -

Page 62: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

0.40

0.35

0.30

0 .2 5

0 . 20-

0 .1 5

0.10

0.05

1 2 3 4 5 6 7 6 9 10 11 12 13 14 15

F i g u r e 3 . 3 C g )

12

-0 .0 5

- 0 .1 0

- 0 .2 0

-0 .2 5

F i g u r e 3 . 3 ( h )

0.10

0.05

o12

-0 .0 5

12 1 - r- 0 . 1 0

-0 .1 5

- 0 . 2 0

-0 .2 5

Figure 3.3(i) 48 -

Page 63: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

m e a p p r a x im c j uxu iib T u r i d r g t ; r\ a g a x i i& o c\.

g . n

7 .0

6 . 0

5 .0

CT§ 4 .0crtuUJ>HHt—<_]UJcr

1 -r -

2 .0

0

F i g u r e 3 . 4 C a )

5.0

4 .0

3 .0

2 .0tan

F i g u r e 3 . 4 C b )

4.0-

irr22 tancracrcrw 2.0

1.32

UJ>Mt—<- IUJcr

015

Figure 3.4Cc)- 49 -

Page 64: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

2 .0a:ocrcrUJ

(1 .28 - 0 .5 3 ra J tan

ui> 1 .01-4»—<_JUJcr 0

F i g u r e 3 . 4 ( d )

3 .0ccacr£ 2.0tu><tucr

014

F i g u r e 3 . 4 C e )

6.0

s.o

~ 4 .0CCacrcr" 3 .0tu >HHt—<:

j 2 . 0UJcr

0

Figure 3.4Cf)

- 5 0 -

Page 65: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

E.O

5 .0

4 .0

CCOCECCUI

3 .0

UJ>l-tt—<—1tuCC

2.0

01512 13

F i g u r e 3 . 4 C g )

3 . O'

ccaccccUJ

4 ( 1 - r J2 . 0-

tu>*-4>—<UJcc

Q14

F i g u r e 3 . 4 ( h )

4 .0

3 . 0cracccctu 1 - r

2 .0tu>t—<

010

Figure 3.4(i)- 51 -

Page 66: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

3.4 The Best Approximation for the von Mises Concentration Statistic, k, in

the Range 1 < k < 2.5

From examining the results and graphs for the approximations for large and small k t

all the power series functions are very poor estimators in the range 1 < k < 2 .5 .

From the global approximations, Figures 3.3(a) to 3.3(d) and 3.4(a) to 3.4(d), k , is

a poor approximation and may be removed. Taking a closer examination of k 2, k 3

and k A in the range 1 < k < 2.5, the residuals for these are given in Figures 3.5(a)

to (c) respectively. Here we can see that all three are good approximations,

however, k A is the best with maximum residual of 0.0168 at k - 1.6. The maximum

percentage relative errors for k v k z and k 4 in this range are, 2.125%, 2.446% and

1 .12% respectively.

- 52 -

Page 67: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

approximations to k in tne range

0 .0 5 .

0.03

0-01cS

2.1 2 .3 2 .5

•0.03

-0 .0 5

F i g u r e 3 . 5 ( a )

0 .05.

irrtan

0.01

02 .3 2 .5

- 0.01

-0 .0 3

-0.05

F i g u r e 3 . 5 ( b )

0.05

(1 .2 8 - 0 .5 3 ra ) tan0 .03

0.01

02.3

- 0 . 0 2

-0 .0 5

Figure 3.5Cc)- 53 -

Page 68: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

3.5 Summary of Approximations

For the twelve approximations examined, Table 3.5.1 gives the best three

approximations of k with their respective ranges. Also shown are the best 'simple'

approximations of k.

TABLE 3.5.1

Range o f k

Best Function

Approx im ation Range o f k

Best 'S imple'

Funct ion

Approximat ion

0 < k < 1.25

or

0 < r < 0.528

5 r 4r [2+ r 2 + ^ i - ]

0 < k < 0 . 2

or

0 < r < 0 .1

2 r

0 .2 < k < 1.45

or

0.1 < r < 0.584

r [ 2+ r 2]1.25 < k « 2.45

or

0.528 < r < 0.759

[ 1 . 2 8 - 0 . 5 3 r 2] t a n ™

k > 1.45

or

r > 0.584

1

1 - r 2k > 2 .45

or

r > 0.759

5 -5 r+ 2 r 24 ( l - r )

Figures 3.6(a) and (b) plot the absolute residuals for the best and best 'simple'

approximations in their respective ranges as continuous functions.

- 54 -

Page 69: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

ABSO

LUTE

R

ESIO

UA

t. AO

SOLU

TE

RES

IDU

AL

F i g u r e 3 . 6 ( a ) . P l o t o f t h e a b s o l u t e r e s i d u a l s f o r t h e b e s t

a p p r o x i m a t i o n s t o c o n c e n t r a t i o n s t a t i s t i c , k

0.05

0.04

0.03

[1 .2 8 - O .S S r^ tan i

0 .02

0.01

0 .5 1.0 2 .0 2 .5 3 .0 4 .0

F i g u r e 3 . 6 ( b ) . P l o t o f t h e a b s o l u t e r e s i d u a l s f o r t h e b e s t ' s i m p l e '

a p p r o x i m a t i o n t o c o n c e n t r a t i o n s t a t i s t i c , k

0 .09

o.oa

0.07

1-r-0 .06

0 .0 5

0 .04

0.03

0.02

02. 0 4 .0

- 55 -

Page 70: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Naturally we obtain better and better approximations using further terms in the power

series (3.3.4) and (3.3.9), however we wish to find and use fairly simple

approximations to (3.2.2) in order that tables or large computing equipment are not

required.

3.6 Expansion of [ I0(A )/IQ(B)]N

When calculating the likelihood ratio test for large or small values of k an expansion

and approximation of (3.6.1) is required.

N logIn (A)

I 0( B>( 3 . 6 . 1 )

If A and B are both small we may use the standard series expansion of Bessel

functions quoted by Bickley (1957) to obtain

i „<a >

I n ( B )

A2 A41 + — + +

4 64

1 + (A2 - B2) + 164

B2 B4 1 + — + — +

4 64

(A4 - 4A2B2 + 3B4) ( 3 . 6 . 2 )

For (3.6.1) the power series for log(l+x) is used to produce

N logI o (A) N

1 o ( B) 4(A 2 - B2) -

N

64(A4 - B4) ( 3 . 6 . 3 )

For very small values of A and B the second term may be neglected to give

N log' lo (A) ' N

1 o (B) 4(A2 - B2) ( 3 . 6 . 4 )

- 56 -

Page 71: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

For large values of A and B the Bessel functions are expanded using the asymptotic

expansion (3.3.1) to obtain

M A> B i 1 1 1 1— exp[A - B] • 1 + - — ------- + --------

I 0<B> A 8 A B 128

9 2 7

A2 AB B2

For (3.6.1) we again expand the logorithm as a power series.

1 (A + B)’l 0(A) N BN log- , « - log -

I 0(B) 2 A+ N(A - B) 1 -

8AB 16A2B2

( 3 . 6 . 5 )

( 3 . 6 . 7 )

For very large A and B we may neglect the higher powers of 1/A and 1/B to give

’ l 0(A> N BN log . « - log -

I 0 (B) 2 A+ N(A - B) ( 3 . 6 . 7 )

- 57 -

Page 72: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 4

TH E DEVELOPMENT OF CIRCULAR ANALYSIS OF VARIANCE TECHNIQUES

4.1 Introduction

The analysis of data of a circular or spherical nature began when Lord Rayleigh

(1880) developed a one-sample test for uniformly distributed random vectors using

the resultant length. Investigation into circular distributions other than the uniform

distribution began in the early part of this century. The most important of which,

and the assumed circular distribution for much of this thesis, was the von Mises

distribution. True development of significance tests, however, did not appear until

Fisher (1953), whilst investigating the remanent magnetism of a sample of rock

specimens, considered the spherical analogue of the von Mises distribution where

observations are regarded as points on a sphere. Fisher derived the maximum

likelihood estimates of the concentration parameter and the mean direction and

provided the basic distribution theory in order to test a prescribed mean direction

when k is unknown. Watson (1956) gave a significance table for the test of k - 0

i.e. uniformity, and approximate tests for the equality of concentration parameters

and mean directions. As discussed in Chapter 2.3 Greenwood and Durand (1955)

utilised Fishers work to produce a similar distribution theory for the circular case.

In 1956 Watson and Williams derived tests for the direction and homogeneity in both

the two- and three-dimensional cases. Their exact test for the two-dimensional case

is summarised in the following section.

- 58 -

Page 73: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

4.2 Exact and Approximate Tests of Significance from Watson and Williams

4.2.1 An Exact Test

Stephens (1972) produced an exact two-sample test for the null hypothesis that two

samples, size N 1 and N 2, have identical modal vectors, assuming they have the same

unknown value of k. For each sample the resultant lengths are found and the test

statistic Z calculated

R, + R2Z -------- ( 4 . 2 . 1 )

N

The resultant R of R , and R 2 is also found and W = R/N calculated.

The test consists of finding a critical value z satisfying Pr(Z > z/W ) = a, for

appropriate significance level a. Tables for the critical value z are presented by

Stephens (1972) and the null hypothesis is rejected at level a if Z = z. The exact

test is a conditional test based on the joint distribution of R 1t R 2 given R i.e.

P u ( R , ) P u ( R 2 )f ( R , , R 2 lR)--------------------------------- '■-------------1------------------------- ( 4 . 2 . 2 )

*Pu<R>{[(R, + R2) 2 - r2HR2 - (R, - R2>2]}*

where

0 < R 1 < n, 0 < R 2 < n 2 |R , - R 2 I < R < R , + R 2

and pu(R) is given by equation (2.3.9). Equation (4.2.2) was derived by Watson and

Williams (1956) for the circle following the derivation for the sphere by Fisher

(1953).

4.2.2 Approximate Tests

If 0 is an observation from the von Mises distribution with mode at zero then for

large k we have shown, (2.2.4), that 0 J~k is approximately N(0,1) and hence k 6 2

- 59 -

Page 74: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

to be chi-squared distribution with 1 degree of freedom. Using this result we may

approximate for 0 2 to obtain the result that 2fc(l-cos0) is chi-squared distributed

with 1 degree of freedom.

Watson and Williams used this result and the additive properties of x 2 distributions

to produce the approximations

2 /c(l - cos 0 ) »

2/c(N - X) ( 4 . 2 . 3 )

2k(N - R)

for single sample tests.

In the two sample case, if 0 1 ( the mean direction for sample 1) equals 0 2 (the

mean direction for sample 2) then

R 1 + R 2 = R

However, if 0 1 * 0 2

then R 1 + R 2 > R

Therefore the greater the difference between 0 1 and 0 2 the greater the value of

R 1 + R 2 - R. Via equations (4.2.3), Watson and Williams showed

2/c(R1 + R2 - R) « x 2

Equations (4.2.4) are independently distributed. (Further proof of (4.2.4) and (4.2.3)

can be seen in Mardia (1972 p.114)).

- 60 -

2k(N - (R, + R2) ) ( 4 . 2 . 4 )

qwhere

J -l

Page 75: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Watson and Williams then stated that to test the equality of mean vectors for several

samples (for large k ), generalising Watson (1956), we should use

q

In 1962 two papers by M A Stephens produced exact and approximate tests for the

null hypothesis, concerning single sample tests, that the mean vector is a given

vector when k is unknown. For the exact test, given in section 4.2.1, Stephens

produced nomograms for differing significance levels to find R 0 given N and X,

where Pr(P > R 0 |X) = a . X being the component of R on 0, when /x 0 is known

(Figure 2.2.1).

Stephens' three approximate tests were also for the above null hypothesis, for

different ranges of N and X. For large concentration parameter the approximate test

was given by

calculating R 0 from (4.3.1). If R > R 0 , reject the null hypothesis.

Equation (4.3.1) was first suggested by Watson and Williams using the equations of

(4.2.3). Using these equations the chi-squared approximations may also be shown as;

( 4 . 2 . 5 )q

where q is the number of samples.

4.3 Hypothesis Testing Concerning the Mean Direction

<R0 - *>(N - 1) ( 4 . 3 . 1 )

2/c(N - X) = 2/c(R - X) + 2k(N - R) ( 4 . 3 . 2 )

- 61 -

Page 76: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

For large k, this obeys the x 2 decomposition property parallel to

( 4 . 3 . 3 )

where x iy...,xN is a random sample from N(/t,cr) for linear statistics.

Equation (4.3.1) is therefore similar to the F-statistic for linear statistics, which is

the ratio square-root of the first and second terms on the right-hand side of

In 1969 Stephens stated that in practice the asympototic results of (4.2.3) are not

adequate for moderately large values of k. Illustrating this via Pearson curve

approximations and Monte Carlo studies, an improvement in (4.2.3) was found by

examining the expansion of A (k) for large k (equation (3.2.2)). This suggested that

the chi-squared form would be improved by replacing k by y in (4.2.3), where

y k 8k2

and that this should be used for tests when k > 2 .

In 1974 Upton considered further improvements of (4.2.3) by investigating the

distributions of 0, X and R. Taking the expectation of these distributions and

approximating the Bessel functions involved, Upton derived 0, to replace k in

(4.3.3).

1 1 3( 4 . 3 . 4 )

(4.2.3).

1 30 - k 1 ( 4 . 3 . 5 )

4k 16k2

Upton concluded that (1) both y and 0 approximations improve as k increases (2)

both 7 and 0 are considerable improvements over the original, and (3) there is little

to choose between the 0 and y approximations

Page 77: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Assuming equal concentration parameters k , and k 2 for the two sample case, we are

now interested in testing

Ho : Voi “ 02 “ ( 4 . 3 . 6 )

against

Hi : * 02 ( 4 . 3 . 7 )

where fi0 and k are unknown.

Using Stephens approximation improvement 7 , for large k , (4.2.5) from Watson and

Williams becomes

3•2 “ 1 + — (N -2 )

8k

(R, + R2 - R)

(N 1 JO 1 to

f 1 ,N -2 - 1 + — (N -2 ) ( 4 . 3 . 8 )

where for unknown k , k can be replaced by its maximum likelihood estimate,

k - A" 1

given by (3.3.11) or (3.3.10).

For value of k > 10 (R/N > 0.95) the improvement factor 7 is negligible.

Using the likelihood ratio test procedure, discussed in Chapter 3.1, to test H 0 and

H 1 defined by (4.3.6) and (4.3.7) we may obtain a test statistic for small k, assumed

equal, given by

log X - N logI 0( * i )

M * o >

p 2

+ k Q £ £ cos(0 i j - 0)

i-1 j-1

P 2

" k ' I I cos(0 i j - 0j )

i-1 j-1

( 4 . 3 . 9 )

- 63 -

Page 78: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

From (1.4.11)

P 2^ ^ c o s ( 0 j j - 0) - R

i - 1 j - 1

P 2

^ ^ c o s (0 } j - 0j ) - R, +

i - 1 j - 1

Using (3.2.11) for small k

k n - —2R

N

2(R, + R2)

N

( 4 . 3 . 1 0 )

where

N - Nt + N2

Using approximation (3.6.4) we obtain the test statistic

“2 log X - jj (R, + R2) 2 - R2 = X] ( 4 . 3 . 1 1 )

In the same manner the likelihood ratio test of the equality of the mean directions

of two samples having unknown concentration parameters, assumed to be equal and

large, may be obtained from (4.3.9).

With k assumed large we may use the approximation (3.3.7) to give

N0 2(N - R)

N2(N - R, - R j)

( 4 . 3 . 1 2 )

- 64 -

Page 79: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Using the approximation (3.6.7) for large k we obtain our test statistic

Nlog ( 4 . 3 . 1 3 )

Mardia (1972) shows this to be a monotonic function of the F-statistic given by

For both the single and two sample cases Upton (1970, 74), using likelihood ratio

techniques, produced significance tests concerning the mean direction and the

concentration parameter for all permutations of k known or unknown, k large or

small, and mean direction known or unknown. Upton also improved these test

statistics using their expectations and associated degrees of freedom, as discussed in

Chapter 3.1

Many single and two sample tests have not been fully reviewed and reproduced here

as this thesis is concerned with the further development of analysis of variance

techniques and only those having a bearing on such development have been

introduced. An excellent review of single and two sample testing can be seen in

Mardia (1972, p 132-).

4.4 Multi-sample Tests Concerning the Mean Direction

Let 6\j (i= l,2 ......p ,j= l,2 ,...... ,q) be q independent random samples of sizes Nj from

M(/x0j,A:j). Let Rj be the length of the resultant of the jth sample, and R be the

length of the resultant of the whole or combined sample.

We wish to test

against the alternative that at least one of the equalities does not hold. We assume

that k y = ... kq = k , where k is unknown.

(4.3.8).

H0 • ^ 0,1 “ , 2 = ^o,q ( 4 . 4 . 1 )

- 65 -

Page 80: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

4.4.1 For Small Concentration Parameter, k

For values of k in the range 0 to 1 the likelihood ratio test for q samples extended

from (4.3.11) gives

q- 2 Io g X - ^ ( ( £ R j ) 2 - R2> - X2_ j ( 4 . 4 . 2 )

j - 1

Since this test uses the approximation (3.2.11) for small k, which was shown in

Chapter 3 to be unreliable as k-41, the test may be improved by using the

expectation approximations from (2.5.2) and (2.5.4)

E(Rj ) " n jp + k k > 0

E (R j ) - Nj + N j (N j - l ) p 2

in (4.4.2) to produce the test statistic

q

I « ( ( £ R j ) 2 - R2) ) ( 4 . 4 . 3 )

j - 1

where

k 2 q5 - 1 = 1 ---------+ ( 4 . 4 . 4 )

8 2Nk2

4.4.2 For Large Concentration Parameter, k

Using Stephens improvement (4.3.4) and extending Watson and Williams test statistic

of (4.2.5), the new test statistic, under the null hypothesis (4.4.1), becomes

31 + — -

8 k

(N - q) ( J Rj - R)

j - 1

(q - 1) (N - Y Rj )

j - 1

Fq - 1 ,N-q ( 4 . 4 . 5 )

- 66 -

Page 81: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

where k is the m .l.e. of k given in (3.2.2).

Mardia (1972) states that from Monte Carlo trials this approximation is adequate for

k > 1, i.e R/N > 0.45 Stephens (1969, 72), however, shows that the test is adequate

for k > 2, i.e. R/N > 0.7, and requires a further improvement to be satisfactory for

k > 1. Stephens (1972) produced an approximate multi-sample test based on the

exact test given in section 4.2.1, where

R, + R2 + ................ + Rq RZ - ---------------------------------------------- and W — — ( 4 . 4 . 6 )

N N

With test statistic;

z - ( 4 . 4 . 7 )

where

f _ g p (q - * )(N - q)

Let g be the upper percentage point, at level ot, of the F distribution with q-1 and

N -q degrees of freedom, and let D be a parameter taking the following values, for

W between 0.45 and 1.0.

W : 0 .45 0 .50 0.55 0 .60 0 .65 0 .70 0 .80 0 .90 1.00

D : 0 .92 0 .87 0 .84 0.83 0 .82 0 .84 0 .88 0 .93 1.00

If Z > z, reject H 0 at significance level ct

This test is equivalent to (4.4.5) for k > 2, but as (4.4.5) deviates from the

F-distribution below k-2 the factor D is introduced. For values of k > 2 D is the

reciprocal of the factor 1 + (318k).

- 67 -

Page 82: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Upton (1976) produced an approximate test for q samples, known as the G-test,

related to the joint distribution of Z and W, where Z and W are given by (4.4.6).

The G-test is derived from the method of maximum likelihood which gave (4.3.13)

for the two-sample case. Upton states the test statistic

2N2 logi 1 - w1 - z

N(1 + W) + q Xq-1 ( 4 . 4 . 8 )

suitable when all Nj 10 and R/N > 0.6.

In exactly the same procedure as the two-sample case, k 0 and k , are produced

from (3.3.7) to give

kr, =0 2 (1 - W)

k - 11 2(1 - Z)

Using the approximation (3.6.7) for large k and the log likelihood ratio

( 4 . 4 . 9 )

<-r

'w'O

HH

A A

N log + k 0W - k , ZI 0 (£o>.

we obtain the test statistic

-2 logX = N logj ( 1 - W)( 1 i N

( 4 . 4 . 1 0 )

This may be improved by equating the test statistic to its associated chi-squared

expectation. On expanding the logarithm as a power series we may neglect terms

beyond the first two since W and Z will be smaller than 1, to produce

N - | - w 2 + z + § !

- 68 -

Page 83: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

using the expectations E(R) and E (R 2) given by equations (2.5.4) and (2.5.2).

E(G) “ (q2k 1 “ A(/c) + 4En

On equating with (q-1) and simplifying we obtain the correction

2NN + R + q

to produce the test statistic (4.4.8).

Throughout Uptons paper, conditions, ranges and tables for the statistic W/N are

given, however, this is a notation error and should be read as R/N or simply W.

The above multi-sample tests will be investigated further in Chapter 9 when suitable

comparisons to an alternative test will be given.

4.5 Multi-Sample Tests for the Equality of Concentration Parameters, ^

In this section three tests for the homogeneity of concentration parameters for

differing values of Rj/Nj are given. The construction of tests (4.5.2), (4.5.4) and

(4.5.5) can be seen in Mardia (1972, p 165-). The composite hypothesis under

consideration is

Ho : - .............- *q - k ( 4 . 5 . 1 )

where j i , , iq and k are not specified.

- 69 -

Page 84: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

4.5.1 Rj/Nj < 0.45

Test statistic

q 1 I WJg ’ ( * P

u - “ I - - T 1---------------j - i

q

I wj

j - i

q - i

where

R2R

Ng , ( R j ) = s in - 1 (aRj ) a

1wj 4 (N j - 4)

The test statistic (4.5.2) is based on the approximation

2Rk,

2(1 - a 2k 2)N

which is obtained from the approximation

A(fc) « |

from (3.2.8).

1 - 8

( 4 . 5 . 2 )

( 4 . 5 . 3 )

The functional form of the transformation to normality is

g , ( k ) - s i n " 1 (ak)

under H 0 (4.5.1) the g,(R j) are approximately distributed as independent N(g1(A:),o’j)

where

2 _ 3 _ 1_° j " 4 (N j - 4) “ wj

- 70 -

Page 85: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

4.5.2 0.45 < Rj/Nj < 0.70

Test Statistic

U . - I wjS j - 1

R

iNj

f 3v [R j|L WJS2

U - i kJ ■ *= x,

I -Jj -1

q-1

where

( 4 . 5 . 4 )

1 0.7979

wj <Nj - 3>

R 90 1 O

g 2 - = s inh“ 1 -N C 2

c, = 1.0894

c 2 = 0.25789

Test Statistic (4.5.4) is built in the same manner as for (4.5.2).

4.5.3 Rj/Nj > 0.70

This test is the analogue of Bartlett's test for homogeneity of variance and was given

by Stephens (1972) and Mardia (1972).

Let d j = N j - 1

Test Statistic

D = N-q = ^ d j

J-1

Z, - D{ loge £ (Nj - R j ) } - DlogeD - £ d j lo g e (Nj - R j ) + J d j l o g edj

j - 1 j - 1 J- 1( 4 . 5 . 5 )

- 71 -

Page 86: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

then U 3 = Z y C where

C = 1 +

n

iJ - i

3 (q - 1)

IL is d is t r i b u t e d as y 2 3 ^n-1

The above tests will be used when techniques in later chapters assume that k is

equal in value for all the subpopulations being tested under the composite hypothesis.

- 72 -

Page 87: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 5

PARAMETER ESTIM ATION LEADING TO M A XIM U M LIKELIHO OD METHODS

FOR LARGER EXPERIMENTAL DESIGNS

In the first four chapters of this thesis, a general review and critique of the

problems, theory, approximations and tests associated with circular statistics have been

discussed. The thesis will now begin to extend these techniques and discuss alternative

methods to construct new tests for experimental designs and their required analysis of

variance.

Gould (1969) gave a regression analysis procedure for use when the dependent

variable is the position of a point on the circumference of a circle or the surface of

a sphere. Gould used an analogue of the normal theory of linear regression for the

circular variable problem, where

0j, i= l,...,N is independently distributed as M ( / i0 + @t'v k) where t j , . . . , tN are known

numbers while f i0, 0 and k are unknown parameters. The maximum likelihood

method, as discussed in Chapter 2, is used to estimate the parameters f iQ and j3.

The logarithm of the likelihood function is

5.1 Introduction

( 5 . 1 . 1 )

N

- constant + k cos( 0 j - / i0 - 0 t i )

i - 1

The maximum likelihood solutions, fiQ and 0 for the parameters are then obtained as

the solutions to the two equations

N

( 5 . 1 . 2 )

i - 1- 73 -

Page 88: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

and

N

^ t | s i n ( 0 j - Jt0 - - 0 ( 5 . 1 . 3 )

i - 1

From (5.1.2)

N

£ s i n ( 0 j - ^ t j )

tan fiQ - ^ ----------------------- ( 5 . 1 . 4 )

^ cos( 0 j - 3 t i )

i - 1

P is obtained by a straightforward iterative procedure where P is an initial estimate

of P and fi0 the corresponding value of /z0 from (5 .1.4), for the iteration

N

J t j s i n ( 0 j - ? 0 - ^ t j )

P - P + ^ -------------------------------------- ( 5 . 1 . 5 )

£ t?cos( 0 j “ ?o ~ ^ i )i - 1

From these estimates is developed an appropriate test for p=0. For this model,

maximum likelihood estimation coincides with least squares estimation.

Johnson and Wehrly (1978) showed that the most serious drawback to Goulds

approach is that the likelihood function has infinitely many large peaks.

In this section a similar modelling approach to Gould will be discussed, however, for

experimental design models, as in linear modelling, constraints will be placed on the

estimating parameters. The maximum likelihood estimates of the model parameters

will be produced for the required hypothesis test for the first time. The attainment

of these not only enables hypothesis analysis but greater data appreciation, as will be

shown in the following sections.

- 74 -

Page 89: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

5.2 Constraints for Circular Models

For the development of experimental models each angular variate in a one-way

classification may be made up of some overall mean direction f iQ, some effect due to

a particular treatment, /3j, and some random variable representing unassignable

(residual) effect, q j. Hence /3j = (fij - / i0) where /ij is the mean direction of the

jth population.

The form of the model which is used can be expressed as;

As in linear analysis of variance the assumptions on the treatment effects are:

a) the treatment terms add on to the mean direction term rather than, for

example, multiplying

b) the treatment effects are constant

c) the observation on one block or unit is unaffected by the treatment applied to

other units.

In linear analysis it is usually convenient to choose the constraint on the treatment

parameters so that they sum to zero; in circular analysis it is correspondingly

convenient to choose the constraint on the angles specifying the treatment parameters

so that their sines sum to zero. A proof and simple example shows the need for

this constraint; using expression (5.2.1) and assuming zero residual

0i j “ + + 6i j ( 5 . 2 . 1 )

P q p q

i - i j - i i-1 j-1( 5 . 2 . 2 )

- 75 -

Page 90: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

ta n 0

q qcos /i0 ^ s in f i j + s in f iQ ^ cos 0j

j-1________________ j ° lq q

cos /i0 ^ cos 0j - s in /i0 I s i n

j- i j - i

under the constraint I s l n h - 0j- i

qs in M0 \ cos 0j

tan 0 - --------------i —5--------------- - tan (iQ ( 5 . 2 . 3 )

COS /l0 I cos /3j

j -1

Example 5.2.1

4

i ) Let fiQ - 100* £ - 0

j -1

and

0, = 48.5* 02 - 19.38* ft - -54 .04*

therefore 0 4 = -13.84* with zero residual

0 M - 148.5* 012 - 119.38* 013 - 45.96* q 4 - 86.16*

Using (1.4.5) to calculate the mean direction gives

0 - 100.582*

4

i i ) However under the co n stra in t ^ s In 0 j — 0

j-1

then 0 4 = -15.745* and 0 14 = 84.255*

to give 0 = 100*

- 76 -

Page 91: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

In the first example the sample mean direction differs from the stated population

mean direction when zero residual effect is assumed. When the constraint is that

the sine of the treatments sum to zero, the sample mean equals the stated population

mean direction, i.e. 0 = fiQ. This must hold for all factor effects in larger circular

experimental designs.

Using this notation each observation 0jj is an independent observation from a von

Mises distributed population with a mean direction of / i0 + /3j and whose

concentration parameter is k , i.e.

0 i j * IVM( ^ 0 + 0 j , k ) ( 5 . 2 . 4 )

where q is the number of treatments, and p is the number of observations on each

treatment, with the constraint

q^ s in /3j - 0 ( 5 . 2 . 5 )

J-1

and IV M is read as 'independently von Mises distributed1.

5.3 One-way Classification

Assuming the fly's (i= l,2...p , j= l,2 ...q ) are independently distributed as

M (fi0 + (3j,k), let us construct a test of:

H0 : (3, - P:

H, : at least one |3j / 0( 5 . 3 . 1 )

Let fiQ and ( / i0, j3j) be the maximum likelihood estimates of ( iQ and / i 0, /?j under

H 0 and H 1 respectively. The logarithm of the likelihood function is given by:

P qlog L - constant + ^ cos( f l j j - /*0 - /3j)

i-1 j-1

( 5 . 3 . 2 )

- 77 -

Page 92: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

a) Under H 0 the maximum likelihood estimate fiQ of nQ is the solution ot:

P q^ J s i n ( 0 j j - J 0) - 0 ( 5 . 3 . 3 )

i -1 j -1

Therefore

P q

I l sin i j

tan un - - tan 8 . . ( 5 . 3 . 4 )P q

I I 008 9iji -1 j -1

Hence Jt0, under H 0, is the overall sample mean direction, 8.

b) Under H 1 the maximum likelihood estimates fiQ and /3j are the solution of:

P q£ £ s in (0 j j - - j3j) - 0 ( 5 . 3 . 5 )

i-1 j-1

and

P^ s in (0 j j - n 0 - /3j) - 0 'q' equations ( 5 . 3 . 6 )

i=*l

qunder the constraint ^ s in |3j — 0

J-1

Let Aj - £ 0 + 0j

Substituting Aj into (5.3.6)

PJ s in (0 j j - Aj) =* 0 ( 5 . 3 . 7 )

i = l

From (1.4.9) Aj is the mean direction of the jth block i.e. 8 j.

- 78 -

Page 93: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Therefore we may solve for £ 0> 0 ,, (32......... 0q from the q equations:

0- i “ M0 +

0-2 " Mo + h

0 -q “ M0 + 0q

under the c onstra int I s in 3j - 0

j -1

Taking the sine and summing the q equations of (5.3.8) produces

qP q0 . j - s in fL0 2 cos fij

A

+ COS fL0 2 s i n 0j

j -1 j-1J-1

under the given constraint

q^ s in 0.j - s in n0

J-1

^ COS |3j

j-1

Taking the cosine and summing the q equations of (5.3.8) produces

J cos 0.j — cos f l Q

j - 1

^ COS /3j

j -1

Dividing equation (5.3.9) by (5.3.10) gives

qI s in » . j

j -1 s in )t0

q cos /*0COS 0. {2

j - 1- 79 -

( 5 . 3 . 8 )

( 5 . 3 . 9 )

( 5 . 3 . 1 0 )

Page 94: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Therefore

J-1fiQ — tan ” 1 ( 5 . 3 . 1 1 )q

Hence, under H lt jz0 is the circular mean of the q equations, 0 .j

The 3j treatment effects are calculated directly from the q equations of (5.3.8), given

H0 found by (5.3.11) i.e.

Unlike linear analysis the maximum likelihood of the overall mean direction, n Q,

alters depending on the hypothesis test. Example 5.3.1 shows the parameter

estimation and the following hypothesis test at work for a hypothetical data set

concerning animal orientations.

Example 5.3.1

In an orientation experiment four samples of animals were observed: one was a

control group, the other three were experimental groups. Following treatment, their

direction of movement was noted and reproduced in Table 5.3.1.

( 5 . 3 . 1 2 )

- 80 -

Page 95: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 5.3.1

ControlGroup

Experim ental Group 1

Experim ental Group 2

Exper imental Group 3

156* 168* 137* 214*

111*

•VOC" 184* 155’

174* 102* 222 * 129’

140* 137* 163* 153*

213* 184* 236* 125’

121 * 112* 133* 228*

2 0 0 * 62* 161* 185’

166* 133* 193* 176*

Statistic:

Resultant

R. j

Sample Mean D i re ct ion

' .J k

Control Group 6.7031 159.9322’ 3.415

Exper imental Group 1 6.231 121.5483* 2 .622

Exper imental Group 2 6.5938 177.9278* 3 .182

Exper imental Group 3 6.5938 169.9278* 3.182

Overall sample mean direction, 0.. = 158.3153*

Overall resultant length, R.. = 24.362

Concentration parameter estimate k - 2.464

Circular mean of the individual _group mean directions 6 - 157.8013*

Each of the sample populations have been tested by Watsons statistic (1961),

using the critical values supplied by Stephens (1964), to show von Mises distributed

data sets. Similarly, the homogeneity of the concentration parameters have been

tested and validated via test statistic (4.5.5).

- 81 -

Page 96: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Parameter Estimates:

Under H 0 of (5.3.1) /t0 = 158.3153* (from (5.3.4))

Under H , of (5.3.1) £ 0 = 157.8013* (from (5.3.11))

0 ,A

^2

e 3

h

. l

. 2

. 3

. A

M0A

M0A

^0y\

Mo

2.1309

-36.253*

20.1265*

12.1265*

I sin 3jj - i

Hypothesis Testing:

The likelihood ratio test for H 0 and H 1 gives the test statistic

N logeI 0C/c0)

p q

+ I \ cos^ i j " M0)i -1 j -1

p q£ £ COS(0f j - f t Q - 0j)

i - i j - i

( 5 . 3 . 1 3 )

which produces the same test statistic given by Watson and Williams and built in

Chapter 4.3 where

P q

£ £ cos ( j j - J*0) “ R. .

i -1 j -1

p q q

I I cos(f l iJ - - 3 j ) - J R. ji - i j - i j - i

Using the approximations (3.3.7) and (3.6.7) for large fc the one-way classification

test statistic is seen as

- 82 -

Page 97: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

N l o g eN - R

N - J r . j

j -1

~ Xq-1( 5 . 3 . 1 4 )

which is a monotonic function of the F statistic given by (4.4.5). For Example

5.3.1, applying test statistic (5.3.14) gives:

32 loge7.638

5.87838.38 * x 2

3

x l (5%) - 7 . 8 i x | ( 1%) - n *34

Indicating a significant difference, at the 5% level, between the four treatments

shown in Table 5.3.1 . Having rejected H 0 the above method has shown means by

which the parameter estimates of a one-way classification design may be obtained

and examined further. Figure 5.3.1 illustrates the angular difference between the four

effects or treatments, given in Example 5.3.1, against the model derived mean

direction estimate, n 0. Treatment 1 is seen to possess the greatest divergence from

'0*

- 83 -

Page 98: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

E*f> I

* ------------2 0 - 13*

Figure 5.3.1 Angular differences between the treatment mean directions and the

overall sample mean direction found in Example 5.3.1 .

5.4 Randomised Complete Block and Larger Experimental Designs

If the maximum likelihood estimates for other circular experimental design models can

be found in a similar manner to the one-way classification shown above, the testing

of differing hypotheses for 'larger' experimental designs may be undertaken. However,

unlike the one-way classification, the parameter estimates for these designs may not

be found by simple algebraic manipulation.

Let us investigate, for example, parameter estimation for the randomised complete

block design. Here the form of the model may be expressed as

0 i j = /*o + a i + 0j + e i j ( 5 . 4 . 1 )

- 84 -

Page 99: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Assuming the fly's (i= l,2 ,......,p, j= l,2 ,...... ,q) are independently distributed as

M (pLQ +Oj+/?j, k), let us test the 'column' effects for the hypothesis given in (5.3.1)

under the constraints:

P£ s in a j — 0 ( 5 . 4 . 2 )

i -1

qJ s in /3j - 0 ( 5 . 4 . 3 )

j -1

A A

let ( / i0,ay and ( / i0*, cq*, (Sj*) be the maximum likelihood estimates under H 0 and

H 1 respectively. (The starred parameters represent parameter estimates found under

H , ) .

A A

Under H 0 the maximum likelihood estimates / i0 and aj of /x0 and cq are the

solutions of:

P q^ ^s in(f l j j - j i0 - cij) - 0 ( 5 . 4 . 4 )

i -1 j -1

q^ s i n ( f l j j - n Q - a j ) = 0 'p ' equat ions ( 5 . 4 . 5 )

j “ l

under the c onst ra int ^ s in a j - 0

i -1

Equations (5.4.4) and (5.4.5) are in precisely the same form as (5.3.5) and (5.3.6),

and therefore give the same parameter estimates as (5.3.12) and (5.3.11) for aj and

/z0 respectively.

Under H 1 the maximum likelihood estimates / t0*, a * and of f iQ, aj and /3j are

the solutions of

- 85 -

Page 100: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

p q£ J s in (0 j j . £ 0* - 5 i * - j3j*> - 0 ( 5 . 4 . 6 )

i -1 j -1

P£ s i n ( 0 j j _ £ 0* - cq* - 3 j*) - 0 'q' equations ( 5 . 4 . 7 )

i -1

Punder the co n stra in t ^ s i n a** = 0

i-1

tr^ s i n cq*

J s i n ( 0 j j _ n0* - cq* - 3 j* ) - 0 'p' equations ( 5 . 4 . 8 )

J-1

qunder the co n stra in t ^ s in 3 j* ” 0

J-1

Unlike the one-way classification of section 5.3, these constrained equations may not

be simplified in the same manner. In 'linear' statistical analysis the equivalent

expressions would simplify to their respective 'column' or 'row' means. This

produces a simple statement where, some overall mean plus a particular 'row' effect

gives the respective row mean; and similarly for column effects. In circular analysis

these expressions will not simplify and require lagrange multipliers to take account of

the given constraints.

For the solution of the system of equations (5.4.6), (5.4.7) and (5.4.8), under the

constraints (5.4.2) and (5.4.3), computer programs have been utilised. Several

algorithms were applied to the constrained optimisation problems in an attempt to

find a global maximum subject to equality constraints. O f main use were sequential

lagrangian methods with the maximisation being solved by quasi-Newton procedures.

The functions, however, are not unimodal in nature and are indeed very heavily

multimodal. Therefore unless the initial estimates are very good approximations,

- 86 -

Page 101: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

the algorithms simply find local constrained maximums and not the required global

maximum.

In order to solve merely small randomised block designs a lengthy and inefficient

program has been written using a direct search method. Example 5.4.1 gives a

small hypothetical data set on which the direct search program has been applied.

Example 5.4.1

'Column* E f fe c ts ResultantMean

D i re ct ion

0 i 02 03 Ri . * i .

«i 115* 105* 2 0 * 2.229 83.347

' Row' «2 170* 180’ 65* 1.899 145.343

E f fe c ts « 3 75* 90* 325* 1.761 52.253

«4 120* 150* 45* 2.175 107.632

Resultant

R . j 3.346 3.255 3.202

Mean Di rect ion

* . j

119.517* 130.751* 25.566*

N - 12

R . . - 6.817 ? . . - 97.458*

H 0 • 01 “ 0 2 ™ 0 3 ” ®

Ht : at least one /5j ^ 0

- 87 -

Page 102: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Parameter Estimation;

Under H0 : n 0 - a rc tan

pp

Z s in e i .i -1Pp

Z cos ®i.i -1

A

“ * 1 .A

- V- 0 “ -13, .508

«2 “ * 2 . - ?o " 48,.487

A« 3 “ * 3 . - ?0 “ -44, .603

« 4 “ * 4 .A

- ^0 - 1 0 ,.776

P^ s in a j

i -1

P q PJ £ cos(8 j j _ /t„ - o ii) - 2 Ri

i -1 j -1 i-1

k using approximation (3.2.7) equals 1.838

Under H , : Via the direct search method

?o* " 97.065*

s , * - -11 .78*

S 2* - 46.49*

-48 .49*

« < * - 13.164

? , * “ 23.13*

0 2* " 3 4 .3 5 ’

£ 3 * “ -73 .15*

P

ii -1

^ s in 04 '

^ s in /3j* - 0

J-1

p q£ £ cos( 0 i j _ £ 0* - 04 * - 3 j * )

i -1 j -1

96.856*

0

8.0646

11 .878

- 88 -

Page 103: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Maximum likelihood estimate k , is given by

p q

" " I I c o s ( 0 U “ £o* ~ « i* - 0 j* )M M N i=i j„i

Using approximation (3.3.11) k y = 49.4

Replacing the parameter estimates under H 0 and H t into the likelihood ratio test

statistic (5.3.13) and using the approximation (3.6.7) produces a chi-squared value of

41.88. Comparing this to a x \ indicates that there is a highly significant difference

between the column or block effects.

A A

Under a null hypothesis testing the row or treatment effects the m.l.e. of / i 0 and /3j

are

r qn

I s i n 6-im-a j- i

O il

q

Z 005 ? . jL j - i

- V - i t " 22 777'

- * 2 .A

- " 34 O i l ’

- - i t - -71 . 174"

96.74

I sin hj - i

p q q

£ J cos(0ij - - 3j) = 2 R.j “ 9-i-1 j -1 j-1

8034

k 0 - 3.01

The alternative hypothesis estimates will be unchanged producing a chi-squared value

of 34.74. Comparing this to a x \ indicates a highly significant difference between

the row or treatment effects.

- 89 -

Page 104: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

5.5 Summary

In linear statistical analysis the generalised linear model approach may be used to

find the factor parameter estimates for a particular design. These are then utilized

within a maximum likelihood testing procedure to examine for any significant

difference between the factor effects. A similar approach has been used within this

section for circular statistics. In linear analysis the factor effects sum to zero, here

we have seen that for circular analysis, it is convenient to choose the constraint on

the angles specifying the treatment parameters so that their sines sum to zero. From

this, section 5.3 has shown how parameter estimation for the design model of a

one-way classification may be produced, and assist in further understanding the

underlying structure of the data sample under investigation.

For larger designs, however, the solution of the likelihood function under the given

constraints may not be found by the same simple algebraic manipulation. Due to

the complexity of the maximising problems involved many local maxima may be

found and the discovery of the global maxima extremely difficult. Many computer

programs for the optimisation of constrained equations have been tried with little

success and further investigation into improved computer algorithms will be necessary.

If the null hypothesis for a particular test is rejected, the neatness of this approach

helps us to appreciate and to look at the contrasts between the effects within a

factor. Such procedures exist for linear analysis derived by Tukey or Scheffe known

as methods for multiple comparison. If the approach of this section can be extended

to larger design methods using computer algorithms to help find and understand the

parameter estimates, similar procedures of multiple comparison may be derived for

the circular case.

- 90 -

Page 105: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 6

THE EXTENSION OF EXISTING TECHNIQUES TO LARGER EXPERIMENTAL

DESIGNS

6.1 Introduction

From Chapter 4 we have seen how, for tightly clustered populations, Watson and

Williams (1956) developed a one-way classification technique which has been widely

used for 2 and 3 dimensional vectors. This original technique has been refined and

used to produce tests for several samples having common mean direction. Chapter 5

has shown how the parameter estimates may be found for the one-way classification

design. In Section 6.2 this analysis of variance is extended from the one-way layout

to the nested or hierarchical design as an initial step to the analysis of larger more

complex experimental situations. Section 6.3 extends the design further to enable

analysis of randomised complete block designs and the two-way analysis of variance

design, for large k.

It should be noted that within linear analysis of variance the factor components

within an experimental design are correctly referred to as sums of squares, however,

in circular analysis of variance the measures of each factor are not calculated in the

same manner and will be referred to as measures of variation. For examples, the

one-way analysis of variance procedure will be analysed using a total, between and

residual measure of variation.

91 -

Page 106: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

6.2 Nested or Hierarchical Design

There are many occasions when a researcher wishes to study the effect of a single

factor such as light intensity or magnetic charge on the displacement of

micro-organisms but because of the nature in which these are obtained a more

complex one-way design is required. The nested or hierarchical model is such a

design.

It is often necessary to measure the response to a treatment on each individual of a

subsample of a unit factor than on the entire unit to which the treatment is applied.

In a drug experiment, for example, the treatments may have been applied to a

particular animal type within different groups, from each group several animals may

be picked at random and their response (perhaps their angle of movement) measured.

To this end a model for the nested design may be built, under the general

assumptions;

i) The samples are drawn from populations with a von Mises distribution

ii) The parameter of concentration has the same value in each population,

that is,

k , = k 2 - ............ - kq = k

iii) The overall and individual parameters of concentration are sufficiently

large, namely

k > 2

Let p be the number of treatments, given by i = 1,2...... ,p, qj the number of groups

or experimental units within treatment i, given by j = 1 , 2 qj, and n the number

of observations within each qj group. Let N be the total number of observations

and U be the number of cells. Extending the expression given by Watson and

Williams (1956), that

- 92 -

Page 107: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

p p( 6 . 2 . 1)

i= l i= l

produces:

+ 21c(J R2 j . - R2 . . ) +

j - 2

+ 2k( -t - R p _ ) + 2k(^ 2 n i j . “ Ri j . >

j -1 i -1 j -1

(6 .2 .2 )

with associated independent chi-squared distributions with known degrees of freedom,

for large k :

The first term on the right hand side of the expression (6.2.2) is essentially the

measure of variation among treatments, the next qp terms of similar form are the

measures of variation within treatment i but among the qj groups. The final term

represents the within or residual variation within treatment and within group but

among the sub-groups. As a nested design we may consider differences between

rows, or difference between columns within any one row.

As in the one-way analyses a test statistic for examining the differences between

rows may be given as

P

( 6 . 2 . 3 )p q i

- 93 -

Page 108: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

which has an F distribution with (p—1) and (N -U ) degrees of freedom. Similarly a

test statistic for the difference between columns within row i is calculated in the

same manner.

which has an F distribution with (qj—1) and (N -U ) degrees of freedom.

Stephens (1982) produced the same expression (6.2.2) and quotients (6.2.3) and

(6.2.4) in m dimensions for the analysis of data which are proportions of a

continuum such as time or volume. Stephens also gives a good example of the

methodology when studying the proportion of time spent in various activities by 130

students.

6.3 The Randomised Complete Block Design and Two-way Classification with

The randomised block design is a widely used method of dealing with factors that are

known to be important and which the researcher wishes to eliminate rather than to

study. Here the factor is blocked so that each is as homogeneous as possible and

the treatments under study are each used exactly once in each block for the design

to be balanced. The observed differences among the treatments should be largely

unaffected by the factor that has been blocked.

There are many situations where a randomised block plan can be profitably utilised.

For example, a testing scheme may take several days to complete. If we expect

some systematic differences between days, we might plan to observe each item on

( 6 . 2 . 4 )P Qi

Interaction Design

- 94 -

Page 109: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

each day; a day would then represent a block. Since each treatment occurs exactly

once in every block, the treatment totals or means are directly comparable without

adjustment.

By again extending the simple expression (6.2.1) and its associated results concerning

the chi-squared decomposition we may now take account of a possible block effect, i,

as well as the treatment effect, j, to produce the randomised complete block model;

P q2/c(N - R ) - 2/c(^ Rj - R ) + 2/c(^ R j - R )

i -1 j -1

p q+ 2k(N - £ Ri - £ R j + R ) ( 6 . 3 . 1 )

i -1 j -1

with associated independent chi-squared distributions, for large k,

XN-1 " Xp-1 + Xq-1 + X ( p - 1 ) ( q - 1 )

Expression (6.3.1) obeys the x 2 decomposition property parallel to the linear form of

the randomised complete block design. The first term on the right hand side of the

expression is essentially a measure of variation due to treatments, the second term of

similar form being a measure of variation due to blocks. The final term represents

the residual variation after variation due to treatments and blocks have been

removed.

From (6.3.1) the test statistic (6.3.2) is produced to test the null hypothesis that

there is no difference between the treatments.

Page 110: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

which has an F distribution with (p—1) and (p - l) (q - l) degrees of freedom.

Similarly, the test statistic (6.3.3) will be produced to test the null hypothesis that

there is no difference between the blocks

which has an F distribution with (q-1) and (p - l) (q - l) degrees of freedom.

In the case of significance, we may only state that the mean directions are or are

not equal. The test does not allow for discrimination among single mean directions.

Before the accuracy of the approximations to the distributions of the components of

(6.3.1) and the F distribution approximations Z., and Z 2 are examined, the

associated model for the two-way analysis with interaction will be discussed. A

discussion of interaction in directional data analysis will be given in Chapter 8 .

Using the same approach as for the nested and randomised complete block designs a

two-way classification with interaction may be built;

q

Z i -1 ( 6 . 3 . 3 )2 P q

i - i j - i

p q

R. j . " R . . . >

p q

p q p q( 6 . 3 . 4 )

i - i j - i i - i j - i

- 96 -

Page 111: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The first three terms on the right hand side of (6.3.4) have been shown to be

chi-squared distributed by Watson and Williams (1956) and Stephens (1963)

representing measures of variation due to row i and column j effects and a measure

of residual variation. The final term represents the possible interaction within the

experiment. F test statistics may be produced in the same manner as for the nested

and randomised block designs. (/ represents the number of observations on each

treatment combination).

Testing row effects

P

Z i -1 ( 6 . 3 . 5 )3 p q

F ( p - i ) , p q ( * - i )

Testing column effects

q

j -1 ( 6 . 3 . 6 )P q

= F ( q - l ) , p q ( / - l )

Testing interaction effects

P q P q

Z ( 6 . 3 . 7 )5p q

F(p - i ) ( q - i ) , p q ( * - i )

- 97 -

Page 112: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

6.4 Accuracy of the Associated Component x2 Approximations for the

Randomised Block Design and Two-way Classification and their

Corresponding F Statistics

The expressions (6.3.1) and (6.3.4) are based on a sequence of approximations, and

the accuracy of their chi-squared approximations may best be determined by

simulation. An examination may be made by taking Monte Carlo samples from a

von Mises distribution specified with fixed k. Observations from the distribution

specified by the null hypothesis were generated by computer methods outlined in

Appendix B and were grouped into samples of size N. This has been carried out

for 10,000 sets of samples of various size and were drawn from von Mises

distributions with k = 2, 3, 4, 5 and 10. For the testing of the randomised block

model three designs were investigated varying in sizes of N. Tables 6.4.1 to 6.4.5

examine the chi-squared approximations for each of the components within each of

the models. Table 6.4.1 shows the accuracy for the total measure of variation

component, 2k(N-R ), and Stephens improved approximation, 2y(N -R ), where y is

given by (4.3.4). It is clearly seen that cq, the simulated proportion of the

component, approaches a, the x 2 value theoretical proportion or significance level,

when y is applied. This is further illustrated when the first two moments for both

approximations are given in Table 6.4.2. When Stephens improvement is used both

moments approach their expected values and with increased accuracy as k increases.

Accuracy for both approximations increases as k increases.

Similar results are also found when the component of error or residual is also

adjusted by Stephens improvement, y. Table 6.4.3 compares the accuracy of the

X2(p - i) (q - i) approximations, and shows that the accuracy for both approximations

improves as k increases. Table 6.4.4 shows the effect on the two approximations

first two moments when Stephens improvement is applied. As with the total

measure

- 98 -

Page 113: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

of variation component, when y is applied, both moments approach their expected

values with increased accuracy as k increases.

Table 6.4.5 gives the accuracy of the two components measuring block and treatment

effects. Comparing the tables of the two effects shows how the number of

observations as well as the size of concentration parameter affects the accuracy of

the chi-squared approximation. As N and k increase the accuracy of the

chi-squared increases.

- 99 -

Page 114: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABL

E 6.

4.1

Com

pari

ng

the

accu

racy

of

the

'X2

appr

oxim

atio

ns

with

N-

1 de

gree

s of

free

dom

fo

r th

b to

tal

mea

sure

of

va

ria

tio

n

CNJ in CO *0* in CO CO in CO CO CO CM'O’ CO in o o CM CO CO CM v CO □ IX 'T—

CD CD CO CO CD CD CD CD CD CO CD CD CO CD CO CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . . . . . . . . . . . . .o □ O o

LD in CO 'O’ CO CD CO CM CM COCNJ CNJ a CO r — CO CD CNJ in CO in CM CO

>( in in *3- 'O’ in in in *0" in in in CO in in/_l CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD« . . . • . . • . . . . . . . . .• a o a □

o:

>_> CO in 'O’ CO T- CO CD IX CM>- in rx CO CNJ CD CO •O’ •O’ *o- CD CM IX CO CDCM □ CD CO CO o O CD CD CD o o l \ CO CD □ □

CD CO CO CO CD CD CO CO CO CD CD CO CO CO CD CD. . . . . . . . . . . . . . • .

o o C3 □

CO rx 'O’ in CO 'O’ CD CO CO in ■crCO a CNJ rx CD CO IX CO CO CD CM r - CO

CD CD rx i \ CO CO in in IN. IN. CO CD CO CD rx COCD CD CD CD CD CD CD CD CD CD CD CO CD CD CD CD

• . . . . . . . . . . . ■ ■ . .O □ a o

CO •O’ CNJ CO IN. CD CO ■sf v~ CO CD CO COCNJ rx CO in CD CO o CO in CD CO CO o CD

. LD in 03 a CNJ CO CNI IX CD CO CM CM IX a CM— _ ) 03 CO CO CD CD CD CO CO CO CD CD IX CO CO CD CD

. . . . . . . . . . . . . . .• O o a a

crz :'—>

rx CO CO •o* CO CNJ in CO IX CO 03 CMCNJ (V □ CO CO CO CD CO IN. CO *0" CO T~

o in o •O’ in CO co IX CNJ *0- IN. CD CM CO CM IXCD rx CO co CO CO CO rx CO CO CO in rx IX CO CO

. . . . • . . . . . . . . . . .o o o o

CNJ CO <3- in a CNJ CO in □ CM CO 'O’ in □T~

cn 0 0-p -P •pc c c0 0 0 0E 0 E 0 E

-P -P -p GID o ID o (D o0 o 0 o □ 0 1—1

m P t—i CO P 1—1 •r* p JO4-> JO 4-> JO -p

LU > in >1 «0* > C3 C3_ j cn JO CO in JO CO CO CM JO in 'r - inQ . LUe : in i CO II ii n CO II II II in II II II< Mcn co PU cr 2 P . cr z : Cu cr 2

- 100 -

The

tabl

e gi

ves

the

prop

orti

on

of 10

,000

No

nte

Carl

o sa

mpl

es

for

whic

h th

e tw

o ap

prox

imat

ions

fell

be

low

the^

X2^

^ va

lue

for

whic

h th

e th

eore

tica

l pr

opor

tion

sh

ould

b

eck

.

Page 115: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CDCo

•H4-> 0(D 0E Z3

•H i--IX 0o >a TJa . 0(0 •P

O

?<0

sCLX

0 0j=-p c.

•H<+- 0o sz

-p0o oc -p0

•H Efn o0 •O> 0

0T3 pC0

4-c O00 0E 0

00 U

JZ bfl-P 0

■QbCC V•H lp0CL SZE •PO •HCJ 2

CNJ

•sfCDUJ_lCD<

0 4--. 4--> 4--1a <—> <—> /—1 4--V r — \ 4—> <—. r— > 4--. 4--1 4--, <—> 'd* CDc CO T- CD CD rN r— CN CO d" CD co *3* CO CO0 CD CD CD IN CO CD d" CO d- IN 'T~ CD CO CD O

•H cn CD V- a CD CO CO in CD • • • •U • ■ • • • • • • • • • CD V- o r -0 LD CD CD CO IN CN in CO IN d - CD a O a O

> CN CN CO CN CN d - d" d - d - CD r~ V r —1 ' '— ' '— ' '— ' '— ' '— ' '— ' '—’ '—' '—' w V-- 4 w '—’ '— ' '—’

D Z

>— ’ CO CD CO CD CO CO CO CD CO r— CD in CO CO> - V CN CN CO in CO in CD CN in CO CO CNCN c d" CN * r - a CD CD d" CD CD CO CN IN o in CD

0 • ■ • • ■ ■ ■ • ■ ■ ■ • m ■ •0 'd' <d" <d" <d* CO CO CO CN CN CN a CD c n CO CO

e= TT— r - CN CN CN CN CN in d* <d* 'd*

0 4--. 4--. 4--> 4--.a I—> r —\ r —\ •—> «—> «—> 4--. 4--> <—» 4--. CN inc CD <—> IN CO CD CO IN. rN r* CD CO v- CO CD0 CN in CD a CO CO in CN CD d* CO IN in IN

•H CD CD IN in CN CO IN. CO IN CD • • • • •C. • • • • » • • • • • in in CD CO0 CD CD CD CN CD cn a IN d - rN CO CO CN r- CD

<—> > CO CO CO CO CN in CD in in d- r~ r~ • 'T- r-• '—’ w w ’ w ' w v—' w w w w w ’ w

c r

<—' CD un CO in d - CO n r - CO CO in CNv CO CD in CD CO CO IN in d- IN CO IN in

CNJ c t - o <d* □ ’d" o CO o CD rN CD CD CD «d-0 • • • • ■ ■ • • ■ ■ ■ ■ ■ • ■0 IN CD in in d - CO CD in d - CO CD in CO CN a

e : r- r— 't— CN CN CN CN CN in in in in in

CN CO *d* in CO CN CO d - in O CN CO "d- in o*” t— *"

CO CD COCN d- CDII II II

■d" CO cn0 CN 0 o <d" 0un O CO O V O

TD ii c TJ II c TJ ii c> 0 0 > 0 0 > 0 0

LU JD •P c ■H JD -p c •H JD -P c •H_J cn a 0 u o 0 U O 0 UCL LU CO 0 0 0 CO 0 0 0 in 0 0 0e : IN I Q . e : > □L e : > CL e : >< 1—1 X X XCO CO LU LU LU

- 101 -

Page 116: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABL

E 6.4.3

Comp

arin

g the

ac

cura

cy

of th

e*X2

appr

oxim

atio

ns

with

(p-l

Hq

-1)

degr

ees

of fr

eedo

m for

the

resi

dual

me

asur

e of

vari

atio

n

un CD rx CO in CM in co t— CO <3-„_, •3- m CO IX o <r- cn CD CD '3' «3- CD CO COcn cn CO CO CO CD cn CO CO CO co CO IX CO CO CO

\ cn cn CD CD CD cn cn cn CD CD cn cn cn CD cn cnon . . . • . . • . • . . . . . .+ a □ o □

•'“J

cnw CM i \ CO CD CO ix CM CO CM CO rx <3- CO v CO

<3* *3- CM □ cn CO O CO CD IX IX CO CO CO IX1 Vf in CO CM CO <3" <3- CM CM CO CO «3" V O CO <3" 3-

CD CD cn CD CD cn cn cn cn cn cn CD cn CD cn OD• . . . . • . . . . . . . . . . .•H o □ □ CD

cnwCM CO CO CO CO 'T- *3" CO CD in cn *3- co CO

«_i v o CO □ IX CD CM CO in CO rx CO in> a in in CO co □ in M" CO CO cn CO IX CO cnCM CD CO CO CO CO cn CO CO CO CO CO CO CO co CO CO

. . . . . . . . . . . . . . . .□ a o a

v- CO IX CO rx CM CD CM inix CM cn IX CO <3 ‘ CD CD CO CO cn CO CO

J cn CO CO IX ix CO *3* in IX rx CO CO CM in IX COX

CD cn CD CD CD cn CD CD cn cn cn CO cn cn cn CD. • ■ . • . . . ■ • ■ • . . . .

•<—) o o o acnw

CM '3" CO in <3" CM CO CM cn CO COi cn in CO cn o cn in IX CO CD CM CO CM CO

V LD CO ix o r- CO '3' CD O cn a CO cn CM• o cn CO CO cn cn CD CO. CO CO cn CD CO CO CO CO CD•H . . . . • . . . • . . . . ■ . .

cn □ a □ oW12>— > CO rx CO CO CM *3" in CD CO CO CMjn CO CO CO CO CO IX CD CO CO <3* CO <3- <3* OCM a CO CO CM «3- CO CO *3* CO rx in o IX V - CD

CD rx rx CO CO CO CO IX CO CO CO in IX IX CO CO. . . . . . . . . . . . . . . .

o o a CD

CM CO <3* in o CM CO <3" in □ CM CO <3- in CD*“

cn 0 0•p -P •Pc c C0 0 0 0E 0 E 0 E

-P -P -P a(0 o 10 CJ (U o0 o 0 o O 0 i—i

in P rH CO P 1—1 P XD•p n -P X3 -P

UJ > in > <3- > O a_i cn n CO in v JO CO CO CM n in inQ_ UJe: m CO II ii ii CO II II II in ii II ii< Mcn cn Cu a* 2 P-4 c r 2 o - c r 2

- 102 -

The

tabl

e gi

ves

the

prop

orti

on

of 10

,000

Mo

nte

Carl

o sa

mpl

es

for

whic

h th

e tw

o ap

pro

xim

atio

ns

fell

be

low

the^

>(2

va

lue

for

whic

h th

e th

eore

tica

l pr

opor

tion

sh

ould

be

<x.

Page 117: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABL

E 6.

4.4

Com

pari

ng

the

mean

an

d va

rian

ce

of th

e ap

prox

imat

ions

with

C p

-13

C q-1

3 de

gree

s of

free

dom

to

thei

r ex

pect

ed

valu

es

03O <—* f-- \ t—* 1—\ 1—\ r-» /—s /—\ /—\ /—V ,---v ,—vc CO CO x CNJ <3- x CO un /—. r - CO CM un CO ,—,

• 0 CO CNJ r— un CNJ CO □ CO CO CO o• *H un CO CNJ v- CO 03 CO CM o CO CO CM CO o

cr fn • • . . . . . . . . . • . . •+ (0 un CO CO X un CO CM o o CO CO CO x un CO

'<—> > r - r - r — CNJ CO CO CO CM X co X IX rxcr * w v-- ' >—> »—' *—1’ v_/ '—' '—* *■—' v---' v--> v—> \_* V_t *_/wi

•Hcrw x CO X CO CO CO CO CM IXi 03 *r~ CNJ CO CO CO 03 03 CD a T -

2 c X 03 r - X CO ^— CO X a o CM o^ ca 03 CO • ■'T CNJ o . . . . . . • . • .

> , 03 • • • • . in m ■d" ^f* CO CO rx CO COcnj z : CO CO CO CO co r— r - CO CO CO CO CO

O /—N CO COc CO CO in CO CO .—> CM co ■CJ- CO un CO CO CM

• ID un 03 03 CM a a V“ CM CO CM ,—> CO T“■ •H 03 r - rx CO CO a CO CM IX CO . • CO CO

cr • • • • • . • . • • CO CD • •+ <D r— CO 03 CO CO <r- CD «3* □ a O CM IX

•<-> > CM CM CM "C— CO CO co CO T~ 03 CO COcr ’ w w v_/ w <—1 w V----' >—> v—< \_! V_t V_t \_>

wi

•Hcrw CO CO CO CO <=}- rx CD in un in in CO COi in CO 03 CO CO CM CO IX <cj- IX CM IX CD CO

2 C CO IX CM CO CO CO o CO <=}* co o CO 'Cf CD COV----* 0 . • . • . . . . . . . . ■ ■ .

03 □ 03 03 co co CO IX in in CO CM o CO XCM E l v - V t- '3 ' CO CO

CM CO un a CM co in a CM CO un or— T*

CO CO CMr - CM X

II II IIa CD

CO 0 ^— 0 V* CO 0in G CO G G

TJ it C TJ n C > . TJ n c> 0 0 > 0 0 JO 0 0UJ n 4-> c •H JO +J c •H 4-> c •rH_ ) CO G 0 U G 0 U un G 0 f-.CL UJ CO 0 0 0 CO 0 0 0 0 0 0e : N Q. z : > D . s ; > Q. e ; >< H X X XCO CO UJ UJ UJ

- 1 0 3 -

Page 118: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

6.4.5

Examining

the accuracy

of the

*X2

approximations for

the

block

and

treatment

measures of

variation

CO CD LD un CN CO CO ■sT CO IX CO COun cn cn CN CD CD IX CN CO CO IX □

CD CO CO CO CO CD CO CO CO CO CD CO CO CO CO CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . . . . . . . . . . . . .O a □ O

• LD CN CD CD CO CO CN r~ CO CO IX CN• cn CD CO CN IX CO CO co CD CN *3" a CO

c r LD CO co un CN CO CO ■sT «cr CN un un0 CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

i • . . . • . . . . . . . . . . .O □ a a

•H01W rx rx un CN CD CO IX un CO CN CO'—* □ CO CD CO ix CO CN CO IX CN CO IX CO CO

o ix CO CO CO CD CO co CO CO CD rx CD CO CD onCN CD CO CO CO CO CO CO CO CO CO CO CO CO CO CO CO

. • . . . • . . . . . . ■ . . .o a a a

CO IX IX CD CD IX un CN 'T— CO COCO CO CO CO CO IX CD CD CD CO IX IX co

CD CO CO CO CO CO CO CO CD CO CO co CO CO CO CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . • . . . . . . . ■ . . .□ o o o

t_% CO CN CN CO CO CD CN CO CO rx CN• CD CN ( \ CO CO un CO □ un o un CN IX CN CN■ > ' in CO •3- <3* un un un *3* un

cr L CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD. • • • . . . . . ■ . . . . . .

i a o □ □

•>“5

cr IX rx un CO un CO CO CNw 'x’ IX LD <3- CN <r~ CO a un CO o a>—< o CO CO CD CD CD CD CD □ CD o □ CD o CD □

CD CO CO CO CO CO CO CO CD CO CD CD CO CD CO CDCN • . . . . . . . . . . . . . . .

o a □ o

CN CO cn a CN CO un o CN CO 'CT un oV

cn cn 0-p -p -pc c cCD 0 0 0E cn E 0 E

-P -P JtC •p 0(0 o (D a (0 o

un 0 o 0 o a 0 rH0 1—1 CO 0 i—i h* 0 JQ

> -P J3 -P JD -pUJ n LD > •sT > o a_j co CO LD H* iJ CO CO CN JO un V unCL UJ COz : m II II II CO II II II un II II II< Mcn az Pu cr 2 CL cr 2 CL cr 2

- 104 -

Page 119: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Tables 6.4.6 and 6.4.7, probably the most important, show the results for the old

and new F-approximations. The tables give the proportion cq of Monte Carlo results

less than the F-value for the appropriate statistics when the theoretical proportion

should be a. The old F-approximations being Z 1 and Z 2 given by (6.3.2) and

(6.3.3) respectively, and the new F-approximations being m Z, and m Z 2 where m is

given by (6.4.1) and is the statistic due to Stephens improvement of the residual

measure of variation.

( 6 .4 .1 )

The conclusions here are the same as for the F-statistic for one-way analysis given

by Stephens (1972). The new tests are clearly very good F-statistics and increase in

accuracy as k and N increase. The approximations are very good, even for N as

low as 10, and improve quickly for larger values of N.

The chi-squared approximations of (6.3.4) is examined in the same manner as above.

Tables 6.4.8, 6.4.9 and 6.4.10 show the chi-squared approximations for all five

components of (6.3.4). The total and residual measures have been adjusted by

Stephens improvements, y, the original approximations of the total and residual

measures were of similar form to those already seen in the randomised block design.

All five approximations, including the interaction measure built in section 6.2, show

excellent chi-squared approximations for k > 2. Accuracy is seen to improve as k

increases.

Tables 6.4.11, 6.4.12 and 6.4.13 give the results for the old and new

F-approximations for testing the two main effects and the interaction component.

The old F-approximations being Z 3, Z 4, and Z 5 given by (6.3.5), (6.3.6) and

(6.3.7), and the new F-approximations given by m Z 3, m Z4, and m Z 5. Clearly all

three F-statistics improve when Stephens improvement is used, similarly and as for

the randomised block design, increased accuracy is seen as k and N increase.

- 105 -

Page 120: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

6.4.6

Comparison of

the

two

F approximations ZA

and mZ

un IX) CO CD CD CO <3* IX) CD CD COcn CD CM cn o cn CD CM o CD

CD CD CD CD CO CD CD CD CD CD CO CD CD CD CO CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . . . . . . . . . . . . .o O □ O

rx CM rx ix CD CO CD CD CD IX) IX)CM IX IX) CM <r* cn <3* IN. O CM rx IX) CM un CM

/ LD CD IX) IX) IX) IX) CD IX) IX) IX) IX) CD un IX) unI CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

M . . • ■ • • . • • • • . . . . .E o o □ o

IX) IX CD CD IX CM CO ■<r- *3- *3*cn CD IX) CD CD IX) cn IX) v- r- rx '3" CD

o r— O □ CD CD CD o O O CM O □ CD aCD CD CD CD CO CO CD CD CD CD CD CD CD CD CO CD

. . . . . . . . . . . . . . . .o O □ □

cn CD IX CM IX cn CM rx un CMCD IX) IX) V cn CD <3* IX) CM CD CD cn cn ■3"

CD CD CD CD CD CD CD CD CD CD CO CD CD CD CD CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . . . . . • . . . . . . .CM □ O O □

cn '■

CD rx V“ cn IX CO CD CM IX) COi_> <3* CD IX IX IX) CD CO *3" CD <3" < r IX) IX CO

V IX) rx CD CD LD IX) rx CD co IX) cn CO rx CD un unc C CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CDo . . . . . . . . . . . . . . . .

•H o □ □ □•PCODcr CM CD V“ IX) CO CO CD IX CD CM *3"

LU CD CD CD CM *3- cn IX) cn cn CD un IX ao cn CM V □ CM CM rx CD cn CM

T“ CD CD CD CD CD CD CD CD CD CD CO CD CD CD CD CDIX . . . . . . . . . . . . . . . .

o □ □

CM cn *3" IX) o CM cn *3" un □ CM cn IX) □V-

□UJ un CO r—_ i cnQ_ UJ > >E: rx JO n JO< Mcn cn cn cn un

- 106 -

Page 121: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

6.4.7

Comparison of

the two F

approximations Z0

and mZ

CN

N

CO CO <3- LD CO co cn *3" IX rx CO*3" CN TT- a — a cn cn a CN CN CN rx

cn cn cn cn cn cn cn CO CO cn cn cn cn CD co cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . . . . . . . . . . . . . . .o o a o

CO CO cn CO IX CO CO CO coIX cn cn CO CO □ IX cn CO CO cn <r~

/ cn CO un <3- IX) CO IX) IX) <3- IX) IX) CO IX) <3* IX)cn O cn cn cn cn cn cn cn cn CD cn cn cn cn cn cn CD

M . . . . . . . . . • . . . . . .E o o o a

IX) CN cn IX) <3* «3- CN CO CN <3* IX COcn CN IX CN CO IX) IX) cn IX cn CN IX) cn CO

□ □ o a TT“ a □ CD cn a >r- a cn □cn cn cn cn. cn cn cn cn cn CO CO cn cn cn CO cn

. . . . . . . . . . . . . . .o □ □ a

cn CO IX co CN CO cn cn CN IX) CO COCO <3- CO CN CN IX) CO CO ’3’ CO CO IX CO CN

cn cn cn cn CD CD cn cn cn cn co cn cn cn cn CD<—> cn cn cn cn cn CD cn CD cn cn cn cn cn cn cn cnCO . . . « . . . . . . . . . . . .

. □ o o oCO

CO<_> IX CO CO <3" CN cn *3" CN rx

a CO □ CO IX CO CO v - CD CN IX a COc > cn CO CO CO 1X1 IX) IX IX CO co CO CO IX rx CO IX)o c cn cn cn cn cn cn CD cn cn cn cn cn cn cn cn cn

•H . • . • . . . . » . . . . . . .-P □ □ o acoZJcr

LLl IX CO CO CO IX) V- CN CO CN COTr- LD cn CO r - □ CO IX IX IX co CO CO cn CO

CN o UO CO TT” v - IX) CO CN TT- IX IX) 3" CO v -N cn cn cn CD cn cn CD cn cn cn CO cn cn cn cn cn

. . . • . . . . . . . . . . . .o □ o a

CN CO "3" IX) a CN CO <3- IX) □ CN CO *3" IX) oV-

oLU un CO_1 COQ_ LU > > >e : m n .□< MCO CO CO CO IX)

- 107 -

Page 122: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

6.4.8

Examining

the accuracy

of the

X2 approximations for

the

total

and first

main

effect measures of

variation

CO CO in CM CO CM CD CM co CM •3* CO CMCD V CO o a a CD o o r - co a CD a CO

CD CD CO CD CO CD CD CD CD CO CD CD CD co CD CD CD CO CD CO CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . . . . . . . . . . . . . . . . ■o □ a a □

—CO in IN CO CD CO CO CO '3 ’ in CO CO CO <3* in CO CM

• CO CD □ in CM CO CD in □ t— CD CO 3* rN CM CO in V in CD• in *3* 'nI ' in '3 ’ in <3" '3" <3" in in in in in 3" in in '3 -

c r I CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD. . . . . . . . . . . . . . . . ■ . . .

o o a a O

•Hc rw'_i in IN *1* CO <3" CM CO CO CO CO CD CM CO in CM CMy in CD co CM IN in in CD CM <3- in «3" in CO CO «3" CO IN V-CM a CD CD CD CD o CD CD CD CD a o o a CD o CD □ □ CD O

CD CO CO CO CO CD CO CO CO CO CD CD CD CD CO CD CO CD CD CO CD■ . . . . • . . . . • . . . . . . . . . .

a a o □ □

CM CM T— IN in CO CO IN CO CO CD INCO CO CD CD CD in IN CD CM CO a CD in o a CM

CD CD CO CD CO CO CO CO co CD CO CD CO CD CD CD CO CO CD CD CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . . . . . . . . . . . . . . . . ■ .o a a o a

CO CM CM <r- IN CM CM IN t— CD 03 <3- CO COCO a IN CM in <3’ CD CO CO <3* CD CM rN co in IN *3" in CO

in 'c r in in -sf CO <3* in in *3’ <3* *3- '3" in CO CO *3- in in> CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CDC . ■ . . . . . . . . . . . . . . . . • ■

• a a a o a

c ri

in CM rN rN CM CO '3 ’ CD in CO CM CD'—' CM IN CO CD CO IN *3* CM ■3* rN CO in CD o V~ CO> - a CD CO CD □ a CO IN CD a CD CD CD CD □ IN IN CD O QCM CD CO CO CO CD CD CO CO CO CD CD CO CO CO CO CD CO CO 00 CD CD

. ■ • . . . ■ • • . . . . . . . . . . . .a a a o o

CM CO '3" in □ CM CO *3* in a CM CO <3- in □ CM CO '3" in Ov~ V V < r

cJC •rH+J JZ•H 4-33 •H O

3 COc It ----. 4-->

<—> ClO C 21 □ «—« oin •H o •r- in r-

cn •H> 0 4-3 4---. > > >

JO TO ID rH JO JO JOV_/ > rH <_> >_4 V_4

CO I* 0UJ CO 0 o CO O CO in CO a_ ] cn > 0 CD •3- CDCL UJ > n JO sz > > >E ; INI n o G JO II JO n jD ii< H CM incn cn CM in 0 CM 2 CO 2 CO 2

- 108 -

Page 123: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

6.4.9

Examining

the accuracy

of they

approximations for

the

second

main effect and

interaction

measures of

variation

—CNJ CO CO r - O ' o - CNJ CO in o - IX in O ' CN CO

. CO o CD O CD CD CD CD CO o in in CO CO O ' CO 'T ' v CO

. CD CO CD CO CD CO CO CO CO CO CD O ' CO CO CO CO CO co CD CD COcr CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD+ m . . . . . . . . . . . . . . . . . » . .

. o o o a CD’r~)

crwi CO CNJ in CO CN! IX O ' CNJ O ' IX CO rx CO CO CO CO

. CO CD in CO CO CD CNJ CO o CD ^— CO CO CO CD CD CO CO CD• ?f in CO O ' O ' O ' in O ' O ' O ' in in CNJ O ' CO O ' O ' CO CO O ' o - O"

•H C} cr> CD CD CD CD CD CO CD CD CD CD CD CD CD CD CD CD CD CD CD CDcr . . . . . . . . . . . . . . . . . . . . .wi

o □ a a CD

• o•rH

cr CO CO CO CN! CO CNJ rx CD CN! O ' O ' CNJ CD CO CO O 'w rx CO CD CO CO CD CO rx CO a CO CO in IX CD CO CO rx CO COw o rx CD CO CO CD CD CD CD CD □ rx O ' co CD CD CO co CD CD CD>—> CD CO CO CO CO CO CO CO CO CO CD CO CD CO CO CO CO CO CO CO CO

• . . . . . . . . . . . . . . . . . . • .CM o O o o CD

CO O ' in CN! in in CO O ' CO CO CN O ' CO O 'CD CD CD CO CO V* CD o o CD O CD CO CO CO

CD CD CO CD CO CD CO CO CO CD CO CD CD CD CO CD CO CO CD CO coCD CD CO CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . . . . . . . . . . . . . . . . • .O □ □ o O

CNJ CO CO CO O ' CNJ CO CO CO CO CO O '• CNJ CO CD CO O CO CD IX rx in CO o in CO CO CO in CD CD r-• lo in O ' O ' O ' in O ' O ' O ' O ' in O ' in O ' O ' O ' o - O ' O ' O ' in• a co CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

cr . . . . . . . . • . . . . . . . . • • • •

i o o o a CD

• o

crw CO in in CNJ CO O ' CNJ CO IX CNJ in O ' O ' CO O ' in>—< o - V- CD CNJ CO CD in rx IX CO CNJ CO in CD CO a CN CO CO

o o CD CO □ o □ a CD CD □ CD CD CD CD CD CD CD a CD CDCN CD CD CO CO CD CD CD CD CO CO CD CO CO CO CO CO CO CO CD CO CO

. . . . . . . . . . . . . . . . . . . . .

o □ a a a

CNJ CO O ' m a CN! CO O ' m o CN! CO O ' in a CN CO O ' in CDr~ V

cito □ ,_, t ^

<—\ b fl > CO o <—» CDin •H u C II V- in

CD CD •H 2> Q) cn JD >> > >

JO TJ JD -P i—1 JD J3 JD<_/ □ •H r—I v__ / V__ / \_ j

CO 2 CDUJ CO in CJ CO CO CO_j cn > cnCL UJ > JD sz c . sz > □ > in > CD£1 M JO -p Q O JD CO JD O ' JD CD< M CN1 •H •H (0 II II Itcn cn CNJ 2 ■P CD CN! 2 CO z : CO Z

- 109 -

Page 124: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

6.4.10

Examining

the accuracy

of the

X2 approximation

for

the

residual measure

of variation CO cn V IX) IX) COCO cn cn 1X1un ■cr 'TIX) CDCOCO rx <3- COo CO COCDCOCD IX) cn rs. CDv-

cn cnCOCOCOCO co COCOcn CO cn COCOCOCO co COCOCOcnCD CDCDcn CDCD CDCDCDCDCD CDCDCDCDCD CDcn cn CDcn

. . . . . . . . . . . . ■ . . . . . . . .CD CD CD CD a

COCD CDCO CO .. COIX cn cn CO IX) cn CM CMCO cn CDo cn cn CMCMCD cn IX) x IX COCO

cn cn CO«o- 'tf' IX) cn cn IX) cn CMCMcn IX)^ • 2CD cn cn CDcn cn CDCDCDCDcn CDCDcn CDcn CDCDcn cn CD

•>-> . . . . . . . . . . . . . . . . . . . . .•H CD o a a CD

orww~ZL cn CO IX) CM rx COCDCM IX) CDCO CM COcn CM>—> LD COcn ^r V- COCDCD ix X CD '3' x cn IX) X CO>- o rx COCOCDCD IX CDCDCDCD CDx COCDCD 1X1CDr x cn OCM CD COCOCOCOCD COCOCOCDcn COCOCOCOCO COcn COCOcn

. . . . . . . . . . . . . . . . . . . . .o CD a CD a

CMcn <3- IX) o CMcn IX) a CM cn IX) CD CMcn 'T un oV v V"1 O

c co cn <—> ,—i/—\ > n CD 1--V CDLD •rH u c IX)

cn 0 ■H>> 0 0 S Z /-- V > > >XI TJ XI -p 1—1 X3 XI XI«_> Q •H 1—1 >_i u t '

cn 2 0LU cn IX) o cn cn CO_ j cn > 0CL LU > XI SZ C s z > a > IX) > Oe : m XI •p o o XJ CO XI *3* XI CD< H CM•H •H 0 ii II IIcn cn CM s -P 0 CM s cn cn 2

_ 110 _

Page 125: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

03JDP

UaH-co

M cE o

•HTO PC (00 O

•H03 P

M •Hcn

cn cnc (0o i—i

•H o-P0 >E cn

•H 3X io Ou 3Cl PCL(0 0

SZU_ p

O c3 •HP

P03 O

SZ 0p P

Pp 0o

cc •Ho 003 E

■Hu "O(0 c□ . QE Oo 0

u 0

3 *

coLU_JCD<

CD CNJ rx IX CN CO CO CO CO 03 CO CN CO IX CN COCO CD r— o CD CD CD r— CD 3 - a co CD CO o O C3

CD CD CD CD CD CD CD CD CO CD CD CD CD CO CD CD CO CO CD cn cn03 CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD cn cn

■ • ■ • ■ ■ ■ • a a a a a a a a a a a a

CD □ a CD o

CNJ in IX CO CD co CN CN CO CNCO a CD CD r - CD T * CO 03 CD CO IX CD 03

> LD U3 U3 U3 LD 3 " 03 03 3 " 03 03 03 03 03 03 3 - 03 3 " 3 * 03 3 "CO t 03 03 CD CD CD CD CD 03 CD CD CD CD 03 03 CD 03 CD CD CD CD 03

M ■ • ■ • • ■ • ■ • a a a a a a a a a a a a

E CD CD CD CD o

CNJ IX in CN V 03 IX CN 3 - CN IX CN CN CO IX rxCNJ 3^ CD 3 " CD IX CO 03 03 CN CO CN CD CN CN CD CD CO 3 " CO

□ CD CD CD CD o O CD CD CD CD O CD O cn O cn □ CD CD03 CD CO CO CD CD CD CD CO CO CD CD CD CO CD CO CD CO CD CO CO

• ■ ■ • • ■ • ■ ■ a a a a a a a a a a a a

CD CD CD CD a

CO U3 3 " CN 3 " s— CO CD CN IX CO CO CO 3"CO 3 * CN CO v~ 03 CO CN CO IX 3 " CO T - CD CO CO CN X

03 CD CD CD CD CD CD CD CD CD CD CD 03 CD CD CJJ cn CD CD CD CD03 CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD cn CD CD CD CD

« • ■ ■ ■ • • • a a a a a a a a a a a a ai—> CD O O CD □LD

CO

CD CNJ cn CO CO CO CN 3 " <3* CO 3 * V~ 3 " CN CO IX<—< r - CO V IX CO 03 CN (X CD CO CO r- CN O CO CN CO o COV 03 IX CO CO LD U3 CO CO 03 03 03 IX rx CD CO 03 IX CO CD 03 03

c 03 CD CD CD CD CD CD CD CD CD CD CD CD CD CD cn CD on 03 CD CDo • • • • ■ • ■ ■ a a a a a a a a a a a a a

•H □ o a o C3P0DJc r

UJ CO CO CO CO IX V CN 3 - CO 3 - V 3 " 3 "rx CO CO CD 03 CO •cr CD CO CD 03 IX CO CO 'T~ CO CO r - CD IX

CO CD CO v - r~ a CO CN CD a a CO CN V v o CO CN CN CD OM 03 CD CD 03 03 CD CD CD CD CD CD CD CD CD cn CD CD cn cn CD CD

■ ■ ■ ■ ■ a • • a a a a a a a a a a a a a

CD CD a a o

OJ CO 3 * CD a CN CO < 3- 03 CD CN CO 3 * 03 o CN CO 3 * 03 □*“ 'T-

,— > CD ,— , Oin < r~ 03 *“

> > > >JD JD JD n

w w

LU CO CD CO □ CO 03 CO □_ l C/D CO CO 3 " CDCL UJ >1 > > >E: m JD II JD II JD II JO II< MC/3 C/D CNJ z: CN z CO 2 CO Z

- Ill _

Page 126: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

6.4.12

Comparison

of the F

approximations 2n

and mZ

for

the

co

•H-Ptno

•rH4 -•Htncntni—i0>(D31O2 4->0

SZ-P

c•H•PCJ0

4 -<4-0C(DE

-P0P

•HH-

*3- CO CM CM CO 3* X X in X X CO CMCN CM CO cn CM CO o CM CO r— o V“ cn

cn cn CD CO CO cn cn cn CO cn cn cn cn cn cn cn cn cn cn CO cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . . . . . . . . . . . . . . . . . . . .o CD a a o

CO <3- CO X CM in CD 3" CO in CM in COLO in CO CM co CO □ CD in V“ CO CO CO 3" CO X

> LD cn LO in 3- in in in in in in CO in in in in in in in <3* *3"c cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn<3- . . . . . . . . . . . . . . . . . ■ . ■ .

N o CD CD □ □E

CM CO CO CO CM CM CO CO CO CM *3* CO COO CM CO cn CO CO CO X a in CO — CD CM co CO CM

□ v □ □ cn cn a cn cn CD CD t— V o a o a o cn CDcn cn cn cn CO CO cn CO CO cn cn cn cn cn cn cn cn cn cn CO cn

. . . . . . . . . . . . . . . . . . . . .□ o o □ □

CO rx CD CD t— X CO CM CO CM in X CO CM in 3"in <3* v- r— CM X LO CM CO CM LO in 3 CM CO in in *3* CM CO

cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . . . . • . . . . . . . . . . . . . . ./--» o □ o □ □CO

onCD in CM cn CO 3 - CO X 3 - CM 3" *3* X CO V CO 3"«— > in CO O *3" CO CM CO CM CO CO cn CO r- X CO CM CO CO co

V CD ix CO CD in in X CO CO CO in X CD CD CD tn X X CO in inC C cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cno • • • . • . . . . . . . . . . . . . . . ••H □ o □ a a-P(0Dcj

UJ co CO rx <3" X cn 3 - 3* 3* CM 3" CO CD co CM COo CO □ CM X CD 'cT in in CO 3" CO CM in CM cn X CO CM cn«3* CD CM CM a «3* CM CD *3* CO CM v~ V CO CO CM CDM cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

• . . . . . . . . . . . . . . . . . . . .CD a a □ CD

CM CO <3- in CD CM CO 3 - in a CM CO 3" in o CM co 3" in CDv r - V

(_t<—> □ <--V ain in

> > > >JD JD JD JDw w

UJ CO o CO a CO in CO CD_ j cn co CO 3" cnQ . UJ > >1 >1 >z: m JO II JD ii JD 11 JD ii< Mcn cn CM CM 2 CO 2 CO 2

- 112 -

Page 127: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CDH4->

O4-

u>N cE o

•HTJ PC co<u G

•rHld 4-

N •Hcn

cn cnc COo I—1•H oPcn >E CO

•H 3X io op 3Q. -PCLCO 0

JCLi. pO c2 ■rHP

PCD O

JC 0P 4-

4-4- 0O

CC □o •Hcn P•H GP 0(0 PQ. 0E PO CCJ •H

CO

3*CO

LU_ !CD<

r \ CN in *3* CD CO CN CO CO CO CN in CO <3’ CO CD [ \ 3 "CO V- CD □ CO ^— O CO CD o CD a in r \ a CO CN Cn. O

CD CO CD CO CD co CD CD CO CO CD CO CD CO CO CD CD CO CD CO CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

■ * ■ • ■ • • ■ • • • ■ ■ * • ■ • ■ ■ • ■o □ a a O

CD CN CD CO CO CO in CO CN CO CO CO *3’ CN in CNV CN <3* '3 ' CO CO rv. CD r - in in r - O CO CN CO r -

/ in 3 * in *3- <3- ^3" in <3* *3* in 3 " <3" *3" in in in *3" in *3" inLD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

N • • • • m • • • ■ • * • ■ ■ • « • ■ ■ • •E o CD a a □

r \ CN CO CN <3* CO in CO CO l \ in CO CN CO<3" CO in CO CD CO V- CD CN in CD CN CN CO r \ CO CO

o CD o CD CD CD a CD CD CD co co CD CO co CD □ CD CD CD CDCD CO CD CO CO CO CD CD CO CO CD CO CO CO CO CO CD CO CO CO CO

■ ■ m • • ■ • • • • • • ■ • • ■ ■ ■ • • •o o □ CD CD

in CO CN CO CO CO <3" CO CN CO <3" CD CO CO CO3 * *3" CO in CD CO in CN t— <3* ’3" CD CO CO in CN T -

CD CD CD CD CD CO CD CD CD CD CD CD CD CD CD CD CD CD CD CD CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

• ■ ■ • • • » • • ■ • ■ ■ ■ • ■ M ■ ■ • m,—. □ O CD a □

CO

CD *3* r-v. CO 3 - l \ CO CD <3- CN CN l \ CO CD CO CD 3 ">_i r \ ( \ CO in V" co <3* CD 3 " *3" CO CD <3- in *3" r \ r~ co

% LD CO CO in in in CD in CO in r \ CO in CO in i \ CO CD CO inc 0 CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD OD CDo ■ ■ • • • ■ • ■ • • • • ■ ■ ■ ■ ■ ■ • ■ •

•H O o o o aP0ZJc r

LU CO CO CO in CO CN CN CN CN r \ CO in CO COin CO V - T“ in CO CN CO *3" CN CD CN CD V- CN CO CO «3" CO

CD □ CN CN ■<r- □ «3* CN r — CN CN a r- *3" CN CN V □M CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

• ■ ■ • ■ • • ■ ■ ■ ■ ■ ■ ■ • * ■ ■ • ■ ■CO CO CO a a

CN CO «3* in CD CN CO <3" in O CN CO •3" in a CN CO *3" in o<r~ r — V-

<_i<—. O ,—i Oin r~ m *”

> > >> >JD JO JD JOw v_> w

LU CO O CO a CO in CO □—1 CO CO CO *3" CDCL LU > > > >EC N JD II JO ii JD II JO II< MCO CO CN 2 CN 2 CO 2 CO 2

- 113 -

Page 128: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

6.5 Summary

As discussed in Chapter 5 the likelihood ratio function has infinitely many equally

large peaks and may not be broken down under the set hypotheses using the

parameter constraints, as is the case for linear analysis. Because of this the tests

derived initially by Watson and Williams and improved by Stephens have been

extended, for large k , to enable analysis of further experimental designs. The

components within each of the model expressions are shown to be very good

chi-squared approximations and their appropriate quotients to be excellent

F-distributed approximations, both increasing in accuracy as N and k increase.

Simple extensions to the likelihood criterion for the one-way classification with small

concentration parameter showed the test components to be poor chi-squared

approximations with little justification for their construction.

- 114 -

Page 129: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 7

TH E ROBUSTNESS AND POSSIBLE COLLAPSE OF TH E EXTENDED

TECHNIQUES

7.1 Introduction

Chapter 6 has shown how we may extend the original techniques for the one-way

classification with large k, to examine further experimental designs. This chapter

examines the robustness of the test statistics and shows the sensitivity of the

assumptions and possible flaws or breakdowns within the new techniques. Section 7.3

gives further understanding to the construction of Watson and Williams test statistic

and the reasons for its possible failure when larger designs are considered.

7.2 Robustness of Assumptions

Section 6.2 gave the assumptions which must be observed in order that the new

extended techniques may be applied. The third assumption, that the overall

concentration parameter is sufficiently large, namely k 2 , is an extremely restrictive

assumption when analysing larger designs. Equation (1.4.16) gave an angular measure

equivalent to the standard deviation in linear statistics. In order that the assumption

of large overall concentration parameter may be satisfied, the measure of angular

deviation must not exceed 44* i.e. the overall sample data set must be closely

packed. This will not be the case if substantial differences result between or within

factors.

Further and more important problems are noted as the size of the concentration

parameter decreases towards and below 2. During the many thousands of simulation

runs discussed in Section 6.3 many other statistics were collected for each of the

- 115 -

Page 130: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

varying hypothesised models. These included the distributions maximum and

minimum values. The simulations for the two-way classification procedure produced

several occurrences within each hypothesised model of a negative interaction

component. The frequency and magnitude of these negative components increased as

the size of the concentration parameter, k, decreased. When simulations were run

to further examine the power of the tests and specified differences were given

between treatments, blocks or cells, (i.e. separate von Mises distributions with equal

concentration parameters but differing mean directions) the frequency of the negative

interaction components increased.

An example of the two-way classification with interaction and its analysis is given in

Tables 7.2.1 and 7.2.2. It demonstrates the situation where the individual factor

concentration parameters (i.e. k 1, k 2, ......) are not significantly different and are

large, but the overall concentration parameter k is small. Other less dramatic data

sets may be shown with a larger overall concentration parameter but producing

similar, although less pronounced, results.

- 116 -

Page 131: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Example 7.2.1

Table 7.2.1 A Two-way Classification

FACTOR

^0 An

61* 177*

48* 139’

Ro 87* R , t . = 4 .7303 155* R 12> - 4 .7604 R i . . = 6 .6 5

102* N11 = 5 1 9 1 ’ N 12< = 5 N i . = 10

73* 164*

FACTOR

318* 240*

353* 281 ’

358 ’ R21 = 4 .8 284 229* R22 - 4 .7623 r 2 . . = 6 .6 5 4 8

328* N 21 . = 5 252 ’ N 22> - 5 N 2 . . = 1 0

344 ’ 2 40 ’

R n = 6.5247 R 2 = 7 .1363 R = 0 .6117

N . 1 . = 10 N 2 . - 10 N = 2 0

P q P q^ R i . . = 13.3058 ^ R j = 13.661 J ^ Ri j < = 19.0814

i= l j = l i = l j = l

p = q = 2, 1 = 5

- 117 -

Page 132: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 7.2.2 Two-Way Analysis of Variance Table

Var i at i on Component Value Degrees o f Freedom

Due to Factor A i*i.. - R . 12.6931

i=l

Due to Factor B ^ ^ . j . “ .

j -1

13.0493

( p - 1 ) - 1

(q -1 ) - 1

I n t e r ­act ion

Res idual

To ta l

P q P

I I Ri j - -i-1 j-1 i-1

q

h - j . + r ...j - i

p q

n - I I Ri ji-1 j-1

N - R

-7 .2727

0.9186

( p - l ) ( q - l ) - l

p q ( Z - l ) - 16

19.3883 N - 1 = 19

It should be noted from Table 7.2.2 that the sum of the two main effects measure

of variation is greater than the total measure of variation and therefore the

interaction measure is negative.

As noted the above example does not satisfy the assumption of large overall

concentration parameter k but has been used to illustrate the associated problems

when k approaches and decreases below 2. This type of data set questions the

independence of the test components in less dramatic cases. [The presence of a

negative interaction measure of variation does not occur if the data is axial in nature

and the general assumptions are upheld.]

- 118 -

Page 133: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

7.3 A Further Approach to Watson and Williams Design Using a Regression

Model

Section 7.2 has given an example of where the extended techniques for the

randomised and two-way analysis designs may breakdown. In this section we will

briefly reiterate how the one-way analysis design was constructed and then

investigate, via a regression approach, the structure of the individual components.

From this approach the possible reasons for the breakdown of the present statistics in

larger designs may then be seen.

When testing the hypothesis that there is no difference between a number of

treatments, a basic result given in all texts, for the simple linear one-way analysis, is

P q P P q

where Xjj represents the individual observations, xj the treatment sample mean values

and x the overall sample mean. The left hand side measures the dispersion or

scatter of the whole sample about the true mean. The first term on the right hand

side measures the dispersion of the sample about the estimated mean x, whilst the

last term measures the dispersion of the sample means about the true mean.

It is easily shown that N -R represents the dispersion of the sample of directions

about the estimated mean direction in circular statistics. Equally N -X represents the

dispersion of the sample about the true mean direction, giving the expression

( 7 . 3 . 1 )

(N - X) ■= (N - R ) + (R - X) ( 7 . 3 . 2 )

in our analogue of (7.3.1).

- 119 -

Page 134: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Using the results of (4.2.3)

2k(N - X) = 2k(N - R _ ) + 2k(R - X) ( 7 .3 .3 )

has the associated chi-squared distributions

“ *N -1 + *1

This was developed further by Watson (1956) to examine 2 or more samples with

assumed equal concentration parameter. It may be seen that if the mean direction

of two samples differ greatly the sum of their resultants R , and R 2, would be

much greater than the overall resultant R . Similarly, if the mean directions are

equal R , + R 2 = R . This is then used as a measure of variation between

samples. The variation within the samples is measured by comparing the maximum

length of a sample resultant, N j, to its actual resultant, R j. For a two sample test

the within variation is therefore

(N tl - Re l ) + (N . 2 - R . 2)

This suggests the analysis of variance expression

2k(N - R ) - 2k( N ' t - R .2 + N# 2 - R. 2) + 2k (R. i + R . 2 " R . . >

( 7 . 3 . 4 )

It is easy to see from these analogies how the test for the one way analysis is

derived. Nevertheless, it is not until we investigate the individual components of the

analysis of variance expression that we see an underlying problem that is not fully

appreciated until larger, more complex designs are constructed.

A simple alternative way of building the analysis of variance expression is via a

regression approach using basic vector analysis. Figure 7.3.1 gives the notation that

we will use to construct the components.

- 120 -

Page 135: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

j icon oj* sample, neon dii'e.ch’on

Overall sample\ f^ean direohor

.22

JZ

Angular Notation for the Construction of ComponentsFigure 7.3.1

Assuming the 0jj's (i= l,2 ,...,p ; j= l ,2, ,q) are independently distributed as

M(/<i0+/3j,/:), where /*0 is some mean direction and fij is the possible effect due to

treatment j. Let us test

H0 : 0 , - (32 - .........

H1 : at least one /3j * 0

- 0 q " 0

A A A

Let fi0 and (pi0, /3j) be the maximum likelihood estimates of ( iQ and / i0; under

H 0 and H , . Let

P q

S0 - N " I I C0S( 0iJ " " 3j)i -1 j -1

- 121 -

Page 136: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

From the statistical results of (4.2.3), 2A:S0 has a x 2 distribution with N degrees of

freedom, for large k. Using approximation (1.4.13), 2fcS0 is equivalent to

P q

k I Ii= l j -1

A 2 ( 7 . 3 . 6 )

Using this approximation we may construct the individual components of the

expression.

7.3.1 Total Measure of Variation

From equation (7.3.6) the component for the total measure of variation is calculated

from the sum of the squared distances between each sample point, 0 jj and the

overall sample mean direction, 6 , as illustrated in Figure 7.3.2.

2.1

22.

12

32

Figure 7.3.2 Distances Between the Sample Points, and the Overall Sample

Mean Direction, 0

- 122 -

Page 137: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

From vector algebra and basic directional data properties we have, denoting 10jj | as

the vector length;

P q

k l I ( £ i j - I..)2 ( 7 . 3 . 7 )

i - 1 j -1

p q

- k ^ ^ ( ' i i j * 2 + ^ . J 2 " 2cos(0 j j - 0 m' ) )

i - 1 j -1

= /c(N + N - 2R )

= 2/c(N - R ) ( 7 . 3 . 8 )

Producing the same total measure of variation as in (7.3.4).

7.3.2 Residual or Within Measure of Variation

The residual measure of variation is calculated from the sum of the squared distances

between each sample point, j, and its own sample mean direction, 6 j

Figure 7.3.3 Distances Between Sample Points, and Sample Mean

Directions, 6j

- 123 -

Page 138: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

p q

* 1 I < £ ij - £ . j > 2 ( 7 .3 .9 )

i -1 j-1

p q

“ k \ \ j 1 2 + 1 .. j 1 2 - 2cos(0 j j - 0 . j ) )

i -1 j -1

qk{ N + N - 2 ^ R j )

q- 2k(N - £ R j )

J-1

Producing the same residual measure as in (7.3.4).

( 7 . 3 . 1 0 )

7.3.3 Between Measure of Variation

If, as is shown in 'linear' statistical analysis, the total measure of variation is split

into two parts, a residual or within measure and a between measure, then the

between measure may be found from

However, equation (7.3.11) assumes, as in 'linear' analysis, that when circular mean

directions are combined the same overall mean direction is produced. It was first

observed in Example 5.3.1 that this property does not exist for circular statistical

analysis, the following sub-section investigates this prior to deriving the 'true' between

measure of variation.

q q( 7 . 3 . 1 1 )

- 124 -

Page 139: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

7.3.3.1 Combining Mean Angular Directions and Resultant Lengths

Expressed algebraically by (7.3.12) a simple proof shows that when angular mean

directions are combined the overall angular mean direction may not be produced, and

is dependent on the resultant lengths found within each of the combined samples.

q P q

I s in e . j 2 I s i n # i j

i —!--------------- ^ 1 y-1--------------- ( 7 . 3 . 1 2 )q p q2 COS e . j I 2 cos 0 j j

j-1 i-1 j-1

An angular mean direction, 0 is given by

cos 0 = ^ s in 0 = ^ ( 7 . 3 . 1 3 )R R

Similarly for 0 j

C j _ S jcos 0 j = -- s in 0 | = ——

s .j 5 -j

Therefore

q q p p q

I s i n ? . j I v T ] l s i n "i j I I s in # i j/ — U — ----------- ( 7 . 3 . 1 4 )

q q p p q

I c o s 9 -j I F T l c o s e i j 2 2 c o s ej-1 j-1 i -1 i -1 j-1

Figure 7.3.1 gave the mean direction of the sample means as T with resultant length

Rfl. This result gives further understanding and an alternative calculation of the

overall mean direction and its associated resultant length. By utilising the resultant

lengths associated with each sample mean direction, the mean directions may be

- 125 -

Page 140: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

combined, as in standard statistical analysis, to give the overall mean direction.

Using a rectangular co-ordinate system with X and Y axes and origin 0, let 6 j be

one of the q mean angles with corresponding mean resultant length r y Let x j and

y j be the rectangular components of r j, as seen in Figure 7.3.4. Then by

definition

x.j - r jcos e.j y.j " r.jsin 0.j

Then

( 7 . 3 . 1 5 )

x = — (r,cos 6 + r 2cos 6 2 + . . + r #qCos 0 q) ( 7 . 3 . 1 6 )

y = — (r#1sin 6 ^ + r 2sin 6 2 + .. .. + r qSin 6 q ) ( 7 . 3 . 1 7 )

Therefore

M *1

= [(i I r.jcos ? j)’ + (I I r.jsin ?.j)2]i ( 7 . 3 . 1 8 )

R = Nr

Similarly for p row mean directions

P[(i ^ rj cos )2 + (I ^ri SinOj ) ^

i=l i=l( 7 . 3 . 1 9 )

Giving the overall mean direction as

6 = arc tan

q1q I r - j s in

j= iq

i Y —

q I r . j cos

( 7 . 3 . 2 0 )

- 126 -

Page 141: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Figure 7.3.4 The Rectangular Components of a Non-Unit Vector

This may be further understood by a simple illustration given in Figure 7.3.5. Here

two samples have four observations within each, by taking account of the sample

mean resultant lengths the true overall mean direction is obtained.

- 127 -

Page 142: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Uni^ lenqfil

Figure 7.3.5 Combining Samples Using Angular Mean Directions and their Mean

Resultant Lengths

Note: (a) Fisher and Lewis (1983) considered the problem of forming a pooled

estimate of the common mean direction of several circular samples with possibly

differing concentration parameters. In brief their work discussed the introduction of

some arbitrary non-random weighting factor to each sample defining

'Wi -1 i -1

P2 = + C 2 ( W j > 0 J Wj - 1 )

i - 1

The general pooled estimate for ’ p, given by defined by

- 128 -

Page 143: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

cw swcos Aw - — s in Aw = Z“

Rw Rw

Using the central limit theorem Fisher and Lewis produced a confidence interval of

the pooled estimate comparing this to the approximate confidence cone for the single

sample case. (Further approximate confidence intervals for a mean direction of a

von Mises distribution were later given by Upton (1986)). Fisher and Lewis showed

the consequences of specific choices of weights, considering equal weighting of 1/q

and proportional weighting of Nj/N.

(b) An alternative calculation of the overall resultant length from the sample

angular mean directions and resultant lengths is via the dot product rule for vectors

Using the q column statistics

q q qR2 . + 2 ) ) R . R cos( 0* . - 0 J ( 7 . 3 . 2 1 )

.J L L . J . t . j . tj=l j^i t - l

j 7* t

j < t

Similarly for the p row statistics

P P P

R?."IRi. + 2l1=1 i= l t= l

i ? t

i < t

For example, if the overall resultant was to be found from three sets of sample

statistics

R2 = R2 + R2 + R2 + 2R. R , cos(0. - )1 . 2 . 3 . 1 . 2 . ' 1 . 2 .

+ 2R1 R3 cos( 0 t - 0 3>)

+ 2 R 2>R 3 > c o s ( 0 2> - 0 3 > )

R. R cos(0 . - 0. ) ( 7 . 3 . 2 2 )l . t . l . t .

- 129 -

Page 144: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Returning to the calculation of the between measure of variation, it is evident that

this should be calculated from the sum of squared distances between each sample

mean direction 6 j and their combined mean direction T. As an identity this will

leave a further component measuring the difference between the combined mean

direction, 6, and the overall mean direction, 6 , as in (7.3.23)

( 0 i j - * . . > ” < * i j “ * . j > + < * . j - « ) + ( * - 7 . ) ( 7 . 3 . 2 3 )

t o t a l res idua l between d i f fe re n c e betweenthe combined mean d i r e c t i o n and the o v e r a l l mean d i r e c t ion.

Therefore the 'true' between measure of variation is given by;

P q kli - i j - i

p q~ ^ ^ ^ ( l£ . j 1 2 + \8_\2 - 2cos (0 j - 6 ) )

i -1 j -1

- /c(N + N - 2Rd)

- 2k(N - Re) ( 7 . 3 . 2 4 )

7.4 Cross Product Terms and the Analysis of Cross-Classification

In 'linear' statistics the corresponding expression to (7.3.23) will produce cross

product terms equal to zero. On the circle these terms are found to be non-zero;

P q P q p q

I I (£ij - I . ) * - I I<lu + 1 I<Z.j-Z>2i -1 j -1 i -1 j -1 i - 1 j -1

p q p q+ l + J <£i j - I.jXZ.j - I>

i -1 j - 1 i -1 j -1

- 130 -

Page 145: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

p q _ p q

+ 2 1 1 + 2 1 1 (- Jj ■i -1 j -1 i - 1 j -1

( 7 .4 .1 )

The left hand side and the first two terms on the right hand side of (7.4.1) have

been simplified in Section 7.3, giving

P q

k ]j ^ ( .^ i j ~ Z . . ) 2 " 2/c<N “ R. . ) from ( 7 . 3 . 8 )i - 1 j - 1

p q q

k £ 2 (- i J " ^ * j )2 " 2/c(N ” 1 R- j ) from ( 7 *3 * 10)i - i j - i j - i

p q

k 1 1 ( - - J " " 2/c(N " R0) from ( 7 - 4 -24)i - i j - i

The following term is the measure of the difference between the combined means

and the overall mean direction.

P qk J J (0 - 0 ) 2 - /cN(|0 | 2 + | 0 _ l 2 _ 2cos(0 - 0 )

i - 1 j - 1

- 2kN (l - cos(0 - 0 ) ) ( 7 . 4 . 2 )

The final three terms are the corresponding cross product terms.

P q _ q R .

2 k l 1 ( - ! J ' Z . j H l . j - £> “ 2fcP I (— - £ . j ) < Z . j - £>i - 1 j - 1 j - 1 p

P q = u _ _

2k I J ( £ . j - £ ) < £ - £ . . ) - 2feN< £ ) < £ - £ . . )i - i j - i N

- 131 -

Page 146: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

P q R R q _2k J J ( 0 ; : - 0 0 ( 0 - 0 ) - 2/cN(— * ------- ) (0 - 0 ) ( 7 . 4 . 3 )

. - . - " N Ni - l j - 1

Summing the cross product terms of (7.4.3) equals

q- 2k (^ R j - 2N + Rfl - R + Ncos(0 - # . . ) ) ( 7 . 4 . 4 )

j - 1

Producing the circular model

q2k(N - R . . ) - 2/c(N - J Rmj ) + 2k(N - R0) + 2/cN(l - cos(0 - 0 . . ) )

j “ l

(T o t a l ) (Res idual) (Between) (D i f fe re n c e betweencombined and o v e r a l l mean.)

q+ 2k(N - J R j - 2N + Re - R + Ncos(0 - 0 ) )

j - 1

(Cross product terms)

Reducing

2k(N - R ) - 2k(N - y R j ) + 2ic(N - R0)

j - 1

(T o t a l ) (Res idual) (Between)

q+ 2 * ( J R . j + Re - R . . - N) ( 7 . 4 . 5 )

j - 1

(C orrect ion)

Unless the residual term does not exist the correction will always be negative since

the between measure in (7.4.5) will always be greater than the between measure in

(7.3.4) from Watson and Williams.

- 132 -

Page 147: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

When larger designs are considered in 'linear' statistics the between sum of squares

can be broken down into three components measuring differences between rows,

differences between columns and the interaction within the design. In directional

analysis, from extending Watson and Williams, this splitting produces

P q P q

( I 2 Rij . - R. . .> - ( J r i . . - R . . ) + c 2 r j . i -1 j-1 i-1 j-1

(Between rows) (Between columns)

P q P q

+ ( I I Ri j . - - ] r . j . + R . . . )i -1 j-1 i -1 j-1

( I n t e r a c t i o n ) ( 7 . 4 . 6 )

If the first and second terms on the right hand side of (7.4.6) are calculated their

sum may be greater than the total measure of variation, as was illustrated in

Example 5.3.1. In this situation the value of the interaction will be negative,

whether interaction exists or not. It is only when the between measure is broken

down in this manner for larger designs is the problem within its derivation fully

realised.

- 133 -

Page 148: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

7.5 Summary

Chapter 7 has shown via a simple alternative approach to Watson and Williams

how,as k decreases, the model components begin to breakdown. The most obvious

consequence is seen when analysing two-way classification designs when a negative

component may be produced. These are shown to be a consequence of the way in

which circular mean directions combine. Unlike linear statistics, if the overall mean

direction of a sample is found, 6 , and then the same sample is split into equally

weighted samples and the overall mean direction re-calculated from the resulting

combined means, a different overall direction, 0, may be produced.

For the one-way analysis this does not cause any major problems. However, when

the technique is extended to larger analyses the sum of the main effect components

may give a result greater than the total measure of variation.

It is important to re-emphasise that the above only occurs as the overall

concentration parameter k decreases towards and below 2. The new extended

procedures, developed in Chapter 6, are satisfactory for highly clustered data sets

since the larger the value of the overall concentration parameter the smaller the

correction value of Section 7.4, and similarly the closer the two mean directions, 0

and 0, become.

- 134 -

Page 149: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 8

DEVELOPMENT OF A NEW ANALYSIS OF VARIANCE PROCEDURE

8.1 Introduction

Chapters 6 and 7 have shown (1) how the modified components for the one-way

classification have excellent chi-squared approximations for large k (2) how the

extended components for further design models also follow good chi-squared

approximations, for large k (3) how the underlying structure of the components may

breakdown as k decreases (4) the dependence of the components as k varies and (5)

since the overall concentration parameter must be large as well as the sample

concentration parameters, the tests may be good approximations only when applied to

highly clustered data sets.

This chapter is concerned with developing new test statistics which may be

generalised across all values of k and are based on likelihood ratio statistics for

testing both the mean direction and the concentration parameter, k.

The main reason for the breakdown of the extended models as k diminishes is

concerned with the combining of the angular mean directions, discussed in Section

7.3. A new procedure is shown to overcome this problem and therefore enable

components such as interaction to be investigated as k decreases.

Section 7.33 showed how the overall mean direction will remain unaltered if the

sample resultant lengths are retained when combining sample mean directions. Using

this fact, Section 8.2 constructs new components via a regression approximation

approach previously utilised to show the construction of Watson and Williams original

test statistic.

- 135 -

Page 150: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Following the discussion of the cross product terms in Chapter 7, Section 8.4

discusses the interpretation and calculation of interaction for the procedures.

8.2 Minimising Chord Distances to Build New Test Statistic Components

Using exactly the same approach as Section 7.3 a design procedure may be built via

a regression approach using basic vector analysis but also taking account of the mean

resultant lengths. This method will minimise the chord lengths between points within

the circle.

Assume the 0jj's (i = 1, 2 ,...., p; j = 1, 2 ,...., q) are independently distributed as

M (/t0 + /3j, k) where /*0 is some overall mean direction and 0j is the possible effect

due to treatment j. The null and alternative hypotheses are

H0 'Pi @ 2 ** ......... “ Pq — k 2 — .............— kq

against

H1 * 0 2 * ......... * /3q k , * k 2 * .............* kq

or the testing of different populations. To overcome this, the equality of the

concentration parameters must be examined prior to any examination of the main

effects, in a similar manner to that undertaken in standard 'linear' analysis of

variance.

8.2.1 Total Measure of Variation (TM V)

It was shown in Section 7.3 that 2kS0 is equivalent to

P q

k I 1 (6U " Vo ~ Pj>2 (8.2.1)i= l j - 1

- 136 -

Page 151: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Hence under H 0:

P q P q

kl I <#u -f» -ep2 = kl Ii - 1 j - 1 i -1 j - 1

Using the above expression the total measure of variation will be calculated from the

sum of the squared distances between each sample point, 0jj, and the overall sample

mean direction, 0 , as illustrated in Figure 8.2.1.

22

3Z

Figure 8.2.1 Vector Lengths for the Total Measure of Variation

Here the mean resultant length of 0 _ is utilised rather than effectively extending it

to the circumference of the circle. From vector algebra and basic directional data

properties we have, denoting 10jj i as the vector length:

- 137 -

Page 152: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

p q

kli -1 J- l

(8 .2 .2 )

p q

[ *^.ij 1 2 + !£. . ,2 “ 2 *£.ij 1 *£... 1 c o s i j “ 0. . ) ] i -1 j - l

p q

‘ 2 2i -1 j - l

R2 2R1 + —— cos(0js - 0 )

N2 N

R2 2R2N + -

N N

TMV - kR2

N -N

( 8 . 2 . 3 )

8.2.2 Residual Measure of Variation (RMV)

The residual measure of variation will be calculated from the sum of the squared

distances between each sample point, 0jj, and its own sample mean direction, 0 j, as

illustrated in Figure 8.2.2.

P q

k l h i a - t . p 2i - i j - i

( 8 . 2 . 4 )

p q

k J J [ *£.i j 1 2 + ' i . j ' 2 ■ 2 ,£ i j l lL. j 1 c°s i j - 0 j ) ]

i - 1 j - l

p q

‘ 2 2i - 1 j - l

RJ.i 1 + - JR .(N1

Nf j N . j .

c o s ( 0 j j - 0 j )

r q R2 . • J

q R2 . • Jk N + ) - 2 yL

L J - l N - j .L a

J - l

- 138 -

Page 153: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

r q R2 . • JRMV - k n - j

LL J-i N - i

22,

32.

Figure 8.2.2 Vector Lengths for the Residual Measure of Variation

8.2.3 Between Measure of Variation (BMV)

By incorporating the resultant length with its respective angular mean, as discussed in

Section 7.3.3, we may produce the same overall resultant length and angular mean

when they are combined.

Let us now construct the between measure of variation in the same manner as for

the residual and total measures. The between component will be calculated from the

sum of the squared distances between the sample angular means and the overall

angular mean. This measure is illustrated in Figure 8.2.3.

- 139 -

Page 154: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Figure 8.2.3 Vector Length for the Between Measure of Variation

P Q

k l hi-i - i j - i

j( 8 .2 .6 )

p q

i - 1 j - l

p q

* 2 2i - 1 j = l

R2 . R2• J

N2 . N 2 • J

R . •J

N Jj

Rcos(0 #j - 6 ' )

3 R2 . • J

R2 R2y + — - 2 — ( 8 . 2 . 7 )L

U - 1 N - j . N. . N ' * 4

f 3 R2 . •J

R2BMV - k I _ _J_L

LLj-i N - j . N. .

( 8 . 2 . 8)

- 140 -

Page 155: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Lemma. To show ^ Rjcos(0j - 8 ) - Nr -j “ l

fo r equat ion ( 8 . 2 . 7 )

Proof

J R^ jcos(0 j - 0 ' ' )

j “ l

q* J N j r j (cos 0 cos 8 j + s in 0 s in 0 . j )

j - l

■ ij - i

N . j r . j

X X • J

r r •Jj

y . . y . j

lr,- r-jJ

X y . .

r * * -

Nx +r

Ny

N[x2 + y 2 J - N r

8.3 Cross Product Terms

As discussed in Section 7.4, in standard 'linear' analysis the cross product terms sum

to zero, however, with the extended models for circular statistics a non-zero value is

obtained. By taking account of the mean resultant lengths this property is

re-established. Expression (8.3.1) shows the components within the design

- 141 -

Page 156: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

p q p q

k l h e n - I - » . j > +i - i j - i

( T o t a l )

i - 1 j - l

(Res idual )

p q p q

k 1 1 ( $ -J ‘ 6- - )2 ‘ 2k 1 1 ( 9 iJ ‘ ( 8 . 3 . 1 )i - 1 j - l i - 1 j - l

(Between) (Cross-Product)

From (8.3.1) the cross product term may be broken down as follows;

P q

2kl I <*IJ - ‘i - i j - l

p q

“ i ii= i j - i

! (£ j j I l£ . j |cos(0 j j - 0 j )

( 8 . 3 . 2 )

- i£ i j I 10 |cos(0 j j - 0 ) - l £ . j M 0 . j l + 10. j I l£ . . lcos(0 j - 0 _ )

p q

“ i ii - 1 j - l

R• J

N JJ

c o s ( 0 j j - 0 . j ) “R

Nc o s ( 0 j j - 0 )

R2 . • J +

R . •J

R

N2 . •J-l ln j J N . ^

cos(0 < j - 0. . )

2kr ^

yR .

• JR2 q

- IR2 .

• JR2

LU - i N J . N. .

Lj - l N - j . N. .

i.e. The cross product terms sum to zero.

- 142 -

Page 157: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Hence the one-way classification design model can be decomposed as:

R2 r qr» R2 . •J

R2 r qr* R2 . •Jk N ------ — - k i - — — + k n - y

N LL j - i N . i N . # L

L j - i N J .

(TMV) (BMV) (RMV)

8.4 Analysis of Cross-Classification and Interaction

8.4.1 The Interpretation of Interaction on the Circle

The extent of the problem within the original approach was not fully appreciated

until extensions were made to larger designs. Before constructing the statistical test

components for larger designs using the approach of Section 8.2 the understanding

and interpretation of interaction on the circle will be discussed.

To illustrate the meaning of interaction on the circle a simple example giving

population angular means for a particular design may be show. (Table 8.4.1).

Table 8.4.1

Treatment Level

A B C

1 356* 6* 357*F a c to r

2 0* 14* 1*

Unlike standard 'linear' statistics a simple subtraction of level mean directions may

not be used to indicate the presence of interaction since here periodic values are

given and not quantities. The true angle between each of the directions must be

calculated in order to observe any differences. (Table 8.4.2).

- 143 -

Page 158: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 8.4.2

Treatment Level

A D i f f B D i f f C

F a c to r

1

D i f f

2

356*

( 4 * )

0*

( 1 0 * )

( 1 4 * )

6*

( 8 * )

14*

( 9 * )

( 1 3 * )

357*

( 4 * )

1*

We note that the difference in direction between the two Factor types is larger for

the middle level of treatment than for the low and high levels. Similar differences

are seen between any two treatment levels and the two Factor types. With unequal

differences in sample angular means we may state that interaction exists between

factor and treatment levels.

As in linear statistics graphing the cell mean directions can be an aid in interpreting

the interaction. For example, consider an experimental design that involves three

levels of treatment and two levels of a factor. Lines are used within a unit circle to

represent the cell mean directions for the experiment. Table 8.4.3 gives the sample

means for the design, with equal concentration parameter within each cell.

Table 8.4.3

Treatment Level

A B C

1 270* 320 ’ 40*F ac to r

2 170* 220 ’ 300*

- 144 -

Page 159: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Figure 8.4.1(a) indicates no difference in the length of the line segments between

level A and level B of the treatment. Figure 8.4.1(b) shows no difference in the

length of line segments between level B and C of the treatment. Similarly Figure

8.4.1(c) shows no difference in the length of the line segments between level 1 and

level 2 of the factor. Here Figure 8.4.1 illustrates an experimental design where no

interaction is present.

(a ) D i f ferenc e between (b) D i f fere n c e betweenlev e ls A and B o f treatment leve ls B and C o f treatment

2Z

(c) D i f ference between leve ls 1 and 2 o f f a c t o r

Figure 8.4.1 Mean Responses without Interaction

- 145 -

Page 160: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Alternatively Table 8.4.4 and Figure 8.4.2 illustrates the same design layout but with

interaction present.

Tab1e 8 . 4 . 4

Treatment Levels

A B C

1 270* 240’ 340*Factor

2 280* 360’ 60*

23

(a) D i f ference between (b) D i f feren ce betweenlev e ls A and B o f treatment lev e ls B and C o f treatment

22.

23

(c) D i f ference between leve ls 1 and 2 o f f a c to r

Figure 8.4.2 Mean Responses with Interaction

- 146 -

Page 161: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

As in 'linear' statistics when the data indicates that large interactions exist, it is

important to consider whether large interactions actually are present in the sample

means or whether there may be some other explanation for the occurrence of the

interactions in the data.

Unexpected interactions may be caused by a problem in the data, there may be an

outlier or an erroneous response. Possibly another effect may be taking place which

has not been accounted for. In experiments involving animals, for example, where

room or time of day may give an apparent interaction when none exists. In such a

case, the errors can no longer be said to be random or independent. Thus an

unexpected interaction may be a clue to a failure in meeting the assumptions of the

model being used.

8.4.2 The Calculation of Interaction on the Circle

The between cells sum of squares in 'linear' statistics can be split up to produce a

between rows, between columns and interaction sum of squares. For a two way

analysis the individual cell interaction values are made up of three components, the

distance from the overall mean to the cell mean (x - xjj ), from the row mean to

to the cell mean (xj - xjj ), and from the column mean to the cell mean

( \ j . - % ) •

(x - x j j ) - (Xj - X i j ) - ( x j i - x i j . )

Giving

X i j - x j - x j + x ( 8 . 4 . 1 )

For the overall interaction this distance is squared and summed over all observations.

For interaction on the circle Figure 8.4.3 illustrates the sample mean direction for

the three vector lengths involved.

- 147 -

Page 162: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Figure 8.4.3 The Components of Interaction

Interaction on the circle may then be given by;

p q m

* 1 I I < £ i j . - £ i . . - i . j . - L . Ji = l j = l Z=1

( 8 . 4 . 2 )

p q m

- * I I Ii - 1 j - l Z- l

R? . R? + - 1

N? . N?. i j . l

R2 . R2+ - 1I 1 + —

N2 . N2

- 2R. .

i jNi j

R.1 cos(0 j j^ - 0 i m' ) - 2

‘i j

L« i j

R

NcosC^jj^ - 8 j )

- 2R.

1

Nicos( 0 j - 8' ' ) - 2 VJ_

LN. j - j Ncos (6 j - 8 )

- 148 -

Page 163: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

+ 2R

NL L

i J . J

cosN

‘i . .

N i . .

•J

Ncos(0j - 0 s )

j .

( 8 . 4 . 3 )

R.i j

Njj

R .J

It can be shown that

p q in

1 1 1i - 1 j —1 Z—1

p q m

2 2 2i -1 j - l Z -l

p q m

2 2 2i -1 j - l Z -l

p q m

2 2 2i -1 j - l Z -l

p q m

2 2 2i -1 j - l Z -l

p q m

2 2 2i -1 j - l Z -l

R, , R* Pn R2i j . i . . c o s ( 6 j j ^ - « i . . > - )

l . .

Ni j * . Ni . . .Li

i - 1 Ni . .

R

lNU

R.l . . R .• J.

_N i . . NJ ,

c o s (0 i j . - 0 . j . ) " \

j - l

R2

C O S ( 0 j # # - 0 5 )

N

R2

N

R2

N

R2

N

R2

N

Replacing these into (8.4.3) gives

r p q

y y R2 .IJ .

p_ y R?l . .z zLi-1 j - l Ni j . .

zi - 1 Ni . . .

Ij “ l

RJN

R2 + —-

N( 8 . 4 . 4 )

- 149 -

Page 164: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

A comparable breakdown of the between measure of variation for the circle can now

be seen as

r p q

y y N j .] R2 r p y R? . . ' R2+ kr qy fRJ . l R2L Il i - i j - i Ni j . . N. . .

zLi - l Ni . . . N. . .

zLj-i NJ . . N. . .

(Between Factor 1 (Between F acto r 2 or Rows) or Columns)

+ k

r p q

I I fR?J.lp

- IR?

i .. q

- IR2 .

•JR2

+ ■ - - -La La

L i - l j - l [ " t J . JL

i - 1 [ N i . JL

J - l k j j N( 8 . 4 . 5 )

( In t e r a c t ion)

In the same manner as Section 7.4 the cross product terms may be found for the

two-way classification with interaction design. Following lengthy vector algebra we

may show that the cross product terms listed below all equal zero.

p q m

2k 1 1 1 " Z - - - ) ( I-J* " I . . . * “ 0i-1 j - l Z -l

p q m

2k I I I (I i . . " Z. . . } (Z i j . - Z i . . “ Z. j . + Z. . . > * 0i -1 j - l Z - l

p q m

2k I I I (Ii.. “ " l i j^ "°i - 1 j - l Z- l

p q m

2k 1 1 1 (^ -J - " Z . . . > < i i j . ’ I * - . " I . J . + " 0i -1 j - l Z- l

p q m

2k 1 1 2 (Z . j . - Z . . . ) (i . j z - Z i j . ) - oi - 1 j - l Z- l

p q m

2kI I I (*ij* - Zij.xZij. -Zi.. -Z.j. +Z...> - 0i - 1 j - l Z- l

- 150 -

Page 165: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Finally the two-way analysis of variance model with interaction may be given as

R2XT * * * — U

r p y R24- k

r qy fRJ . l R2

N. # # z. i -1 Ni . . N. . .

zL j - i N J . . N . . .

+ kr p q

I I Ri j -p

- 1

1 i

q

- 1R2 .

•J-R2

+Li Li

L i - l j - i LNi j J Lii - 1 [Ni . . J

UJ - l k jJ N . . .

p q R2 .+ k n ■ I 1 i j -

Lt uL i - i j - i Ni j * .

8.5 Other Design Models

8.5.1 The Randomised Complete Block Design

Section 6.3 discussed the requirement and structure of the randomised complete block

design where the blocks are formed so that each is as homogeneous as possible.

Within this design no interaction exists. The vector difference (0jj - 0 ) i.e. the

total measure of variation, may be expressed as the sum of three terms

<®ij “ " (Z i . " I . .> + (Z . j - + (i i j " Z i . ~ Z . j + Z. .>

( 8 . 5 . 1 )

Both the first two terms on the right hand side of (8.5.1) have been seen in Section

7.3.3 and give

P q

k l I <Iti - 1 j - l

p q

k l I < * .i - i j - i

r p R2 R2- £ _ ) 2 - k y l . - —l-l

Li. i -1 Ni . N . .

r qn R2 : R2 'j - Z . . ) 2 “ k I

• j

L j - i N - j . N . .

( 8 . 5 . 2 )

( 8 . 5 . 3 )

- 151 -

Page 166: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Note that (8.5.2) provides an estimate of the measure of variation between p

treatments and that (8.5.3) provides an estimate of the measure of variation between

q blocks. The third term on the right hand side of (8.5.1) is an estimate of the

residual measure within the design. Using the same procedure as Sections 8.2 and

8.4 it can be shown that

i -1 j - l

r pv qV R2+ 0 ) 2 - k N - ) - / + — —

Li -1 Ni .

j - l N-j. N . .

( 8 . 5 . 4 )

In the same manner as for the two way classification with interaction the cross

product terms all equal zero. Hence the randomised complete block design may be

given as

R2 r pV’ Ri . ‘ R2 r[R’ j]

R2k N -----— - k y - --— + k y - ——

N LLi-i N‘ -. N. .

LLj-i ."•J. N. .

PN - I

P2Ki.

q- I

R2R-j

R2

Li-1 Ni*.

Lj - l N-j. N_ _

8.5.2

+ k

Latin Squares Design

( 8 . 5 . 5 )

In the randomised complete block design, the effect of a single factor was removed.

It is occasionally possible to eliminate two sources of non-homogeneity simultaneously

in the same experiment by using the Latin square design. Such designs were

originally applied in agricultural experimentation when the two directional sources of

non-homogeneity were simply the two directions on the field, and the "square" was

literally a square plot of land. Its usage has been extended to many other

applications where there are two sources of non-homogeneity that may affect

experimental results, for example, machines, positions, operators, runs, days. A third

variable, the experimental treatment, is then associated with the two source variables

- 152 -

Page 167: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

in a prescribed fashion. The use of Latin squares is restricted by two conditions:

(i) the number of rows, columns and treatments must all be the same

(ii) there must be no interaction between row and column factors

The analysis of Latin squares is based on essentially the same assumptions as the

analysis of randomised blocks. The essential difference is that in the case of

randomised blocks we allow for one source of non-homogeneity (represented by

blocks) while in the case of Latin squares we are simultaneously allowing for two

kinds of non-homogeneity (represented by rows and columns). As with the

randomised complete block design the relatively simple but lengthy proof of

construction via vector algebra has not been reiterated, however, all cross product

terms can be shown to equal zero, producing the model expression:-

R2N -

N

if

Ii -1

Ri . .

Nl . .

R2

N+ k i

■J-l

R2 . •J

N J-J

R2

N

+ /cr p

yR 2R. . z R2

zlZ-1 N. . .

+ kP

N - Ji - 1

R? l ..Py R2 .

•J*Py r : . z' R2

+ ? * * *

Ni . . .Z

j- i NJ . .z

Z- l n . . * . N. #( 8 . 5 . 6 )

- 153 -

Page 168: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

8.5.3 Nested or Hierarchical Design

Section 6.2 discussed the structure of the nested design; as with the randomised

complete block and Latin square designs the nested design may also be constructed in

the same manner to produce the model expression;

Page 169: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

8.5.4 Three-Way Classification

Clearly larger and larger designs can be constructed in the above manner, here a

three-way classification design has been built to illustrate the generalised nature of

the approach;

R2N -

N

+ k

r p y M*

K) R2Ll i - i Ni . . N . . .

' my R! . z R2

zlz - 1 N. . .

+ k iLj-1

R2 . •J

N

R2

N

+ kr p q

I I

R? . i j -

P

- 1 R?l ..q

- 1 R2 .• J.

R2 + — ■La La

Li-l J- i iNijJ Li -1 [Ni . . j

LJ- l k j . J N. . .

+ k

p m

i i ^ - i=1 (_1 i . I

+ k

+ k

i - 1 Z- l

q m

i i .

P m q

i i ii - 1 Z- l j - l

i - 1

q

ij - i

R?1

Nj

R2

N

m

- I

J-J

Z- l

m

R? . i

N . . Z

R2

- 1Z- l

r : . z

N . . Z

N

R2

N

R? . . i j *

p q

- y yR? .

i j -q m

- 1 1.Ni j I.

La La

i - 1 j - l k j . JL L

j - l Z - l k j / J

p m

I Ii - 1 Z- l Ni A i - 1

R?l * IJ - l

R2 . •J

N ■J.J

m

iZ -l

R?.z' R2

N. . * . N. . .

+ k

p q. m

- i l li —1 j —1 Z—1

R2 . .i j *

lNi j / J

- 155 -

( 8 . 5 . 8 )

Page 170: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Chapter 11 shows further use of this generalised method with the implimentation of a

split plot design to real data arising in an angular form.

8.6 Summary

It was seen in Chapter 7 how the combining of angular mean directions may give a

false overall mean direction. Taking account of the mean resultant lengths together

with their corresponding mean directions has been shown to eliminate this problem

and has helped to indicate a new approach to the circular analysis of variance. It

has been emphasised that the approach is, in the first instance, an analysis of

differing populations, via the maximum likelihood ratio test, rather than differing

mean directions. The requirement for the testing of the equality of concentration

parameters is essential in order that a true test of mean directions may be

undertaken.

Using the knowledge of angular mean combinations, the use of vector algebra and

directional data properties, new components for the total, residual and between

measures of variation have been built. The significance of this method has shown

how the cross product terms, found to be non-zero for the original approach in

Chapter 7 and requiring a correction factor, are now zero.

This method has then been extended to discuss, explain and illustrate the construction

of interaction on the circle and hence build the new procedure for the two-way

classification design, showing zero cross product terms.

Finally this generalised method has been further extended to construct other larger

designs such as the Latin square and three-way classification design.

- 156 -

Page 171: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 9

TH E DISTRIBUTION OF TH E NEW COMPONENTS AND TEST STATISTICS

Having developed the generalised procedures in Chapter 8 it is necessary to examine

the theory of the associated statistical tests for all experimental situations which may

occur. Some attempt was made to evaluate theoretically the exact distributions of

the test statistics. However, as was found by Upton (1972) and Stephens (1969, 72),

even for simple single sample tests the numerical integration involved was extremely

tedious. R does not have a simple density function and a direct evaluation of the

significance points is not straightforward. Stephens (1969) gave the upper and lower

1% and 5% points for several values of k and N for R/N and X/N. Stephens

(1972) has also evaluated the exact two-sample test given by equation (4.2.2) for

differing values of k and N. In 1969 Stephens discussed the problem of obtaining

the exact theoretical distribution of

for the sphere and circle developed by Watson (1956) and Watson and Williams

(1956). Stephens also stated that the analysis is not so straightforward as for the

Normal distribution; with the distribution of the test statistic being intractable. This,

unfortunately, is also true for the distribution of the test statistic

from (8.3.3). However, the asymptotic results, as for the test statistic by Watson

and Williams, may be investigated.

9.1 Introduction

9

u

- 157 -

Page 172: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Sections 9.2 and 9.3 will examine the component statistics for small and large k

respectively. In addition the first two moments for each of the components are

found and used to improve the associated approximation. The accuracy of the

chi-squared distributions and their corresponding F statistics are examined via the

same simulation techniques used in Chapter 6.

As with standard 'linear' analysis of variance, it is important to reiterate that because

of the way in which the new procedures are constructed, the equality of the

concentration parameters are examined prior to any analysis of variance.

9.2 Small Concentration Parameter, k

As given in Chapter 8 the new procedure is based on the likelihood ratio test for

the null hypothesis /31 = ...... = /3q, k 1 = .... = kq against its general alternative

hypothesis. Let X be the likelihood ratio criterion for this problem, as in (3.1.5),

giving the test criterion

-2 logX = 2 J f c j R . j - i j l + l N . j l o g

j - l J- l

10( ^ 0)

M * J ) .

A A

where k 0 and kj are given by

R RJA (k 0) = - A ( k j ) =

• J

NN-j.

j = 1 , 2 q

( 9 . 2 . 1 )

(9 .2 .2 )

The power series for the ratio of Bessel functions I Q(k) and I,(A:) for small k are

given in (3.2.8). The first term approximation, A (k) = kl2, has been shown to be

tolerable for k z 1. Using this in (9.2.2) gives

^0

2R

N

2R

N Jj

j = 1 ,2 , . . . , q ( 9 . 2 . 3 )

- 158 -

Page 173: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

also given in (3.2.11) as approximation k 7. Hence for small k (9.2.1) reduces to

( 9 . 2 . 4 )

This same procedure is given by Mardia (1972) for his S statistic. Identity (9.2.4) is

seen as the between measure of variation (8.2.8), for small k.

From Rayleigh (1919) and Lord (1954) it has been shown that 2R 2/N has a

chi-squared distribution with 2 degrees of freedom. This result was given by

Rayleigh after finding the form of the p.d.f. of R, equation (2.3.14), for large N.

This result has been used to test whether the population from which a sample is

drawn differs significantly from uniformity. We cannot, however, show 2 (N -R 2/N ) to

be chi-squared in a similar manner. Following preliminary investigations and

simulation routines both 2 (N -R 2/N) and

are found to be what may be termed as negative or reflected chi-squared

distributions (or random variables), negatively rather than positively skewed. With

the inclusion of N the chi-squared is effectively transformed to a reflected

chi-squared with its variance decreasing as k increases.

For small k we may only investigate the between measure as produced from

maximum likelihood. This chi-squared may be improved following the equating of

expectations, using

E(R2j ) - N . j + N j ( N j - l ) p 2 from ( 2 . 5 . 2 )

M*>where p = A(k) = and

I 0W

E< * I ( q - l ) > "- 159 -

Page 174: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Producing the improvement factor

1( 9 . 2 . 5 )

1 - p 2

Therefore we may approximate the distribution

2 r 9 y [R‘jl R2

1 - p2 zLj-i N

( 9 . 2 . 6 )

by a x 2 variable with 2 (q -l) degrees of freedom. When k is unknown, the

maximum likelihood estimator k , given by (3.2.2), will be used.

9.3 Large Concentration Parameter, k

Watson and Williams showed that 2k(N-R) is approximately distributed as a

chi-squared with (N - l) degrees of freedom. For the procedure discussed in Chapter

8 we are required to show that fc(N-R2/N) is distributed as chi-squared with (N - l)

degrees of freedom, for large k. As a simple initial proof we may show that

n - N k(N - R)

For large k, R/N—>1, therefore

2k(N - R)

1 + I

R2N - — N

Alternatively we may adapt the approach of Mardia (1972, p 114) where the

distributions of 2fc(N-C), 2fc(R-C) and 2fc(N-R) are found. Here the distributions of

& (N -C 2/N )), fc((R2/N )- (C 2/N )) and fc(N-(R2/N )) are required.

- 160 -

Page 175: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

9.3.1 Distribution of k {N - C 2/N)

Let 0 be distributed as M(0,fc), then for large k we wish to show that

~(N - C)k(N + C) - * 2 ( 9 . 3 . 1 )

Let

N

€ — J /c(l - COS 0 j ) ( l + COS 0 . j )

j - l

N

i -j=l

N

i -j “ l

1 -

e 2 . e A .i +

2! 4!1 +

®?j 6U 1 1 + —

2! 4!

2 e A . e A. 0 G.2 l i _ • J _ -j• j 4! 212! 4!2!

For large k, 0 j is small

N

e * J

j“ l

From (2.2.2) 0 j may be approximated by N(0 ,k~%) and therefore kd j 2 will be

distributed as the square of the standard normal variate which is approximately a

chi-squared distribution with 1 degree of freedom. Hence, by the additive property

of chi-squared

N

ij “ l

kd2 . « k • J

C2N " N~ * *N ( 9 . 3 . 2 )

- 161 -

Page 176: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

9.3.2 Distribution of fc((R2/N) - (C2/N))

It can be seen that

£ (R + c )(R - C) - k (1 - cos0) (1 + cos0) ( 9 .3 .3 )

It has been shown by Mardia (1972, p 98) that the conditional distribution of 0

(sample mean direction) given R is M (0 ,kR). On using (9.3.1) the conditional

distribution of (9.3.3) for given R is x? which does not depend on R.

9.3.3 Distribution of k{N - (R2/N ))

Following the identity

c 2 R2 C2 R2N =■ k N " N + k N - — N

and using (9.3.2) and (9.3.3), by the additive property of chi-squared it can be

shown that

where k ((R 2/N )- (C 2/N )) and k (N -(R 2/N )) are independently distributed.

In the same manner as Watson and Williams original expression, (9.3.4) behaves like

the similar form found in 'linear' analysis of variance.

Using a similar approach, it follows that for large k

( 9 . 3 . 5 )

k N ( 9 . 3 . 6 )

- 162 -

Page 177: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Therefore

H - I

j “ l

R2.J

N • j jXN-q k

r v R?j' R2) - — fSL

U-i N.j. N» V 2

Lq - i( 9 .3 .7 )

As for small k these approximations may be improved following the use of

expectation (2.5.2). For large k , equating expectations gives an improvement factor

1( 9 . 3 . 8 )

k ( l - P2)

Therefore an improvement, when k is unknown, is made by replacing k by (9.2.5)

1N -

R2

N N -l ( 9 . 3 . 9 )1 - p 2

The remaining components also require the same improvement factor, giving;

1 R2N -------

N

1 r *y R?j' R2

- P2 1 - p 2 LL j- i N . j . N

1

1 - p » - 1j -1

R2 . •J

N.JJ

( 9 . 3 . 1 0 )

with associated chi-squared distributions

*N -1 " Xq-1 + XN-q

9.4 The Variance of the Component Chi-Squared Approximations

9 . 4 . 1 Variance o f

Var R2 R22

R2N - —— N - E N - ——

N E N - — N

= E(N2) - 2E(R2) + ER4

N2- [ ( N - l ) 2 - 2(N - l ) 2p 2 + (N - l ) 2p4 ]

- 163 -

Page 178: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Using expectations E (R 2) and E (R 4) given by (2.5.2) and (2.5.3)

- 2(1 - N ) p ’ - 1 - (N - 1 ) V + 575 ■(N -" 4 ) |P/I

+ (T ^ 3 ) T p2(2 + + ( N ^ 2 ) I (2 + 4p2 + p V + N ( 9 .4 .1 )

Let (9.4.1) be represented by T . For large k, equating T to the variance of its

associated chi-squared

k 2T - 2(N - 1)

Let S be the improvement for variance, therefore

S2 2(N - 1) k 2T

Re-equating expectations gives

( 9 . 4 . 2 )

( 9 . 4 . 3 )

N - N + (N - 1 ) [1 - Sk( l - p 2) ] « ( 9 . 4 . 4 )

This lengthy expression has been tested on simulations for large k (> 2) and

excellent chi-squared approximations produced. However, with the adjustment for the

expectation given in (9.4.4) it is possible for negative values to be obtained.

Similarly, as part of an analysis of variance procedure (9.4.4) would be impracticable.

- 164 -

Page 179: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

In order to study the variance (9.4.1) further, the asymptotic expansion of p has

been used. For reasonably accurate approximations the first four terms of the series

A(k) are required; for large k

1 -

1 1 1

2k 8k 2 8k 3

1

from ( 3 . 3 . 2 )

1 9 1 11 + + + -------

k 8k 3 64k4 32k s 64k6

2 1 1 1

k k 2 4 k 3 4k4

( 9 . 4 . 5 )

When these equations and those of p i and p 4 are substituted into (9.4.1), the

variance of the chi-squared approximation,

Var R2 1 1 -53 89 5NM - __N ~ *£ t(2N 2 ) + N " N + k 4 32N + 32 “ 32

Therefore

1 -9 ^ 7 5N 1 -1 7 9N 1k 5 m + 8 ~ T T + k 6 T3n " 32 32 + 0 k 7

( 9 . 4 . 6 )

VarD 2

N - — N 2(N - 1) + £ ( 9 . 4 . 7 )

This interesting result shows that the variance of the component is equal to the

variance of the chi-squared approximation plus further smaller terms. These terms

are found to be negative and heavily dependent on the size of k and N. Higher

terms may only be neglected when k and N are relatively large. Further proof of

this is discussed in Section 9.5.

- 165 -

Page 180: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

r q[R f j l9 . 4 . 2 V ar iance o f n - yLa

j - 1 N -j .

Following the same procedure as Section 9.4.1, let q be the number of samples and

c the number of observations within each sample and of equal size

Var ■ - 1 j - i

R2

lN-jJ-q - p 2[ 2 (c - l ) q ] - p4[ q ( c - l ) 2 ]

c! 2c!P4 + --------------- P2( 2 + p 2)

(c - 4 ) ! (c - 3 ) !

c!

(c - 2 ) !

Using the series equations of (9.4.5), for large k

(2 + 4p2 + p 2) + c

Varr qN - I [Rj ]

1------- ( 2qc

1- 2q) + —

q-qc + —

Uj -1 N -j . k 2 /c3 c

( 9 . 4 . 8 )

1 -5qc 89q 53q 1 -5qc 7q 9q '+ — ------------------ + -------------- - -------------- + — ------- + — - -----

32 32 32c /c5 16 8 16c

1 9qc 7q lq 1+ — ----- + — _ ----- + 0

k 6 32 32 16c k 7( 9 . 4 . 9 )

Therefore

r q v 1 q

Var< k N - ) - 2q(c - 1 ) + - -qc + —La

J-1 N . j . k c- - ■

( 9 . 4 . 1 0 )

As with the total component of variation the variance of the residual component is

equal to the variance of the chi-squared approximation plus smaller negative terms

dependent on the size of k and N.

- 166 -

Page 181: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

r q R2 . • J

R29 . 4 . 3 V ariance o f ) - —

LiL j- i N J . N

Let q be the number of samples and c be the number of observations within each

sample of equal size

Varf q

y R?j' R2 1 (qc) !

zU - i N-j. N (qc) ( (cq) - 4 ) !

2 ( q c ) ! (qc) !+ --------------------- p 2( 2 + p 2) H--------------------------(2 + 4p2 + p 2) + c

( ( qc) - 3 ) ! ( (qc) - 2 ) !

c ! 2c!P4 + --------------- p 2( 2 + p 2) +

(c - 4 ) ! (c - 3 ) !

c !

(c - 2 ) !(2 + 4p2 + p 2) + c

+ (q - 1) - 2(q - l ) p 2 + p4 (qc2 - (qc ) 2 + q - 1) ( 9 . 4 . 1 1 )

Using the series equations of (9.4.5), for large k

r qv *■ } R2 1 1 -1 1Var y ------- ------- ( 2 (q - D ) + — q - -L>

L j- i N -J. N k 2 R3 c q

k A

-53 53q 89q 89

32qc 32c 32 32

1+ ---

/C5

-9 9q 7q 7

16qc 16c 8 8

+ 0k G

( 9 .4 .1 2 )

- 167 -

Page 182: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Therefore

r * R2 1 -1 l 'k y ------- . - 2 (q - 1) + - — q - - +

L ,

L j- i N - j . N k c q -

Again it is important to note that the variance of the between component is equal to

the variance of the chi-squared approximation plus smaller further terms. However,

Section 9.5 shows how the size of k and N within the components have substantial

effect on the size and accuracy of the variance, particularly so on the between

measure of variation.

9.5 The Adequacy of the New Procedure, for Large k

As we have discussed in Chapter 4 and showed in Chapter 6 , the approach by

Watson and Williams, and adapted by Stephens, is the principal approach for testing

differences between mean directions of different samples with assumed equal

concentration parameter. The procedure developed in Chapter 8 is ultimately

designed to analyse larger experimental situations where extensions to Watson and

Williams have been shown to breakdown. However, it is necessary to compare the

accuracy and power of the new test statistics with the alternative tests for the

one-way classification before investigating its suitability for larger designs.

As in Chapter 5, for testing the adequacy of the extended techniques from Watson

and Williams, simulation techniques have been used to examine the accuracy of the

approximations to the distributions of the components of (8.3.3). The observations

from the von Mises distribution specified by the null hypothesis were generated by

the computer method outlined in Appendix B. 10,000 sets of samples of various size

were drawn from the von Mises distribution with k = 2, 3, 4, 5 and 10. For the

sake of conformity and in order to further check that the simulation techniques

- 168 -

Page 183: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

being used were satisfactory, the same sample sizes are analysed as in Upton (1974).

Five multi-sample experimental designs are examined varying in size, with N ranging

from 10 to 60.

To verify the findings of Section 9.4, the first two moments of the components for

total and residual measures of variation are given in Table 9.5.1. The mean of the

chi-squared approximation value for each component is seen to be an excellent fit

increasing in accuracy as k increases. If k is unknown and the improvement factor

(9 .2 .5) is used the accuracy of the chi-squared approximation mean value is

maintained. As shown in Section 9.4 the variance of each component is seen to be

below its chi-squared approximation value and dependent on the size of N and k.

As N and predominantly k increase, the accuracy of the variance increases.

This deficiency within both components is reflected in Table 9.5.2 and 9.5.3, showing

the accuracy of the upper percentage points to the x 2 distribution with associated

degrees of freedom. It is clearly seen that cq, the simulated proportion of the

components, approaches a, the x 2 theoretical or significance level, as N and k

increase. Similarly, due to the size of the chi-squared approximation variance values

the probability of accepting the null hypothesis when in fact it is false, a type II

error, is increased; and unacceptedly so for k=2 and small N. Tables 9.5.2 and

9.5.3 also give the comparable accuracy of the Watson and Williams component

improved by Stephens. For increased understanding the same 10,000 sets of

observations for each set of samples were used. It is seen for both components that

while the new procedure components overestimate the theoretical proportion, following

Stephens improvement Watson and Williams components slightly underestimate the

theoretical proportion or significance level of the associated x 2-

- 169 -

Page 184: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.1

Comparing

the mean and

variance

of th

e^2 approximations

for

the total

and residual measures of

variation

against

their

expected values

0o t--V /—\ t—\ *—\ /—v t—» r i /--N t—\ /—* /—\ ,—* ,—% r-vc CO V CD V CM CM CO CD r - CO CD CO <3- CO CM

«—> (D CO CO O o V O CO CO IX) CM CO CO IX) CM CM•H CO *d- IV V CO CD IX) <3* CO CD IX) *3"

• P . • . • . • * . . . . . . . .s (0 o CO CO cn CM v~ CM CO ^d- -cf rv r — CO

*> ; > V V- CM CO CO CO CO CO CO CO CD CD*_/ '—' >—> V_! >—> >—* V---! <—> >—> <—i '—> v_/ w \---! A_>

CM •OHV_/w1 CM CO CO CO CO *d- IX) CO CO CO CM CO

z c CO ID r - »r~ CO CO IX) CO rv V CD <3- rv CO<—1 (D V V- CD a <3- CM o CD CO IV abz CD . . • . . • • . . . . . . . . -

e : CO CO CO V CO CO CO CO CO CO CD CO CO CO COv~ *” r_ 'T *d* «3- <3* 'd '

CO CO COV* CO CD

11 II IICO CO

CO 0 r - 0 *3" 0G G G

TD II c TD II c TD II CCD 0 0 0 0 0

LU -P c •H -P c •H -P C •H_i cn O 0 P o G 0 P IX) G 0 PQ_ UJ 0 0 0 0 0 0 CO 0 TB 0e : m IX) CL e : > * CL e ; > % CL e : >< H * X CD X IX) Xtn cn IX) LU M“ LU V IU

0G /—\ t—\ /—\ e—\ r - t /—> /--N ,—« t—\ r - \ t—^C CD r~\ CM CO CD CM V- CO CD IV CM CM CM IV ,—,0 IX) CO IX) CO V CO «3- CM IX) CO CO «3* CM

•H rv <3- IV CO CD CM 1X1 IX) V CD CO CD CO IX) IVP . . . . . • '• . . • . . . ■ .(D IX) tn IX) V d* CO *3" IX) CO CO CO CD CO IX)

<—> > r - V- V r - CM CO co CO CO CO CO cO CD CDz: V-- ! v---! w v—/ V_/ l-- ! >—/ w w V_/ w \_1 * ' \_j V_> V_>\CM •OHI CO CO CO CO CM «3* CO rv 'd* CD CO

Z c CD CO cn V- CO V CO r - V IX) V— CO CM d-V_> CD 'T~ <r* CD a 'd ' <3* CM CM a CM r - CO a

CD • • • • • • . • . • . . . .E : CD CD CD CO CD CD CD CD CD CD CD O CD CD CD

r- v - t - IX) IX) '3* 'd* <3*

CM CO «3" IX) O CM co «3- IX) CD CM CO *3* IX) Ov

CO CO 00V - CO CD

II II IICD CD

CD 0 v~ 0 <3" 0G G G

"O II C TD II c TD II C0 0 0 0 0 0to -P c •H •P C •H -P C •H

_ j cn G 0 p O G 0 P IX) G 0 Pc l to 0 0 0 0 0 0 CO 0 0 0e: m IX) Q. s : > % CL e : > * D . e : >< M * X CD X cn Xcn ai IX) to V“ LU V IU

- 170 -

Page 185: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.1

continued:

Q1O /—i <—> <—i 4—> 4—> <—. i—> CD CO <—, ,—> 4--> CD r *C CD in CN <d- V rx CD CO <3* CD CO CD CD CNto CD CO in co co in CD CD IX CD CN X CO CN CO4—> •H CD CD CO CD x CD CN . • . CD . . .

•*“ 1 Ch • • • • . • • CO CD a • . a CO a• to CO in CD o V- r x a a in in o CD r -

s > 'd* co CD x x rx CD r— v <r“ IX CD r — v\ \_4 '_t *—> V_/ »—/ >_i >__i >_* <_< <_i >_< <__ i__ V_4 i_i

•r~in •Od<_> v CO V CN in rx CO CD X X COW LD CO CO co CO V- CN CO 'Cf CD X CO CO in1 c O o in CN a V- CN CN CD V

2 CO . . . . ■ . . . . . . . . . .<— > 0 CD CD CO co CO CO CO CO rx rx X X CD CD CObZ E : CO CO CO CO CO in in in in in in in in in in

CNCD r-x r - V

n ii IIco IX in CDCO 0 in 0 r— in 0

o o •> 0TD 11 c CO TO n c in n ii C0 0 CO 0 0 r- 0 0

LU •p c •H •> -p c •H •> -p c •H_I C/1 O o 0 U a o 0 U in 0 0 UCL LU C\J 0 0 0 CN 0 0 0 V“ 0 0 0e : m * CL s= > * D . e : > * CL e : >< M o X a X in XC/1 C/1 CM UJ t - UJ UJ

^0o <--V <—> 4—. <—> i—> CO CD CO CO t—> CO COc x in X CO CO 'd- CO 'd ' CD CN CO CN CN0 CN co in CD CN CN CN CD CO CO CD CD CD in

•H in CD CO r - . . . . *d" . . . .Ch . . . • . • CO ^d" in • V- CO CO *d*0 o X CN CO CO in CD CD V t— CD a a

<—- > in CO X X IX rx V- V r - r — X v V“ ■>r-~ZL V__/ V__4 \__* V__/ w \__* \—> V__4 \__> V----4 V__4 V__t V__4 *__4 V__4X

(M*QC co CN in in CD IX CD T- CO <r CO *$■ CO COI CD CO o CO CO CD T“ ^ r CN <r <sr CO CN in

2 c O CD in CO O CO o CD r— in CD CD r—v—t 0 . • . . . . . . . . . . . . .

bE 0 o CD CD CD CD a o CD CD CD CD CD a CD CDz : *d* CO CO CO CO CD CD CD in in CD CD CD in in

JZ CN CO in □ CN CO in a CN CO *3* in o

CO COCO V-IX <r“ T-

II II IICD CD in CDCO 0 in 0 in 0

a a * 0TO II c a TD it c in n ii c0 0 CO 0 0 0 0

LU •P c •H % •P c •H * -p c •H__1 C/1 O O 0 U a a 0 U in a 0 uCL LU CN 0 0 0 CN 0 0 0 V- 0 0 0e : m * CL e : > % a e : > % CL e : >< M □ X CD X in XC/1 C/l CN UJ UJ V UJ

_ 171 _

Page 186: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.2

Comparing

the accuracy

of the

two

X2 approximations with

N-1 degrees

of freedom

representing the

total

measure

of

co•H-P<U’Hto>

rv <3- cn 3* CN V CO CO CD CD UOCO rv V 0 rv CO CD co CO CO rv cn

CO cn CO cn cn CO cn CO CO CO cn cn CO cn cn COcn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . • . • . . . . . . . ■ • . ■o O 0 0

CN CD UO CO rv CD CO UO CN CDCD '3' cn CN cn CO CO CO UO CO <3_ CO T- UO

CD 3" 'T ■cf UO <3* '3' CO <3" «3* UO *3" 3* uo UOI i cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

■ . ■ . . • • • . . . . ■ ■ ■ .cn o o 0 0

'— ’ ^r cn CO CD v CD IV CO cn CN IV T- CN CO> - IV CD rv «3* CO CO CO <3* cn rv co «3_ CDCN o CO cn cn a cn cn CO 0 cn 0 rv CO cn O O

CO CO CO CO cn CO CO CO cn CO cn CO CO CO cn cn. • • ■ ■ . . . . . . . . . . .

o o 0 0

<3" cn CO CO CO CO CO UO cn cn T- CO COcn rv *3" <3" cn cn 3* CN CO CO CN CO CN 0

cn cn cn cn cn co cn cn cn cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. • ■ . ■ . . . . . . . . . . ■o o 0 0

CN CD rv UO cn IV UO cn rv UO <3" CO<—> UD CO CO r - <3* <3" rv <r- 0 CO 3* CN <3-un CO CD UO UO UO CD UO UO UO UO IV UO UO UO uocn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

• ■ . . • ■ . • . . . . . . ■ . .M* o o 0 0cn

CO <3- V UO r - V cn UO cn CN UO UO IV-—i o rv *3" UO 0 cn rv CD CO CO CD CN CN

o CO a CD 0 0 CN cn a cn 0 CN cn cn CD acn cn cn cn cn cn cn CO cn CO cn cn CO CO cn cn

. . • . . . . . . . . ■ • ■ . ■o a a 0

CN CO UO 0 CN CO '3" UO 0 CN CO •3" UO 0'T- T~

LU_ i cn O uoCL LU COz : m UO *< M ► O UOcn cn UO t -

- 172 -

The

table

gives

the proportion

of 10,000 Monte

Carlo

samples

for which

the two

approximations fell below

the/*

value

fpr which

the theoretical

proportion

should be

Page 187: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.2

continued:

x lo in CN X CO in in cn cn in CDo CN cn □ CN ^— X a CO o X X CO cn

cn cn cn CO cn cn cn CO cn CO cn CO CO cn CO COcn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . . . . . . . . . . . . . . .o o a o

X cn X cn CD <3- CN «3- CO CNy a X CO X cn X cn X V cn cn cn <3"

‘ ‘ )f in *3" *3* in in «3" cn <3- «3" in *3* cn *3" m. c cn cn cn cn cn cn cn cn OD cn cn cn cn cn cn cn. . . . . . . . . . . . . . . . .c n o a a □2v—/ IX CO CO X CD CD X in CD X *3" in CO>- c n in CD CN CO in o X <3* in cn a CNCM CD cn CO cn □ a cn CO cn a a CO CO □ a r -

cn CO CO CO cn cn CO CO CO cn cn CO CO cn cn cn• . . . . . . . . . . . . . . .

a o CD CD

cn CO cn cn cn cn cn CN X COCO CN v~ CN CN X v cn CD 'T- CO CN V cn

cn c n cn cn cn cn cn cn cn cn OD cn cn cn cn COcn c n cn cn cn cn cn cn cn cn cn cn cn on cn cn

. . . . . . . . . . . . . . . .□ a □ a

CO cn X in cn CO r— r - cn CD cn in in X<—> <3- cn CN CD cn CO CD cn X CN CO r - cn CNS V/ in X in in in in X n - in *3" in X *3" in *3" in\ < cn cn cn cn cn cn cn cn on cn cn cn cn cn cn cn

• . . . . . . . . . . . . . . . .<N ■ □ a CD CDc n

~ZL x • in V“ <3* cn c n cn in CN in>—> CO n - CO cn <3* * — cn cn CD CN CN OD CD in

o CN cn cn cn o CN CO CO cn a CN CO CD cn CDcn cn CO CO CO cn cn CO CO CO cn cn CO cn CO cn

. . . . . . . • . . . . . . . .a o o o

CN cn in □ CN cn <3" in a CN cn <3" in ar— V "

O incn

QJ •i %_ i cn □ □ in□_ UJ CN CN <r-e : m * % *< M a O incn cn CN V“ 'T-

in

- 173 -

Page 188: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

a300PbO0TJCT

cSO o-P •H•H2 0

■Hcn pc 0o >•H-P q-0 oE•H 0X Po CJp 0CL 0CL 00 E

•T* - 1—1X 0ZDQ TJ2 •H+-> 0

00 PSO-p 0JCq- 4->o bfl> Co •H0 4->p OZD 0o 0G 00 P

CL0 0SO P4->

Ebfl □C TJ•H 0P 00 PCL q-EO q-CJ o

CO

CD

cnLU_JCD<I—

X CD CO co in CD CD CO CD co CD X COCN «3" CD CO CN CT CO CO a X cn a CD

cn cn CO cn cn CO cn CO cn CO cn cn CO CO cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

• • • ■ ■ • • ■ • • • m • • ■ •

o o o a

*3" CO CO CT in CN in t - CO X ct CO CO COin CN CT V X CO CD X CD cn CO X X cn CD

^ \ / in «3" CT in *3* CT CO CT <3* CT CT CD CT CT in•>—) cJen cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

• • ■ ■ *s" • • ■ • ■ • ■ ■ • mcr ’ o □ o aw

cn CO X X CO CD CD CT CD <3* CN X cn cn CO<— < <3* CT CO CD co in in CO CN X X C T CN o CD>- o CO CO cn O cn co X cn cn cn X CO cn □ OCN cn co CO CO cn CO co CO CO CO CO CO CO CO cn cn

• • ■ • • ■ ■ • • ■ • ■ ■ • • •o □ a o

CO CN X CO cn CN CT CD cn C T COCO X CD CT cn CO in co CO X CN CO CN

cn cn cn cn cn CO cn cn cn cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

• ■ • • • • • • ■ ■ • • • • • •y - \ o o a a

X in CN CO CN X CO in CN in CT CT COCN o O o CO CD CN m >3* <3* v - T;— in

• i—>\ cn CO CD CD CD in CO in in in in X in in in in(N • I cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cncn • • ■ ■ • • ■ ■ • ■ • ■ ■ ■ ■<—< o o a aw

in <r- CN CO CO CN CD CO CO CD CN CT CD CO>—< CN r— CD *3" a CT X X CO CN CO in CT cn CN□ CO r - O V- o CO cn O cn a CN cn cn cn acn cn cn cn cn cn cn CO cn CO cn cn CO CO CO cn■ • • • • • • • ■ • m ■ • • ■ •o o a a

jS CN CO "3* in o CN CO CT in a CN co CT in CDv- >r-

UJ_ i cn □ inCL LU COe : m in * %< M * a incn cn in V

O2 C-P o•H0 4->SO P4-J OCLSO Ou P•H CLJZ2 i—1

0P OO•Hq- -P00 P0 OrH 0Q. SOE 4-100 0SOo -PrHP SO0 oCJ •HSO0 24->C PQ oe: q-CD 0a ZJCDi—i

0a >—

CTq- 1o nZc Xso 0•H JZ4-1 -pPO 2CLoO 1—1P 0□. S20 1—1JZ i—!4-1 0

q-00 0> c•H o obfl -P0 0 0i—1 E JZJZ •H0 X TJ4-1 O i—I

P ZJ0 D. OJZ CL SO(— 0 0

- 174 -

Page 189: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.3

continued:

cn X x in <3* CO CN CD CO V“ CO CO□ in x CN o X a cn CO CO cn co cn

cn c n CO CO cn cn cn CO cn co cn CO CO CO CO COcn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn• * . . • • ■ . * . . . • . »□ o o a

CO cn in x CO CO CO X CN COun x CO T- x CD CO in CO CO X X □ cn inCO in in *3* CO <3" in CO CO in ■sf in

J ° j cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn• » • . . • . . . . . ■ . .t r ' □ o a ow

2 CN V CN x CO cn *3" CN cn COs--> CO in cn o cn r - cn cn *3* cn CN CO X X> - o CO CO cn a o CO rx cn cn a X X cn cn oCN cn CO CO CO cn cn CO CO CO CO cn co CO CO CO cn• ■ • . ■ * • . . . » . . . . .

o a □ □

N <3* CO r- cn CO in CO in "vfCO CN CO CO CO CO CN CN T- CO CO CN

cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn• • . . • • . . . . . . . . . .□ □ a □

t—\ V" CO cn in CO CO CO CN CO CO!--V x CO CN in CO CO o cn CO V in <3"• 1—V in x in in in in x in in in X in in in in

• o cn CD cn cn cn cn cn cn cn cn cn cn cn cn cn cn53 * • • • • • • • . . . • • • ■ .X a a a a

•r-)(M •c n x X X <3- CO CN V” CO CO CN COw CO in CN t- CO □ CO V CO O CO in X in CO>■1 • a CO cn o a O CO CO cn cn O CN CO cn cn □1 CD cn CO cn cn cn cn CO CO CO cn cn CO CO CO cn• • . . • . . . • . . . ■ . . .'—' □ a □ ov :

CM CO <=r in a CN CO <3* in □ CN co <3* in CDr— V

in

%□ inCO V

UJ %_1 C/3 a o inCL UJ CN CN21 M % * *< M a o incn cn CN T- v—

- 175 -

Page 190: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 9.5.4 gives the first two moments for the between measure of variation

component. Once again the mean value of the component chi-squared approximation

is a very good fit increasing in accuracy as k increases. The deficiency in the

accuracy of the chi-squared approximation variance is again shown. This is slightly

worse than for the total and residual components.

Following examination of the fitted components percentiles to the upper 50% of the

theoretical x 2 distribution, several improvement factors were tested to increase the

accuracy of the chi-squared approximation variance without greatly impairing the

accuracy of the mean. The improvement factor 0, (9.5.1), was chosen as a balance

between the two moments, acceptable for all large k

1 1 1“ " 1 - — ------ ~ ( 9 . 5 . 1 )(3 5k 10k2

Table 9.5.4 gives the comparable moments for each set of sample sizes multiplying k

by (3 and shows how the mean of the chi-squared approximation has increased

slightly from its desired value whilst the associated variance has improved appreciably.

Table 9.5.5 gives the accuracy of the improved between measure of variation to the

upper percentage points of the associated x 2 distribution with comparable statistics for

Stephens measure. By introducing (3 the accuracy of cq the simulated proportion, is

greatly improved, increasing in accuracy as k increases.

It is important to note that to acquire this accuracy with the component distribution

variances being below the desired theoretical values implies that the accuracy of the

lower percentage points have deteriorated. However, the accuracy of the whole

component distribution is still very good, although Stephens measure has the superior

fit across all percentage points.

- 176 -

Page 191: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.4

Comparing

the mean and

variance

of th

e>f2

approximations

for the

between

measure

of variation

against

its expected

values

2 r—\V . 0

• oIN* c 4-- \ t—. 4---. 4-- \ 4-- \ 4-- \ 4---. 4-- \ 4-- \ 4---\ 4-- .c r 0 r \ CO in rv in cn CO in CN •3" CN CN cni •H CD o cn cn CN CO rv *r- cn cn a CD CN4--> U in rv. rv. rv cn CO CO CO IV un CO cn rv cn

(0 • • • • • • • • • • • • • . •• > t - ^— ^— ^— r - r - V“ 7 - V“ r— r—

&N

•I- )a •c r CO CN CN CN IV cn CO v - IV IV v ->—> c r \ CJ) cn CO ’T" cn CO a V o CO cn CNw 0 o a o r - r— o a a o cn O■—» 0 • • . • . • • • . . • . . . .CD e : v t— V t - v~ V V V o

r—\2 i—>\ 0

• o(N* c t—\ 4-- \ 4---. 4-- . 4-- \ 4-- . 4-- \ 4-- . 4-- . 4--\ 4-- \ 4-- .c r 0 CO CD V 4-- \ cn in CO v CO 4--> 4~ \ COi •H CN CO o CD a a rv cn IV cn cn CO CN rv

/—. P CO rv CO V CO CO a cn ^1- IV CO•f- ) 0 • • . • • . ■ • • • • • . • •

• > ^— V- ^— *— ^— V V" ^— T“ V ^— ^—

\•<-1

ct •cr r — CN in CN rv CD co cn I V V rv in>—< c cn cn cn cn CO O o IV V O CO cnw 0 r — □ cn cn o a a CD cn a v o cn cnw 0 . . . . ■ . . . . . . . . . .v : e : r - V" o a v V " r — V o <r- o CD

CN cn in a CN cn <3* m o CN cn in ar —

CN CN CN

II II II

0 0 0G a o

TD ii C TD ii c TD n c0 0 0 0 0 0

LU •P c •H -P c •H -P c •H_j cn o 0 P a o 0 p in a 0 uCL LU 0 0 0 r— 0 0 0 cn 0 0 0£= M un CL S I > % CL e : > % OL e : >< M * X a X in Xcn cn in UJ UJ UJ

- 177 -

Page 192: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.4

continued:

\ 03• o

CN • c /—> r—\ /-- \ t—\ /—\ /—\ /—s t—\ ,---V r-» ,—* ,—^cr (U O cn CO CO 03 IV CO V CO ,—, un CO CO COi •H CM CM o CO V r — 03 <3- CO i—> 03 CO CN un IV

«—v Ch in CO cn CO 03 CNI CO in CO CO o CN CO CO•<—> (0 ■ • • . • . . . . . • • • • •

• > r - v~ CD CO CO CO CO '3‘ un un un unw w \_/ %_/ »■-- ' V_> \_t \_f \_/ \_! i_> * . *_, * .

\

CN “ l—!cr • CO CO CM K~ CO K~ CO IV '► T“ CO <r~ IV un'—i c CO a IV a CO <c CO CN LD CO rv CO un «sfw 0 O o CNJ CN <c— V a CN r~ □<—> 03 • • ■ . . • . . . . • • • • •CD e : T— r— CNJ CN CN CN CN CO CO CO CO COx :

,—,0

• oCN • c r - \ t—\ i—i *—\ t—\ ,—\ i—\ t—\ r—\ r—\ (—1 r—.cr 0 CO CM ' <—* n 03 CN V ID 03 IV V“ COi •H V“ IX CO CN 03 rv CO V «cr *3" 03 un rv 03 CD

<—i L. r— CO rv CO CN CO 03 LD CO 'Cf 03 'tf ’ CN0 » • ■ • ■ • • • . . . . . . .

> V“ r - r — CNJ CN CN CO CO CO CO un unw

« *r_jCNcr CO *3- i \ CNJ CNJ rv 03 rv 03 CN r— 00 <3*'—> C O rx CM 03 a 03 CO o r — CO V 03 ^ r Vw (0 o O 03 CJ NT* CD 03 □ o CN CN a o aw 0 • • • • . • ■ . . • . . . . .v : E= o CNJ CN T- CN CN CO CO CO CO CO

CM CO <sj- un a CNJ CO LD □ CN CO un av

CNJ CO

II II IICO

T- 0 CN 0 CO 0a G * Gu II c O •a II C LO ■a ii C

CD 0 CO 0 0 0 0LU -P c •H * +j C. •H * -p c ■H_ i cn a o 0 u o G 0 Li un G 0 UCL UJ CM CD 0 0 CN 0 0 0 T~ 0 0 0E l N * a e : > % Q. e : > * CL e : >< w o X o X un Xcn cn CM LU UJ r - UJ

- 178 -

Page 193: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.5

Comparing

the accuracy

of the

two

*X2

approximations with q-1

degrees

of freedom

representing

the between

measure

of variation

CN CD 03 CO CO un rx un CO CD un CDCD rx CO r - 03 rx r \ CD o V CD CN CD CN CD

cn 03 CO CO CD CO CO CO CO CD CD CO CD CO CD COCD CD CD CD CD 03 CD CD CD CD CD CD CD CD 03 CD

• • • . • . • . . . . ■ ■ • ■ •o o CD a

N un CD CO I X m CD *3* CN T“ rx* cn CD CD CD •3- CO CO V- un CO 3 * 03 [ X• in *3- 3" '3" *3" 3" 'O’ un 3 - in *3* un un

01 )/CD CD CD 03 CD CD CD CD CD CD CD CD 03 03 CD CDt c • • • • • » • . . . . • ■ . . ■• I—1 □ O a a

CT *w V CO v CN CN *3* CN ^3" rx CO v— CO rx CO'—' CO CD CO CN 3* CO □ CD CO rx CO rx CO r— CD

□ CO CD CD a CD CD C3 CO CO CD CD CD CD CDCN CD CO CO CO 03 CO CO CD CO CO CO CO CO CO CD CO

• . • • . • • • . . . . . • • ■O o a a

CD CO in CD CN CD CN CO un rx 03 V rxCO CN CN CO v 'O’ un V- CN CN 3 " 3 - a CO

<—> CD CD CD CD CD 03 CD CD CD CD 03 CD CD CD CD CDCD CD CD CD CD CD CD CD CD 03 CD 03 CD CD CD CD

\ . • . . • . . . . . . . . ■ ■ «• O □ a O

(N*c r

i CD U3 CD CD CD rx CO rx. CN V V r xCD CN *3" <3* un CD *3* CO 03 un un rx CO O

•r—l in un un un LD <3" un un *3- ■O’ un un un un CO un_ • > CD CD CD CD 03 CD CD CD CD CD CD CD CD CD CD CD

. • . ■ . . . . . . . . . . • .□ a □ a

•>-)N •QC 03 CN CN CN <3* CO CO CO un CO CO>—i a CD CD <3* ( \ rx t X O’ CD rx CD CD oW □ CD CD CD □ CD CD 03 CD CD CD cn CD CD □*—' CD CO CO CO 03 CO CO CO CO CO CO CO CO CO CD CDCO ■ . • . • • . . . . . . ■ ■ . .

a □ o a

CN CO un □ CN CO 3" un □ CN CO «3* un at- 'T-

UJ_ j cn a unCL UJ 'T- COz : m un %< M * o uncn cn un T— v ~

- 179 -

The

table

gives

the proportion

of 10,000 Monte

Carlo

samples

for which

the

two

approximations fell below

thep

<2_v

alue

for

which

the theoretical

proportion

should be

<x.

^

Page 194: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.5

continued:

r \ CNJ cn CD CNJ r- in co CO in in COCO a i \ cn cn cn cn cn cn cn cn CNJ cn cn

cn CO cn CO CO CO CO CO CO CO CO CO cn CO cn COcn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

• . . . . . . . . . . 9 . . . .o o CD a

. in cn CO CO CNJ cn cn CO UD CO cn rx• CO o CO CO CO CO CNJ co CD CD cn

c r ■ U1 in s}- <3- in in in UD in in in ){cn cn cn cn cn cn cn cn cn cn cn cn cn 03 cn cn* . • . . . * • . . . . . ■ 9 .

• o □ CD ac rw>—> CO in CO CNJ CO CNJ <r- V- CD V- rx cn

V- TT- CNJ □ CO a CO r- CD CO co rx in aCNJ CD a o cn cn CD cn CD o O CD cn o cn 03 CD

cn cn cn co co cn CO cn cn cn cn CO cn CO CO cn» ■ . . . • . . . m . . . 9 .o o CD a

V co CO in CO CO CO CNJ CO CNJ in<—> in CNJ r— a <3- CNJ CNJ o CNJ CNJ CN] CD

CD cn cn cn cn cn cn cn cn cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

• • . • . • . . . . . . . . . . .rl* CD CD CD ao:A CD CO V- CO rx CO cn cn CO r - cn rx

• i-i CO CO T- cn CO CO CO V- r - CO CNJ CO111 LO in in in in <3- in in m in in in un incn cn cn cn cn cn cn cn cn cn cn cn cn cn CD cn

\ . ■ . . . . . . . . . ■ . 9 .a CD a CD

N •cr'—< CO v cn in CO m V T“ i \ CNJ in COw cn rx cn cn in CO V CO CO CO r ->—i o cn CD cn cn o cn cn CD a a cn cn cn cn CDCO cn co cn CO co cn CO CO cn cn cn CO co CO CO cn

■ . ■ . • • . . • . . . . . . .CD CD CD o

CNJ CO in o CNJ CO in CD CNJ CO in a

in'T~

CD inCO r -

UJ * *_ i cn CD o inCL UJ CNJ CNJ t-e : m * * *< M CD a incn cn CN] V V-

- 180 -

Page 195: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Following the introduction of (3 the test statistic components can now be expressed

as;

R2 r qv RJ r qv R?j'N ------- - k(3 y - --- + k n - y

N Li

L j - i ."•J. N L.

j - i N-j. •

( 9 . 5 . 2 )

The associated test statistic, Q ,, for the null hypothesis of no difference between q

treatments or samples is given as;

Qi

r qV 'R?J RJ

*0 (N - q) ) - —L i

Lj-1 N - j . N

k (q - 1 ) N - I

j = l

R2

N JJ

( 9 . 5 . 3 )

which has an approximate F-distribution with (q-1) and (N-q) degrees of freedom.

This may now be examined and compared to the associated test statistics from

Stephens (4.4.7), and Uptons G-test (4.4.8). (See Table 9.5.6.) Test statistic

(9.5.3) and (4.4.7) are compared to the theoretical significance levels of an

F-distribution, while (4.4.8) is compared to those of the x 2 distribution.

Test statistic (9.5.3) is seen to be accurate for large concentration parameter and

increasing slightly in accuracy as N increases. This dependence on the size of N is

more pronounced for k equal to 2. As stated by Upton (1974) the G-test is greatly

affected by the size of N and is inappropriate for small sample sizes. Stephens test

statistic is again seen as the 'best fitting' statistic when examined across all large k

and varying sizes of N.

- 181 -

Page 196: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.6

Comparison of

the three

test

statistics;

q and

Stephens compared

to the

F

distribution

and Uptons G-test

compared to

the x2

distribution

'If-rv CO CM in rv CO CO CM CO

<_1 CD CM CO o rv CO in CO CD r— rv r - —CD r \ CO CO CO CO CO CO CO CO CO CO CD co co CD

c CD CD CD CD CO CD 05 CD CO CO CO CO CD CD CD CDo . . . . . . . . . . . . . . . .

•H o CD CD a-P(0Dcr

IL1 CM CO CO rv CO rv V CM CM CO rv CM CMun rv un CO CM in CO CD rv CD in t* rv

1-- V/ un CM CM CM CM T~ <cr CO CO CO 'd- 'd- «d- in *d-cn c) CD CD CO CO CD CO CD CD CD CO CD co CO CD CO CDUJ . . . . . . • • . • . . . . . .h- a CD CD CD

LD

CO2 V UD CO un CM CO '3- CO CM CM CO unO rv CO 'Cf* M" o CD CM co CD un CO IV CO CO COf - o CO CO CO CO CO CO CO CO CO CO CD CD CD a CDQ_ CD 03 CO CO CO CO CO CO CO CO CD CO CO CO CD COZ3 . . . . . . . . • • . . . . . .

o o a CD

CO CO UD co r - CO '3* in CD in CO CMCD CO CD CM o CD rv a CM CD 'r- 03 CO r—

rv. CD CO CO CD CO CO CO CO CO CO CO CD CD CO CO CD. CD CD CO CO CD CO CD CO CO CD CD CO CD CD CO CO

t* . . . . . . . . . . . . . . . .O CD CD O

«cr

co rv CM CD CM r— CO CD in IV rv CO CO

•H V" CM CO CD (V CD CO un IV CO o a IV in COt un un UD 'cr un '3" un un 'sf '3* 'S' un un <d- un ^r

(U * CD CD CO CO CO CD CD CO co CD CD CD CO CD CO coID . . . . . . . . . . • . . . . .c r □ CD CD CD

UJ

CO2UJ CO «cf- CO CO IV rv in CD CO COX CD CO un un CO CO un CO CO CM CO IVQ. o CD CO CO CO CO CD a CD CO CD CD CD CO ’T* COLU CO CD CO CO CO CO CO CD CO CO CO CD CO CO CD CO1- . . . . . . . . . . • . . . . .co □ CD CD CD

CO <3- CM CO CM •st- CO r— CO <r~ IVCM '3- CM '3" CM O CM 'T- 'd- *d- CD CO CM

CD CD CD CD CD CD CO CO CO CD CD CD CD CO CD COCD CD CO CO CD CD CO CO CO CO CD CO CO CO CD CO

. . . . . . . . • . . . . . . .,—> □ CD CD aCO

un

CD CO CO un IV CM T - CM CO T“ in un>__- CM CO CO CO CD <3* CO CO co V CD CD *d- CD 'ct-

un CO un un un «=t* CO in in UD un CO CO un un inc * CD CO CO CD CO CD CD CD CO CO CO CO CO CO CD CD□ . . . . . . . . . . . . . . . .

•H o CD CD □-P(UDCT

UJ CM rv CM rv CM CO CO rv *7~CO CO CO in CO CD CD 'sf- IV T- CO CO un 'd- CM

f * o CD a □ CO ■ T— V a CD CD a o CD r- aO ' CD CD CD CO CO CO CD CD CO CO CD CO CD CO CD CD

. . . . . . . . . . . . . . . .CD o a CD

■Si CM CO ’st­ un CD CM CO ^r un a CM CO 'd* in CD<r"

UJ un_J CO a COQ_ UJ VE l N un % *< M * a unCO CO un V

- 182 -

Page 197: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.5.6

continued:

'—> CJ) CO CO CO CO CO cn CO CO CO CO CO cn cn co COcn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

c . . . . . . . . . . . . . . . .o o o o CD•H+J(UD)c r CO V- CM CM CO co rx rv 3" IV rv

LU cn cn rx *3* in 3" r - □ CO rx CO in cn cn COLD 3" 3* 3" ^r 3" in in in *3" 3" 3" in 3* 3* 3"

> CO cn cn cn cn cn cn cn cn cn cn cn cn cn cn cncn . . . . . . . . . . . . . . . .LU □ □ o af -

LD

CO rx CO CM CM in T- in cn in rx in CO cn in2 V in rx. r— in 3" rv CM cn CO CO in inO o o a cn CO cn o a a cn cn cn a cn cn cnh- cn cn cn co CO CO cn cn cn CO CO CO cn CO CO COX . . . . . . . . . • . . . . . .ZD o □ a □

LD 3 - rx CO rx V cn in CO CO CO<—> □ cn CO CO cn CM CO CO □ v~ □ r~ orx. cn cn CO CO CO CO cn CO cn CO cn cn cn cn cn cn

■ cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn3^ . . . . . . . . . . . . . . . .

. □ a CO a3*

co CO CM *3* in CO CO CO CO CO CO CM CM CO in

•H co CM CO rx CO CO CM cn r— □ rv CM CM COLD in in in 3 - 3" in in in 3* in in in in in 3*

(0 c cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cnD . . . . . . . • . . . . . . . .cr □ o a CD

LU

cnxLU in co 3* "3* r - r— in CM CM 3" CDX CO CO 3" cn V CD CO CO cn rx in CO CM OX □ o a cn CO O o o o cn cn CD CD O CD CDLU cn cn cn CO CO cn cn cn cn CO CO cn cn cn cn cn1— . . ■ . . . . . . . . . . . . .UD o a a a

CO CO 3" CO rx CO cn cn in3 - CO v- V in CM CO CM CD co 3" CO CO a

cn cn cn cn cn an cn cn cn cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . . . . . . . . . . . . . . .□ □ a CD

CO

in CO rx CO rv 3" t- r— rv co cn• V CO in co CO CO cn in 3" a CO co

cn { LD co in in in in CD in in in in CD CO in in in>—' c cn cn cn cn cn cn cn cn cn cn cn cn cn c n cn cn

. . . . . . . . . . . . . . . ■c a o a oo

•H•P<0CJ V“ CO •3" CO CO CO CO CM cocr rx 3 - co 3" □ r— CO CO CO cn CO CO 3"

UJ a o rr* o cn O V a □ a CD CD CD acn cn cn cn CO cn cn cn cn cn cn cn cn cn cn cn

’T- . . . . . . . . . . . . . . . .o - □ o o CD

CM CO 3" in a CM CO *3* in o CM CO 3* in a

in

□ inCO r -

UJ *_J CO a o inX LU CM CMx m % * *< M a CD inCO CO CM V—

- 1 8 3 -

Page 198: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

9.6 Comparing the Power of the Tests, for Large k

In previous sections the components and test statistics introduced have been compared

with each other principally on the basis of their apparent goodness of fit to their

respective distributions. All these tests involve the use of many approximations in

their construction;

a) The approximation of the ratio of Bessel functions in the likelihood ratio

test as a short power series (Chapter 3.6).

b) The approximation to obtain an explicit value for the maximum likelihood

estimate of the concentration parameter (Chapter 3).

c) The use of the asymptotic result about the distribution of -21og\

(Chapter 3.1).

d) Occasionally an approximation is introduced to simplify the test statistic.

e) The modification of the statistic by correcting its expected value

(Chapter 2.5).

It was similarly noted by Upton (1970) that because of all these approximations it is

often surprising that so many of the tests derived using this procedure are good fits

to their respective x 2 anc* F distributions. It is for these reasons that it is important

to examine the statistics relative goodness of fit and establish their range of validity.

What is also important is to examine the relative power of the statistics against any

available alternatives. In order to compare the test statistic (9.5.3) against alternative

multi-sample tests given by Stephens (4.4.7) and Upton (4.4.8) it is unnecessary to

conduct a complete investigation of all the different sample situations we have already

used. Table 9.6.1 gives the relative power of the three tests by studying two sets of

samples, with one sample mean direction set approximately 30* from the true value

- 184 -

Page 199: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

of the other samples. It is clear from the table that Uptons and Stephens tests are

virtually identical in power, with the Q-test slightly less powerful improving in

comparison as k increases. The power of all three tests naturally increases as the

underlying distribution becomes more peaked.

As has previously been noted likelihood ratio theory produces tests that are

asymptotically uniformly most powerful. Therefore the tests derived in this manner

examining the required hypothesis will be at least as powerful as any other test.

The test statistic (9.5.3) initially constructed by likelihood methods but requiring a

further test to eliminate an assumption of unequal concentration parameter would not,

expectedly, be quite as powerful as that derived without this restriction. Nevertheless

this loss of power is not very large and is compensated by its increased range of

application as a generalised approach for larger experimental designs.

- 185 -

Page 200: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.6.1

Comparison

of the

power

of three

alternative

tests, for

large

concentration

parameter

CDo J ZCO •p

3* 3* rv. 3* r - CNJ CO 3"II o CO CO CO CO CO CD CO CO TD

CO 3" CO CD CO CNJ O CD cn CO CDco <D v— T- X ~ 3" 3* 3~ rv CO CO TD

x o\° • . • . . . . . . CDii r~ □ O CD 0

OX□ CNJ

CNJ © 3" IV un un un CO CO CO CO 0CO CD cn CNJ CO o CO a o

II II CNJ CNJ v cn CO rv CO CO CNJ JZCO CO CNJ CO CO CO CO CO CO CJ

CNJo o\° . • . . . . • • . •H2 O in □ o CD J Z

CO 3□ 1 cn •

CO CO v V- cn CO IV CNJ 0 cCO CO v CO 3* CO CNJ cn i—1 o

II 0 3^ 3" o cn cn co 3" 3" CO CL -Ho\° 3* 3" 3 - rv rv rv cn cn cn E -Pr— o • • • • . . . . . tn zj

X V“ o o CD cn jo•H

O P1—1 4-3v P cn

ra *hCJ TDCNJ cn CO cn un 3" CO un un

CO CD o r— CNJ CO CNJ 3* 0 4->3" 3* cn CO CO o IV rv un -P cCNJ CNJ r— CO CO CO cO CO CO C <0

o\° • • • . . . . . . o >O CD CD X 0

»-HCNJ a 0

O 0 'c— 3" CD cn CNJ CO 3* 3 - o PCNI o LD rv cn 3" CO CO IV aII IV v 3" 3 - 3 - CNJ CO CO un ■> 0II 3" 3* ’3* CO CO CO cn cn cn O J Z

o o\° • • • . . • . . . x - -PCNJ O IS) o CD CD

X CO <4- q -II 1

T- CO cn 3" 3* cn□ □

c cnv r" CO 3 - cn CO CNI CD CNJ CNJ CO □ -p©N

o O rv T“ CD CO CO rv •H co\° CO CO un cn cn cn cn cn cn -P -HO ■ • ■ • • . . . • P o

o o CD O CL D.O 0

•—‘ «—> r— * P UOCO v CO CL ID

■ • • -P3* 3" un 0 C

• • • JZ 03 - 3 - cn -P G

Pcn 0

i— I— I— J - 0 CLCO i— CO CO co f - >UJ CO UJ UJ UJ co •H TDJ -1

UJh -

1—1

1— J -1

UJh -

b fl 0 -P

CDCO

CDCO

CDco

0 (0 rH -P

- X » 2 - X jd cnCO UJ 1— CO UJ 1— CO UJ 1— cn

CNJ 2 J Z CO 3* X CO CO z X CO •p p!— o CL UJ □ CL­ UJ o a . UJ 0CO II h - UJ J - II J - UJ 1 - II 1— UJ J - 0 CLUJ CL 1— 1 CL 1— 1 CL i— 1 J Z CL1— ZD CO C D ZD CO O ’ ZD CO C D h - ZJ

- 186 -

Page 201: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

9.7 The Adequacy of the New Procedure, for Small k

As previously stated, the new procedure for small k is based on the likelihood ratio

test for the null hypothesis /31 = ... = /3q, = ... = kq against its general

alternative hypothesis. For one-way analysis problems Mardia's likelihood ratio test

(4.4.3) for the null hypothesis /31 = ... = /3q with assumed equal k will be

asymptotically uniformly most powerful as a direct test of the mean directions. As

with test statistic (4.4.3) the comparable test statistic (9.2.6) only involves the main

factor effect and cannot take account of the total or more importantly the residual

or error effect. However, unlike (4.4.3) the loss of power of test statistic (9.2.6)

may be compensated by its possible application in larger experimental situations to be

discussed in Chapter 10. Table 9.7.1 compares the accuracy of the two x 2

approximations (4.4.3) and (9.2.6) following 10,000 simulations of varying

concentration parameter and sample size. The sample sizes range from 10 to 60

using the same sample sets as for large k. Before the tests are applied the equality

of concentration parameter is checked using the appropriate tests from Section 4.5.

Ranges of small k are given as its accuracy cannot be assured for such small values.

From Table 9.7.1, test statistic (9.2.6) is seen to be less susceptible to smaller

sample sizes than (4.4.3). As the sample sizes increase the accuracy of the two tests

become more comparable with both showing good fits to the upper percentage points

of the x 2 distribution with associated degrees of freedom. For both tests cq, the

simulated proportion of the component, approaches a, the x 2 theoretical proportion

or significance, as N and k increases.

- 187 -

Page 202: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.7.1

Comp

arin

g the

ac

cura

cy

of test

stat

isti

cs

(9.2.6

) and

(4

.4.3

) as

X2 ap

prox

imat

ions

with

2Cq-

1)

degr

ees

of fr

eedo

m

CN CO m CN CD i \ CO <r- in CNCD CO CO CO *3" IV . *3" l \ in CD CD rv.

CD CD CD CD CO CD CO CD CO CD CO CD CO<—> CD CD CD CD CD CD CD CD CD CD CD CD CDCO . . . . . . . . . . . . .

. o CD a CD CD o CD<3-

'3 'v--- /

in <3* CD CO r \ ' in CO CO 'r~ <3"CJ CD O CN V r \ CO r — CD CN COM s/ in CD CO CO in 3* in «3" CD CO ^ r1 - C CD CD CD CD CD CD CD CD CD CD CD CD CDCO . . . . ' . . . . . . . . .M o o CD a a CD ab-<f—CO

f \ in CN <3* CD CN CO CN rv.1— OJ V- «3" CD co CD CO CD CO CD a COCO o l \ <3* CO CD CO o CO r \ CN rvLU CD CD CO CD CO CO CO CD CO CD CO CD CO1— . • . . . . . . . . . . .

CD o O a a O CD

r \ CO CN CD r \ CO CD r \<3* <3" a CN CD a V 'T- CO CN CN

CD CD CD CD CD CO CD CD CD CD CD CD CD•—> CD CD CD CD CD CD CD CD CD CD CD CD CDCD . . . . . . . . . . . . .

. □ □ CD a O CD aCN

cn

CD v - CN in LD CD CO VCJ CN «3* CD *3" CO CO CD in CN *3- COM in CD in in in <3" in <3* in in in >3*' CD CD CD CD CD CD CD CD CD CD CD CD CDCO L . . . . . . . . . . . . .M □ CD a CD CD o a1—<h -co

CN CO CD '3 ' CO CO CO *3*1— O CO CO «3* CO in CO CD iv .CO CD r — CD CD CD CD O CD o CD CD OUJ CD CD CO CD CD CO CO CD CO CD CD CD CDH - . . . . . . . . . . . . .

□ O CD o O a a

in o in a in a in CD m CD in CD. . . . . . . . . . . .

CD T“ o o o V a oO in CD in a in o in CD in CD in

. . . . . .o □ o □ o o

inV

*CD inCO v -

LU •« %_1 CO CD in o CD inCL LU V- CO CN CN <r—e ; m m * % * * %< M % O in CD a inCO CO in V CN

- 188 -

Page 203: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 9.7.2 takes the tests statistic (9.2.6) out of its proven range and compares it

to equivalent tests in the concentration parameter range 1 to 2. The comparable

tests are those stated by Stephens and Upton given in Section 4.4, and used

previously when examining tests for large k. Stephens test is shown to be suitable

for k as small as 1, while Upton's test is suitable for R/N as small as 0.6 (fc~1.5).

Table 9.7.2 shows test statistic (9.2.6) to be comparable to the equivalent tests, whilst

all three tests show good fits to the upper percentage points. Once again it is

important to note that the equality of concentration parameters has been tested prior

to examination of mean directions.

- 189 -

Page 204: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.7.2

Comparison

of the

three

test

statistics;

C9.2.6)

and Uptons G-test

compared

to the X2

distribution and

Stephens compared

to the

F

distribution

CO

*3"

co

•H-P

cn03

o

90

96

*0

0.9

83

7

0.9

87

8

0.9

90

3

0.9

87

1 COCOco03

O

(0ZJcr in IO CO IOLU CN rx CO CO 3"

\ IT) CN CO <3- <3" *3- <3*+> c 03 cn 03 03 cn 03 03cn . . . . . . .CD o o a o o CD CD

-pi

CD

co CO CO CN CN CO ! \2 CO in a CO CN r -o o CO CO cn cn cn 03t - cn CO CO CO CO CO COCL . . • . . . .=D □ a o o □ o O

CN '3" <3* CO CN CN r - CO r \<—i CN CN t * i \ CO cn cn CO CO a

cn 03 03 cn CO CO CO CO cn CO co cn cn• cn cn cn cn 03 03 cn cn cn 03 cn cn cn

*3" . . . . . . . . . . . . .. o o a o o CD CD

<3"'— '

Co *3" CO CO CO CO CO i \ CO CO 03 CO•H CD co Tr- CO IO CO o CO CO CO r --P . in in IO <3- «3* «3* IO in '3" <3“ in in03 £ 03 cn cn cn 03 03 cn cn cn 03 03 cn 03D . . . . • . . . . . . . .cr □ o C3 o CD CD a

LU

cn

LU in CO CO TT- <3* CO CO CO ev­X CO CO cn 03 >3- cn CN r \ CD i \ enCL­ □ CO o cn 03 cn 03 cn O cn cn cn cnUJ cn CO cn CO CO CO CO CO 03 CO CO CO CO1— . . . . . • . . . . . . .cn o □ o a o o a

i \ CN rx ^r IO <3* TT" IO COV- CO CN CO CO co 03 l \ CO CO

cn 03 CO cn CO CO CO CO CO CO CO 03 COcn cn cn cn cn cn cn cn cn cn 03 cn cn

*—* . . . . . . . . . . . . .CO o o o a a CD o

CN.cn<_i "3- CO r— to CN CO in CN

IO in CN CO ** in CD r \ Tr­ *3* a° > to *3" IO <3" «3* IO io *3* IO *3*•H < cn cn cn cn cn cn cn cn 03 cn cn cn cn-P . . . . . . . . . . . . .cn o a a a o a CD

•H•P(D

-Pcn cn «3" cn CN TT- TT- cn Tr* <3*

in CO CN 03 CO l \ 03 <3" cn cn CN-p o CO cn cn cn CO cn cn cn O CO cn cncn cn CO CO CO CO CO CO CO CO cn CO CO CO0 . . . . . . . . . . . . .

i— CD o o a CD CD o

in in IO to IO IO in IO to in toCN CN i \ CN CN r \ CN CN l \

• • • • • ■ • • . . . .V V- Tr- TT" V” t- TT- V

IO*

o IOCO TT“

UJ * *_j cn a lO a a inCL UJ in TT~ CO CN CN r-e : m * •> * •>< M % o IO CD a IOCO CO in V V CN TT—

- 190 _

Page 205: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

9.8 Comparing the Power of the Tests, for Small k

As for large k , it is important to examine the relative power of the test statistic

against any available alternatives. To compare test statistic (9.2.6) against Mardia's

test statistic (4.4.3) for k < 1, and against Stephens (4.4.7) and Uptons (4.4.8) for

k > 1 it is unnecessary to investigate all the different sample situations of Section

9.7. Tables 9.8.1 and 9.8.2 show the relative power of the four tests by studying

two sets of samples, one with equal sample sizes of 20, the other with differing

sample sizes of 10, 20 and 30. Table 9.8.1 examines the power of the test with

one sample mean direction set approximately 30* from the true value of the other

samples, as in the power examination for large k. Table 9.8.2, however, gives a

further test when one sample mean direction is set 90* from the other samples. A

larger displacement has been used for the examination of small k , since to acquire a

significant difference between samples when simulating almost randomness is difficult

and larger fluctuations in test statistic would be expected. In both tables it is clearly

seen how the detection of a significant difference is increased as k increases. When

a displacement of 90* is used with k = 1.75 all three test statistics indicate the

presence of a displacement on almost 100 percent of occasions.

Examining the power between the tests for k < 1 shows Mardia's statistic (4.4.3) to

be slightly more powerful than test statistic (9.2.6) in both situations and across all

percentage points. For k between 1 and 2 Stephens test statistic dominates close to

1 while all three tests appear to be identical in power for k increasing towards 2.

Test statistic (9.2.6), constructed using maximum likelihood techniques but requiring

examination of the sample concentration parameter, is seen to be almost as powerful

as test statistics (4.4.3) and (4.4.7) but with the possible added application to larger

experimental situations.

- 191 -

Page 206: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

9.8.1

Comp

aris

on

of the

power

of three

alte

rnat

ive

test

s,

for

smal

l co

ncen

trat

ion

para

mete

r

□ o\°cn c - 1!rr\

0.00

57

.010

5

0.02

75

.018

7

0.05

38

.052

5

0.11

99

.108

2 .

.120

1

l T JX CO

© CXI LD v~ cn CO CO CO CO COcxj a cn in <3* CO CM

II CO LO o CO in CO CO rx□ CXI □ □ r - o CM CM CMcxi <D o\° • . • • • • • . .II LD □ a a □

CXI IIx

0oCO CXJ in <3* cn CO I \ in

O 1 l \ in in V“ in cn in oCO o CO in r \ xj* □ cn CO

0 % □ V CM CM CO COo . . . . . . . . .

X r - o □ □ □

CO CM in CO in in inl \ □ CO [X. CO t- CM CO i \o T- '3 ' CO cn cn cn V r -o a □ o o a r — CM CM

o\° . . . • . . • • •t - o □ CD o

OCXI CXJ CXJ CO CO in i \ v~ CO

© r— o ( \ CO cn CO in a v -II CD CO CM CO o CO

II o o T - V- CM CM •sj* «3*CXJ 0\° . . . • . . • . •

N O 5 o a a oo

II CO1

GN [X, in CO rx.CO o CM <sj" CM CO 'c f o CM

CXJ CO o CD CO CO in CMo\° V CM CM CO CO in in inO • . • • . . . • ."T- □ a o o

,_, ,_, ,_, ,_, ,_, (_( (_>CO CD CO CO .--V CO i \ CO

. . • . IX . . <—> •CM "st" CM . CM CO CM

. • . • . . . .CD cn . cn cn

'—' >—' w >—' >_i >_> . \—j<_'

u a CJ CJ CJ h - >_< cjM M o M i—i I— H co M1— i— . J - i— CO h - UJ i— I—

cn co CO co CO UJ co I— CO CO. M M M M h - M UJ M

□ i— I - M- h - I— h - - H - I—< < < < in CO < in CO <V i— I— I— CM X h - X - I—CO co CO CO • UJ CO . UJ CO CO

V X v X Xf - 1— I— I— X I— X □ J—CO CO co in CO CO n UJ CO n UJ J - COUJ LU LU • UJ UJ J— UJ I— X UJ1— o J— a h— I— CO t - CO X I -

- 192 -

Page 207: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABL

E 9.

8.2

Com

pari

son

of th

e po

wer

of th

ree

alte

rnat

ive

test

s,fo

r sm

all

conc

entr

atio

n pa

ram

eter

to 00 LO r—1 cn 00 r-HvO LO * 00 vO LO cn cn toc rH vO cn vO CN 00 LOo o CN r—H vO vO cn cn cn

o\° • • • • • • • • •to »—1 o O o o

©oto IIII 1—t vO 00 r—H 00 to

to CN "3- to to 00 I4- CN I—t to LO

N 0 LO 00 vO 00 vO LO 00 00 00c o -3- to 00 00 cn cn cn

ii o\° • • • • • • • • •o LO o O o oCN OII O

cn cnA

1 to LO to VO rH cn rH CN vOo to oo LO o rH CO CN to tof—1 r—I CN cn r r CN o cn cn cnII <D o\° rH 1—1 LO LO cn cn cn cn cn

O • • • • • • • • •2 r—I o o o o

rH CN vO »—i CN t4- r-HvO to LO VO t4- CN 00 00i—i CN o VO LO 00 00 00o O *3- to 00 00 cr, cn cn

o\° • . • • • • • • •(—1 o o o o

O CNCN (T) vO *3- rH LO r - CN cn

rH cn LO CN rH o LO r - vOII II cr. cn to cn VO LO cn cn cn

o o o vO LO cn cn cn cn cnCN O o\° . . • • • • • • •

2 cn LO o o o o

-ti irH rH

Z <D r-1 CN LO CN t4- vO LO cn00 00 00 «-t o LO 00 cn 00f - vO •*3- cn 00 e'­ cn cn cn

o\° rH »—i vO cn en cn cn cnO • • • • • • • • •f-H o o o o

/—\ /_^ /—\ /—\ /—\ /—\ /—s f —•>to vO to vO vO r -. 00 vO

■'3" CN •*3- CN 3" CN ■*3" *3- CN

r t cn *3" cn 43" cnv—' v—' \_/ v_> '__/ v_> '_/ V__> v_>

c j u u u c j CJt—1 HH o t—1 HH E-1 l-H E— i—iE-1 E-1 E-1 E-> CO E-1 CO E-4

LO CO CO rH CO CO UJ CO UJ E— CO• 1—1 t—t 1—1 HH H 1—t E- CO HH

o E- E-* V E-1 E-4 E-r UJ E-;< < < < LO CO < LO CO E-4

V E-1 E-> E-* E-4 CN 2 t " 2 E-CO CO CO CO • UJ CO • UJ CO CO

V *—t X I—I X 2E- E-1 E- 4 E - E-* a. E- 1 Oh o E- 4co V CO CO LO CO CO II UJ CO II UJ E-4 COu j u j CIJ • UJ UJ E-4 UJ E-4 Cu UJE - O E- 4 " O E- 1 E - CO E - ^ CO X E-4

- 193 -

Page 208: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

9.9 Summary

As the exact distribution of the new test statistic components have been seen to be

intractable by an exact theoretical approach for both large and small k , other

methods to examine their distribution have been used. A comprehensive step

approach was initiated for both situations of large or small concentration parameter

(1) to test the components validity by use of approximations in a theoretical

manner

(2) to obtain the expectation and variance of each chi-squared approximation

component via Bessel function approximations

(3) simulation testing of the components to examine their accuracy to the

upper percentage points of their associated chi-squared approximation and

obtain and examine their first two moments

(4) to compare the new F or chi-squared test statistic against available

alternatives, and finally

(5) to examine the power and robustness of the test statistics against these

alternatives

Following the production of the first two moments, via the Bessel function

approximation and simulation, an improvement factor /3 was derived to increase the

accuracy of the distribution approximations for large k and therefore improve the

final test statistic. The comparison of the new test statistic (9.5.3) to associated

one-way classification tests was favourable although as it requires the prior testing of

sample concentration parameters it is not a uniformly most powerful test.

Nevertheless the new approach lends itself to an increased range of applications

unlike the alternative techniques.

- 194 -

Page 209: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

For small k only the associated component chi-squared approximation can be used as

a test of difference since the corresponding total and residual components for small

concentration parameter do not form chi-squared distributions. The test statistic

chi-squared (9.2.6) also compared favourably to the available alternatives for small k.

As for large k, the tests of (4.4.3) and (4.4.7) are uniformly most powerful although

generalisations of these tests to alternative experimental designs is not feasible.

- 195 -

Page 210: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 10

THE RANDOMISED COMPLETE BLOCK AND TW O -W A Y DESIGNS V IA TH E

NEW APPROACH

10.1 Introduction

A comprehensive analysis of the distribution functions for the generalised approach

has been carried out in Chapter 9. Here the new test statistics for the randomised

complete block and two-way classification designs, together with their associated

improvement factors, are produced. The relevant component statistics and test

statistics are compared with their respective chi-squared and F distributions to

ascertain their reliability and robustness for larger designs. In order not to repeat

this analysis for every possibility the randomised complete block and two-way

classification designs are used to represent and examine the adequacy of other larger

more complex design situations. Although the components and test statistics are not

investigated here, Chapter 11 extends the approach and analyses experimental

situations using such methods as Latin-square and Split-plot designs.

10.2 Randomised Complete Block and Two-way Classification with Interaction

Designs, for Large k

In Section 8.5 the randomised complete block design for the new approach was

constructed via vector analysis and was shown to produce zero cross product terms.

We may now test for any possible block effect, i, as well as any treatment effect, j,

to produce the randomised complete block expression;

- 196 -

Page 211: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

R2 r pr» R? i .

R2 r q [R' j ] R2k N -------— - k y - ---- — + k y - ---- —

N LL i - 1 N i - .

N LL j - i N - j .

N

r pv R? qv fR j ]

R2+ k N - J

l .- ) + —

La

i - 1 N i .L

J - l N - iN

with associated independent chi-squared distributions

X (N -1 ) “ X (p -1 )+ X( q - 1 ) + X( p - l ) ( q - l )

As noted in Section 8.5 the first term on the right hand side of the expression is

the measure of variation due to p treatments, the second term of similar form being

a measure of variation due to q blocks. The final term represents the residual

variation, where we assume that the experimental errors are independent and von

Mises distributed.

Equation (10.2.1) produces the test statistic Z g to examine the null hypothesis that

there is no difference between the p treatments

r p R2l . R2( p - 1 ) ( q - 1 ) y ----- L -L

L i. i= l [Ni . J N

( P - Df P N - J R?l .

q

- J R2 . • J

R2 + —

Li = l Ni* .

L i N

( 10 . 2 . 2)

which has an F distribution with (p-1) and (p - l) (q - l) degrees of freedom.

Similarly, the test statistic Z 7 is produced to test the null hypothesis that there is no

difference between the q blocks

r q

fR' J]R2

( p - 1 ) ( q - 1 ) I ----- L -LL i

L j - i k J N

r pr* R? ' q R2 . • J

R2( q - D N - J 1 .

- I + ' *La

i -1 k J laj - l k j J

N

( 1 0 . 2 . 3 )

- 197 -

Page 212: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

which has an F distribution with (q-1) and (p - l) (q - l ) degrees of freedom.

Test statistics Z 6 and Z ? indicate whether or not the mean directions are equal, they

do not allow for discrimination amongst single mean directions which we were able to

derive for the one-way analysis in Chapter 5.

Prior to examining the accuracy of the component chi-squared approximations of

(10.2.1) and the associated F approximations Z 6 and Z ?, the test statistics for the

two-way analysis with interaction may be constructed.

Using the same approach as above, the two-way classification with interaction may be

seen as;

R2M - * * *

r p y RL' R2 r v

y R2 .•J-

R2

Nft L

. i - l Ni . . N LLJ-l N - j , N

+ k - 11 i-1 J- l

R?. i j

lNU.J

+ k i ii-1 j - l

R? . U

lNU.J- I

i=l

R?1l

J- l

R2 .•J

N J-J

R2

N

with associated independent chi-squared distributions

( 1 0 . 2 . 4 )

CN-1 X (P -1 ) + X( q - 1 ) + Xpq(m-1) + X( p - 1 ) ( q - 1 )

where m represents the number of observations within each cell. F test statistics are

built as for the randomised complete block design;

- 198 -

Page 213: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Testing for differences between rows effects ( i= l,2 , p)

p q ( m - l )

r p R2 R2y i . .z. i - 1 Ni . . N

8

(P -1 )

r p q

- - I I

> 90t-i.

JO

N i j . .L i - 1 J - l -

~ F ( p - 1 ) , p q ( m - l )

Testing for differences between column effects (j= l,2 q)

p q ( m - l )

f Vy R2 .

• J-R2

zU-i N - j . . N

(P -1 )

p q

■ - 1 Ii - 1 j - l

R?.i j

NiJ-J

” F ( q - ! ) , p q ( m - l )

Testing for difference between interaction effects

p q ( m - l )

r p q

y y R?. i j -

Py

f—(S*«-iz zL i - i j - i Ni j - .

zi - 1 Ni . .

lj - l

R2 . • J

N J-J

R2+

N

( 1 0 . 2 . 5 )

(1 0 .2 .6 )

“ 1 0 r p qV V R2 .

i j .

"

( p - i ) ( q - i ) N ■ I ILi LiL i - 1 j - l iN i j J

( 1 0 . 2 . 7 )

F ( p - 1 ) ( q - 1 ) , p q ( m - l )

- 199 -

Page 214: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

10.3 Accuracy of the Associated x 2 Approximations for the Randomised

Complete Block and the Two-way Classification Designs with their

Corresponding F Statistics, for Large k

The accuracy of the expressions (10.2.1) and (10.2.4) are determined by simulation

methods. Monte Carlo samples from a von Mises distribution with fixed k are made

for the distribution specified by the null hypothesis. The computer method used for

the generated observations is outlined in Appendix B. As for the testing of the

one-way classification and the extended techniques of Watson and Williams in

Chapter 6, 10,000 sets of samples of various size were drawn from the von Mises

distribution with k = 2,3,4,5 and 10. The same experimental designs investigated in

Chapter 6 are used here to enable comparison between tests. For the randomised

block three designs are examined varying in size of N.

10.3.1 The Randomised Complete Block Design

Tables 10.3.1 to 10.3.6 examine the chi-squared approximations for each component

within the randomised complete block design. Table 10.3.1 gives the first two

moments of the components for total and residual measures of variation. The

simulated mean value of the chi-squared approximation for each component is seen

as a good fit, although slightly over-estimated, and increasing in accuracy as k

increases. This follows the findings of Section 9.5. When k is unknown and

equation (9.2.5) is used as its replacement the accuracy of the simulated mean value

is increased. Again, as in Section 9.4, the simulated variance of the chi-squared

approximation of each component is seen to be below its expected value and

dependent of the size of N and k. As N and predominantly k increase the accuracy

of the variance increases.

- 200 -

Page 215: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 10.3.2 shows the accuracy of the total and residual measures of variation

components to their associated chi-squared. The goodness of fit in the upper 10

percent significance levels, for concentration parameter greater than 2, is seen to be

very good, and increasing in accuracy with increasing size of k. For concentration

parameter equal to 2 the approximations under estimate their associated significance

levels. Although not shown within the tables, the fit of the simulated distributions in

the lower percentile levels is not as good as in the upper tails, but this would be

expected given the simulated means and variances of Table 10.3.1.

Tables 10.3.3 and 10.3.4 give the first two moments for the treatment and block

measures of variation components, respectively. As for the one-way analysis the

simulated mean value of the component is approximately equal to the mean of the

chi-squared approximation and increasing in accuracy as k increases. Similarly the

variance is seen to be rather poor, most noticeably for small k. As in Section 9.5

the improvement factor (3 is used and multiplies k in order to increase the accuracy

of the chi-squared approximation variance and therefore the distribution fit. The

comparable moments are given in adjacent columns in Tables 10.3.3 and 10.3.4 and

show a slight increase in the simulated mean value but an important improvement in

the accuracy of the variance. This is reflected in the accuracy of the improved

treatment and block components to the upper percentage points of their associated

chi-squared distributions given in Tables 10.3.5 and 10.3.6.

- 201 -

Page 216: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.1

Comp

arin

g the

mean

and

vari

ance

of

the X2

appr

oxim

atio

ns

for

the

tota

l and

re

sidu

al

meas

ures

of

vari

atio

n ag

ains

t their

expe

cted

va

lues

2\rj • «—«cr m+ o ,—, ,—% ,—, ,—, r-» t—\ *--\ t—* * ' 1 ' r '

c CD *3" CO CD r- ,— , cn l \ cn cn l \ cn <r cn IX)•r- ) <U cn CO 'd' O r- cn CO CD o CM cn CO CO

• *H CD o ur O o cn cn cn o O r— cna a a a a a a * • • * • *

\ CD CD CM cn 'd' l \ CM cn CD l \ CM CM tn V

r - CM CM CM CM CD CD CD l \IN • ' »__/ >_» »--- ' «.—j <--- ’ •— > '— ' '— ’ '—' *—' '—' %__/ '— 1 '— ’c r*—•w

i

CO cn CO CO CM LD l \ CD CO r - CO l \

•H C o CD V CO CO r — cn cn ID CO IX) ( \

S CD CM r~ o O cn <cf V- CD o cn IX) O\ 0 . a a a a a a a • 9 • • ■ • •

21 CO CO CO CO CO <" *d" <d_ CD CD CD CDV t - V- v ~ cn cn cn cn cn

c rww

s\__/X L

CD 'd' CO CD CMCO <r- T- CM cn r \

II II II II n ii

0 0 0G G G

TD C TD C TD cCD 0 0 0 □ 0 0

UJ CD -P c •rH co -P c •rH -P c •H_1 O co p O 0 P G 0 PQ_ UJ > CD 0 0 > 0 0 0 > 0 0 0e: m JD Q. s : > JD CL e: > JD Cl z : >< M X X Xcn cn CO UJ cn UJ LD UJ

I do ,—* /—s /—1 t—\ /—\ <---1 * v * '

c CD rv. V CO CO r~ CD CO CM CM <a- CDCO CM cn CM cn CD V- CM CO CM r \ CO «d- IX) a•H CD CD in in V- r - LD «c~ cn cn CO ’d" CO COP . a a a a a a a a a a a * a ■(D r x cn CD LD CD CO cn CM cn CM CO ( \ o CM i \

2 > T- CM CM CM CM CM cn <d" CD CO cn cn cn\ w %_' w w v—/ w '—' 1— * '—' v---* v__/ '—’

CM •cr

i t- CM «d" cn <d" cn cn ID CDZ L cn r " l \ o cn LD T - CM cn cn cn O CO

' C CM CM V o LD CM CM o 'd - cn co cn o-*£ (0 a a a a a • • • • ■ • • ■ • •

CD 'd' "d " «d" cn cn cn cn cn o CD cn cn cnEl r - V V~ CM CM CM CM CM LD IX) 'd-

X L CM cn «d" LD O CM cn "d- I D o CM cn «cr IX) or - ’t- T-

<d" CO cn CD cn COCM CM cn

II II n II n n

0 0 0O G a

TD c ID C TD cCD 0 0 0 □ 0 0

UJ LD -P C •H CO -P c •H r~ •P c ■H_ ) O 0 P O 0 U G 0 UCL UJ > 0 0 0 >> 0 0 0 > 0 0 0e: m JO CL e: > JD Q. e: > JD Cl e: >< w X X Xcn cn CO UJ cn UJ LD UJ

- 202 -

Page 217: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.2

Exam

inin

g the

ac

cura

cy

of the

X2

appr

oxim

atio

ns

for

the

total

and

resi

dual

me

asur

es

of va

riat

ion

2\

fN •c n+ cn CO CO CO CO LD in CM CO CO CO cn in

«--V CO CO cn CO <3- CD CO CD cn CM aCO cn CO cn cn co CO cn cn CO CO cn cn cn cn CO

• cn cn CO CO CO cn CO cn CO cn cn CO cn cn CO CO53 . . . . . . . • . . . . . . . .

CD □ a CD

IN •c nv-- !w CM •3* CM CM CO CM CO COI CO a CO *3" cn CO a CO re­ l\ . CD in CO V

1—1 cn co CO cn CD in in CO in in CO in in in in•a cn cn cn cn cn CO cn cn CO cn CO cn cn CO CO CO

•H 0 . . . . . ■ . . . . . . . . •53 CD o a CD\

cn *Hcn'—> CM CO CO CO ■3* CM CO CM CM IN CO CM CMw r - CM a 3" CO cn CO in in IN. CO CO cn CDi CD cn V CD a CM CD a CD CD CM CO CD o O

CD CO cn cn cn cn cn CO CO cn cn CO CO CO CO cn*—* . . . . . . . . . . . . . . . .XL CD CD o CD

rN CM 'd- CO un CM CO CO CO COCO in . V cn CO 'Ct’ cn CM CO *— cn T—

CD CO CO cn cn cn cn cn cn cn CO CO CO cn CO cnCO cn cn CO cn CO cn CO cn cn CO cn CO cn cn cn

. . . . . . . . . . . . . . . .CD CD a o

CO CM CO CM <3- CO CM CM CO COCO CO cn <3- in CM in cn v - l \ cn cn CM

ID l"N CO in CD in r \ LD in in in rN in incn cn cn cn cn CO CO CO cn cn cn CO cn cn CO CO

I ■ . . . . . . . . . . . • . . ■2 CD o a □

CN •c ni CO CO CO CO CO CO CM *3*

CO □ CO IN. a CO CM in cn r— r \ I \ CM CM>—> □ CM □ CO o o CM □ CO o CD CM CO 00 CD CD

cn CO CO CO cn cn cn cn cn cn CO cn CO CO cn CO. ■ . . . . ■ . . . . . . . . .

CD a o CD

CM cn CD o CM cn in a CM cn in CDV t—

cn cn cn-p -p -pc c cCD CD CD cnE cn E cn E xL

-P AC -P a C -P uCO a 10 G in oCD o 0 O CD g 1—1

LU CD P r—1 CO P i—1 T- p JO_i cn -P JO -P JO -pQ . LU > in > > o Oz : m JO cn cn v - JO cn CO CM JO in in< i— n ii n n II II ii II iicn cn cn M •“ ) 2 cn H 2 in M •n 2

- 203 -

Page 218: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.3

Comp

arin

g the

mean

and

vari

ance

of

the

y2

appr

oxim

atio

n for

the

trea

tmen

t me

asur

e of

vari

atio

n ag

ains

t its

ex

pect

ed

valu

es

m<—• oz c /—\ t—\ r-* /—\ \ /—s !-- \ /—t *—» f--N /---V /—» r-> /—\\ (0 i x T- CD IX cn CO CD V IX i n CM r -

. •H LD r - LD CO <—i CM CO CM CO CM O IX)N • Li LO IX IX CO r x r x cn CM LD LDc r (0 . • . . . • . . . . . . . . .i > CO CO CO CO CO CO CO CO CO CO CD CO r x IX IX

*—» W *.— ’ *—; S-- ! w w v-- > v-- ' w V---' *— > v---' v— f v-- / V__ >

s

•<“ 5PI • IX T— LD 'tf- v— CD CM <d- *}* IX CO CMc r c CD cn IX) CD IX) CD CO □ 'T ‘ •ct" IX) CO r — cn CD<— i (D CM CM v CD a CM CM V o <3* CM CM Ow 0 . . . . . . . . . . . . . . .>_t e : OJ CM CM CM CM CM CM CM CM CM 'TM&

<—> 0Z o\ c t—s t—\ /—\ !-- \ t—\ t—* t—\ t—\ t--N r i !---\ t---\ !---\ !---\

. 0 CM <—■> v CD co IX CD CO CO CO CO IX IX -PI • •rH CO CM V“ a IX o CD O ■sT cn CO IX) oc r u O CO IX CO CD cn CO IX) CO CO ^ r CO CD

i 0 • . . . . . . . . . . . . . .<—> > CM CO CO CO CO CM CM CM CO CO LO CD CO IX

•i- ) v—' *.—t v---' v—' w w w w w w w w v—' w

s

•i- jpi •c r CO i n CO CO LD CO CM CM CM CO CD<—' c LD CD CO CM LO CD CO IX) CD •cr IX CDw 0 □ O a o O a a CD a o a CD a o cns---> 0 . . . . . . . . . . . . . . .j r e : CM CM CM CM CM CM CM CM CM CO

CM CO CD □ CM CO LO a CM CO i n CDr - T- V-

CM CM co

II ii II II n II

0 0 0G G G

TD C TD C TD c0 0 0 0 O 0 0

LU lt > •P C •H CO -P C •H •P c •rH_1 O 0 Li a 0 Li G 0 LiCL LU > 0 0 0 > 0 0 0 0 0 0e : m JO P . z : > JO □L e : > JO □L E l >< M X X Xcn cn CO LU CO LU LD UJ

- 204 “

Page 219: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.4

Comp

arin

g the

mean

and

vari

ance

of

the

•x2- ap

prox

imat

ion

for

the

bloc

k me

asur

e of

vari

atio

n ag

ains

t its

ex

pect

ed

valu

es

CDr-> o 4---\ 4---\ 4--\ 4-- \ 4---\ 4---* 4---\2 c <—> <—. /—I #—i <—> CD CO i—> in CO 'J - CN ,—, CD CD\ to cn cn in cn < - O CD x o cn CD in CN CD

. •H CN cn cn CD CO cn cn a in in CD CN cna • L, t- r— in in x • . . . . . • • . .c r (0 • . • • • □ CN cn cn cn 'd- in CD X Xi > CD x x x x r— V V t— V T-

’ w w w v—’ *_4 w w ’ *---4 w w X_4 w

•H

(N.pH in V- CD in CO CO co v CO X ■cf CD Xc r C CO O >r- CD CD o 'd* CD CO in <3* V- COV_/ (0 's f cn CN o a CD cn CN CN CO in CN <r _w CD • . . . . . . . . . . • • . •>—> e : «cf* CO x rv. X IX O CD CD CD CD

pq&

,—i 0gc r—\ <—> <—> r—\ /—i 4---1 i—i 4—» 4--> ,---> 4---> 4--<

• to i—. X CD r — cn CO .—, CN CO CN in cn cn V -* •H 'c n CO CD in 'Cf rx cn CD CO CO

c r P CD CD x cn o cn in CD cn CO cn X COi to . • . • • . . • . . . . . . .

>—> > 's f CD CD CO x CO o t— CN CN r - CN in in. ’—' 1—' >—> '—' >—’ '—' T- T“ r — r — r—

•H V-- 4 V_4 V_4 V_4 v_4 \_4 V_4 V_4 V_4

<N*Hc r cn CO *3r cn *r~ CD CO cn CO V <r V- CO CN>—> c in CD CD CD CN O co CN cn X CO CD cnw to o O O o CD V- CN CD a CN O av—' 0 . . . . . . . . . . . . • . .X L e : ^ r cn X X x x X cn CD CD CD CD

X L CN CO '3 ‘ in CD CN cn in a CN cn «Cj- in av " r —

* “

•3* CD CO'd* CO x t - V-

ii II n II II II

0 0 0a G G

TO c • a C • a C0 0 0 0 o 0 0LU in -P c •H CO -p c •H rr- -p c •H_J O to p o 0 P a 0 uCL LU > 0 0 0 > 0 0 0 > 0 0 0E: m JO Q. e : > JO CL e : > JO CL e : >< M X X Xcn cn cn LU cn LU in LU

- 205 -

Page 220: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.5

Exam

inin

g the

ac

cura

cy

of th

e^2

appr

oxim

atio

ns

for

the

trea

tmen

t me

asur

e of

vari

atio

n

"vf- CO '3' CN CD CD <3" 3* CN CO CO CD CD CO□ a o cn CO CN CO CN a a O

cn cn cn cn cn co cn cn cn cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

/—\ . . . • . . • » . . • • • . . .2 □ a a a\

a •cri CO un CO CO CO CO CN CO *3* «3- CO CD CO *3"

<—> r— cn CO CO un LD cn CO co i-n cn CO IX)•«“> un un <3* un un un un IX) un 1X1 •3" <3* *3- IX)

• s,cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cnS £ • ■ • • • . . • . . . ■ . . . .\ o a o a

•i- )M •cr\_/w CO CO CO «3" <3" «3" CN CO CO CD CO COi_< in . CO CO CO cn CO CN CO v~ CO t— <r~ IN. t— COpq o CO CO cn cn cn CD cn CD O CD cn cn cn cn CD

cn CO CO CO CO CO cn CO cn cn cn CO CO CO CO cn» . . . • • . . . . . . . . . .

□ o a □

<3- c\j CN CO *3* CO CO CO CN CN CO *3" CN COCO CN V r \ LD CO CN <3- CO CN CN

cn CD cn cn cn cn cn cn cn cn cn cn cn cn cn cnCJ) cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . . . . . . . . m * . . . . .a □ CD CD

s\

. CO CD CO CO CO «3‘ CN CO CN CDCN • •3* CN r— in . <3“ CD <3" CO IX) IX) CO cn LO cncr LD CO CO un un l"N CO CO un IX) IN. CO LX) IX) IX)i >/O ) cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

^ c • . . • . . . . . . • . . . . .•i—i o a a CD

55

•«“)CN • CO CN CN CO *3" CN CD CO CO CNcr ^r <3- CO CO CD CO LD cn CD CO T~ CD CD CO CO<_i a CN r— a O O CO V- v CD CO "T* V a CDw cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn\_/ » . . . . . . . . . . . . . . .iT o o CD a

CN CO <3- un o CN CO *3- un a CN CO «3" IX) □T- r—

cn cn 0-p -p •Pc c a .CD 0 0 0e cn E 0 E

-p 4-> •P ara a (0 a in ocn o 0 □ CD 0 i—i

llJ un p i—i CO P rH V p JD•p JD •P JD -p

Q_ UJ > LD > > a CDe : m JD CO un TT- JD CO CO CN JD IX) 'T- IX)< M ii n II II II II ii II IIcn cn CO H r ) Z CO M •“) 2 IX ) M •“J 2

- 206 -

Page 221: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.6

Exam

inin

g the

ac

cura

cy

of the

X2

appr

oxim

atio

ns

for

the

bloc

k me

asur

e of

vari

atio

n

<3* 1* CO CM CO <3* CO X CO CM CD CM XCD 'T- a CD V“ CM CM a CD V“ CD a CD a

CD CD CD CD CD CD CD CD CD CO CD CO CD CD CO CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. • . . . . . . . . • . . . . .z o □ a oX .

<N •cr

CD x CM CD CO V —n CO CD x CD CD CD CO CO CD CD o CD CD V“i—> LD LD LO in in CO ^r in CO in *1* '3" in

• ]rcn c n CD CD CD CD CD CD CD CD CD CD CD CD CD CD•P f . • . . . . . . . . . . . . . .

o a CD a

cr>—« V CO x CM CM 'T-w a CO X CD CM IX CO CD CD CD CO CD*—> a CD CD CO CD CD CO CO CO CD CD CD CO CO CD CDCO CD CO CO CO CO CO CO CO co CO CO CO CO CO CO CD

y j . . . . . . . . . . . . . ■ . .r-Z* a a CD CD

CM CM 3" UD CO CM CD CMco ID CO CM CM CO CD IX CM X un in V*

CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CDcn CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

w . • . . . . . . . . . . . . .o CD O CD

Z\

(N • CO CM CM CD X CD CM CO CM CM v-cr CD CO x CO CD CD CD CD CD v CD CD CO <3*i LD X CD ud UD in x CD CD LD in X CD CD in in

' ' V c n CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD. • . • • . . . % . . . . . . . .

•H o CD o CD2

m .pcr CO <3- CM <cr CO CM CD CD CO iny—j CD CD CO x <ct* in X □ in in CD Xw a CO <r- CD O CD CO V a V- a CO CM □ o'— ' CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . . . . . . . . . . . . .CD a o CD

CM CO ID CD CM CO in o CM CO <3- in CD,r" r—

CD 0 0p p Pc c c0 0 0 0E 0 E 0 E

4-> p P o(0 a CD 0 0 o0 o 0 o a 0 1—1

UJ LD U i—i CO u r—1 V U JD_J -P JO p JD PCL UJ > in > > CD CDE : m JO CO in JD CO CO CM JD in V in< M II n ii II li II ii II itcn cn CO M • - ) z CO H z in H r~3 z

_ 207 _

Page 222: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The accuracy of the F distribution approximations are provided in Tables 10.3.7 and

10.3.8. Table 10.3.7 examines the F distribution statistic for the testing of the p

treatments Z e, against its improved F distribution where the term (3 has been

included. Table 10.3.8 examines the corresponding test for q blocks, Z ?. The

improvement tests show better fits for both factors across all significance levels,

although it is important to note that it still slightly under-estimates the proportion a

leading to an increased probability of a type I I error.

In Chapters 6 and 7 the extension of Watson and Williams and Stephens approaches

to larger experimental designs was shown to breakdown due to the combination of

sample means. Examination of the chi-squared approximations for randomised

complete block and two-way classification designs were, however, carried out since

the combination problem was negligible for very large k and did not affect the

associated test statistics. On the assumption that the extended techniques were valid,

direct comparison may be made to the corresponding tables in this section. The

associated tables for the randomised complete block design are Tables 6.4.1 to 6.4.7.

For all the components within (10.2.1), incorporating the correction factor /3, show a

comparable fit for large k and an improved fit for small k for both the chi-squared

approximation and moment calculations.

- 208 -

Page 223: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.7

Comp

arin

g the

ac

cura

cy

of the

test

stat

isti

c Z, a

gain

st

its

impr

oved

st

atis

tic

for

exam

inin

g tr

eatm

ent

effe

cts

in m CO in in CD CN in X COin CO CO o CN 'T CN ■cf V CN CN o CN

CD CD CD CD CD CD CD CD CD CD cn CD cn cn CD cnCD CD CD CD CD CD CD CD CD cn cn CD CD CD CD CD

. . . . . . . • . . . . . . . .o a CD O

in CO CN TT- CD CN r - cn X X COCN T— CO CO CN V“ CO in CN t— o X CD

in CD CD in in in CD in in in in CD in m *3" in>/CD CD CD CD CD CD CD CD CD cn CD CD CD CD CD cn

CD I • . • . . . . . . . . • . . . .N o o O □CQ

in V— CO CN CN CN CN X COCO CD in CD CO CD '3" CN in CD CN

a a a o <r- a a O O □ o CD oCD CD CD CD CD CD CD cn CD cn cn cn cn CD CO CD

. . • . • . . . . . . . . . . .a O O □

in CD in CD CD CO CD cn CO

T— XCD in CO <r“ CN in in CO V O in CO CN CN

CD CD CD CD CD CD CD cn CD CD CD CD cn CD cn CDCD CD CD CD CD cn cn CD CD CD cn cn CD cn CD CD

j—t • . . . . . . . . . . ■ . . . .CN □ o □ o

CN

ar - in CO <r- IX CO CO T- CO X X CD

CO CO O x in CO cn CD CO X in CD COin CO CD CD in in CD CO in in in CD CD in in in

c > CD CD CD CD CD CD CD CD cn CD cn CD CD cn cn CDO c . . . . . • . . . . . . . . . .

•H o a a a•P(DDcr

LU CD CN V ix CD CO CD CO T— CD '3'a a CO V- CD CO CD □ CO

CD □ CO CN r- □ CO CN t- □ □ CN CN CN v □X CD CD CD CD CD CD CD CD CD CD CD cn CD CD CD CD

. . . . . . . . . . . . ■ • ■ .□ o o □

CN CO <3* in CD CN CO in a CN CO <3- in a*" V

OLU in CO V-_ICL LU > > >e ; m JC X3 JD< Mcn co CO CO in

- 209 -

Page 224: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.8

Comp

arin

g the

ac

cura

cy

of the

test

stat

isti

c Z-,

agai

nst

its

impr

oved

stat

isti

c for

exam

inin

g block

effe

cts

CM ’3" CM in CD CD CO <3*CD CO CO V“ CO CM — CM t * CO CM CD v-

CD CD CD CD CD CD CD CD CD CD CD CD CD CD CO CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

• » . • • • . . . . . . . . .□ O CD CD

CO s}- CM t - CM CO Xr * o CO CD in co CM CO a — CM

V in IX CD un in in CD in in in in in CD in in in0 CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. . . . • . . . . . . . . . ■ .x o □ o a

X IPQ

CO CO CM CO CD CM CO COCO o CD CD in o in CD ^r — o a CD X

o CM CM □ CD CD □ V o O V □ CD □CD CD CD CD CD CD CD CD CD CD CD CD CD CD CO CD

. • . . • . . . • . . . . . . .o o o CD

CD X X X CD CD CM COIX CO CO CD CO CM CO V“ CD CM CM

CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CDCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

/—\ • . . . . . • . . . . . . . . .CO O □ CD a

C\J

oCO in CD CO X CM CM CD X CM

' — • <3* X V- CO X CD CD CD in CO CD CO CDun IX CD co LD in CO CD CO in in CD CD CD in in

c ICD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

□ *- . . . . . . . • . . . . . . . .•H o a CD a+-><0DD"

LLI CO CO t— CO CO CO <r* v- CO X XCO T— o CM CD X CO o in CD CD □ CD

IX o CO CM CO CM CM r— CD CO CM CM □X CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

. m . . . • • . . . . . . . . .□ o □ o

CM CO <3- in a CM CO <3* in □ CM CO in o

aUJ in CO V_JCL UJ > > >2 Z NI -Q n JO< 1-cn cn CO CO in

- 210 -

Page 225: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

10.3.2 Two-way Classification with Interaction

As for the randomised complete block comparable statistics for the two-way

classification with interaction design, examining the component chi-squared

approximations, are produced in the six tables from 10.3.9 to 10.3.14. Here the

chi-squared approximation moments are not reproduced as similar terms from other

models have already indicated their adequacy. Similarly the correction factor 0 has

been included for the main effect and interaction terms.

There are four two-way designs simulated, each varying in size from N=30 to N=90

and varying in concentration parameter from k=2 to k=10. Table 10.3.9 indicates

the accuracy of the total measure of variation and the first main effect to their

respective chi-squareds for 10,000 simulations. Table 10.3.10 for the second main

effect and interaction terms and finally Table 10.3.11 for the residual term. The

main effect and interaction terms show excellent fits across all values of k, while the

total and residual terms are relatively poor fits for concentration parameter as low as

2.

The accuracy of the three test statistics Z 8, Z g and Z 10 are shown in Tables

10.3.12, 10.3.13 and 10.3.14 respectively, with the corresponding improvement factor

/3 applied. The F distribution approximations for the main effects show similar good

fits as in the one-way classification and randomised block designs, with the accuracy

decreasing at 2. The final Table, 10.3.14, indicates the accuracy of the

interaction test statistic to the F distribution and shows an excellent fit across all

concentration parameters.

Comparing the Tables 10.3.9 to 10.3.14 with the corresponding Tables 6.4.8 to

6.4.13 of Chapter 6, and on the assumption that the extended techniques are valid

for large k, shows once again an improvement in the chi-squared accuracy for the

new approach.

- 211 -

Page 226: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.9

Exam

inin

g the

ac

cura

cy

of the

total

and

first

main

effe

ct

comp

onen

ts

to their

resp

ecti

ve X2

dist

ribu

tion

s

CO CO CO CN *3* CO CN CN t- IX CO CO CO un CO CN cn•3- CN CO V CO CO r - CO •3- V CN CN CO CO a

cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . . • . . . • . . . . . . . . . . . . .2 o a a a o

CM •cr co CO CO CO CD CO r— CO CO co CO CO CO uni Nj cn LD <3- cn •st­ •3* CN cn in •3* r x IX •3* CN CO un a o

\ LD LO in in <3- un un un '3' un un CO un un un un un un un un uniJ cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn. . . . . • . . . . . . . . . . . . . . .

•H o o o a oZX .

CM *H *3- <3- ^r <3* <3* CO 3- IX co IX CN CO cn CO CO CO cn •3-cr CD rx un CO <3* in o CO un un a un o CN *3" cn CO>_i o cn O o cn a cn cn cn a a o o a o a cn O O cn ow cn CO cn cn CO cn co CO CO cn cn cn cn cn cn cn CO cn cn CO cnV_/ . . . . . ■ . . . . . . • . . . . . . . .CO a o a o a

*3" <3* •3- <3- CO co CO r~ CO •3" CO r x <3" IX COCO LD *3- CN cn IX □ V“ cn CO CN co CO a CO co CN

cn CD cn cn cn CO cn cn cn cn CO cn cn cn cn cn cn cn cn cn cncn CD cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn CD cn cn. . . . . . . . . . . . . . . . . . . . .o a a o □

CN CO in CN •3" CN un CN CN CN •3* CO COV CO IX CO cn IX cn CO co cn CO CN O <3- CO CO CO cn CN

) lh tx CO LD LO un CO 3- *3* un un CO un in un un CO <3- •3* *3* unt cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

t—\ • . . . . • . . . . . . . . . . . . . . .2 o o □ o aX .

CM •cr CO <3- CN V- CO r— CO CN •3- CO CN CN CN t -i cn ^t- •3* un cn CO •3- cn IX in CO IX in cn a cn V O co

□ CN cn a O a CO cn cn a CN cn cn cn cn CO cn □ □>—> cn cn CO cn cn cn cn CO CO CO cn cn CO CO CO CO cn CO CO cn cn

. . . . . . . . . . . . . . . . . . . . ■o o a o o

CN CO un □ CN CO •3- un o CN co <3" un a CN CO •3* un o•?-

ci(0 □

<—> b0 > CO <—i i—\ oin •H u c II a un V-

cn 0 •H> CO cn JO i—i > >

JO TO JO •P i—1 > JO JO<_> o •H i—1 JO i_> >_1CO 2 0 >_iLU CO un a CO CO CO_J > cnCL LU > JO _c c j i > O > un > O

s : m JO -p o o JO CD JO •3" JO cn< H CN •H •rH 0 II II ncn cn CN 2 - P 0 CN z : CO Z CO z

- 212 -

Page 227: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3.10

Exam

inin

g the

ac

cura

cy

of the

second

main

effect

and

inte

ract

ion

comp

onen

ts

to their

resp

ecti

ve X2

dist

ribu

tion

s

Cd

on

<N "H C d

WI

•H

(M ‘HC d

WW

O

CD CD rx r— ix V- cn cn CO «3* CD CD *3* cn cn*3" cn cn in CM CM o r- CM □ o o cn cn cn CMcn cn cn cn cn cn cn cn cn cn cn cn cn cn CO cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . • . . . . . . . . . . . . • . • . .□ o o o

un cn cn CM cnCD cn in CM in

cn cn in in in incn cn cn cn cn cn

. . . . . .□ a

<c— V cn CM incn r— CD «3" o

a cn a cn cn acn CO cn co CO cn

• • • • • •

cn CO cn CD ix cn CM CM cn CO ix r - CM ^ — r xcn <3- CM CM V - □ I X CM o co in CM r -in in LD in in in in >3* in in in in incn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

o■ • • •

a• * • •

□■ ■*

«3- cn CO cn rx cn t- CO r- cn «ct-o cn CO CM cn cn CO r— CM in cn cn a aa cn cn o □ CO CO cn □ a cn CO cn a ocn CO CO cn cn co co CO cn cn co CO CO cn cn

a• ■ • •

o• • • •

ix IX in cn co CO 3* *3" cn in IX cn 'v in CO ix <3* cn CDin cn CM CD in a CM a co cn CM CM CM CM a cn

cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn COcn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

t—\ . . . • . . . • . . . . . . . . . . . . .2 CD o o o o\

(N •C d CM rx in rx T- in in CD CD '3" cn r - <r- ix co1 CO CM CM cn cn 3" a V CO in in cn CM cn in CM '3" CM V

>—. in un in in in in in in in in in in *3* in <3" in in in in incn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

•r“ )C m . . . . . . . . . . . . . . . . . . . .o a a a CD

2 ’\

PI ■ CD V CD CM CD «3* CD cn CM in 'r* in COC d cn un IX in a in CM CD r— in cn CD □ in in IX>_> o o cn o O CD CD a a a CD cn cn cn a CD □ cn a cn aw cn cn CO cn cn cn cn cn cn cn cn CO CO CO cn cn cn CO cn CO cnv-- > . . . . . . . . . . . . . . . . . . . . .CO o a a □ □

CM cn *3- un o CM cn 3* in a CM cn *3” in CD CM CO '3_ in ac~ r~ r—

Ci(0 O (_>

«—> bO > cn o <—i CDLD •H u c n T“ in —

CD 0 •H 2> 0 in JO i—> > > > i-O "D JO •P i—i JO JO JOv—t o •H i—i >_< <—< v—>

cn 3 0UJ cn in o cn cn CO

> inCL LU >> JO JO c JO > > >e: m jo •p o a JO JO JO< M CM •H •H 0CO CO CM *_! 3 •P 0 CM cn CO

- 213 -

Page 228: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

o E-P □

TO•P 0c 00 Pc <4-oa H -e □oo 0

0rH 00 PD bO

•a 0•H TOif )0 <— >

P Y0 £JO+J cr

CLO JO

4-5> i •Ho 20p CD oo •Ha 4-50 OJ

JO0 •H

JO P4-5 -P

0bO •HC . TO

•HC n•H£0 0X 4-5LU •H

CO

a

LU_ICD<

in 3" CO cn CM cn T- CD cn ■3* in CO X 3" CM X cn COco in 3" CM a CO CM O CO co IX in CO CM r— in 'T~ V- 'T~

CD cn cn cn cn cn cn cn cn cn CO cn cn cn cn cn cn cn cn cn cncn cn O) cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn■ • • ■ ■ • • ■ ■ • • 9 • 9 ■ • ■ 9 • ■ ■o o a o a

. CD T- V V in CM CO V V CM cn in X CO cn•<—) CO CO CO CD 3" cn cn V“ in CO <3* O <3- <3- V- 3" CO CD 3*•H in rx in in in in CD <3* m in in X CD in in in CD 3" 3* in in

2 \ /c n cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn• • • • • • • • • • • • • • • • • • • • •

a o □ a o•>—>

(N "Hcrv-- 'w CO CO CM CD in •3* CO in in CM 3* CD CD CD in CO CO CM COw a CO in <3* 3 - cn IX CO CO CM CD r— CD o X X CO 3*1 a CO a a a o CM CO cn a CO CM a a a cn T- CO CO cn O2 cn cn cn cn cn cn cn CO CO cn cn cn cn cn cn co cn CO CO CO cnv—f ■ • ■ • ■ ■ ■ 9 ■ ■ 9 • 9 ■ ■ 9 ■ • • • 9

- X a o a a o

CM CO 3^ in □ CM CO *3* in a CM CO 3" in o CM CO 3* in ot- t-

COi0 □

<—> bfl > CO CO ,—i ain ■H P c II 'T- in r—

0 0 •H 20 0 JO <—> >1 >» >

JO 10 JO 4-5 i—i JO JO JOV__4 o •H i—1 '__ i %_* <_>

CO 3 0LU CO in CJ CO CO CO> 0Q. LU >1 JO JO C JO > CO >1 in > □

e : ix JO 4-5 o 0 JO CD JO 3" JO cn< M CM •H •H 0 II n iiCO CO CM w 3 4-5 0 CM Z CO 2 CO

- 214 "

Page 229: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3

.12

Comp

arin

g the

ac

cura

cy

of the

test

stat

isti

c ag

ains

t its

im

prov

ed

stat

isti

c for

exam

inin

g the

first

main

effe

ct,

p

V CD CD in ix in CM CO co IX in rx CM CM in <3* IX'3* CM T~ CM a in CD CM o in CM a cn CD cn CM V 'T-

CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD COCD CD CD CD CD CD CD CD CD CD CD CO CD CD CD CD CD CD CD CD CD

• ■ ■ . . • • . • » . . . . . . . . . •o CD a CD o

CD CM CO CD CM CM CO CM <r~ V CO CO r - V COa CO in in v~ CO in CO in cn □ CD IX >3* CD o CM cn 3" CO

in CO in in in in in in in in in CO in in in CO in in in X"^ co ? CD CD CD CD CD CD CD CD CD CD CD CD CD cn CD CD CD CD CD CDIX £1 . • • • . • . . ■ • » . . . . . . . . . .PQ o o a a o

CO CD ■sf CO r— cn CD cn <3" CM CO v- IX CM <r-o CD CO cn CD o CD CO CO «3- cn <3- CO a in IX CD

a T— CD o □ a o r- CD a a CD CD a o CD CD CD a CDCD CD CO CD CD CD CD CD CD CD CD CD CD CD CD CO CD CD CD CO CO• . • . . • . . ■ . » . . . » . . . ■ . .o a o CD CD

cn r - in CO rx in in CO CM CM OD CO CD CM cn CO COi \ CO cn cn ix CM «3* CM CO in cn CM CO *3* CM cn CD

CD CD CD CD CD CD CD CD CD CD CD CD CD cn CD CD CD CD CD CD COCD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD

• ■ * . . • • . . . » . . . . . . . ■ . .□ O □ CD a

LD

C\J

O r- in T- CM CD CM CO <r~ CO IX V- CM <3* CD T“in CO cn CD cn CM CO CO CM in CO r— CD o cn CM CO IX CO CMV_/ LD rx CO CO in in (X CO LO CO in rx IX. CO CO in IX CD in in in

> CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD CD coc c ■ . • . . . . . . . . . . . . . . . . . .□ □ a □ a CD

•H4->(DDCJ ix CO rx 'Cf CM CO in rx CO CD CM CM < - cn CD V IX CM CM

UJ in cn in CO CO *3* CD CD CD *3* <3* IX IX CD CD CO CDCD cn T— □ cn CM O O a cn CM t:— V CD cn CM o a a

CO CD CD CD CD CD CD CD CD CD CD CD CD CD CD CO CD CD CD CD CD CDIX . • • . • • . . * . . . . . . . * . . . .a a o CD a

CM cn in o CM cn <3" in CD CM cn in CD CM cn *3* in CDV V

t_N«—> a ,—, CDin V in

> - > > >JD JD JD JD

\— > v_/ V_/

lli cn cn cn cn_iOL UJ > O > CD > in > os: rx JD cn JD CO JD >3* JD CD< H ii II ii IIcn cn CM 2 CM 2 cn 2 cn 2

_ 215 _

Page 230: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.3

.13

Comp

arin

g the

ac

cura

cy

of the

test

stat

isti

c ag

ains

t its

im

prov

edst

atis

tic

for

exam

inin

g the

second

main

effe

ct,

q

CN CO cn rx CO LO V- CN CO IX IX cn V V CN cn V“<3* CO a □ in CO t— CN CO *3" CN CN co CO CO a CN

cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn CD cn CDcn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . . . . . . . . . . . . . » . . • . . .a o o a CD

CO CO CO CN CO CO v- CN CO V“ CO CO CO CM CO *3*CN *3* CD CO □ in in <3- CD CD «3* in rx CO CO CN V-

cn CD CD in in in in CD in in in CD CO in in in in CO in in in)( cn cn cn cn cn cn cn CD cn cn cn cn cn cn cn cn cn cn cn cn cn

cn '■ . » . . . . . • . . . . . . . . . . . . .N o o □ a CDCCS

CO CO CN cn IX cn <3" IX CN '3" v - inv CO in «3* CN cn <3* (X CO co a in in <3- CO a CN •3-

□ o □ a □ r— □ o a o r— a □ CD <r* r" CD CDcn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

. . . . . . . . . . . . . . . . . . . . .o □ CD o o

CO CO CO IX CO <r~ CN r - CO CN CO CN V- cnrx rx CN CN CO CO CO CO CO CN CO CO <3* CN CO IX CD CO co ’T-

cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn___ cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cnCD • . • . . . • . . • . . . . . m • . . . .a o □ □ CDCN

Or~

CD <3* CN in in in *3" CN rx CO V“ CN CO cncn o CN CD co IX CD in o CO cn 3" cn CO CO *3" in CO co

C rx rx CO in in rx rx CO CO CO rx CO CO in in IX CO CD in in° ? c n cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

•H C- . . . . • . . . . . . . . . . . . . . .-P CD a a a □(DDcrUJ

un in rx CO CO CO rx in CN in 3" V CO CO IXcn CO cn a CO rx LO CO CO CO CO CD IX IX IX IX CO cn CO

cn CD CO CN t- r— a CO CN a CO CN a CO CN V— CD CDIxl cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn CD. • . . . . . . . . . . . . . . . . . . .a CD CD CD CD

CN CO <3- un O CN CO in o CN CO *3* in o CN CO in CDV" T“

<_(i—. CD i—> CDun V- in

> > > >JD JG JD JDw w

LU CO CO CO CO_I CDn_ lu > □ >i o > in > □z: m JD CO JD CO JD JD cn< M II II ii nCD CD CN 2 CN 2 CO 2 CO 2

- 216 -

Page 231: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CDszp

+->cn pc G

•H CDCD <PbO q-(D (D

CDCr - O

N •Hp

G G•H CDP Pcn 0

*H PP cCD •H

Pcn CD

.Cp PcnCD bO

p C•H

CD Cj= •rHp E

CDq- Xo 0

> Po oCD q-pD GG •HG PCD 0

•HCD P

JD CDP P

0bflc "O

•H 0P >CD oQ. pe ao E

CJ •H

in . in . CO CD <3* V - V CM IM cn ID [ \ IM *3* CO V - ID ID CDCO CO V CO cn ID < r- CD CO CM cn CM V - CO CM CO CD CD

cn cn cn cn cn CO cn cn cn cn cn cn cn CO cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn

• • ■ a a a a a a a a a a a a a a a a a a

o a o CD CD

un 'T~ cn CD CO ID «3- CO CO V* C D *3“ CO T— fM fN CDo V - v~ cn cn CD CM < 3- CM IM LD CD <3- CM CD CO IN. ID CO

L D CD co un LD <3- ID ID ID ID ID ID ID ID ID ID ID ID ID ID IDa ) cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cnV - *■ . • • a a a a a a a a a a a a a a a a a a

M a a CD o CDcq

! \ CO IM ID CD cn r - IM IM CM [N. ID cn LD cn ID V - IN.cn V CM ■sf- CO CM cn ID on CD CO cn CO CD CO CO ID cn

o o V CD O o o a CD CD cn CD cn cn cn CD cn a CD cncn cn cn cn cn cn cn cn cn cn cn co cn co CO CO cn CO cn cn cn

• a a a a a a a a a a a a a a a a a a a a

a CD o CD □

CD cn r — r - V [N. CM CM LD LD C N cn CM cnN LD CO ID o CO CD *3" CO CM IN . C N >3- «3* CM ID CD •3- CO r -

cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cncn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cnr • • a a a a a a a a a a a a a a a a a a a

o □ CD □ CD

CM

ar r V “ CM IN. V* ID (N. CM CO CO IM CM CO CN CM V"w r \ r - IN . IM CM CD CO CM CD 'Cl­ CM CO cn v- <3- CD CD CD cn CO

m rN [N. LD ID ID CO CD CD CD io CO CD LD CD LD rN CD CD ID IDc ^ cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cno 1 • a a a a a a a a a a a a a a a a a a a

•H o a CD CD CDP00cr

UJ CM T- CD IM <r~ CD CO T- V - CM *3* IM CO CO CM CM cn <r~a CO un CO CD CO CO *3* CO CD CO CO ID CO CO IN . CM <3- «3- L Da a <3- CO r ~ a CO CM v CO CM o CO CM O

^ r - cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cn cnM • • a a a m a a a a a a a a a a a a a a a

a CD CD CD CD

CM CO ID a CM CO *3- ID o CM CO *3* ID CD CM CO <3" ID ar~ T- 'r *

/—\<—> cn ,—1 CDin ID

> > > >JD JD JD JD'—’ *— > s_/ s_/

LU CO CO CO CO_i cnCL LU > CD > o > ) ID > CDsz n JD CO JD CD JD *3* JD cn< M II ii II iicn cn CM z : CM 2 CO 2 CO 2

- 217 -

Page 232: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

10.4 Randomised Complete Block and Two-way Classification with Interaction

Designs for Small k

As discussed in Sections 9.7 and 9.8 only the associated component chi-squared

approximation may be used as a test of difference, since the corresponding total and

residual components, for small concentration parameter, do not form chi-squared

distributions. For the randomised complete block and two-way analysis designs with

small k the models may be seen as given in (10.2.1) and (10.2.4) but the total and

residual terms may not be assumed to be chi-squared distributed. From (10.2.1),

(10.2.4) and Section 9.2 the test statistic to examine the null hypothesis that there is

no difference between the p treatments gives:

which has a chi-squared distribution with 2 (p -l) degrees of freedom.

Similarly, the test statistic Z 1 2 will be produced to test the null hypothesis that there

is no difference between the q blocks within the randomised complete block, or q

treatments within the two-way design:

which has a chi-squared distribution with 2 (q -l) degrees of freedom. Testing for

differences between interaction terms within the two-way analysis gives test statistic

( 1 0 .4 . 1 )

( 1 0 .4 . 2 )

Page 233: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

As for large concentration parameter, test statistics (10.4.1), (10.4.2) and (10.4.3)

indicate whether or not . the mean directions are equal, they do not allow for

discrimination amongst single mean directions.

10.5 Accuracy of the Associated Chi-Squared Approximations for Small k

The accuracy of the test statistics (10.4.1), (10.4.2) and (10.4.3) are determined by

simulation methods. The Monte Carlo samples from a von Mises distribution are

generated by the computer method described in Appendix B. As for the testing of

large k, 10,000 sets of samples of various size were drawn with varying concentration

parameter values. Before the tests are applied the equality of concentration

parameter is checked using test statistic (4.5.2). Ranges of small k are given as its

accuracy cannot be assumed for such small values. The test statistics are examined

with equivalent sample sizes and components within the two-way classification, since

the main effects for both the randomised block and two-way designs are similar.

Table 10.5.1 gives the first two moments of the chi-squared approximations for the

two main effects within the two-way design. The simulated mean value for each

component is seen to be a very good fit to the mean of the chi-squared

approximation. The variances are also good fits, increasing in accuracy as the size

of sample and k increase. Table 10.5.2 shows the accuracy of the first two

moments for the interaction term to the expected values. As for the main effects

the simulated means and variances are comparable to their associated chi-squared

approximations notably increasing in accuracy as N and predominantly k increase.

Tables 10.5.3 and 10.5.4 compare the accuracy of the test statistics, Z 11t Z 12 and

Z 13 to their corresponding chi-squared distributions in the upper 10 percent

significant levels. The goodness of fit for the tests with concentration parameter less

- 219 -

Page 234: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

than 1 are seen to be very good for all the test statistics, and as for all previous

goodness of fit examinations, the accuracy is seen to increase as the size of sample

increases.

Tables 10.5.5 and 10.5.6 take the test statistics out of their proven range and

examine their accuracy for concentration parameter values of 1.25 and 1.75. The

test statistic for both main effects and interaction terms are tested. The tables

clearly indicate that the test statistics show very good fits to the upper 10 percent

significance levels and are reliable tests for experimental designs with concentration

parameters in the range 1 to 2.

- 220 -

Page 235: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABL

E 10

.5.1

Comparing

the mean and

variance of

the

x2 approximations with

Cp-1)

and

(q-1)

degr

ees

of fr

eedo

m to

their

expe

cted

va

lues

, for

smal

l.

(N ■ CDCL 0v—/ c /—\ t—\ /--* /—* t—\ /—. /—v

1 (0 CD LD CM LD cn ^3- v -r—> •H CD rN •3" LD 3" CO O CD

• p CO CM rN ID a CM CDID . • • • • • • .

• > CO *3* CO IN CO CO INV_' w *_i X_> \_* V__ ! v—> V__>

CM •cr>—' <3" LD cn cn rN CD ow C CO fN □ CD 'C~ CM\_/ (D cn cn o CD O o □ CD/—* CD • . • . . . . .CM e : M— CM «3* *3* >3*

v_;

\CM

*3* *3" CO COt— \

cn II II II ii

> CM 0 CM 0 3* 0 ^ r 0jd O a 0 a'_! T3 II c TJ II c "0 II c ■a n c

0 (D 0 0 0 0 0 0ILI ' i ’ CO -P C •H -P C •H -P C •H -p c •H_J O (0 P O 0 p O 0 P a 0 0CL LU > 0 0 CO 0 0 0 0 0 0 0 0 0e : INI JO Q. rr- > CL ZZ > CL zz > CL e : >< 1—1 X X X Xco CO CM LU LU LU LU

cr

fM *Hcr

i

CM

CD0c<D•HP0>

<3- CO CO LD LD LD

CD in CD CD□ cn •3- CO

<3- CD IN . IN IN . CO

I N CO

<3- CM CO COo o<3* *3*

CDCM ^ LD CO

LD ID O CD O

CO *3*

CD v - < r- CO O coCO IN

CO CO □ IN O CD

<3- co

ID a IDCD ID a ID CD. . . . . . . .

a r— a CD CDtr-i i i 1 1 i 1 1

a ID a LD CDID CDLD. . . . . . . .

a CD CDO a O CDCD

CO CO co t_i CO<—i a ■—> aID ii r- ii LD n n> '3' 0 > *3* 0 > <sf" 0 > <3* 0JD o JD 0 JD 0 JD 0

<—< TD 11 c v__t TD II c >—• TD II c w TD II c0 0 0 0 0 0 0 0LU CO •P C •H CO -P C •H CO -P c •H CO -P C •H_! a 0 P 0 0 P 0 0 P 0 0 PCL LU > 0 0 0 > 0 0 0 > 0 0 0 >1 0 0 0e: INI JD CLe: > JD CLe: > JD a. e: > JD CLe: >< H X X X XCOCO CM LU CM LU CO LU CO LU

- 2 2 1 -

Page 236: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.5.2

Comparing

the mean and

variance

of the x2

approximation

with

Cp-lHq-1)

degrees

of freedom

to their

expected values in

cn cm m o •• in

o cm

^ v -co cn

cm cn cm cn <3- co

cn cm co cn in □

CM *H

co in cn cn

CM t- cn o cn o

co in co <3- co ocn cn co

CM

in o in o in □

□ in o ino o □ o □ o o o

co coCO COo oin in

co CO QJ

cn cn •p cn p cnQ_ LU

EZ M < Mcn cn

n cl e: cl JO □L

CM cn cn

- 2 2 2 -

Page 237: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.5

.3

Examining

the

accuracy

of the

x2 approximations for

the

two

main

effect components,

for

small

kCO(X CO CDrx

O cn CDCD 'T~ a o,—i cn CDCO COcn ODcn cn cn2 cn cn cn cn cn cn cn cn cn. . . • • • • • •• o a □ a CD

N ■DCl

□ CO COCO CNv-2 • cn cn cn cn CNCD V” in

/1X1 LD un in un un in< cn cn cn cn cn cn cn cn cn

'r~3 . ■ • • • • • • •n • o a a a CDDCw

a CDcn co CD cnCOCN cn o cn COcoo cn o cn CDcn cn acn COcn COcn cn CO COcn\ . • • • • • • • •CN] a a a CD a

cn COrx COCN IXcn «r* CDrx cn cn

S cn cn cn cn CO cn cn COco\ cn cn cn cn cn cn cn CDcn• . . . . . . . . .

• a CDa CD CD ori •q:i

•H COV- COv- ix. CNcnCO a cn cn CD□LD LD LD un un <sf <3* in, i cn cn cn cn cn cn cn cn cn, i . . . . . . . .a #H o □ a CD amLJW\—//—*N COCN ix CO CNIXr- cn in un cn CDt-v - a O □ a cn CDa CDCD'—> cn cn cn cn CO cn cn cn cn\ • . • • • • • . •CN □ o a CD CD

cn CD un a in CD un CD. . • . . . . .□ CDr- CD CD v -i 1 1 1 1 1 1 1a LD CDun a un O in

. . . . . . . .a □ CDo CDa CDCD

C•H

jC JZ4->-P• H •Hs 3 □C cn cn ,—> «—iCxOC II CD <—. CD

CD -H O 2 V un Vcn •H>1 CD+J ,-N >i > >JD TJ (0 >—I JD JD JD\_/ > rH v—/ V_/ s_>

cn 0 0LU cn 0 O cn cn cn_I > 0CLLU > JDJD JZ > CD > tn > CDz: M JD Q 0 JD CO JD JD cn< M CN 0 II n ncn CD CN w un 0 CN 2 cn 2 cn 2

223

Page 238: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.5.4

Examining

the accuracy

of the

x

2 approximation

for the

interaction

component

for

small

k

Ncr

cr

v.N *Hcr

• | - 3•Hz

\

01 *Hcr*—tww

\CN

COCD

y un 0 CD

oCD

un r \ cn co cn co cn co

co rN cn coCD CO CD CD

CN CO CO CO CD CO CD CO

O

COCN CN CD CD CO CO

COun cn in in CD co

cd

■e* IN CO CDun <3- cn cd

co coCO INun <3-CO CD

CD

CO 3" v-un unCD CD

CD

907

90

53

900

6 <3-COCDCO CD

COCO 9

07

2

CNCOCO. . • . . . ■ .

□ □ CD CD

in a un CD in CD un CD. . • . . . . .□ CD a a v"O in CD 1

in □1

inio 1

un. . • . . . . .□ a CDa CDa CDa

tu_JCL LU Z INI < H CO CO

C•H

JC JD•p •P*rH •H2 3c 0 CDM C CO CO in o

in -h o II0 •H z >

> 0 -P /—1 > JD >ID TD 1—1 JD >_> JDW > i-H 1__ V_l

CO P 0CO 0 o CO CO CO

> 0> JD JD J Z > O > in > CD

JD Q o JD CO JD JD CDCN 0 II it II

CN ^ in 0 CN z CO z CO Z

- 224 -

Page 239: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.5.5

Examining

the accuracy

of the

x

2 approximations for

the

two main

effect components,

for

k in

the

range

1 to

2

IS. 'T- co CM•—> CO CD CD MT“ m un rs IS2 : CD CO CO CO CO CO CO CO CO

CD CD CD CD CD CD CD CD CD• • . . . . . . • •• o □ a □ o

(N •c ri

■ CO rs CO un *3* CD CDun CO CD rs CO V“ CO OJ Ln 1* M" 'd - CO un

• 5 CD CD CD CD CD CD CD CD CD• • • . . • • .

CM • o □ O o ac r'—’

CMCO r - IS. CM v - '3 -

1 ^ r un a un CD o CDo CD CD o CD CO CD CD oCD CO co CD CO CO CO CO CD

• • • ■ ■ • . . •CM o o o a a

CD M" is . CO CM CO CO>—* r — IS CO a rs un V- un-2L CD CD CO CO CO CO CO CD CON CD CD CD CD CD CD CD CD CD

• • ■ • . . . . • •• a □ a o a

CM •c ri

•H V- CO T“ VLn CO CO CD IS O un

un *sf- <3* un• CD CD CD CD CD CD CD CD CD■ • . . . . . . . .

CM "H □ a o o ac rw

01a , CM CO CD CD IS COi CD is . un CD CD IS CD

o CO CO CO CO CO CD CD CD'—> CD CO CO CO CO CO CO CO CO\ ■ ■ • • • . . . •CM o a a o o

un in un un un in un unCM rs CM is CM IS. CM IS

• ■ • • . • . •v~ V- V V V V-

C•H

JZ JD-P -P•H •Hs 3

C cn a (_1<-> ClO c CO a ,—> aun -h o n in r -

cn •H 2> 0 -P <—. > >> >

JD TD in i—i JD JD JD> i—i \—> v_/

CO p 0LU CO 0 o CO CO CO__I > cnQ_ LU > jd jd JZ > a > un > ae : M n o u JD CD JD «cf JD CD< M CM 0 n it itcn cn CM w un 0 CM 2 CO 2 CO 2

_ 225 _

Page 240: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABLE

10.5

.6

Examining

the accuracy

of the

x2 approximations for

the

interaction

component, for

k in

the

range

1 to

2

)

Ncr

cr

cncn

CD

(M *rHcr

incno

•H

\■|— 3•H

cr

\CN

Ocn

LU IQ_ UJz: m< Mcn cn

in cn cn i \ co co cn cn

co <r- o cn in <3- cn cn

CNcn un r\ co cn cn

CD CNco ^r co co cn cn

in cn cn co co co cn cn

co co cn <3- co <3- cn cn

CO rs. cocn cn

IN.o cnin •vj-cn cn o

IN. <3* r~ CO IN. COV CN in Cn CO CN CN r~O CD cn cn cn cn CD acn cn CO CO CO CO cn cn

• • • . . . • .CD a CD CD

in in m in in in in inCN [N. CN IN. CN IN. CN IN.

• • • ■ • • . •V- V V - r - v " V- * “ V-

C•H

JZ JZ-P 4->•H •H2 2

c cn CD/—\ b fl c CO a <—i Oin •H o II V in

cn •H 2> m -P (—1 > > >

JD TD ID 1—1 JD JD JDw > 1—I V_/ V_! V_>

CO t-. 0CO 0 o CO CO CO

> 0JD JD JZ > CD > in > O

JD O a JD CO JD 'T* JD cnCN 0 II II ii

CN w in 0 CN 2 CO 2 CO 2

- 226 -

Page 241: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

10.6 Summary

This Chapter has investigated the adequacy of the generalised approach for the

randomised complete block and two-way classification designs, with particular

reference to their accuracy compared to the associated chi-squared and F

approximations. The randomised complete block and two-way classification designs

have been used to illustrate the validity of the approach for larger more complex

design situations.

Section 10.3 has show that for large k (k>2), and with the inclusion of the

correction factor /3, the new test statistics are reliable and show excellent

approximations to their associated F distributions. For small k only the main factors

may be tested against their corresponding chi-squared distributions, nevertheless, the

tests show excellent fits for a concentration parameter range of 0 to 1.

Of most interest is the examination of the test statistics in the concentration

parameter range 1 to 2. Ideally the test statistics for large k would be desirable

since measures of total and residual variation may be found and utilised. However,

the test statistics for small k show very good distribution fits and indicate that these

tests should be used for all experimental situations where the concentration parameter

is in the range 0 to 2.

- 227 -

Page 242: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

U t lA T li iK 11

ANALYSIS OF VARIANCE EXAMPLES FOR CIRCULAR STATISTICS

11.1 Introduction

Having discounted the possibility of extending the previous one-way analysis of

variance approach to larger designs, we will now proceed to test the new approaches

and their validity with real data sets. The designs range from the simple one-way

design to the Graeco-Latin square and split plot designs. This section not only

demonstrates the application of the new procedures but also indicates how they may

be used to analyses other designs not discussed in this thesis.

11.2 One-way Analysis

Example 11.2.1 for Large k

For a one-way classification analysis with large k an example given by Gadsden and

Kanji (1983) will be used. The examination concerns the orientation of particles in

clay strata observed from photographs for various magnifications. The data, given in

Table 11.2.1, has been reproduced from Gadsden and Kanji (1983).

- 228 -

Page 243: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 11.2.1 Raw Data From Photographs

Magnification:100 8 2 , 7 1 , 8 5 ,8 9 , 7 8 , 7 7 , 7 4 , 7 1 . 6 8 , 8 3 , 7 2 ,7 3 , 8 1 , 6 5 , 6 2 , 9 0 ,

9 2 ,8 0 ,7 7 ,9 3 , 7 5 , 8 0 , 6 9 , 7 4 , 7 7 , 7 5 , 7 1 ,8 2 , 8 4 , 7 9 , 7 8 , 8 1 , 8 9 , 7 9 , 8 2 , 8 1 , 8 5 , 7 6 , 7 1 , 8 0 , 9 4 , 6 8 , 7 2 ,7 0 , 5 9 , 8 0 , 8 6 , 9 8 , 82,73

200 7 5 , 7 4 , 7 1 ,6 3 , 8 3 , 7 4 , 8 2 , 7 8 , 8 7 , 8 7 , 8 2 ,7 1 , 6 0 , 6 6 , 6 3 , 8 5 ,8 1 , 7 8 , 8 0 , 8 9 , 8 2 , 8 2 , 9 2 , 8 0 , 8 1 , 7 4 , 9 0 ,7 8 , 7 3 , 7 2 , 8 0 , 5 9 , 6 4 , 7 8 , 7 3 , 7 0 , 7 9 , 7 9 , 7 7 , 8 1 , 7 2 , 7 6 , 6 9 ,7 3 , 7 5 , 8 4 , 8 1 , 5 1 ,76,88

400 7 0 ,7 6 ,7 9 ,8 6 , 7 7 , 8 6 , 7 7 , 9 0 , 8 8 , 8 2 , 8 4 ,7 0 , 8 7 , 6 1 , 7 1 , 8 9 ,7 2 , 9 0 , 7 4 ,8 8 , 8 2 , 6 8 , 8 3 , 7 5 , 9 0 , 7 9 , 8 9 ,7 8 , 7 4 , 7 3 , 7 1 , 8 0 , 8 3 , 8 9 , 6 8 , 8 1 , 4 7 , 8 8 , 6 9 , 7 6 , 7 1 , 6 7 , 7 6 ,9 0 , 8 4 , 7 0 , 8 0 , 7 7 ,93,89

1200 7 3 , 9 0 , 7 2 ,9 1 , 7 3 , 7 9 , 8 2 , 8 7 , 7 8 , 8 3 , 7 4 ,8 2 , 8 5 , 7 5 , 6 7 , 7 2 ,7 8 , 8 8 , 8 9 ,7 1 , 7 3 , 7 7 , 9 0 , 8 2 , 8 0 , 8 1 , 8 9 ,8 7 , 7 8 , 7 3 , 7 8 , 8 6 ,7 3 . 8 4 . 6 8 .7 5 . 7 0 . 8 9 . 5 4 . 8 0 . 9 0 . 8 8 . 8 1 .8 2 . 8 8 . 8 2 . 7 5 . 7 9 , 83,82

400x1.3 8 8 , 6 9 , 6 4 , 7 8 , 7 1 , 6 8 , 5 4 , 8 0 , 7 3 , 7 2 , 6 5 ,7 3 , 9 3 , 8 4 , 8 0 , 4 9 ,7 8 . 8 2 . 9 5 . 6 9 . 8 7 . 8 3 . 5 2 . 7 9 . 8 5 . 6 7 . 8 2 .8 4 . 8 7 . 8 3 . 8 8 . 7 9 ,8 3 , 7 7 , 7 8 , 8 9 , 7 5 , 7 2 , 8 8 , 7 8 , 6 2 , 6 8 , 8 9 ,7 4 , 7 1 , 7 3 , 8 4 , 5 6 , 77,71

For the photograph magnifications we wish to test the null hypothesis

H0 • ^ o , i “ ^0 , 2 ” .............“ ^o , q

against the alternative hypothesis that at least one of the equalities does not hold.

Prior to testing the hypothesis it is necessary to test the assumptions that (a) the

samples are drawn from a von Mises population and (b) the concentration parameter

k has the same value in each sample. Each of the sample populations have been

tested by Watsons U 2 statistic (1961), using the critical values supplied by Stephens

(1964), to show von Mises distributed data sets. Similarly, the homogeneity of the

concentration parameters have been tested and validated via test statistic (4.5.5)

(U 3= 5.584 distributed as x |)*

- 229 -

Page 244: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The one-way analysis components of variation are given by:

R2 r q R2 . •J

R2 r qv R2 . •Jk N ------— - k y - —LL + k n - y

N LL j - i N . i N L

L J - i N . i

and the modified test for the null hypothesis will provide an F-ratio:

F ( q - l ) ( N - q ) - P

(N-q)

r -

\*T

c—

R2 .•J

N . i

R2 1

N

( q - Dr q C H - J

L j - i

CM •

•f*.

a.

(11.2 .2)

The analysis of variance for Gadsden and Kanji's data is given in Table 11.2.2.

Table 11.2.2 Analysis of Variance Table

Source o f V a r ia t io n d . f . Measure o f V a r ia t io n

Between Photographs q - i H r j 1 r ?.

j - i k j J N

W i th in Photographs N-qS ("r2 .1

n - y , j• 1 n ij “ i

To ta l N - lR2

N ------—N

Stat is t ics ij - 1

R2228.1931

N JJ

R2

N227.6244

q=5 N . 1 N. 2 N.q - 50 N - 250

- 230 -

Page 245: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

which gives the following ANOVA table:

Table 11.2.3 Analysis of Variance Table:

Source o f V a r i a t i o n d . f . Measureo f

V a r i a t ion

MeanMV

F

Between Photographs 4 0.5687 0.142171.5974

W i th in Photographs 245 21.8069 0.089

Tota l 249 22.3756

1 1 1 ,k - 11.02 - - 1 ------------------------------ or 0 - 1.01959

f3 5k 10k2

The value of k is found via approximation (3.3.11) for large k. The modified F'

value is 0 x F 4>245 = 1.628; and as the table value of F 4>245(0.05) = 2.37, the

result is not significant and we can conclude (like Gadsden and Kanji) that there is

no observed significant difference between the orientation of the clay particles under

differing photographic magnifications.

Example 11.2.2 for Small k

An example of the test procedure for small k is given by using Mardia's (1972)

example on wind directions in degrees at Gorleston, England, at 11 hr - 12hr on

Sundays in 1968 classified according to the four seasons. The data has been

reproduced from Mardia (1972).

- 231 -

Page 246: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 11.2.4 Wind Directions in Degrees at Gorleston on Sundays in 1968

According to the Four Seasons

Season Wind D i rect io n s in Degrees

W inter 5 0 ,1 2 0 ,1 9 0 ,2 1 0 ,2 2 0 ,2 5 0 ,2 60 ,2 90 ,29 0 ,3 2 0 ,3 2 0 ,3 40

Spr ing 0 ,2 0 ,4 0 ,6 0 ,1 6 0 ,1 7 0 ,2 0 0 ,2 2 0 ,2 7 0 ,2 9 0 ,3 4 0 ,3 5 0

Summer 1 0 ,1 0 ,2 0 ,2 0 ,3 0 ,3 0 ,4 0 ,1 5 0 ,1 5 0 ,1 5 0 ,1 7 0 ,1 9 0 ,2 9 0

Autumn 3 0 ,7 0 ,1 1 0 ,1 7 0 ,1 8 0 ,1 9 0 ,2 4 0 ,2 5 0 ,2 6 0 ,2 6 0 ,2 9 0 ,3 5 0

Do the wind directions for the four seasons differ significantly within the given data

set? In this case the test statistic to be used is given by;

2r q

y R2 . ‘ J

R2

- P2 L• ja l N- j . N . ^

( 1 1 . 2 . 3 )

The associated circular statistics are given in Table 11.2.5

Table 11.2.5 Statistics for the Wind Directions

Season N. j R. j ^ o , jA

R. j

W inter 12 5.1185 272* 0 .94

Spr ing 12 2.1321 330* 0 .36

Summer 13 3.8680 57 ’ 0 .62

Aut umn 12 3.1878 232* 0.55

Combined Sample 49 5.8771 292* 0 .24

iJ-l

R2 .

N JJ

4 .5 5 9 8R2

N0.7049

- 232 -

Page 247: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The values of k j are found via the 'simple' approximation (3.2.11)

R

M*0 “ 2 —

CN15

N ." . j .

Prior to examining any difference between the four seasons the assumption that the

concentration parameters are equal must be tested. Test statistic (4.5.2) is used to

test their homogeneity and was also used by Mardia (1972). A statistic value of

U , = 0.6013 (distributed as x§) indicates that the concentration parameters for the

wind directions may be regarded as homogeneous.

Applying the test statistic leads to

iJ - l

RJ

N -j .

I , ( & )

l 0W

R2

N

and

3.8549

1 - p 2 2.0288

Hence

2 r sy R2 .

• JR2

- P2 LLj-i N.j. N

7.82

Table value x l (0-05) = 12.59

Therefore the wind direction for the four seasons are seen not to be significantly

different (the same solution as given by Mardia (1972)).

- 233 -

Page 248: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

11.3 Randomised Complete Block Design

Example 11.3.1 for Large k

This example is taken from the field of clinical psychology and is an example given

by Ramano (1977). A psychiatrist was aware that certain organic compounds in the

fluid surrounding the brain will, when purified and separately placed into solution

rotate the plane of a polarized light source. The psychiatrist was interested in

determining whether the optical activity of a specific compound was measurably

different for various degrees of schizophrenia. Five distinct levels of schizophrenic

behaviour were recognised and for each level 4 patients were selected. Measurements

were taken as to the extent to which each sample rotated the plane of polarized

light under specified conditions.

In the original example a one-way analysis of the difference between schizophrenic

levels of behaviour was examined. Here an added factor of blocking will be

introduced as a 'row' effect where we may assume a blocking by, for example, age

of patient.

In the example given by Ramano, although the angle of rotation had been measured

a standard arithmetic mean had been calculated and a 'linear' analysis of variance

carried out. However, although this is an incorrect approach to the analysis, the

estimated concentration parameters are shown to be very large and therefore the

Normal approximation to the von Mises distribution may be used. Table 11.3.1

reproduces the data given in Ramano (1977):

- 234 -

Page 249: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 11.3.1 Optical Activity Measurements of a Specific Compound Contained in

the Brain Fluid of Persons Classified According to Levels of

Schizophrenic Behaviour

Blocks1 2

Leve1s 3 4 5

AngularMean

1 11.51* 12.80* 14.98* 15.71* 20.45* 1 5 .0 9 ’

2 1 1 .74 ’ 12.49* 12 .90 ’ 15.42* 1 9 .4 2 ’ 14.39*

3 1 2 .07 ’ 12 .01 ’ 14.25* 15.77* 2 0 .2 5 ’ 14.87*

4 13.15* 13.97* 15.27* 15.07* 17.17* 14.92*

AngularMean 1 2 .1 2 ’ 12.82* 14 .35 ’ 15.49* 1 9 .32 ’

Testing the homogeneity of concentration parameters utilises test statistic (4.5.5). A

test statistic value of U 3 = 4.7477 (distributed as x |) f ° r the 5 levels and

U 3 = 2.5107 (distributed as x§) f° r 4 blocks, indicates that the concentration

parameters may be regarded as homogeneous for both factors.

The randomised complete block components of variation are given by:

R2 r pv R? R2 r qv R2 . • J

R2k N ------— - k y l . - —— + k y _ —Ll

N L■ i -1 Ni - . N L

L j - i N - j . N

f PN - I

R?l .q

- 1R2 .

•JR2

Li » l Ni . L

J - l NJ . N

and the modified test for the null hypothesis of no difference between the q levels

will provide an F-ratio:

- 235 -

Page 250: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

F( q - 0 . ( p - 0 ( q - 0 “ ^

r[R’ j |

R2(p**i) (q - i ) I ------—

LiLj-i k . j J N

r p R L q h ] R2(q - i ) N - I - 1 + —

LtL i - 1 k J Lt

j - i k j J N

( i i

For the null hypothesis of no difference between the p blocks

F( p - i ) . ( p - 0 ( q - 0 “ ^

r pv R2( p - l ) ( q - l ) I ------—

LaLi-i [Ni.J N

r pRi . '

qv h ]R2

(P-D N - I - I + —L t

i -1 LNi JLa

j - i k j J N

(11

The analysis of variance for the optical activity measurements is given in

11.3.2.

Table 11.3.2 Analysis of Variance Table

Source o f V a r ia t ion

d . f . Measure o f V a r ia t io n

Due tosch izophren iclev e l

Due to Block

q-i

p - i

Res idual ( q - l ) ( p - l )

iJ - l

P

ii-1

P

N " Ii -1

R2• j

LN J J

Ri .

N i .

R?l

R2

N

R2

N

q■ I

R2 . •J

R2

Lj - l k j J N

Tota l N - 1 N -R2

N

. 3 . 2 )

. 3 . 3 )

Table

- 236 -

Page 251: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

r

Ii= l

R?JL

Ni19.9957

R2

N19.9564

iJ- l

R2 . • J

lN.Jj19.9568 p = 4 q => 5 N = 20

which gives the following ANOVA table.

Table 11.3.3 Analysis of Variance Table

Source o f V a r i a t ion

d . f . Measure o f V a r i a t ion

MV

MeanMV

F

Due tosch izophren ic leve 1

4 0.039263 0.009816 30.311

Due to Block 3 0.000414 0.000138 0.426

Res idual 12 0.003886 0.0003288

To ta l 19 0.043563

A A

From concentration parameter approximation (3.3.11) k « 459, as k is so large the

correction factor (3 may be neglected (j3 = 1.00043). Hence F 4>12 = 30.311 and as

the table values of F 4>12(0.05) = 3.26, F 4 1 2 (0.01) = 5.41, the analysis shows that

the optical activity of the compound differs significantly for schizophrenic behaviour

recognised by the psychiatrist.

Testing the difference between the blocks gives F 3>1 2 = 0.426, as the table value of

F 3, 12(0*05) = 3.49, the analysis indicates no significant difference between the 4

blocks under investigation.

- 237 -

Page 252: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Example 11.3.2

This example of a randomised complete block analyses a hypothetical data set where

the concentration parameter is close to two. As was discussed in Chapter 6, a data

set of this nature may not be analysed via extension of the original techniques.

Table 11.3.4 shows the hypothetical data set which will be used to illustrate the

application and robustness of the new approach.

Table 11.3.4

Block1 2

Treatment 3 4 5 6

AngularMean

1 311' 299’ 338* 298’ 286’ 305* 305 .966 ’

2 326* 39’ 354* 47* 10* 354* 8.283*

3 10* 45* 309’ 319* 25* 339* 345.974*

4 55* 48 ’ 54* 17* 29* 69* 4 5 .4 4 0 ’

AngularMean 353 .9 8 ’ 7.657* 351.89* 349.795* 2 .2 5 6 ’ 353.053*

As with all the previous examples the homogeneity of the concentration parameters

must be tested prior to the analysis of variance. Test statistic values of

U 3 = 0.0821 (distributed as x§) f° r the 6 treatments, and U 3 = 2.7967 (distributed

as x Q f° r the 4 blocks, indicates that the concentration parameters may be regarded

as homogeneous for both factors.

- 238 -

Page 253: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Using the test statistic (11.3.2) and (11.3.3) and the Analysis of Variance Table

11.3.2 from the previous example the resulting statistics are obtained.

r

Ii -1

R?l .Ni

20.519145R2

N13.360056

iJ - l

R2R. jN

13.539216 pq - N - 24

which gives the following ANOVA table

Table 11.3.5 Analysis of Variance

Source o f V a r i a t i o n

d . f . Measure o f V a r i a t ion

MV

MeanMV

F

Due to treatments 3 0.178658 0.05955 0.2705

Due to Blocks 5 7.158587 1.43172 6.5035

Res idual 15 3.302197 0.22015

To ta l 23 10.639442

A

From concentration parameter approximation (3.2.7) k = 2.34, hence /? = 1.1159.

The modified F ' value for the testing of the differing blocks becomes 1.1159 x

6.5035 = 7.257 and as the table values of F 5>15(0.05) = 2.9 and F 5>15(0.01) =

4.56, the analysis shows that there is a significant difference between the blocks.

The modified F ' value for the testing of the differing treatments becomes 1.1159 x

0.2705 = 0.3019. The table value of F 3>1 5(0.05) = 3.29 indicates that there is no

significant difference between the observed values within the treatments. Figures

11.3.1 and 11.3.2 illustrate the differences between treatments and blocks, and help

Page 254: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

to understand and confirm the results obtained- Figure 11.3.1 shows the large spread

between the 4 blocks, whilst Figure 11.3.2 shows little spread between the 6

treatments, both confirmed by the analysis of variance

Figure 11.3.1 Angular Mean Responses for the Block Effects

Figure 11.3.2 Angular Mean Responses for the Treatment Effects

- 240 -

Page 255: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

11.4 Two-way Classification with Interaction

Example 11.4.1 for large k

Table 11.4.1, overleaf, gives a hypothetical example data set for the two-way

classification design with large k. The data may, for example, be representative of

the time of on-set of an illness with relation to new drugs and differing groups of

people.

The two-way analysis components of variation are given by:

R2M ’ ' * a U

r p y R?l . . R24- k

r *y R2 .•J*

R2

N z. i« l Ni . .

N zLj“ i N-j . . N

r p qV V R2 .

+ k n - y y 1J •La La

L i - l j - i Ni j . .

+ /c

p q

1 1■i~i j = i

R? . 1J

N

r

- Ii j i = l

R?i_

N* - 1J - l

R2 . • J

N J-J N

( 1 1 . 4 . 1 )

Testing the homogeneity of concentration parameters between cells produces a test

statistic of U 3 = 0.682 (distributed as x§) indicating the equality of concentration

parameters. Table 11.4.2 provides the analysis of variance statistics required.

- 241 -

Page 256: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

TABL

E 11

.4.1

Tw

o-wa

y cl

assi

fica

tion

o 0 ©LO vO LO 1—1 ClCN CN (—1 Cl LO00 . 00 • r—1 •

LO to . LO CN • o r^CN t—4 o CN t—1 LO to to CNi—H to r—4 to CN to

H. II . It II . II II II II. II

o T z f la fiV

DC Z^ I® * o f z : |CD:

oi—< 0

Cl 00 • ■Cl CNo • o i“H vO CN00 o Cl • O o CN

. LO • LO to • rH •to CN Cl CN

it. II II II. II. II. n. II .ti

<•0

o f z fA

icrr

rj•i

z la?■»

o f•>

2

o o o o 0 o 0 o 0 0CN CN LO CN to rr Cl f—H LO

CN LO CN CN r“—4 CNto to to to

o 000 to

i-H t > • r—1 ClCl • Cl CO i“ 4 •00 o 00 Cl Cl O LO

. LO r - • LO • • —H i-H< ^r CN •sf o vO to

ii. II n II . II . II . II . ii IIaco <

Mo f z? la r 2 ? 10? oc

*<z ICD*

E-cj< 0 0 © 0 • 0 © O « a£L- Cl L0 t-H CN Cl LO Cl 00

00 LO vO r—1 LO LOCM CN CN CN CN to to to

•rH

0Cl

CN CN '■O vO vO 00Cl • vO • CN •00 O 00 Cl o

LO o . lO O . <—1 o• r to •rr to Cl toII. II II ii . IL II . II . II _ II ^

0< o f z f la? DC* Z * la ? DC z~ lay

© 0 o a o o o o 0 oi*H O vO LO —1 CN

00 o Cl O Cl CN Cl i-H CNCN to to CN to CN to CN to to

oCQ co"

CQ

DCo

u<U.

- 242 -

Page 257: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 11.4.2 Analysis of Variance Table

Source o fV aria t ion

d . f . Measure o f V a r ia t io n

In te r a c t ion AB

Res idual

To ta l

Due toq - i

qy R2

Factor A Lj-1 NJ , N

Due topy Ri . . ' R2

Factor B P 1 Li-1 Ni . .N

( p - l ) ( q - l )p q

I 1i - 1 j - 1

q

- 1J-i

[ f j .l P. y R?l . .

Nu . .L

i - 1 Ni . . .

R2 . •J

p q ( l - l )

N - 1

p q

■ - 1 1i - i J - i

R2N - —

N

R2 + - * - -

N

R2 .i j

tr

Ii - 1

RiNi

R2- 21.601414 - 17.935976

N

ij - 1

R2 . •J

N J-J

22.44784 p - 2 q - 3 / - 5

N — pq — 30

p q

i ii - 1 j - 1

*11 lNU

28 .550744

- 243 -

Page 258: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

which gives the following ANOVA table:

Table 11.4.3 Analysis of Variance Table

Sources o f d . f . Measure o f Mean FV a r i a t ion V a r i a t ion

MVMV

Due to Factor A 2 4.51186 2.25593 37.35877

Due to Factor B 1 3.66544 3.66544 60.7005

In te ra c t ion AB 2 2.43747 1.218735 20.1825

Res idual 24 1.44925 0.060385

To ta l 29 12.06402

From concentration parameter approximation (3.3.10) k = 2.57, hence /3 = 1.1025.

The modified test for the null hypothesis of no significant difference between the q

levels of Factor A provides the F-ratio:

F ( q - l ) , p q ( Z - l ) " 0

p q ( Z - l )f ^

y R2 .* J*

R2LU=i N* J , N

( q - 1 )

p q

■ - I Ii - i j - i

Ru

iNi j

( 1 1 . 4 . 2 )

Therefore F ' 2>24 = 1.1025 x 37.35877 = 41.188 and as the table values of

F 2>24(0.05) = 3.4, F 2>24(0.01) = 5.61, the analysis indicates a very large significant

difference between the q levels of Factor A.

- 244 -

Page 259: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Similarly, F-ratios (11.4.3) and (11.4.4) provide the null hypothesis tests with regards

to the p levels of Factor B and the pq levels of interaction AB respectively.

p q ( f - l )9y R2

LU - i

N,* N* * * -

(P -1 ) » - 2 Ii - 1 j - 1

RUN i j

( 1 1 . 4 . 3 )

F(p - i ) (q - i )»pq(f - i )1

pq(/-Dp qV V [Ri i 1

pv Ri . /

qv [R2j . l

R2) )

1J • - ) - ) +L L

L i -1 j - 1 LNi j JL

i - 1 [N i . . Ju

j - 1 k j j N

( p r l ) ( q - l ) »-l Ii - 1 j - 1

Rfj

lNi j . J

( 1 1 . 4 . 4 )

F ' 1>24 = 1.1025 x 60.7005 = 66.922 and as the table values of F 1>24(0.05) = 4.26,

1 , 2 4 (0.01) = 7.82, the analysis shows there to be a very large significant difference

between the observed results for Factor B. Testing the difference between the

interaction terms AB shows a similarly large significant difference, F ' 2, 24 = 22.25.

Figures 11.4.1, 11.4.2 and 11.4.3 emphasize the observed differences found between

the q levels of Factor A, the p levels of Factor B, and the pq levels of interaction

respectively.

- 245 -

Page 260: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

e...

Figure 11.4.1 Angular Differences between the q Levels of Factor A

6 ,e...

Figure 11.4.2 Angular Differences between the p levels of Factor B

Figures 11.4.1 and 11.4.2 illustrate the angular mean differences between the q and p

levels of Factors A and B, the solid lines represent the angular mean for each level,

while the dotted line indicates the associated range of values accounting for that

mean. The larger significant differences between levels within both factors are

clearly -seen.

- 246 -

Page 261: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

( a ) D i f f e r e n c e between l e v e l s o f A0 and A1

F a c t o r A

(b) D i f f e r e n c e between l e v e l s o f A, and A2

F a c t o r 2a

( c ) D i f f e r e n c e between l e v e l s o f B0 and B,

F a c t o r B

Figures 11.4.3 Mean Responses Indicating Interaction

Figures (a)-(c) indicate the presence of interaction between the two Factors,

represented by the length of the line segments between the differing levels.

- 247 -

Page 262: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The hypothetical example above shows a design data set where all the differing

groupings of, cells, rows, columns and overall, produce a large value of concentration

parameter. The problems arise, in the majority of cases, where the concentration

parameters for the differing groupings do not remain large and equal. As the

complexity of designs grow the chance that all the cells, rows, columns etc have the

same large and equal concentration parameter becomes highly unlikely. Confusion

arises in the original 'simple' one-way analysis when the q samples all have large

equal k, but their combined sample gives a small k. Batschelet (1981) assumes that

the parameter of concentration has the same value in each population and that k is

found from the average of the sample resultant lengths, and therefore ignores the

combined overall sample. The test statistic derived by Watson and William (1956) is

based, however, on the combined overall sample concentration parameter k, and it is

this value that should be used. This problem limits the usage of the original test

statistic and was highlighted in the data set of Example 7.2.1 where a two-way

design was examined. The new generalised approach was constructed in such a

manner as not to breakdown under such conditions. The following example

re-analyses the data set of Example 7.2.1, this time using the new approach.

Example 11.4.2 for small k

In Example 7.2.1 the possible breakdown of the original extended techniques was

discussed, here this data set is re-analysed via the new approach. Using the

component statistics of (11.4.1) and the analysis of variance Table 11.4.2 the

following ANOVA Table is produced:

- 248 -

Page 263: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 11.4.4 Analysis of Variance Table

Source o f V a r ia t io n d . f . Measure o f V a r ia t io n

Due to Factor A 1 9.33114

Due to Factor B 1 8.83213

In te ra c t io n AB 1 0.02437

Res idual 16 1.7936

T o ta l 19 19.98129

r

2i - 1

q

2j - 1

Ri

R2 . • J

8.85089R2

N0.01871

N• J

9.34985 p - 2 q - 2

N - pqZ - 20

2 2 i - 1 j - 1

Ri j

Ni j .

18.2064

In Example 7.2.1 the sum of the two mean effects measure of variation was greater

than the total measure of variation and consequently the interaction term was

negative. The first, and most important, property shown in Table 11.4.4 is that the

component measures of variation for the new approach remain positive and sum to

the associated total measure of variation.

The concentration parameters of Table 7.2.1 may be shown to be equal within cells

and between factors, however, the overall k is very small. Chapter 10 has shown

the residual and total measures are not chi-squared distributed when k is small and

therefore the F-ratio cannot be computed, and examination of the main effects and

- 249 -

Page 264: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

interaction must oe carried out using me associated cm-squareu test statistic ior smau

k.

Test Statistic

k - 0 .0612 t 2_ -2- - 2 .00187 where p - A( k )I, (k)

Between measure of variation for Factor A

18.68

This is distributed as x i(q -l)> from tables x i ^ - ^ ) = 5.991 x 2^ * 0 1 ) = 4.605,

indicating a significant difference between the observed responses of Factor A.

Between measure of variation of Factor B

Distributed as X 2 (p-1)> once aSa n indicating a significant difference between the

observed responses of Factor B.

Between measure of variation for interaction term AB

2r p q

y y C—

Py * ? . '

qy

[R f j ]

R24- # *

- P 2 L L Li-1 j - 1 N i j .

Li - 1 N i * .

ij-i N.i N

Distributed as x 2(p -l)(q -l)» indicating that no interaction exists between Factors A

and B.

- 25 0 -

Page 265: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

This hypothetical data set was designed solely to emphasize the possible breakdown of

the original extended techniques. The main effects were purposely set with maximum

distance from each other to produce a large significant difference within both

factors.

11.5 The Latin or Graeco-Latin Square

Example 11.5.1

The object of the experiment was to ascertain whether a modified annealing

procedure (to heat and then cool slowly to prevent brittleness) could be introduced

into the production of light gauge domestic copper tube. The original experiment

measured the subsequent tensile strength of the tube (Davies (1963)), here the results

will be taken as the breaking angle of the tube. In deciding the form of the tests it

is necessary to consider possible causes of variation in the results, including variation

in the material itself and variations in temperature over the annealing furnace.

Material variations are studied by taking samples of eight tubes at random on each

of eight days spread over a period of three weeks, thus allowing normal process

variations to be covered adequately. The 64 tubes were held in the furnace in a jig

having eight horizontal rows and eight vertical columns of holes i.e. on 8 x 8

square. The construction of the furnace indicated that temperature variations, if

present, would be horizontal or vertical, and no appreciable interaction was expected

between rows and columns.

The absence of appreciable interaction enabled factors to be examined by means of a

Latin or Graeco-Latin Square. For this example the Graeco-Latin arrangement is

employed in the experiment, so that two factors each of eight treatments could be

allotted to the 64 cells in such a way that the two factors and row and column

effects could be determined independently. The two factors were:

- 251 -

Page 266: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

1. Day of Manufacture

2. Number allotted to an individual tube within the sample of eight.

From the point of view of this example the second is a dummy factor, since the

numbering of the tubes did not correspond to any physical reality.

A separate square selected at random was used for each temperature, and the rows,

columns, and letters were themselves randomised. The results for a nominal

temperature of 300 *C are shown in Table 11.5.1, in which the rows and columns

represent positions in the furnace, the letters A to H the day of manufacture, and

the numbers 1 to 8 the designation within the sample. The figures in brackets are

the associated breaking angles.

The Latin or Graeco-Latin square components are given by:

R2M * *r p y R2 i . R2 r p y R2 .

• J - •R2

N l. i - 1 N i. N L

. i - 1 N J . . . N

To ta l measure Measure o f V a r ia t io n Measure o f V a r ia t io n o f v a r i a t io n due to row o f j i g due to column o f j i g

+ kR?.z. R2 r p y R2 . . . a R2

N z.a - l N. . . a N

Measure o f V a r ia t io n due to day o f

manufacture

Measure o f V a r i a t io n due to number

des ignat ion w i th in the sample

+ k

i - 1

R? i ..

- 1 J-1

R2

N - iZ—1

R?.z.'Pn R2 R2_ y . . . a + 3 ’ ’ * •L

a - l N . . . a N

Res idual Measure o f V a r ia t io n ( 1 1 . 5 . 1 )

Carrying out the analysis of variance we obtain Table 11.5.2.

- 2 5 2 -

Page 267: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Tes

ts

on an

neal

ing

light

ga

uge

dom

estic

co

pper

tu

bes

GO

•Hg pcS ctf <L> *H

S > 0

o

00

vO

LO

to

cE3

fHo

u

LOCNr -

vO

00

LO

r -U-

LO

CN CD

CN

00

00

LO

00w

’ S’

r

vOCQ

-S’

LOu

cnvo

’ S’X

vO

vO

COa

CN

to

03

oo

00c j

00

r -

CQ

vO

CNX

o03

to<

00

LO

"S’CO

’ S’

vO

LOCU

03

LO

VOU -.

ooLO

to00

CN<

"S’

00

w

00

00ci«

oo

LO

CO

vO

vO

’S’a

CN

03r—i'_'to

00

vd

vOCJ

LOCQ

0003

VO

LO

vO

00CQ

00

00

CNQ

vO

r-~

u

ovO

LOG-

tovO

vOw

03

LO

toCO

I''.

’ S-<

ooLO

CN00

t '-

LO

voCO

03

LO

"3-tu

’ S’

00

toCO

tooCN

LO<

CN

03

00

vo

00a

or -

CNCQ

’ 3’

CJ

vO

03

LOX

’ S’

t''.

toCQ

LO

1

vOa

03

LO

00CO

03

vo

r^<

ovO

Cl-

LO

vO

CNCO

to00

toCO

LOa

"S-

03

Tt\ vOCO X

vO

’S’CQ

LO

vO

CO

03

LO

CNCO

03

vo

00<

00

LO

f"co

CN•4

vO

’S’

00

"S’CO

’S’

vO<

LO

LOCO

00

vO

toco

CN

03

CNCO

’ S’

r^.

CQ

’ S’

r -

Q

vO

00

00X

to vo 00

toLOCN

CNvO"S’

CNvOto

LOCN00

LOCNto

LO

’ S’

CNO03

VO

CNvOvO

vO

OLO03

vO

Co

•HG P G cS 0 *H

2 > o a

oLO

LO00

00

00

LO

ovo

oLO00

oLO

oLOo

oo00

00to

LOCN

CQ CN

CNvOCO

o00

rH

0 «H

aQ

- 2 5 3 -

Page 268: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 11.5.2 Analysis of Variance of Data from Table 11.5.1

Source o f V a r ia t ion

d . f . Measure o f V a r ia t ion

MV

MeanMV

F

Betweenrows o fJ ig

7 1.657 x !0 “ 3 2 .3 6 7 x l0 “ 4 1 .5

Betweencolumnso f j i g

7 2.347 x l0 “ 3 3 .3 5 3 x l0 - 4 2 . 1 2 *

Between pos it ions 14 4.004 xlO-3 2 . 8 6 x l 0 “ 4 1 .8 1 *

Betweenlo ts( l e t t e r s )

7 1 .4 7 17 x l0 “ 2 2.102x10-3 1 3 .3 2 * *

Betweennumbers 7

- 42

9 . 83x l0“ 4

" 6.629x10-3

1 .4 0 4 x l0 " 4

- 1 .5 7 8 x l0 “ 4

Res iduals 35 5.646x10-3 1 .6 1 3 x l0 -4

T o ta l 63 2 .5 3 5 x l0 - 2

Py T? 2

Ri . . .L

i - 1 Ni ...Py R2

• j . .Lj - i

pyL

/= i

py R2 . . . aL

a - l N - - - a

63.976307

63.976997

63.989367

63.975633

R2

- 63.97465N

N - 64

p - 8

- 254 -

Page 269: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

rne concentration parameter approximation is so large me correction iacior p may

be neglected ( 0 = 1 )

* = significant at the 1 0 % level * * = highly significant (1 % level)

As expected, the mean measure of variation of the factor representing number

designation within the sample does not differ significantly from the mean measure of

variation. The number factor may therefore be combined with that for the residual

to provide a new estimate of error with 42 degrees of freedom.

Variation between columns (i.e. across the furnace) and total variation among

positions (variation over all positions in the furnace) may be judged significantly

different at the 10 percent level. Clearly the major factor causing variation in the

breaking angle was variation in the lots of tubes; position in the furnace is of little

or no importance.

F 7 t 4 2 ( 0 * 1 0 ) = 1 . 8 7

F 7>42(0.05) = 2.25

F 7, 4 2 (0 .0 1 ) = 3.12

F i 4 , 4 2 ( 0 * 1 0 ) = I - 6 6

F 14|42(0.05) = 1.92

F i 4 ,4 2 (0*01) = 2.52

- 255 -

Page 270: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

11.6 The Split Plot Design

Although the theory behind a design such as the split plot has not been derived and

fully checked, the new generalised approach is robust enough (particularly for large

k) to analyse any experimental design if the components are adapted appropriately

and the assumptions are checked sufficiently.

Example 11.6.1

In an experiment on the preparation of chocolate cakes three recipes for preparing

the mixture were compared. Recipes I and I I differed in that the chocolate was

added at 40 *C and 60 *C, respectively, while Recipe m contained extra sugar. For

each recipe enough mixture was made for six cakes. These 18 cakes were then

placed in an oven which was then heated slowly. When the temperature had

reached 175*C three cakes, one from each recipe, were selected at random for

removal, another three at 185*C, and so on until the last three cakes were removed

at 225 *C. In this manner the recipes are representative of the "whole-unit"

treatments, while the baking temperatures are representative of the "sub-unit"

treatments. There were 7 replications, one replication was completed before starting

the next, so that differences among replicates represented a time difference.

A number of measurements were made on the cakes. Table 11.6.1 presents the

measurements of the breaking angle. One half of the cake was held fixed, while the

other half was pivoted about the middle until breakage occured.

- 256 -

Page 271: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The split components of variation are given by:

N -R 2

N

r

Ii - 1

R?i_

N i

R 2

N+ k i

■J-1

R 2 . •J

N J-J

R2

N

+ k

' III

y R?.z' R 2L.Z—1 N..z, N

+ k

r p q R ? . P [r 2 q R 2 2 R2 1

1 1 L i - l j - l

i j .

N i j ,- 1

i - 1

1 . .

N i . .- 1

j - i

• J-

NJ ,

+N

+ k

p m

+ k

U - i z - i

q m

1 1 J - 1 Z - l

H r 1 - !N i.Z

R ’ j *

i - 1

q

• ij - i

R2l

N i

R 2

N

+ k

p m

- 1 1i = l Z = l

R i . Z

N i A

q m

1 1 j - i z - i

R 2

m

- 1Z-l

R 2

N

m

- IZ -l

R 2

N

Pn + y

R?l . .q

+ IR 2 .

m

+ IR?.z

p q

- 1 I

R? . i j -

i i - i [Ni . . J j - i k j . J Z-l N. . 1 i - i j - i LN i j J

R 2

N

R 2

N

• j Z

LN - j z j

R 2

N( 11 . 6 . 1)

Carrying out the analysis of variance we obtain Table 11.6.2.

- 257 -

Page 272: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 11.6.1 Examination of Breaking Angles

Rep1ic a t ions

175* 185*

Temperature

195’ 205’ 215* 225’

AngularMean

1 42 46 47 39 53 42

2 47 29 35 47 57 45

3 32 32 37 43 45 45

RecipeI 4 26 32 35 24 39 26 3 5 .3 7 ’

5 28 30 31 37 41 47

6 24 2 2 2 2 29 35 26

7 26 23 25 27 33 35

1 39 46 51 49 55 42

2 35 46 47 39 52 61

3 34 30 42 35 42 35

RecipeI I

4 25 26 28 46 37 37 3 6 .0 2 ’

5 31 30 29 35 40 36

6 24 29 29 29 24 35

7 2 2 25 26 26 29 36

1 46 44 45 46 48 63

2 43 43 43 46 47 58

3 33 24 40 37 41 38

RecipeI I I 4 38 41 38 30 36 35 3 6 .1 4 ’

5 2 1 25 31 35 33 23

6 24 33 30 30 37 35

7 2 0 2 1 31 24 30 33

Angular Mean 31.42* 3 2 .2 2 ’ 3 5 .3 3 ’ 3 5 .8 5 ’ 4 0 .6 6 ’ 3 9 .6 2 ’ 35.84*

- 258 -

Page 273: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Table 11.6.1 Continued

R e p l ica t ion 1 2 3 4 5 6 7

Angular Mean 46.82* 45.55* 36.95* 33.28* 3 2 .3 8 ’ 2 8 .7 2 ’ 27.33*

Table 11.6.2 Analysis of Variance of Data from Table 11.6.1

Source o f V a r ia t ion

d . f . Measure o f V a r ia t ion

MV

MeanMV

F

BetweenRecipes 2 0.00207 0.001035 0.081

Between R e p l ica t ions

In te ra c t ion

6 1.92835 0.3213916 2 5 .1 6 9 *

R ec ipe /R ep l ica t ion

1 2 0.15323 0.0127691

BetweenTemperatures

In te ra c t ion

5 0.44041 0.088082 1 3 .4 7 *

Rec ipes/Temperatures

30 0.17493 0.005831 0 .677

Res idual 60 0.51635 0.00860583

To ta l 125 3.28073

- 259 -

Page 274: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

Ii = l

q

ij - i

m

iZ=1

R2

N

R L . 122.72134N l . .

R2 . • J 123.15968

P 2K..z

N . .Z- 124.64762

122.71927 p - 3

* = Highly significant { \% level)

P q

1 1i - 1 j - 1

p m

1 1i=*l Z=T

q m

1 1 j - 1 Z - l

R? . i j

R?.Z

123.22714

N i . Z124.80292

N • JJJ

125.26296

q = 6 m = 7

N — pqm *= 126

The concentration parameter approximation is large and therefore the correction

factor may be neglected ((3 = 1)

From tables , 2(0.05) = 3.89

F s, , 2(°-05) = 3.00 Fb, 1 2 (0 -0 1 ) = 4.82

F s. , o(°*05) = 3.33 F s, ,o(0 -0 1 ) = 5.64oCO ,B0 (°*05) = 1.65

The analysis of variance Table 11.6.2, indicates that there was no significant

difference observed between the recipes. This is shown more clearly from

examination of the angular mean breaking angles associated with the three recipes,

given in Table 11.6.1, where little difference may be noted. Variation between

replicates and temperatures, however, are significantly large and are the dominating

factors causing variation in the breaking angle. The angular mean breaking angle is

seen to increase as the temperature of the oven increases and may be an

understandable effect. The angular mean breaking angle of the replications, however,

- 260 -

Page 275: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

show an even clearer decrease in value as the experiments are carried out. This

heavy dependence on time is less understandable and may indicate, for example, that

another factor or condition may be affecting the oven state.

11.7 Summary

This chapter has shown how the new generalised approach may be applied to real

situations where directional data values are measured. The effect of large and small

concentration parameters have been emphasized together with the requirement to test

for the homogeneity of concentration parameters. The Graeco-latin square and split

plot examples have helped to show the suitability of the approach for many

experimental design situations and indicate the method by which further test statistics

may be built.

- 261 -

Page 276: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

CHAPTER 12

SYNOPSIS OF RESULTS AND CONCLUSIONS

The aims of this study have been to extend the knowledge of the present methods of

analysis of directional data and to develop suitable analysis of variance techniques for

differing experimental design models. These objectives have been achieved, although

it has not been possible to produce a single unified approach for both large and

small concentration parameter. However, separate techniques were produced under a

generalised approach which has been shown to be applicable to many experimental

design problems. To obtain this approach the work has followed several steps

individually discussed within the thesis.

Before discussing the development of the new techniques an understanding of the

development, constructions and distributional form of the original methods was

required. Within the first four chapters a simple yet informative review of the many

approximations for the concentration parameter k , both large and small, was given.

This work included the plotting of the residuals and relative residuals for each

approximation to indicate the accuracy and range of application for the statistics.

Several were shown to be inappropriate despite their complex form. The analysis

culminated in a summary table of the 'best' and 'best simple' approximations for

both large and small k (Table 3.5.1) and indicated the required approximations for

use with the techniques to be developed.

Chapter 5 discussed a generalised linear modelling approach, analogous to the normal

theory of linear regression, in order to estimate the individual parameter values for

application to the maximum likelihood method. The work showed how the

observations may be 'added together' under the constraint that the sum of the sines

of the factor effects equals zero. Using this fact parameter estimates may be found

- 262 -

Page 277: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

for the one-way analysis of variance giving further understanding to the underlying

structure of the data. The approach, under the above constraint, was found to

produce the original one-way analysis of variance technique developed by Watson and

Williams. When applied to larger experimental design problems, however, the

optimisation of the circular constrained simultaneous equations could not be found,

without very good initial estimates, due to numerous local maxima. Further work in

this particular area, using improved techniques for convergence and ever increasing

improvements in computing methods, is worthy of investigation.

Although it would have been useful to have produced a computer optimisation

program to solve the constrained equations, the constant need for a computer

program as a general method for analysing circular experimental designs, would be

rather restrictive. Therefore, a simple construction of an analysis of variance was

still required. Chapter 6 investigated the possibility of extending the original

approach, with large k , for other designs such as the nested or hierarchical, the

randomised complete block and the two-way design with interaction. The methods

were seen to extend for these designs (k > 2 ) and with good chi-squared

approximations. However the assumption of equal and large k must hold, not only

for the cell, column and row observations, but for the overall sample. This, under

larger and larger designs i.e. with an increasing number of factors, is extremely

restrictive. For example, in situations where the individual row and column factors

have large concentration parameters and the mean directions differ, the overall

concentration parameter may be small. Under these conditions within a two-way

design the 'sum of squares' for the factor effects may add to a value greater than

the total 'sum of squares'. A full investigation followed in Chapter 7 where the

one-way analysis was reconstructed via a regression approach using basic vector

analysis to examine the make-up of the individual components. This brought to light

the possible lack of independence between the model components and the non-zero

- 263 -

Page 278: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

existence of a cross-product term. This questioned not only the use of the original

techniques for larger designs but the adequacy of the original one-way analysis.

The development of the new approach needed to overcome the faults of the original

techniques but be relatively simple and, if possible, be capable of generalising across

all designs. The vector approach used to investigate the original techniques was used

again to construct the new test statistics. The method utilizes the resultant lengths

associated with each sample mean direction, in order that when sample means are

combined the overall mean direction is still obtained. The vector approach

minimised the chord distance between mean directions to construct the test statistics.

Nevertheless, this is not only testing the difference between mean directions but the

associated concentration parameters as well. Hence, a separate examination of the

individual concentration parameters must also be carried out for a valid test of the

mean directions to be feasible.

The beauty of the construction of components in this manner is its simplicity and, in

comparison to the original techniques the independence of the individual components

and the zero cross product terms produced. The interpretation and calculation of

interaction is discussed and the generalised nature of the technique is illustrated in

the construction and proof of the interaction component.

As with Stephens (1969) and Upton (1970) an attempt was made to evaluate

numerically the exact distribution of the old and new test statistics. However, even

for the simplest single sample test statistics the numerical integration involved was

found to be very tedious.

R does not have a simply stated density function, and indeed a direct computation of

the significance points of R/N or R 2/N is not simple.

- 264 -

Page 279: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

The majority of the tests are complicated by involving R v R 2 and R, and, since

these statistics are not for the most part independent of one another, either one or

another of the conditional distributions needs to be employed. These distributions are

exceedingly complicated as are the relevant bounds for integration, and no results of

any use were obtained by attempting to integrate numerically.

Alternative investigations were carried out using the power series expansion of p to

examine the moments of the test statistics to compare with their associated

chi-squared distributions. This showed good approximations to the first moment but

worsening accuracy to higher moments, depending on the size of k. In a similar

manner to the adjustment advocated by Stephens, an important factor is produced to

increase the test statistics accuracy.

The simulation techniques, discussed in Appendix B, have been successful in obtaining

the characteristics of the von Mises distribution and hence the various test statistics.

Elaborate examination of the components for varying designs, concentration parameter

and sample size have been carried out together with tests of the statistics power and

robustness in comparison to any available alternative techniques. Compared to the

alternatives in the one-way analysis the new technique is seen as slightly less accurate

and powerful, although it does possess desirable properties as previously described.

For small concentration parameter the new approach, as with alternative tests, can

only examine the between measure of variation, seen to be chi-squared distributed.

No account can therefore be taken of the residual variation.

Following further justification for the new approach via examination of the

randomised complete block and two-way analysis designs, the test statistics were

successfully applied to differing problems with varying concentration parameter.

- 265 -

Page 280: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

APPENDIX A

INDEX OF NOTATION

A (k) or p : ratio of I^A:) to I 0 (fc)

C : sum of cosines of angles

C : mean of cosines of angles

F (0 ) : distribution function

f( 0 ) : circular density

H 0, H , : null and alternative hypothesis

Ip(fc) : modified Bessel function of the first kind and order p

i : subscript ranging from 1 ,2 , ....... ,p

j : subscript ranging from 1 , 2 ........ ,q

I : subscript ranging from 1 ,2 , ....... ,m

m : number of levels for factor 3

N : size of sample

p : number of levels for factor 1

q : number of levels for factor 2

R : the resultant length

Rj : R for the ith sample

r : mean resultant length of sample

S : sum of sines of angles

S : mean of sines of angles

VM ( p 0,k) : von Mises distribution with mean direction p 0 and

concentration parameter k

(3 : improvement factor for the new approach

7 : Stephens improvement factor

k : parameter of concentration

- I -

Page 281: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

population mean direction

standard deviation

circular random variable

overall mean direction of sample

General Tabular Notation

FACTOR A

A 0 A 1 A 2

* 1 1 1 * 1 2 1 * 1 3 1

* 1 1 2 R 1 1 * 1 2 2 R 1 2 * 1 3 2 R 1 3 R 1 .

B o * 1 1 3

* 1 1 4

N n * 1 2 3

* 1 2 4

N 1 2 * 1 3 3

* 1 3 4

^ 1 3 N 1 .

F a c t o r B

* 2 1 1 * 2 2 1 * 2 3 1

* 2 1 2 R 2 j * 2 2 2 R 2 2 * 2 3 2

COCMPi

R 2 .

B i * 2 1 3 N 2 i * 2 2 3 N 2 2 * 2 3 3 N 2 3 N 2 .

* 2 1 4 * 2 2 4 * 2 3 4

R 1 R . 2

CO R

N . 1 N . 2 CO N

Mo

a

6

or 0

- II -

Page 282: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

wherethe resultant length of cell observations in row i( i= l , 2 p) and the column j ( j= l ,2 , .,q)

the resultant length of all observations in row i( i= l,2 , p)

the resultant length of all observations in column j( j= l , 2 q)

the resultant length of all observations in the sample

the total number of observations in the sample

- I l l -

Page 283: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

APPENDIX B

THE SIMULATION AND ACCURACY OF NUMERICAL RESULTS

All the numerical results presented in this thesis are the product of simulations; all

simulations carried out had a minimum of 10,000 samples. For standard simulation a

random number z is generated in the range 0 to 1, for a distribution function F (0 )

the number z corresponds to an observation 0 from this distribution which is the

solution of the equation

F (0 ) = z

For many distributions this equation can be inverted directly to obtain

0 = F“ ’ (z )

but this is not possible for the von Mises distribution. In addition generation of a

pseudo-random observation from VM(0,fc) cannot be obtained by a simple

transformation of VM(0,1) to VM(0,fc) and an alternative procedure was used.

The approach used was to fit a probability density function as an envelope around

the von Mises distribution to give an acceptance - rejection method which is both

simple to program and fast for all values of the concentration parameter. Initially

the simplest p.d.f. used was the Uniform function, simple but very slow. Best and

Fisher (1979) produced an algorithm to simulate samples from the von Mises

distribution using the wrapped Cauchy density (Equation 1.3.5) as the p.d.f. for the

envelope. Let f(x) be the p.d.f. of a random variable x which is to be sampled.

Let Y be a random variable with p.d.f. proportional to g(x), an upper envelope for

f(x) (i.e. g(x)^f(x)) and let U be a U(0,1) random variable. If (y,u) is a realization

of (Y,U), y is accepted as a realization of x if f(y)/g(y)>u. The distribution of the

accepted values is then exactly the required distribution. Once the observations from

the distribution specified by the null hypothesis are generated 1 0 , 0 0 0 sets of samples

- IV -

Page 284: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

of various size and various concentration parameters may be grouped and analysed as

required. The homogeneity of concentration parameters was tested prior to all

analyses.

- V -

Page 285: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

REFERENCES

ABRAMOW ITZ, M. and STEGUN, I. A. (1965) "Handbook of Mathematical Functions". Dover, New York.

AMOS, D .E . (1974) Computation of Modified Bessel Functions and their Ratios. Math. Comp., 28, p239-51.

BATSCHELET, E. (1965) Statistical Methods for the Analysis of Problems in Animal Orientation and Certain Biological Rhythms. Amer. Inst. Biol. Sciences, Washington.

BATSCHELET, E. (1981) Circular Statistics in Biology. Academic Press.

BEST, D.J. and FISHER, N .I. (1979) Efficient Simulation of the von Mises Distribution. Appl. Statist., 28, No 2, p 152-157.

BICKLEY, W.G. (1957) Bessel Functions and Formula. Cambridge University Press.

BINGHAM, M.S. (1971) Stochastic Processes with Independent Increments taking Values in an Abelian Groups. Proc. Lond. Math. Soc. 22, p 507-30.

BINGHAM, M.S. and MARDIA, K.V. (1975) Maximum Likelihood Characterisation of the von Mises Distribution. In "Statistical Distributions in Scientific Work" (C.P. Patil et al., eds), Vol 3, p 387-398. Reidal, Dordrecht.

BREITENBERGER, E. (1963) Analogues of the Normal Distribution on the Circle and the Sphere. Biometrika 50, p 81-8.

CURRAY, J.R. (1956) The Analysis of Two-Dimensional Orientation Data. J. Geol. 64, p 117-31.

DAVIES, O.L. (1963) The Design and Analysis of Industrial Experiments. Second Edition, Oliver and Boyd.

DOBSON, A.J. (1978) Simple Approximations for the von Mises Concentration Statistic. Appl. Statist., 27, p 345-7.

DURAND, D. and GREENWOOD, J.A. (1957) Random Unit Vectors II: Usefulness of Grame-Charlier and Related Series in Approximating Distributions. Ann. Maths. Statist. 28, p 978-85.

FISHER, N .I. and LEWIS, T. (1983) Estimating the Common Mean Direction of Several Circular or Spherical Distributions with Differing Dispersions.Biometrika 70, 2, p 333-41.

FISHER, R.A. (1953) Dispersion on a Sphere. Proc. Roy. Soc. Lond. A217, p 295-305.

GADSDEN, R.J. and KANJI, G.K. (1983) Analysis of Clay Strata Orientations. The Statistician 32, p 289-96.

GOULD, A.L. (1969) A Regression Technique for Angular Variates. Biometrics, Dec 69, p 683-700.

GREENWOOD, J.A. and DURAND, D. (1955) The Distribution of Length and Components of the Sum of N Random Unit Vectors. Ann. Math. Statist. 26, p 233-46.

- VI -

Page 286: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

GUMBEL, E.J., GREENWOOD, J.A. and DURAND, D. (1953) The Circular Normal Distribution: Theory and Tables. J. Amer. Statist. Ass. 48, p 131-52.

GUMBEL, E.J. (1954) Applications of the Circular Normal Distribution. J. Amer. Statist. Ass. 49, p 267-97.

H ILL, G.W. (1978) New Approximations to the von Mises Distribution. Biometrika, 63, 3, p 673-6.

HOGG, R.V. and CRAIG A.T. (1965) "Introduction to Mathematical Statistics" 2nd Edn. Macmillan. New York.

JOHNSON, R.A. and WEHRLY, T.E . (1978) Some Angular-Linear Distributions and Related Regression Models. J. Amer. Statist. Ass. 73, p 602-6.

JEFFREYS, H. (1948) Theory of Probability. Oxford Univ. Press

KENDALL, M. and STUART, A. (1967) "The Advanced Theory of Statistics". Vol.2.

KLUVNER, J.C. (1906) A Local Probability Theorem. Ned. Akad. Wet. Proc. Vol. 8, p 341-50.

LEVY, P. (1939) L'Addition des Variable Aleatoires Definies Sur Une Circonference. Bull. Soc. Math. France 67, p 1-41.

LORD, R.D. (1954) The Use of the Hankel Transform in Statistics. 1. General Theory and Examples. Biometrika 41, p 44-55.

MARDIA, K.V. (1972) Statistics of Directional Data. London Academic Press.

MARDIA, K.V. and ZEMROCH, P.J. (1975) Algorithm AS 81. Circular Statistics. Appl. Statist., 24, p 147-150.

PEARSON, K. (1905) The Problem of the Random Walk. Nature 72, p 294-342.

PEARSON, K. (1906) "A Mathematical Theory of Random Migration". Draper'sCompany Research Memoirs. Biometric Series, III, No. 15.

PINCUS, H.J. (1953) The Analysis of Aggregates of Orientation Data in the Earth Sciences. J. Geol. 61, p 482-509.

POLYA, G. (1935) Zwei Aufgaben Aus der Wahrscheinlictikeitsrechnung. Viertel Jahrsschrift der Naturforschiende Gesellschaft (Zurich), p 123-30.

RAMANO, A. (1977) Applied Statistics for Science and Industry. Allyn and Bacon,Boston, USA.

RAYLEIGH, LORD (1880) On the Resultant of a Large Number of Vibrations of theSame Pitch and or Arbitary Phase. Phil. Mag. 10, p 73-78.

RAYLEIGH, LORD (1919) On the Problem of Random Vibrations, and of RandomFlights in One, Two or Three Dimensions. Phil. Mag. 6, 37, p 321-47.

SANDERSON, H.C. (1976) Paleocurrent Analysis of Large Scale Cross-Stratificationin the Brampton Esker, Ontario. J. Sed. Pet. Vol 46, No 4, p 761-69.

- V II -

Page 287: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

SENGUPTA, S. and RAO, J.S. (1966) Statistical Analysis of Crossbedding Azimuths from the Kamthi Formation Around Bheemaram, Pranhita-Godavari Valley. Sankhya B28, p 165-74.

STEPHENS, M.A. (1962a) Exact and Approximate Tests for Directions I. Biometrika 49, p 493-77.

STEPHENS, M .A. (1962b) Exact and Approximate Tests for Directions II. Biometrika 49, p 547-52.

STEPHENS, M.A. (1962c) "The Statistics of Directions". PhD Thesis. Univ. of Toronto.

STEPHENS, M .A. (1963) Random Walk on a Circle. Biometrika 50, p 385-90.

STEPHENS, M.A. (1964) The Distribution of the Goodness-of-Fit Statistic, U 2.Biometrika 51, p 393-397.

STEPHENS, M.A. (1965) The Goodness-of-Fit Statistic V : Distribution andSignificance Points. Biometrika 52, p 309-21.

STEPHENS, M .A. (1969) Tests for the von Mises Distribution. Biometrika 56, p149-60.

STEPHENS, M.A. (1972) Multisample Tests for the von Mises Distribution. J. Amer. Statist. Ass. 67, No 338, p 456-61.

STEPHENS, M.A. (1982) Use of the von Mises Distribution to Analyse Continuous Proportions. Biometrika 69, p 197-203.

UPTON, G.J.G. (1970) "Significance Tests for Directional Data" PhD Thesis.Birmingham Univ.

UPTON, G.J.G. (1973) Single-Sample Tests for the von Mises Distribution.Biometrika 60, 1, p 87-99.

UPTON, G.J.G. (1974) New Approximations to the Distribution of Certain AngularStatistics. Biometrika 61, 2, p 369-73.

UPTON, G.J.G. (1976) More Multisample Tests for the von Mises Distribution. J. Amer. Statist. Ass. 71, No. 355 p 675-8.

UPTON, G.J.G. (1986) Approximate Confidence Intervals for the Mean Direction of a von Mises Distribution. Biometrika 73, 2, p 525-7.

VON MISES, R. (1918) Uber die "Ganzzahligkeit" der Atomgewicht und Verwandte Fragen. Physikal. Z. 19, p 490-500.

WATSON, G.S. and WILLIAMS E.J. (1956) On the Construction of Significance Tests on the Circle and the Sphere. Biometrika, 43, p 344-52.

WATSON, G.S. (1956) Analysis of Dispersion on a Sphere. Monthly Notices Roy.Astr. Soc., Geophys. Suppl. 7, p 153-9.

WATSON, G.S. (1961) Goodness-of-Fit Tests on a Circle. Biometrika 48,p 109-114.

WATSON, G.S. (1983) Statistics on Spheres. A Wiley-Interscience Publication.

- V III -

Page 288: The development of analysis of variance techniques for angular …shura.shu.ac.uk/19759/1/10697061.pdf · 2018-05-14 · 7.1 Introduction 115 7.2 Robustness of Assumptions 115 Example

WINTNER, A. (1974) On the Shape of the Angular Case of Cauchy's Distribution Curves. Ann. Math. Statist. 18, p 589-93.

- IX -