gibbons & chakraboti - nonparametric statistical inference - fourth-edition

675
8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 1/675

Upload: camilo-lillo

Post on 01-Jul-2018

231 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 1/675

Page 2: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 2/675

Nonparametric

StatisticalI n f e r e n c e

Fourth Edition, Revisedand E xpanded

J e a n D ic k in s o n G ib b o n s

S u b h a b r a t a ChakrabortiTh e University of AlabamaTuscaloosa, Alabama, U.S.A.

M A R C E L

M A R C E LD E K K E R ,I N C . N E W Y O R K• B A S E L

D E K K E R

Page 3: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 3/675

Page 4: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 4/675

STATISTICS: Textbooks and Monographs

D. B. Owen

Founding Editor,1972-1991

Associate E ditors

Statistical Computing/Nonparametric Statistics

Professor WilliamR. SchucanySouthern Methodist University

ProbabilityProfessor MarcelF. Neuts

University ofArizona

Multivariate AnalysisProfessor AnantM Kshirsagar

University of Michigan

Quality Control/ReliabilityProfessor Edward G. Schilling

Rochester Instituteof Technology

EditorialBoard

Applied ProbabilityDr. Paul R. Garvey

Th e MITRE Corporation

Economic StatisticsProfessor David E A. Giles

University of Victoria

Experimental DesignsM r Thomas B. Barker

Rochester Instituteof Technology

Multivariate AnalysisProfessor Subir Ghosh

Universityof California-Riverside

Statistical DistributionsProfessor N.Balaknshnan

McMaster University

Statistical Process ImprovementProfessor G. Geoffrey VmingVirginia Polytechnic Institute

Stochastic ProcessesProfessorV. L akshrmkantham

Florida Institute of Technology

Survey SamplingProfessor Lynne Stokes

Southern Methodist U niversity

Time SeriesSastry G. Pantula

North Carolina State University

Page 5: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 5/675

1 The Generalized Jackknife Statistic, H. L.Gray and W .R Schucany2. Multivariate Analysis, AnantM . Kshirsagar3 Statistics and Society, W alterT . Federer

4. Multivanate Analysis. A Selected and Abstracted Bibliography, 1957-1972, Kocher-lakota Subrahmamama nd Kathleen Subrahmamam5 Design of Expenments: A Realistic Approach, Virgil L. Anderson and Robert A.

McLean6. Statistical and Mathematical Aspects of Pollution Problems, John W Pratt7. Introduction to Probability and Statistics (in two parts), Part I: Probability; Part II:

Statistics, Narayan C.Giri8. Statistical Theory of the Analysis of Experimental Designs, J Ogawa9 Statistical Techniques in Simulation (in twoparts), Jack P. C. Kleijnen

10. Data Quality Control and Editing, Joseph I Naus11 Cost of Living Index Numbers: Practice, Precision, and Theory, KaliS Banerjee12 Weighing Designs: For Chemistry, Medicine, Economics, Operations Research,

Statistics, Kali S. Ba nerjee13. The Search for Oil: Some Statistical Methods and Techniques, edited by D B.Owen14. Sample Size Choice: Charts for Expenments with Linear Models, Robert E Odeh and

Martin Fox

15. Statistical Methods for Engineers and Scientists, Robert M. Bethea, BenjaminSDuran, a nd ThomasL Boullion16 Statistical Quality Control Methods, living W Burr17. On the History of Statistics and Probability, edited by D. B.Owen18 Econometrics, Peter Schmidt19 . Sufficient Statistics' Selected Contributions, VasantS. Huzurbazar (editedby AnantM

Kshirsagar)20. Handbook of Statistical Distributions, Jagdish K.Pate/, C H Kapadia,and D B Owen21. Case Studies in Sample Design, A. C Rosander22 . Pocket Book of Statistical Tables, compiled by R. E Odeh, D B. Owen, Z. W .

B imba um,and L Fisher23. The Information in Contingency Tables, D V Gokhaleand Solomon Kullback24. Statistical Analysis of Reliability and Life-Testing Models: Theory and Methods, Lee J.

Bain25 . Elementary Statistical Quality Control, IrvingW Burr26. An Introduction to Probability and Statistics Using BASIC, RichardA. Groeneveld27. Basic Applied Statistics, B. L Raktoe andJ J Hubert28 A Primer in Probability, Kathleen Subrahmamam29 . Random Processes: A First Look, R. Syski30 . Regression Methods: A Tool for Data Analysis, RudolfJ. Freund and Paul D. Minton31. Randomization Tests, Eugene S. Edgington32 Tables for Normal Tolerance Limits, Sampling Plans and Screening, Robert E. Odeh

andD B Owen33 . Statistical Computing, W illiamJ Kennedy,Jr., and James E. Gentle34. Regression Analysis and Its Application: A Data-Onented Approach, RichardF. Gunst

a nd RobertL Mason35. Scientific Strategies to Save Your Life, / D. J. Brass36 . Statistics in the Pharmaceutical Industry, edited by C.Ralph Buncheran d Jia-Yeong

Tsay37 . Sampling from a Finite Population, J. Hajek38 . Statistical Modeling Techniques, S. S. Shapiroand A J. Gross39 . Statistical Theory and Inference in Research, T A. Bancroftand C -P. Han40. Handbook of the Normal Distribution, Jagdish K.Pate/ and CampbellB Read

41 Recent Advances in Regression Methods, HrishikeshD. Vinodand AmanUllah42 Acceptance Sampling in Quality Control, Edward G . Schilling43. The Randomized Clinical Trial and Therapeutic Decisions, edited by Niels Tygstrup,

John MLachin,and Enk Juhl

Page 6: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 6/675

44. Regression Analysis of Survival Data in Cancer Chemotherapy, W al terH Carter, Jr,Galen L W ampler,and Donald M Stablein

45. A Course in Linear Models, Anant M Kshirsagar

46. Clinical Trials- Issues and Approaches, edited by Stanley H Shapiro and Thomas HLouis

47. Statistical Analysis of DNA Sequence Data, edited by B S W eir48. Nonlinear Regression Modeling: A Unified Practical Approach, DavidA Ratkowsky49. Attribute Sampling Plans, Tables of Tests and Confidence Limits for Proportions, Rob-

er t E OdehandD B Owe n50. Experimental Design, Statistical Models, and Genetic Statistics, edited by Klaus

Hinkelmann51. Statistical Methods for Cancer Studies, edited by Richard G Cornel l52 . Practical Statistical Sampling for Auditors, ArthurJ. W ilbum53 Statistical Methods fo r Cancer Studies, edited by Edward J We g m a n and James G

Smith54 Self-Organizing Methods in Modeling GMDH Type Algorithms, edited by Stanley J.

Farlow55 Applied Factonal and Fractional Designs, Rober tA McLeanand VirgilL Anderson56 Design of Experiments Ranking and Selection, edited by Thomas J Santner and Ajit

C Tamhane57 Statistical Methods for Engineers and Scientists- Second Edition, Revised and Ex-panded, Robert M. Bethea,BenjaminS Duran, and Thomas L Bouil lon

58 Ensemble Modeling. Inference from Small-Scale Properties to Large-Scale Systems,Alan E Gelfanda nd Crayton C W a l ke r

59 Computer Modeling for Business and Industry, Bruce L Bowe r m a n and Richard TO'Connel l

60 Bayesian Analysis of Linear Models, Lyle D Broemel ing61. Methodological Issues for Health Care Surveys, Brenda Cox and Steven Cohen62 Applied Regression Analysis and Expenmental Design, Richard J Brookand Gregory

C Arnold63 Statpal: A Statistical Package for Microcomputers—PC-DOS Version for the IBM PC

and Compatibles, Bruce J Chalmerand David G W hitmore64. Statpal: A Statistical Package for Microcomputers—Apple Version for the II, II+, and

He, David G W hi tmore and Bruce J . Chalmer65. Nonparametnc Statistical Inference. Second Edition, Revised and Expanded, Jean

Dickinson Gibbons66 Design and Analysis of Experiments, Roger G Petersen67. Statistical Methods for Pharmaceutical Research Planning, Sten W Be rgm a n and

John C Gittins68 . Goodness-of-Fit Techniques, edited by Ralph B D'Agostinoand MichaelA. Stephens69. Statistical Methods in Discnmination Litigation, ecMed by D H Kaye and MikelAickin70. Truncated and Censored Samples from Normal Populations, Helmut Schneider71. Robust Inference, M L Tiku, W Y Tan, and N Balakrishnan72 . Statistical Image Processing and Graphics, edited by Edward J, We g m a nand Douglas

J DePnest73. Assignment Methods in Combmatonal Data Analysis, Lawrence J Hubert74. Econometrics and Structural Change, Lyle D Broemelingand Hiroki Tsurumi75. Multivanate Interpretation of Clinical Laboratory Data, Adelin Albert and Eugene K

Ham's76. Statistical Tools for Simulation Practitioners, Jack P C Kleijnen77. Randomization Tests' Second Editon, Eugene S Edgington78 A Folio of Distributions A Collection of Theoretical Quantile-Quantile Plots, Edward B

Fowlkes79. Applied Categorical Data Analysis, DanielH Freeman, Jr80. Seemingly Unrelated Regression Equations Models: Estimation and Inference, Viren-

dr a K Snvastava a nd David E A Giles

Page 7: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 7/675

81. Response Surfaces: Designs and Analyses, Andre I. Khun and John A. Cornell82. Nonlinear Parameter Estimation: An Integrated System in BASIC, John C. Na s h and

Mary W alker-Smi th83. Cancer Modeling, edited by J a m e s R. Thompsonand Barry W . Br own84. Mixture Models: Inference and Applications to Clustering, Geoffrey J. McLachlanand

Ka ye E . Basford85. Randomized Response. Theory and Techniques, AnjitChaudhuriand Rahul Mukerjee86 Biopharmaceutical Statistics for Drug Development, edited by KarlE. Peace87. Parts per Million Values for Estimating Quality Levels, Rober tE Odeh and D B. O w e n88. Lognormal Distnbutions: Theory and Applications, ecWed by Edwin L Crowand Kunio

Shimizu89. Properties of Estimators for the Gamma Distribution, K O. B o w m a nand L R. Shenton90. Spline Smoothing and Nonparametnc Regression, RandallL Eubank91. Linear Least Squares Computations, R W Farebrother92. Exploring Statistics, Damaraju Raghavarao93. Applied Time Series Analysis for Business and Economic Forecasting, Sufi M Na z e m94 Bayesian Analysis of Time Series and Dynamic Models, edited by J a m e s C. Spall95. The Inverse Gaussian Distribution: Theory, Methodology, and Applications, Raj S

ChhikaraandJ. Leroy Folks

96. Parameter Estimation in Reliability and Life Span Models, A. CliffordCohe n and BettyJones W hrtten97. Pooled Cross-Sectional and Time Series Data Analysis, Terry E Die/man98. Random Processes: A First Look, Second Edition, Revised and Expanded, R. Syski99. Generalized Poisson Distributions: Properties and Applications, P C. Consul

100. Nonlinear Lp-Norm Estimation, Rene Goninand ArthurH Money101. Model Discrimination for Nonlinear Regression Models, Dale S. Borowiak102. Applied Regression Analysis in Econometrics, HowardE. Doran103. Continued Fractions in Statistical Applications, K. O. B o w m a nandL R. Shenton104 Statistical Methodology in the Pharmaceutical Sciences, DonaldA. Berry105. Expenmental Design in Biotechnology, Perry D. Haaland106. Statistical Issues in Drug Research and Development, edited by Kar lE Peace107. Handbook of Nonlinear Regression Models, DavidA. Ratkowsky108. Robust Regression: Analysis and Applications, edited by Kenneth D Lawrence and

Jeffrey L Arthur109. Statistical Design and Analysis of Industrial Experiments, edited by Subi r Ghosh110. (7-Statistics: Theory and Practice, A J Lee111. A Primer in Probability: Second Edition, Revised and Expanded, Kathleen Subrah-

man i am112. Data Quality Control: Theory and Pragmatics, edited by GunarE Uepinsand V. R R.

Uppuluri113. Engmeenng Quality by Design: Interpreting the Taguchi Approach, Thom a sB Barker114 Survivorship Analysis for Clinical Studies, Eugene K. Ha m s and Adelin Albert115. Statistical Analysis of Reliability and Life-Testing Models: Second Edition, Lee J. Bam

and MaxEngelhardt116. Stochastic Models of Carcinogenesis, W a i -Yua nTan117. Statistics and Society Data Collection and Interpretation, Second Edition, Revised and

Expanded, W al ter T. Federer118. Handbook of Sequential Analysis, B K. Gfiosn and P. K. Sen119 Truncated and Censored Samples: Theory and Applications, A. CliffordCohe n120. Survey Sampling Pnnciples, E. K. For e m a n121. Applied Engineering Statistics, Rober t M. Bethea and R. Russel l Rhinehart122. Sample Size Choice: Charts for Experiments with Linear Models: Second Edition,

Robert £ Odeh and MartinFox123. Handbook of the Logistic Distnbution, edited by N Balaknshnan124. Fundamentals of Biostatistical Inference, Cha p T. Le125. Correspondence Analysis Handbook, J.-P Be nz e c n

Page 8: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 8/675

126. Quadratic Forms in Random Variables: Theory and Applications, A. M Mathai andSerge B Provost

127 Confidence Intervals on Vanance Components, Richard K. Burdickand Franklin AGraybill

128 Biopharmaceutical Sequential Statistical Applications, edited by Kar lE Peace129. Item Response Theory Parameter Estimation Techniques, Frank B. Ba ke r130. Survey Sampling Theory and Methods, Arijrt Chaudhunand Horst Stenger131. Nonparametnc Statistical Inference Third Edition, Revised and Expanded, Jean Dick-

inson Gibbons and SubhabrataChakraborti132 Bivanate Discrete Distribution, Subrahmaniam Kocher takotaand Kathleen Kocher-

lakota133. Design and Analysis of Bioavailability and Bioequivalence Studies, Shein-Chung C how

a nd Jen-pei Li u134. Multiple Compansons, Selection, and Applications in Biometry, edited by Fred M

Hoppe135. Cross-Over Expenments: Design, Analysis, and Application, David A Ratkowsky,

Marc A Evans, and J. Richard Alldredge136 Introduction to Probability and Statistics- Second Edition, Revised and Expanded,

Narayan C G in

137. Applied Analysis of Vanance in Behavioral Science, edited by Lynne K Edwards138 Drug Safety Assessment in Clinical Trials, edited by Ge ne S Gilbert139. Design of Expenments A No-Name Approach, Thomas J Lorenzen and VirgilL An-

derson140 Statistics in the Pharmaceutical Industry. Second Edition, Revised and Expanded,

edited b y C Ralph Buncher andJia-Yeong Tsay141 Advanced Linear Models Theory and Applications, Song-GuiWa n g and Shein-Chung

Chow142. Multistage Selection and Ranking Procedures. Second-Order Asymptotics, NitisMuk-

hopadhyayand TumuleshK S Solanky143. Statistical Design and Analysis in Pharmaceutical Science Validation, Process Con-

trols, and Stability, Shein-Chung Chowand Jen-pei Li u144 Statistical Methods for Engineers and Scientists Third Edition, Revised and Expanded,

Robert M Be t he a ,BenjaminS Duran, a nd Thomas L Bouil lon145 Growth Curves, Anant M Kshirsagarand W ill iam Boyce Smi th146 Statistical Bases of Reference Values in Laboratory Medicine, Eugene K. Harris and

James C Boyd147 Randomization Tests- Third Edition, Revised and Expanded, Eugene S Edgington148 Practical Sampling Techniques Second Edition, Revised and Expanded, Ran/an K.

Som149 Multivanate Statistical Analysis, Narayan C G in150 Handbook of the Normal Distribution Second Edition, Revised and Expanded, Jagdish

K Pate l and Campbel lB Read151 Bayesian Biostatistics, edited by DonaldA Berry and Dalene K Stangl152 Response Surfaces: Designs and Analyses, Second Edition, Revised and Expanded,

Andre I Khuriand John A Cornel l153 Statistics of Quality, edited by Subir Ghosh, W il liamR Schucany, and W illiamB . Smith154. Linear and Nonlinear Models for the Analysis of Repeated Measurements, Edward F

Vonesh a n d Ve m on M Chinchilli155 Handbook of Applied Economic Statistics, Aman Ullahand David E A Giles156 Improving Efficiency by Shnnkage The James-Stein and Ridge Regression Estima-

tors, MarvinH J Gruber157 Nonparametnc Regression and Spline Smoothing Second Edition, Randal l L Eu-

bank158 Asymptotics, Nonparametncs, and Time Senes, edited by Subi r Ghosh159 Multivanate Analysis, Design of Experiments, and Survey Sampling, edited by Subir

Ghos h

Page 9: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 9/675

160 Statistical Process Monitoring and Control, edited by Sung H Park and G Geoff reyVining

161 Statistics for the 21st Century Methodologies for Applications of the Future, editedby C. R Rao and GaborJ Szekely

162 Probability and Statistical Inference, Nitis Mukhopadhyay163 Handbook of Stochastic Analysis and Applications, edited by D Kannan an d V.Lak-

shmtkan tham164. Testing fo r Normality, Henry C Thode ,Jr .165 Handbook of Applied Econometncs and Statistical Inference, edited by A m a n Ullah,

Alan T K.Wan, andAnoop Chaturvedi166 Visualizing Statistical Models and Concepts, R W Farebrother167. Financial and Actuarial Statistics An Introduction, Dale S Borowiak168 Nonparametnc Statistical Inference Fourth Edition, Revised and Expanded, Jean

Dickinson Gibbons and SubhabrataChakrabort i169. Computer-Aided Econometncs, edited by DavidE. A Giles

Additional Volumes in Preparation

The EM Algorithm and Related Statistical Models, edited by Michiko WatanabeandKazunori Yamaguchi

Multivanate Statistical Analysis, Narayan C Giri

Page 10: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 10/675

Page 11: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 11/675

Page 12: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 12/675

1971 1992 D 30

:

21

v

Page 13: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 13/675

MB * S S S X C SPSS

BMDP SPSSX MSL

C 1 C ’ C

L

C 2

C 5

C 5 C 4

L ’

* M B M S

v

Page 14: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 14/675

Q Q

C 6 ‘‘ ’’

M C 7 8 C 8 9

C 8

C 10 12

C 11 K ’ S ’ S ’ K ’ K

C 14 ’ M ’ ’

B L S 2000

‘‘ ’’

B

M C S

P M B S S

M

v

Page 15: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 15/675

M

D S S M S

D

C D S M

JS P M D

J 1993 P 1993 Z 1993 Z ’ ‘‘ —

’’ M Z

v

Page 16: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 16/675

— P R: S P S X C : S P

Page 17: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 17/675

S

C 2

C 3

C 4 L’

C 7

C 11 C 11

3

J

C 13 P K ’

K ’ C 14

L ’ K ’ P ’

J

Page 18: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 18/675

Page 19: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 19/675

Page 20: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 20/675

Page 21: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 21/675

C 15

C1 14 M

C 15

C 3 S 15 3

C 15

: D

1972 J 1973 K 1972 1972

D B B V

v

Page 22: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 22/675

D

D ‘‘ ’’

v

Page 23: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 23/675

P

P M C

M

C

v

Page 24: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 24/675

S ’ ‘‘B ’’ 1962

J V B M K

D

P L

S D S R P

v

Page 25: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 25/675

Page 26: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 26/675

1 Introduction and Fundamentals 1

1 1 11 2 S C 9

2 Order Statistics, Quantiles, and Coverages 32

2 1 322 2 Q 332 3 D 372 4 S P S 40

Page 27: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 27/675

2 5 P P 422 6 J D S 442 7 D M R 502 8 M S 532 9 L S M

S 572 10 D S 602 11 L D C 642 12 S 69

P 69

3 Tests of Randomness 6

3 1 763 2 B R 783 3 B L L R 873 4 R D 903 5 B R 973 6 S 98

P 99

Tests of Goodness of Fit 103

4 1 1034 2 C S 1044 3 K S S S 1114 4 K S

S S 1204 5 L ’ 1304 6 L ’ D 1334 7 V 1434 8 S 147

P 150

5 One-Sample and Paired-Sample Procedures 156

5 1 1565 2 C P Q 1575 3 P Q 1635 4 S C

M 1685 5 R S 1895 6 R 194

Page 28: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 28/675

5 7 S R C 196

5 8 S 222P 224

6 The General Two-Sample Problem 231

6 1 2316 2 R 2356 3 K S S 2396 4 M 2476 5 C M 2626 6 M 2686 7 S 279

P 280

Linear Rank Statistics and the GeneralTwo-Sample Problem 283

7 1 2837 2 D L R S 2847 3 D P L R S 2857 4 294

P 295

8 Linear Rank Tests for the Location Problem 296

8 1 2968 2 R S 2988 3 L 3078 4 S 314

P 315

9 Linear Rank Tests for the Scale Problem 319

9 1 3199 2 M 3239 3 B D B 3259 4 S 3299 5 K S 3319 6 P M R S 3329 7 S 3339 8 C P 3379 9 S P 338

Page 29: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 29/675

9 10 3419 11 S 348

P 350

10 Tests of the Equality of k Independent Samples 353

10 1 35310 2 M 35510 3 C M 36010 4 K V

M C 36310 5 R S 37310 6 37610 7 C C 38310 8 C S P 39010 9 S 392

P 393

11 Measures of Association for Bivariate Samples 399

11 1 : D M B P 399

11 2 K ’ C 40411 3 S ’ C R C 42211 4 R B t 43211 5 M 43711 6 43811 7 S 443

P 445

12 Measures of Association in Multiple Classications 50

12 1 45012 2 ’ V

R Â M C 45312 3 P ’ 46312 4 C C S

R 46612 5 C C S

R 47612 6 K ’ C P C 48312 7 S 486

P 487

Page 30: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 30/675

13 Asymptotic Relative Efciency 9

13 1 49413 2 B C R 49813 3 C R 50313 4 S 518

P 518

1 Analysis of Count Data 520

14 1 52014 2 C 52114 3 S S R Â2 C 529

14 4 ’ 53214 5 M ’ 53714 6 M D 543

P 548

Appendix of Tables 552

D 554 B C S D 555 C C B D 556 D R D 568 R D D 573 K S S S 576

B D ¼0 5 577 P S R S 578

K S S S 581 J P R S S 584 K K S 592 L K ’ S 593 M S ’ C R C 595 ’ V S

K ’ C C 598 L ’ D

C V 599 P S P K ’

P R C C 600 Q P ’ S 601 R C V P

J 602

Page 31: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 31/675

S R S 607 L ’ D

C V 610

611 617 635

v

Page 32: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 32/675

1Introduction and Fundamentals

1.1 INTRODUCTION

In many elementary statistics courses, the subject matter is somewhatarbitrarily divided into two categories, called descriptive and inductivestatistics. Descriptive statistics usually relates only to the calculationor presentation of ®gures (visual or conceptual) to summarize orcharacterize a set of data. For such procedures, no assumptions aremade or implied, and there is no question of legitimacy of techniques.The descriptive ®gures may be a mean, median, variance, range,histogram, etc. Each of these ®gures summarizes a set of numbers inits own unique way; each is a distinguishable and well-de®ned char-acterization of data. If such data constitute a random sample from a

certain population, the sample represents the population in miniatureand any set of descriptive statistics provides some informationregarding this universe. The term parameter is generally employed toconnote a characteristic of the population. A parameter is often an

1

Page 33: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 33/675

unspeci®ed constant appearing in a family of probability distributions,but the word can also be interpreted in a broader sense to includealmost all descriptions of population characteristics within a family.

When sample descriptions are used to infer some informationabout the population, the subject is called inductive statistics orstatistical inference . The two types of problems most frequently en-countered here are estimation and tests of hypotheses. The factorwhich makes inference a scienti®c method, thereby differentiating itfrom mere guessing, is the ability to make evaluations or probabilitystatements concerning the accuracy of an estimate or reliability of adecision. Unfortunately, such scienti®c evaluations cannot be madewithout some information regarding the probability distribution of the

random variable relating to the sample description used in the in-ference procedure. This means that certain types of sample descrip-tions will be more popular than others, because of their distributionproperties or mathematical tractability. The sample arithmetic meanis a popular ®gure for describing the characteristic of central tendencyfor many reasons but perhaps least of all because it is a mean. Theunique position of the mean in inference stems largely from its ``almostnormal'' distribution properties. If some other measure, say the sam-ple median, had a property as useful as the central-limit theorem,surely it would share the spotlight as a favorite description of location.

The entire body of classical statistical-inference techniques isbased on fairly speci®c assumptions regarding the nature of the un-derlying population distribution; usually its form and some parameter

values must be stated. Given the right set of assumptions, certain teststatistics can be developed using mathematics which is frequentlyelegant and beautiful. The derived distribution theory is quali®ed bycertain prerequisite conditions, and therefore all conclusions reachedusing these techniques are exactly valid only so long as the assump-tions themselves can be substantiated. In textbook problems, the re-quisite postulates are frequently just stated and the student practicesapplying the appropriate technique. However, in a real-world problem,everything does not come packaged with labels of population of origin. A decision must be made as to what population properties may judi-ciously be assumed for the model. If the reasonable assumptions arenot such that the traditional techniques are applicable, the classicalmethods may be used and inference conclusions stated only with theappropriate quali®ers, e.g., ``If the population is normal, then . . . .''

The mathematical statistician may claim that it is the users'problem to decide on the legitimacy of the postulates. Frequently inpractice, those assumptions which are deemed reasonable by empirical

2 CHAPTER 1

Page 34: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 34/675

evidence or past experience are not the desired ones, i.e., those forwhich a set of standard statistical techniques has been developed. Alternatively, the sample may be too small or previous experience toolimited to determine what is a reasonable assumption. Or, if the re-searcher is a product of the ``cookbook school'' of statistics, his parti-cular expertise being in the area of application, he may not understandor even be aware of the preconditions implicit in the derivation of thestatistical technique. In any of these three situations, the result oftenis a substitution of blind faith for scienti®c method, either because of ignorance or with the rationalization that an approximately accurateinference based on recognized and accepted scienti®c techniques isbetter than no answer at all or a conclusion based on common sense or

intuition. An alternative set of techniques is available, and the mathema-tical bases for these procedures are the subject of this book. They maybe classi®ed as distribution-free and nonparametric procedures. In adistribution-free inference, whether for testing or estimation, themethods are based on functions of the sample observations whosecorresponding random variable has a distribution which does not de-pend on the speci®c distribution function of the population from whichthe sample was drawn. Therefore, assumptions regarding the under-lying population are not necessary. On the other hand, strictlyspeaking, the term nonparametric test implies a test for a hypothesiswhich is not a statement about parameter values. The type of state-ment permissible then depends on the de®nition accepted for the term

parameter. If parameter is interpreted in the broader sense, the hy-pothesis can be concerned only with the form of the population, as ingoodness-of-®t tests, or with some characteristic of the probabilitydistribution of the sample data, as in tests of randomness and trend.Needless to say, distribution-free tests and nonparametric tests arenot synonymous labels or even in the same spirit, since one relates tothe distribution of the test statistic and the other to the type of hy-pothesis to be tested. A distribution-free test may be for a hypothesisconcerning the median, which is certainly a population parameterwithin our broad de®nition of the term.

In spite of the inconsistency in nomenclature, we shall follow thecustomary practice and consider both types of tests as procedures innonparametric inference, making no distinction between the twoclassi®cations. For the purpose of differentiation, the classical statis-tical techniques, whose justi®cation in probability is based on speci®cassumptions about the population sampled, may be called parametricmethods . This implies a de®nition of nonparametric statistics then as

INTRODUCTION AND FUNDAMENTALS 3

Page 35: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 35/675

the treatment of either nonparametric types of inferences or analogiesto standard statistical problems when speci®c distribution assump-tions are replaced by very general assumptions and the analysis isbased on some function of the sample observations whose samplingdistribution can be determined without knowledge of the speci®c dis-tribution function of the underlying population. The assumption mostfrequently required is simply that the population be continuous. orerestrictive assumptions are sometimes made, e.g., that the populationis symmetrical, but not to the extent that the distribution is speci®-cally postulated. The information used in making nonparametric in-ferences generally relates to some function of the actual magnitudes of the random variables in the sample. For example, if the actual ob-

servations are replaced by their relative rankings within the sampleand the probability distribution of some function of these sample rankscan be determined by postulating only very general assumptions aboutthe basic population sampled, this function will provide a distribution-free technique for estimation or hypothesis testing. Inferences basedon descriptions of these derived sample data may relate to whateverparameters are relevant and adaptable, such as the median for a lo-cation parameter. The nonparametric and parametric hypotheses areanalogous, both relating to location, and identical in the case of acontinuous and symmetrical population.

Tests of hypotheses which are not statements about parametervalues have no counterpart in parametric statistics; and thus herenonparametric statistics provides techniques for solving new kinds of

problems. On the other hand, a distribution-free test simply relates toa different approach to solving standard statistical problems, andtherefore comparisons of the merits of the two types of techniques arerelevant. ome of the more obvious general advantages of nonpara-metric-inference procedures can be appreciated even before our sys-tematic study begins. Nonparametric methods generally are quick andeasy to apply, since they involve extremely simple arithmetic. Thetheory of nonparametric inference relates to properties of the statisticused in the inductive procedure. iscussion of these properties re-quires derivation of the random sampling distribution of the pertinentstatistic, but this generally involves much less sophisticated mathe-matics than classical statistics. The test statistic in most cases is adiscrete random variable with nonzero probabilities assigned to only a®nite number of values, and its exact sampling distribution can oftenbe determined by enumeration or simple combinatorial formulas. Theasymptotic distributions are usually normal, chi-square, or other well-known functions. The derivations are easier to understand, especially

4 CHAPTER 1

Page 36: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 36/675

for non-mathematically trained users of statistics. A cookbookapproach to learning techniques is then not necessary, which lessensthe danger of misuse of procedures. This advantage also minimizes theopportunities for inappropriate and indiscriminate applications, be-cause the assumptions are so general. When no stringent postulationsregarding the basic population are needed, there is little problem of violation of assumptions, with the result that conclusions reached innonparametric methods usually need not be tempered by many qua-li®ers. The types of assumptions made in nonparametric statistics aregenerally easily satis®ed, and decisions regarding their legitimacyalmost obvious. esides, in many cases the assumptions are suf®cient,but not necessary, for the test's validity. Assumptions regarding the

sampling process, usually that it is a random sample, are not relaxedwith nonparametric methods, but a careful experimenter can gen-erally adopt sampling techniques which render this problem academic.With so-called ``dirty data,'' most nonparametric techniques are, rela-tively speaking, much more appropriate than parametric methods.The basic data available need not be actual measurements in manycases; if the test is to be based on ranks, for example, only the ranksare needed. The process of collecting and compiling sample data thenmay be less expensive and time consuming. ome new types of pro-blems relating to sample-distribution characteristics are soluble withnonparametric tests. The scope of application is also wider because thetechniques may be legitimately applied to phenomena for which it isimpractical or impossible to obtain quantitative measurements. When

information about actual observed sample magnitudes is provided butnot used as such in drawing an inference, it might seem that some of the available information is being discarded, for which one usuallypays a price in ef®ciency. This is really not true, however. The in-formation embodied in these actual magnitudes, which is not directlyemployed in the inference procedure, really relates to the underlyingdistribution, information which is not relevant for distribution-freetests. On the other hand, if the underlying distribution is known, aclassical approach to testing may legitimately be used and so thiswould not be a situation requiring nonparametric methods. The in-formation of course may be consciously ignored, say for the purpose of speed or simplicity.

This discussion of relative merits has so far been concernedmainly with the application of nonparametric techniques. erformanceis certainly a matter of concern to the experimenter, but general-izations about reliability are always dif®cult because of varying factorslikesample size,signi®cance levels or con®dencecoef®cients, evaluation

INTRODUCTION AND FUNDAMENTALS 5

Page 37: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 37/675

B 1955 ‘‘

1 2

’’ P

?P

P

Page 38: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 38/675

P

C

P

B

C 2

Q S C 3

7

Page 39: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 39/675

C 3 4

C 5

R C 6

C 7

C 8 9 C 10

C 11 12

D

C 13

C 14

8

Page 40: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 40/675

serve to solidify the reader's understanding of proper uses of nonparametric methods. All of the solutions show the calculationsclearly. In addition, many of the solutions are then repeated using oneor more statistical computer packages.

roblems are given at the end of each chapter. The theoreticalproblems serve to amplify or solidify the explanations of theory givenin the text. The applied problems give the reader practice in applica-tions of the methods. Answers to selected problems are given at theend of the book.

1.2 FUNDAMENTAL STATISTICAL CONCEPTS

In this section a few of the basic de®nitions and concepts of classicalstatistics are reviewed, but only very brie¯y since the main purpose isto explain notation and terms taken for granted later on. A few of thenew fundamentals needed for the development of nonparametricinference will also be introduced here.

BASIC DEFINITIONS

A sample space is the set of all possible outcomes of a randomexperiment.

A random variable is a set function whose domain is theelements of a sample space on which a probability function has been de®ned andwhose range is the set of all real numbers. Alternatively, X is a randomvariable if forevery real number x there existsa probability that thevalue

assumed by the random variable does not exceed x, denoted by P X 4 xor F X x , and called the cumulative distribution function (cdf) of X .The customary practice is to denote the random variable by a

capital letter like X and the actual value assumed (value observed inthe experiment) by the corresponding letter in lowercase, x. Thispractice will generally be adhered to in this book. However, it is notalways possible, strictly speaking, to make such a distinction. Occa-sional inconsistencies will therefore be unavoidable, but the statisti-cally sophisticated reader is no doubt already accustomed to this typeof conventional confusion in notation.

The mathematical properties of any function F X which is a cdf of a random variable X are as follows

. F X x 4 F X x for all x 4 x , so that F X is nondecreasing.

. lim x3ÀI F X x and lim x3I F X x .

. F X x is continuous from the right, or, symbolically, as e 3through positive values, lim e3 F X x e F X x :

INTRODUCTION AND FUNDAMENTALS 9

Page 41: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 41/675

A random variable X is called continuous if its cdf is continuous.Every continuous cdf in this book will be assumed differentiable ev-erywhere with the possible exception of a ®nite number of points. Thederivative of the cdf will be denoted by f X x , a nonnegative functioncalled the probability density function (pdf) of X . Thus when iscontinuous,

F X x xÀI f X t dt f X x

ddx

F X x F H X x 5

and

I

ÀI f X x dx

A random variable is called discrete if it can take on only a ®niteor a countably in®nite number of values, called mass points. The probability mass function (pmf) of a discrete random variable X isde®ned as

f X x P X x F X x Àlim e3 F X X Àe

where e 3 through positive values. For a discrete random variable f X x 5 and all x f X x , where the expression ``all x' ' is to beinterpreted as meaning all x at which F X x is not continuous; in otherwords the summation is over all the mass points. Thus for a discreterandom variable there is a nonzero probability for any mass point,whereas the probability that a continuous random variable takes onany speci®c ®xed value is zero.

The term probability function (pf ) or probability distribution willbe used to denote either a pdf or a pmf. For notation, capital letterswill always be reserved for the cdf, while the corresponding lowercaseletter denotes the pf.

The expected value of a function g X of a random variable X ,denoted by E g X , is

E g X IÀI g x f X x dx if X is continuous

all x g x f X x if X is discreteV

Xoint probability functions and expectations for functions of morethan one random variable are similarly de®ned and denoted by re-placing single symbols by vectors, sometimes abbreviated to

X n X ; X ; . . . ; X n

10 CHAPTER 1

Page 42: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 42/675

A set of n random variables ( X ; X ; . . . ; X n ) is independent if andonly if their joint probability function equals the product of the n in-dividual marginal probability functions.

A set of n random variables ( X ; X ; . . . ; X n ) is called a randomsample of the random variable X (or from the population F X or f X ) if they are independent and identically distributed (i.i.d.) so that their joint probability density function is given by

f X n x ; x ; . . . ; xn f X ; X ; ... ; X n x ; x ; . . . ; xn n

i f X xi

A statistic is any function of observable or sample random vari-ables which does not involve unknown parameters.

A moment is a particular type of population parameter. The kthmoment of X about the origin is mH k E X k , where mH E X m, isthe mean and the kth central moment about the mean is

m k E X Àm k

The second central moment about the mean m is the variance of X ,

m var X s X E X Àm mHÀmHThe kth factorial moment is E X X À ÁÁÁ X À k .

For two random variables, their covariance and correlation , re-spectively, are

cov X ; Y E X Àm X Y ÀmY E XY Àm X mY

corr X ; Y r X ; Y cov X ; Y s X s Y

The moment-generating function (mgf ) of a function g( X ) of X is

M g X t Efexp tg X gome special properties of the mgf are

M a bX t ebt M X at for a and b constant

mH kd k

dt k M X tt

M k X

MOMENTS OF LINEAR COMBINATIONS OF RANDOM VARIABLES

et X ; X ; . . . ; X n be n random variables and a i ; bi ; i ; ; . . . ; n be anyconstants. Then

INTRODUCTION AND FUNDAMENTALS 11

Page 43: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 43/675

¼1

¼¼1

ð Þ

¼1 ¼

¼1

2 ð Þ þ21 ð Þ

¼1

¼1

¼¼1 ð Þ þ1 ð þ Þ ð Þ

S 2 1

B

¼0 ¼1 ð Þð Þ

:

S ’ ð Þ ¼À1 2

ð1

þ 2

ÞÀðþ1Þ2

ð 2 1 2Þ 0

S ’ ð1 2Þ ð Þ ¼

1

2 1 2

1 2À1 ð1 þ 1 2ÞÀð1þ 2Þ2

ð1 2 2 2Þ0 1 2 0

’ ð1 2Þ ð Þ ¼2

1

2 1 2

1 ð1 þ 1 2 2ÞÀð1þ 2Þ2

ð1 2 2 2Þ1 1 2 0

2 1

ðÞ ð Þ ðÞ

ðÞ ¼Z 10

À1 À 0

Page 44: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 44/675

T a b l e 2

. 1 S o m e s p e c i a l p r o b a b i l i t y f u n c t i o n s

N a m e

P r o b a b i l i t y f u n c t i o n f X ð x Þ

m g f

E ( X )

v a r ( X )

D i s c r e t e

d i s t r i b u t i o n s

B e r n o u l l i

p x ð 1

À p Þ 1 À

x

p e t þ 1 À

p

p

p ð 1 À

p Þ

x ¼ 0 ; 1

0 4 p 4 1

B i n o m i a l

n x p x ð 1

À p Þ n À x

ð p e t þ 1 À

p Þ n

n p

n p ð 1 À

p Þ

x ¼ 0 ; 1 ;

. . . ;

n

0 4 p 4 1

M u l t i n o m i a l

N !

x 1 ! x 2 ! Á Á Á x k ! p x 1 1

p x 2 2 Á Á Á p x k k

ð p 1 e t 1 þ Á Á Á þ

p k À 1 e t k À

1 þ p k

Þ N

E

ð X i Þ ¼

N p i

V a r ð X i Þ ¼

N p i ð 1 À

p i Þ

C o v ð X i ; X j Þ ¼ À

N p i p j

x i ¼ 0 ; 1 ;

. . . ;

N ;P x i ¼

N ,

0 4 p i 4 1 ;P p i ¼ 1

H y p e r g e o m e t r i c

N p x À Á N

À N p

n À x

À

Á

Nx À Á

n p

n p ð 1 À

p Þ ð N À

n Þ

ð N À

1 Þ

x ¼ 0 ; 1 ;

. . . ;

n

Ã

0 4 p 4 1

Page 45: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 45/675

T a b l e 2

. 1 ( C o n t i n u e d )

N a m e

P r o b a b i l i t y f u n c t i o n f

X ð x Þ

m g f

E ( X )

v a r ( X )

G e o m e t r i c

ð 1 À

p Þ x À 1 p

p e t

1 À

ð 1 À

p Þ e t

1 p

ð 1 À

p Þ

p 2

x ¼ 1 ; 2 ;

. . .

0 4 p 4 1

U n i f o r m o n

1 , 2 , . . . , N

1 N x ¼ 1 ; 2 ;

. . . ;

N

X N x ¼ 1

e t x N

N þ 1

2

N 2

À 1

1 2

C o n t i n u o u s

d i s t r i b u t i o n s

U n i f o r m o n

ð a ; b

Þ

1 b À a

a < x < b

e t b

À e t a

t ð b À a

Þ

b þ a

2

ð b À a

Þ 2

1 2

N o r m a l

e À ð 1 = 2 s 2 Þ ð x À

m Þ 2

ffi ffi ffi ffi ffi ffi ffi ffi ffi ffi ffi 2 p

s 2

p

e ð t m þ t 2 s

2 = 2 Þ

m

s 2

À 1 < x ; m <

1 ; s

> 0

Page 46: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 46/675

Page 47: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 47/675

and has the properties

G a a À ! for any positive integer aa À G a À for any positive a ;

not necessarily an integer

pp for a

VbbXFor other fractional values of a , the gamma function can be found fromspecial tables. For example, G = : ; G = : , andG = : (Abramowitz and tegun, , p. ). The beta function , denoted by B a ; b , is de®ned as

B a ; b

xaÀ À x bÀ dx for a > ; b >

The beta and the gamma functions have the following relationship

B a ; b G a G bG a b

The gamma and the beta functions can be very helpful in evaluatingsome complicated integrals. For example, suppose we wish to evaluatethe integral I I x eÀ x dx . We identify I as a gamma function Gwith a and then I G ! . imilary, the integral I x À x dx is a beta function B ; with a and b , andthus using the relationship with the gamma function and simplifying,we easily get I ! != ! = .

DISTRIBUTIONS OF FUNCTIONS OF RANDOM VARIABLES

USING THE METHOD OF JACOBIANS

et X ; X ; . . . ; X n be n continuous random variables with joint pdf f x ; x ; . . . ; xn which is nonzero for an n -dimensional region S x. e®nethe transformation

y u x ; x ; . . . ; xn ; y u x ; x ; . . . ; xn ; . . . ; yn u x ; x ; . . . ; xn

which maps S x onto S y, where S x can be written as the union of a ®nitenumber m of disjoint spaces S ; S ; . . . ; Sm such that the transforma-tion from S k onto S y is one to one, for all k ; ; . . . ; m . Then for each

k there exists a unique inverse transformation, denoted by x w k y ; y ; . . . ; yn ; x w k y ; y ; . . . ; yn ; . . . ; xn wnk y ; y ; . . . ; yn

16 CHAPTER 1

Page 48: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 48/675

Assume that for each of these m sets of inverse transformations, theacobian

J k y ; y ; . . . ; yn q w k; w k; . . . ; wnk

q y ; y ; . . . ; yndet

qwik

q yi exists and is continuous and nonzero in S y, where det( a ij ) denotes thedeterminant of the n Ân matrix with entry a ij in the ith row and j thcolumn. Then the joint pdf of the n random variables Y ; Y ; . . . ; Y n ,where Y i u i X ; X ; . . . ; X n , is

f y ; y ; . . . ; yn m

kj J k y ; y ; . . . ; yn j f w k y ; y ; . . . ; yn ;

w k y ; y ; . . . ; yn ; . . . ; wnk y ; y ; . . . ; yn

for all y ; y ; . . . ; yn PS y, and zero otherwise. The acobian of theinverse transformation is the reciprocal of the acobian of the directtransformation,

@ w k; w k; . . . ; wnk

@ y ; y ; . . . ; yn

@ u ; u ; . . . ; u n

@ x ; x ; . . . ; xn !À xi wik y ; y ;... ; yn

or

J k y ; y ; . . . ; yn J k x ; x ; . . . ; xn À

Thus the pdf above can also be written as

f y ; y ; . . . ; yn m

kj J k x ; x ; . . . ; xn Àj f x ; x ; . . . ; xn

where the right-hand side is evaluated at xi w ik y ; y ; . . . ; yn fori ; ; . . . ; n . If m so that the transformation from S x onto S y isone to one, the subscript k and the summation sign may be dropped.When m and n , this reduces to the familiar result

f Y y f X xdydx

À4 5 x uÀ y

CHEBYSHEV'S INEQUALITY

et X be any random variable with mean m and a ®nite variance s .Then for every k > , Chebyshev's inequality states that

INTRODUCTION AND FUNDAMENTALS 1

Page 49: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 49/675

ð À Þ2

2

CL

L 1 2 2 0

! 1

ffiffiffi ð À Þ

0 1

CL ’ CL

L

L

¼ ð 1 2 Þ S

1 : ð Þ ¼ 2 : 1 2 ð 1 2 Þ

1 2 ð 1 2 Þ ¼ð Þ ð 1 2 Þ ð 1 2 Þ

3 :

!1 ð À Þ ¼0 0

8

Page 50: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 50/675

!1 ð Þ ¼0 C ’

1

4 : ½ðÀ Þ2 ½ððÃÀ Þ

2 Ã

5 : ð Þ ðÃÞ Ã Ã

1 À 100 ð1 À Þ

V ð V Þ ¼1 À ð V Þ ð Þ þ ðV Þ

V

¼ ffiffiffi

ð À Þ

S ’ ðÀ1Þ

ð Þ

ð Þ

ð 1 2 Þ ¼Y¼1 ð Þ

ML

ð 1 2 Þ ð 1 2 Þ

Page 51: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 51/675

ubject to certain regularity conditions, Es are suf®cient andconsistent and are asymptotically unbiased, minimum variance, andnormally distributed. Here, as elsewhere in this book, the termasymptotic is interpreted as meaning large sample sizes. Anotheruseful property is invariance , which says that if g y is a smoothfunction of y and y is the E of y, then the E of g y is g y . If morethan one parameter is involved, y above may be interpreted as avector.

HYPOTHESIS TESTING

A statistical hypothesis is a claim or an assertion about the probabilityfunction of one or more random variables or a statement about thepopulations from which one or more samples are drawn, e.g., its form,shape, or parameter values. A hypothesis is called simple if thestatement completely speci®es the population. Otherwise it is calledcomposite . The null hypothesis H is the hypothesis under test. Thealternative hypothesis , H or H A , is the conclusion reached if the nullhypothesis is rejected.

A test of a statistical hypothesis is a rule which enables one tomake a decision whether or not H should be rejected on the basis of the observed value of a test statistic , which is some function of a set of observable random variables. The probability distribution of the teststatistic when H holds is sometimes referred to as the null distribu-tion of the test statistic.

A critical region or rejection region R for a test is that subset of values assumed by the test statistic which, in accordance with the test,leads to rejection of H . The critical values of a test statistic are thebounds of R . For example, if a test statstic T prescribes rejection of H for T 4 ta ; then ta is the critical value and R is written symbolically as

T PR for T 4 t a

A type I error is committed if the null hypothesis is rejected when it istrue. A type II error is failure to reject a false H . For a test statistic T of H : y Po versus H : y PO Ào , the probabilities of these errors are,respectively,

a y P T PR jy Po and b y P T =P R jy PO Ào

The least upper bound value, or supremum, of a y for all y Po is oftencalled the size of the test The signi®cance level is a preselected nom-inal bound for a y , which may not be attained if the relevantprobability function is discrete. ince this is usually the case in

20 CHAPTER 1

Page 52: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 52/675

S

‘‘ ’’

2 ðÞ ¼ 0 P ðÞ ¼ð 2 Þ P

0 1

P ðÞ ¼ð 2 2 À Þ ¼1 À ðÞ :

1 0 0 12 3

4

1

S

0 ¼10 10

1

10 1 1000 1000

10 5 1

Page 53: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 53/675

simulations gives an empirical estimate of the power (simulatedpower) of the test for the normal distribution when the mean is .and so on. The simulation technique is particularly useful when a newtest is developed with an analytically complicated null and =or alter-native distribution and we would like to learn about the test's per-formance. The number of successes follows a binomial distributionwith n and p a and this fact can be used to give an idea aboutthe simulation error associated with the proportion of successes interms of its standard error, which is a Àa =p .

A test is said to be most powerful for a speci®ed alternative hy-pothesis if no other test of the same size has greater power against thesame alternative.

A test is uniformly most powerful against a class of alternativehypotheses if it is most powerful with respect to each speci®c simplealternative hypothesis within the class of alternative hypotheses.

A ``good'' test statistic is one which is reasonably successful indistinguishing correctly between the conditions as stated in the nulland alternative hypotheses. A method of constructing tests which of-ten have good properties is the likelihood-ratio principle . A randomsample of size n is drawn from the population f X :; y with likelihoodfunction L x ; x ; . . . ; xn ; y , where y is to be interpreted as a vector if more than one parameter is involved. uppose that f X :; y is a speci-®ed family of functions for every y Po and o is a subset of O. Thelikelihood-ratio test of

H : y Po versus H : y PO Ào

has the rejection region

T PR for T 4 c; 4 c 4

where T is the ratio

T L o

L O

and L o and L O are the maximums of the likelihood functionwith respect to y for y

Po and y

PO, respectively. For an exact size

a test of a simple H , the number c which de®nes R is chosen suchthat P T 4 c j H a . Any monotonic function of T , say g(T ), canalso be employed for the test statistic as long as the rejection regionis stated in terms of the corresponding values of g(T ); the natural

22 CHAPTER 1

Page 54: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 54/675

À2 1 À 2 ! 1

1 2 2 1

0

0 ¼50

1 50 52 ð 52 ¼50Þ 52

50

S

0 0

0

Page 55: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 55/675

The P value provides not only a means of making a decision about thenull hypothesis, but also some idea about how strong the evidence isagainst the null hypothesis. For example, suppose data set with testT results in a P value of . , while data set with test T (or T ) hasa P value of . . The evidence against the null hypothesis is muchstronger for data set than for data set because the observed sampleoutcome is much less likely in data set .

ost of the tables in the Appendix of this book give exact P va-lues for the nonparametric test statistics with small sample sizes. Insome books, tables of critical values are given for selected a values.ince the usual a values, . , . , and the like, are seldom attainableexactly for nonparametric tests with small sample sizes, we prefer

reporting P values to selecting a level a . If the asymptotic distributionof a test statistic is used to ®nd a P value, this may be called anasymptotic or approximate P value.

If a test has a two-sided alternative, there is no speci®c di-rection for calculating the P value. One approach is simply to reportthe smaller of the two one-tailed P values, indicating that it is one-tailed. If the distribution is symmetric, it makes sense to double thisone-tailed P value, and this is frequently done in practice.This procedure is sometimes used even if the distribution is notsymmetric.

Finally, note that the P value can be viewed as a random vari-able. For example, suppose that the test statistic T has a cdf F under H and a cdf G under a one-sided upper-tailed alternative H . The P

value is the probability of observing a more extreme value than thepresent random T , so the P value is just the random variable P À F T . For a discussion of various properties and rami®cations,the reader is referred to ackrowitz and amuel- ahn ( ) andonahue ( ).

CONSISTENCY

A test is consistent for a speci®ed alternative if the power of the test,when that alternative is true, approaches as the sample sizeapproaches in®nity. A test is consistent for a class (or subclass) of alternatives if the power of the test when any member of the class(subclass) of alternatives is true approaches as the sample sizeapproaches in®nity.

onsistency is a ``good'' test criterion relevant to both parametricand nonparametric methods, and all the standard test proceduresclearly share this property. However, in nonparametric statistics the

24 CHAPTER 1

Page 56: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 56/675

alternatives are often extremely general, and a wide selection of testsmay be available for any one experimental situation. The consistencycriterion provides an objective method of choosing among these tests(or at least eliminating some from consideration) when a less generalsubclass of alternatives is of major interest to the experimenter. A testwhich is known to be consistent against a speci®ed subclass is said tobe especially sensitive to that type of alternative and can generally berecommended for use when the experimenter wishes particularly todetect differences of the type expressed in that subclass.

onsistency of a test can often be shown by investigatingwhether or not the test statistic converges in probability to theparameter of interest. An especially useful method of investigating

consistency is described as follows. A random sample of size n isdrawn from the family f X :; y ; y PO . et T be a test statistic for thegeneral hypothesis y Po versus y PO Ào , and let g y be somefunction of y such that

g y y if y Po

and

g y Ty if y PD for D &O Ào

If for all y we have

E T g y and lim n3I var T then the size a test with rejection region

T PR for jT Àyj> ca

is consistent for the subclass D. imilarly, for one-sided subclass of alternatives where

g y y if y Po

and

g y > y if y PD for D &O Ào

the consistent test of size a has rejection region

T

PR for T

Ày > cH

a

The results follow directly from hebyshev's inequality. (For a proof,see Fraser, , pp. ± .) It may be noted that the unbiasednesscondition may be relaxed to asymptotic n 3 I unbiasedness.

INTRODUCTION AND FUNDAMENTALS 25

Page 57: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 57/675

PITMAN EFFICIENCY

Another sort of objective criterion may be useful in choosing betweentwo or more tests which are comparable in a well-de®ned way, namelythe concept of Pitman ef®ciency . In the theory of point estimation, theef®ciency of two unbiased estimators for a parameter is de®ned as theratio of their variances. In some situations, the limiting value of thisratio may be interpreted as the relative number of additional obser-vations needed using the less ef®cient estimator to obtain the sameaccuracy. The idea of ef®ciency of two test statistics is closely related,where power is regarded as a measure of accuracy, but the tests mustbe compared under equivalent conditions (as both estimators werespeci®ed to be unbiased), and there are many variables in hypothesis

testing. The most common way to compare two tests is to make allfactors equivalent except sample size.The power ef®ciency of a test A relative to a test B, where both

tests are for the same simple null and alternative hypotheses, thesame type of rejection region, and the same signi®cance level, is theratio n b=n a , where n a is the number of observations required by test Afor the power of test A to equal the power of test B when n b observa-tions are employed. ince power ef®ciency generally depends on theselected signi®cance level, hypotheses, and n b , it is dif®cult to calcu-late and interpret. The problem can be avoided in many cases byde®ning a type of limiting power ef®ciency.

et A and B be two consistent tests of a null hypothesis H andalternative hypothesis H , at signi®cance level a . The asymptotic re-lative ef®ciency (A E) of test A relative to test B is the limiting value of the ratio n b=n a , where n a is the number of observations required bytest A for the power of test A to equal the power of test B based on n bobservations while simultaneously n b 3 I and H 3 H .

In many applications of this de®nition, the ratio is the same forall choices of a , so that the A E is a single number with a well-de®nedinterpretation for large samples. The requirement that both tests beconsistent against H is not a limitation in application, since mosttests under consideration for a particular type of alternative will beconsistent anyway. ut with two consistent tests, their powers bothapproach with increasing sample sizes. Therefore, we must let H approach H so that the power of each test lies on the open intervala ; for ®nite sample sizes and the limiting ratio will generally be

some number other than . The A E is sometimes also called localasymptotic ef®ciency since it relates to large sample power in the vi-cinity of the null hypothesis. A few studies have been conducted which

26 CHAPTER 1

Page 58: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 58/675

seem to indicate that in several important cases the A E is areasonably close approximation to the exact ef®ciency for moderate-sized samples and alternatives not too far from the null case. Espe-cially in the case of small samples, however, the implications of the A E value cannot be considered particularly meaningful. ethods of calculating the A E for comparisons of particular tests will be treatedfully in hapter .

The problem of evaluating the relative merits of two or morecomparable test statistics is by no means solved by introducing thecriteria of consistency and asymptotic relative ef®ciency. oth arelarge-sample properties and may not have much import for small- oreven moderate-sized samples. As discussed in ection . , exact power

calculations are tedious and often too speci®c to shed much light on theproblem as it relates to nonparametric tests, which may explain thegeneral acceptance of asymptotic criteria in the ®eld of nonparametricinference.

The asymptotic relative ef®ciency of two tests is also de®ned asthe ratio of the limits of the ef®cacies of the respective tests as thesample sizes approach in®nity. The ef®cacy of a test for H : y ybased on a sample size n is de®ned as the derivative of the mean of thetest statistic with respect to y divided by the variance of the test sta-tistic, both evaluated at the hypothesized value y y . Thus, for largen the ef®cacy measures the rate of change of the mean (expressed instandard units) of a test statistic at the null hypothesis values of y. A test with a relatively large ef®cacy is especially sensitive to alternative

values of y close to y and therefore should have good power in thevicinity of y . etails will be given in hapter .

RANDOMIZED TESTS

We now turn to a different problem which, although not limited tononparametric inference, is of particular concern in this area. Formost classical test procedures, the experimenter chooses a ``reason-able'' signi®cance level a in advance and determines the rejection-region boundary such that the probability of a type I error is exactlya for a simple hypothesis and does not exceed a for a compositehypothesis. When the null probability distribution of the test statisticis continuous, any real number between and may be chosen asthe signi®cance level. et us call this preselected number the nom-inal a . If the test statistic T can take on only a countable number of values, i.e., if the sampling distribution of T is discrete, the numberof possible exact probabilities of a type I error is limited to the

INTRODUCTION AND FUNDAMENTALS 2

Page 59: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 59/675

1 2

0

D

P

0

2 1 2 1 2

1 2 0 1

ð 2 0Þ þ ð1 2 0Þ ¼S

100 100 S

1

P ðÞ ¼ð 2 1Þ þ ð1 2 1Þ

5 B 0 ¼0 5 1 0 5 0 05

8

Page 60: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 60/675

¼5

C :

0 05

¼1 32 ¼0 03125

2 ¼5

¼0 05 1 ¼4 2 ¼5 :

ð 5 ¼0 5Þ þ ð4 5Þ ¼1 32 þ ð ¼4Þ ¼0 05

1 32

þ5 32

¼0 05

¼0 12

2 1 ¼5 0 12 ¼4 C

1 ¼0 6

P ð0 6Þ ¼ð ¼5 ¼0 6Þ þ0 12 ð ¼4 ¼0 6Þ¼0 0778 þ0 12ð0 2592 Þ ¼0 3110

5 4 3 2 1 0

ð ¼0 5Þ 1 32 6 32 16 32 26 32 31 32 1

Page 61: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 61/675

Æ0 5 2 0 2

À ð 0Þð 0Þ

0

2 À0 5 À ð 0Þð 0Þ ¼ 2

0 2 þ0 5 À ð 0Þð 0Þ ¼ À 2

2 ð 2Þ ¼1 À 2

ð 0Þ þ0 5 þ 2 ð 0Þ ð 0Þ À0 5 À 2 ð 0Þ

À0 5 À ð 0Þð 0Þ ¼

ð 0Þ þ0 5 þ ð 0ÞS

ð 0Þ À0 5 À ð 0Þ

L

0 ð 0 0Þ

Page 62: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 62/675

ð 0 À0 5 0Þ B

¼20 0 ¼0 5 1 0 5 ¼13

ð 13 0Þ ¼ð 12 5Þ ¼ À10

ffiffiffi5

12 5 À10

ffiffiffi5

¼ ð 1 12Þ¼1 À ð1 12Þ ¼0 1314

0 1316 C

0 0901 0

1 À0 À ð 0Þ À0 5

ð 0Þ !

0 À ð 0Þ þ0 5

ð 0Þ !

Page 63: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 63/675

, , v

L 1 2

S ð1Þ 1 2 ð2Þ ðÞ

ð1Þ ð2Þ ÁÁÁ ðÞ

1 2 1 ’ ðÞ S

Page 64: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 64/675

:

1 ðÞ

2 ð1Þ

3 ½ðþ1Þ2 ð 2Þ ð 2þ1Þ

4 ð ð1Þþ ðÞÞ2

5 ðÞÀ ð1Þ 6

‘‘ ’’ ð1Þ ð2Þ ÁÁÁ ðÞ

7

C 5

B

, , V

Page 65: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 65/675

60 3

100

100

ð0 1Þ

ð Þ ¼ ð Þ ¼ ¼ À1

ð Þ ¼Q ð Þ Q ð Þ 0 1

C ¼2 2 1 C 1

ð Þ ¼0 0

1 À À 2 0(S 1 À À 2 ¼ 0 ¼ À2 ð1 À Þ0 1 Q

ð

Þ ¼ À2

ð1

À

Þ 2 1 2 2

S ¼2

Q ð0 5Þ 1 3863 Q ð0 4Þ 1 0217 3 0 2231

Q ð Þ

Q ð Þ ¼ À1 ð Þ ¼ ½ ð Þ 0 1

Q

ð

Þ

S 0 25

2

Page 66: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 66/675

0 50 0 75 25 50 75

¼0 95 0 99 0 995

ð Þ ¼Z 1

0Q ð Þ ð 2Þ ¼Z

1

0Q2

ð Þ ð2 1Þ 2 ¼R 1

0 Q2 ð Þ À ½R 1

0 Q ð Þ2

¼2

, , V

Page 67: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 67/675

L ð Þ ¼ 0 ð Þ

, Q ð Þ

Q0 ð Þ ¼1

½Q ð ÞQ00 ð Þ ¼ À

0 ½Q ð Þ ½Q ð Þ

3

ð 3 2Þ

¼2

2

Page 68: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 68/675

1 À À3 2 3 2

?

ð

Þ

ð Þ ¼

ð3 2Þ ð 3 2Þ

S 2 1 S

ð1Þ ð2Þ ðÞ ð Þ

ð Þ ¼0

ð1

Þ ðÀ1Þ ðÞ ¼1 2

1 ðÞ8><>:ð3 1Þ

S ¼5 9 4 11 211 4 12 12 6 3 1C ð Þ

1 5 0 2

ð Þ

ð Þ ð ÞL ð Þ ¼ ð Þ ð Þ

, , V 7

Page 69: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 69/675

, ð Þ ð Þ. ¼1 2

ð Þ ¼ ½ ¼1 0 &

1ð Þ 2ð Þ ð Þ B

¼ ½ð Þ ¼1 ¼ð Þ ¼ ð Þ ð Þ ¼P¼1 ð Þ

B ð Þ ¼ ð Þ 3 1

ð Þ½ð Þ ¼ ð Þ

¼5

8 2

Page 70: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 70/675

Page 71: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 71/675

S

Q ðÞ0 1

Q ðÞ ¼

ð1Þ 0 1

ð2Þ 1 2

ð3Þ 2 3

ðÞ À1 1

8>>>>>>>>>>><>>>>>>>>>>>:Q ðÞ ¼ ½ ð Þ

¼10 0 30 30

Q10ð0 3Þ ¼ ð3Þ2

10 0 3 310

30 10

0 25 25 1 ð3Þ 2 10 0 25 3 10

( ) X ðr Þ

ð

ðÞ Þ ¼¼

½ ðÞ ¼¼

¼ ½ ðÞ½1 À ðÞÀ À 1 1 ð4 1Þ

2

Page 72: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 72/675

ðÞ ’

3 1 ’

( ) X ðr Þ 0 ð Þ ¼ ð Þ

ðÞð Þ ¼ðÀ

1

Þð ÀÞ½ ð ÞÀ1

½1 À ð ÞÀ ð ÞÀ 1 1 ð4 2Þ

4 1

4 1 4 2 B

S 0 1

ðÞ ¼ 0 1 4 1 ðÞ

ðÞðÞ ¼ð ðÞ Þ ¼¼

½ ðÞ ¼

¼¼ ð1 ÀÞÀ 0 1

ðÞ

ðÞðÞ ¼ðÀ1Þð ÀÞÀ1ð1 ÀÞÀ 0 1 ð4 3Þ

4 3

ð0 1Þ, ðÞ ð À þ1Þ.

, , V

Page 73: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 73/675

4 1 4 3

¼ ð1 ÀÞÀ ¼1

ð À þ1ÞZ 0 À1ð1 À ÞÀ ð4 4Þ

ð À þ1Þ 1 À ð Þ ¼ 1Àð Þ

P 2 3

- ( )

ðÞ¼ ð ðÞÞ

0 1 ðÞ

P

( - ) . ,

¼ ð

Þ

ð0 1Þ. S 0 ð Þ 1 ð Þ ¼0 0 ð Þ ¼1 1

0 1 ðÞ ¼ ð Þ

ð Þ ¼½ ð Þ ¼ð Þ ¼ ðÞ ¼ 0 1

P 1 2

ð 1Þ ð 2Þ ð Þ

2

Page 74: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 74/675

S ð1Þ ð2Þ ÁÁÁ ðÞ

ð ð1ÞÞ ð ð2ÞÞ ÁÁÁ ð ðÞÞ

0 1

P

P

p S ð2 3Þ X 3 S ð Þ X S ð Þ 0 1

ð3Þ À ð2Þ C D S ¼3 XC L XC L

ð2Þ ¼1 À0 5724 ¼0 4276 ð3Þ ¼1 À0 3916 ¼0 6084 0 6084 À0 4276 ¼0 1808

p P X

2 X ð Þ ¼1 À À 2 P ¼1 À À 2

0 1 1 À À 2 ¼ ¼ À2 ð1 À Þ

X

¼ À2 ð1 À Þ ¼0 2346

X 0 5347 2

0 1

S

, , V

Page 75: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 75/675

P 2 4

S 1 2

1 2 ð 1 2 Þ ¼Y¼1 ð Þ

ð1Þ ð2Þ ÁÁÁ ðÞ 1 2 J

1 ¼ ð 1 2 Þ ¼ ð1Þ 2 ¼ ð 1 2 Þ ¼ ð2Þ ¼ ð 1 2 Þ ¼ ðÞ

¼ ð 1 2 Þ ¼ ðÞ

5 1 À1 ÁÁÁ 2

5 ¼ 1

1 ¼ 2

À

1

¼ 3

¼ À1

2 ¼

2

Page 76: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 76/675

J Â 1 2

Æ1

1 2 ð 2 3 À1Þ ¼Y¼1 ð Þ 1 2 ÁÁÁ

J 1

J

C 1 ð1Þ ð2Þ ðÞð 1 2 Þ¼ ! Y¼1

ð Þ

¼ Y¼1 ð Þ 1 2 ÁÁÁ

ð6 1Þ 2

ð1Þ ð2Þ ðÞð

1

2

Þ¼ð2 2Þ

2 À 12 2P¼1ð ÀÞ2

À1 1 2 ÁÁÁ 1

ð À1Þ 6 1 ðÞ

ðÞð Þ¼ ð ÞZ À1Z À1

À1 ÁÁÁZ 3

À1Z 2

À1YÀ1

¼1 ð Þ

¼

ð

ÞZ À1Z À1

À1 ÁÁÁZ 3

À1½

ð

ð

2ÞÂYÀ

1

¼3 ð Þ 2 ÁÁÁÀ1

, , V

Page 77: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 77/675

¼ ð ÞZ À1Z À1

À1 ÁÁÁZ 4

À1½ ð 3Þ

2

2ð1Þ ð 3Þ

ÂYÀ1

¼4 ð Þ 3 ÁÁÁÀ1

¼ ð Þ½ ð ÞÀ1

ð À1Þ¼ ½ ð ÞÀ1 ð Þ ð6 2Þ

S ð1Þ

ð1Þð 1Þ¼ ð 1ÞZ 1 1 Z 1

2ÁÁÁZ 1

À2Z 1 À1Y¼2

ð Þ À1 ÁÁÁ3 2

¼ ð 1ÞZ 1 1 Z 1

2ÁÁÁZ 1

À2½1 À ð À1Þ ð À1Þ

ÂYÀ2

¼2 ð Þ À1 À2 ÁÁÁ2

¼ ð 1ÞZ 1 1 Z 1

2ÁÁÁZ 1

À3

½1À ð À2Þ2

2ð1Þ ð À2Þ

Â

YÀ3

¼2 ð Þ À2 ÁÁÁ2

¼ ð 1Þ½1À ð 1ÞÀ1

ð À1Þ¼ ½1 À ð 1ÞÀ1 ð 1Þ ð6 3Þ

1 À1 ÁÁÁ

À1 1 2 ÁÁÁ ðÞ ð1Þ:

ðÞð

Þ¼

ð

ÞZ À1Z À1

À1 ÁÁÁZ 2

À1Z 1

Z 1

þ1 ÁÁÁZ 1

À1

ÂY¼1

ð Þ ÁÁÁþ2 þ1 1 ÁÁÁÀ1

2

Page 78: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 78/675

¼ ð Þ½1 À ð ÞÀ

ð ÀÞ Z À1Z À1

À1 ÁÁÁ

Z 2

À1YÀ1

¼1 ð Þ 1 ÁÁÁÀ2 À1

¼ ð Þ½1 À ð ÞÀ

ð ÀÞ½ ð ÞÀ1

ðÀ1Þ¼ðÀ1Þð ÀÞ½ ð ÞÀ1

½1 À ð ÞÀ ð Þ ð6 4Þ

ð1Þ ð2Þ ðÀ2Þ

ðÀ1Þ ðÞ

Z 1 À2 Z 1

À1

ð1Þ ð2Þ ðÞð 1 2 Þ ðÞ ðÀ1Þ

ðÞ R

ðÞð Þ¼ ! 0

ðÞð þ Þ À ðÞð Þ

¼ ! 0

ð ðÞ À Þ ð6 5ÞS

:

1 ¼ ðÀ1 2 ¼ ð þ 3 ¼ ð þ 1Þ

, , V 7

Page 79: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 79/675

1 ¼ ð 2 1Þ ¼ ð Þ 2 ¼ ð 2 2Þ ¼ ð þ Þ À ð Þ 3 ¼ ð 2 3Þ ¼1 À ð þ Þ ðÞ 1 2

2 À1 1 À ’

3 ðÞ 2 S

1 2 3

6 5

ðÞð Þ ¼! 0 À1 1 À À11 2 À3

¼ðÀ1Þð ÀÞ½ ð ÞÀ1

 ! 0

ð þ Þ À ð Þ½1 À ð þ ÞÀ& '¼ðÀ1Þð ÀÞ½ ð ÞÀ1 ð Þ½1 À ð ÞÀ ð6 6Þ

6 4 4 2 ðÞ ðÞ ð1Þ ð2Þ ÁÁÁ ðÞ B

ðÞ ðÞð Þ¼! 0

! 0

ðÞ ðÞð þ þ ÞÀ ðÞ ðÞð þ ÞÀ ðÞ ðÞð þ Þþ ðÞ ðÞð Þ

¼! 0

! 0

ð ðÞ þ ðÞ þ ÞÀ ð ðÞ þ ðÞ Þ

¼! 0

!0

ð ðÞ þ ðÞ þ Þ ð6 7Þ

:

8 2

Page 80: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 80/675

Page 81: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 81/675

ð1Þ ð2Þ ð Þð 1 2 Þ¼ð1 À1Þð2 À 1 À1ÞÁÁÁðÀ Þ½ ð 1Þ1À1

 ½ ð 2Þ À ð 1Þ2À1À1

ÁÁÁ½1 À ð ÞÀ

 ð 1Þ ð 2ÞÁÁÁ ð Þ 1 2 ÁÁÁ

ð1Þ ð2Þ ÁÁÁ ðÞ 0 1 ð Þ ¼

ðÞ ðÞ ðÞ

ðÞð Þ ¼ðÀ1Þð ÀÞ À1ð1 À ÞÀ 0 1 ð6 9Þ

ðÞ ðÞð Þ ¼ðÀ1ÞðÀ À1ÞðÀ1Þ À1ð À ÞÀÀ1

ð1 À ÞÀ

0 1 ð6 10Þ 6 4 6 8

6 9 À þ1

4 3

7 S 2 1

S 2 6

6 4

¼ ðþ1Þ2

¼ ð 2Þþ ½ðþ2Þ2

2

L ¼2 6 8

2

Page 82: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 82/675

ð Þ ð þ1Þð Þ ¼ð2 Þ½ðÀ1Þ

2 ½ ð Þ À1

½1 À ð Þ À1 ð Þ ð ÞM

¼ þ 2

v ¼ J ¼2

ðÞ ¼ð2 Þ2

½ðÀ1Þ2

Z 1

½

ð2

Àv

ÞÀ1

½1

À

ðv

ÞÀ1

 ð2 Àv Þ ðv Þ v ð7 1Þ 0 1 7 1

0 2 Àv 1 0 v 1

vv

2ðv þ1Þ

2 0 v 1

7 1 7 1 v 2 0 1

2 v 112 1 ¼2

4

7 R

, , V

Page 83: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 83/675

ðÞ ¼8 2ð3 À4 Þ 0 1 28ð4 3 À9 2 þ6 À1Þ 1 2 1& ð7 2Þ

¼1 2

ðÞ¼

PÀ1

¼0ð2 Þ2

ð À1Þð À À1Þð þ ÞÂð2 À1ÞÀ À1

½ð1À Þ þ Àð1À2 Þ

þ 0 1 2

PÀ1

¼0ð2 Þ2

ð À1Þð À À1Þð þ Þ

Âð2 À1ÞÀ

À1

ð1À Þ

þ 1 2 1

8>>>>>>>>>>><>>>>>>>>>>>: V

¼ ðÞÀ ð1Þ ð1Þ ðÞ

ð1Þ ðÞð Þ ¼ð À1Þ½ ð Þ À ð ÞÀ2 ð Þ ð Þ

¼ À v ¼

v

ðÞ ¼Z 1

À1 ð À1Þ½ ðv Þ À ðv À ÞÀ2 ðv À Þ ðv Þ v

0 ð7 3Þ 7 3

0v

À 1 0v

1 0 v 1

ðÞ ¼ð À1Þ À2ð1 À Þ 0 1 ð7 4Þ

2

Page 84: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 84/675

Page 85: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 85/675

À þ1

ð

ðÞÞ ¼ðÀ1Þð ÀÞ ðþ À þ1Þ

¼ðÀ1Þð ÀÞðþ À1Þð ÀÞ

ð þ Þ¼ ðþ À1Þ

ð þ ÞðÀ1Þ 1

ð ðÞÞ ¼þ1 ð8 2Þ

ð ðÞÞ ¼ð À þ1Þð þ1Þ

2

ð þ2Þ ð8 3Þ 8 2 8 3

À þ1 4 3 ðÞ

0 1

À þ1

W X ðr Þ X ðs Þ

ðÞ ðÞ ¼1 2 6 8

ð ðÞ ðÞÞ ¼ðÀ1ÞðÀ À1Þð ÀÞÂZ 1

À1Z À1 ½ ð ÞÀ1

½ ð Þ À ð ÞÀÀ1

 ½1

À

ð

ÞÀ

ð

Þ

ð

Þ ð Þ ¼ ð Þ ð Þ ¼ ð Þ ð Þ ¼ ð Þ ¼v ¼ À1

ðÞ ¼Q ðÞ ¼ À1 ðv Þ

¼Q ðv Þ

2

Page 86: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 86/675

ðÀ1ÞðÀ À1Þð ÀÞÂZ 1

0 Z Q ðv Þ0

Q ðÞQ ðv Þ À1ðv À ÞÀÀ1

ð1 Àv ÞÀ v ð8 4Þ 8 4

S 0 1 Q ðÞ ¼ Q ðv Þ ¼v

ð ðÞ ðÞÞ¼ðÀ1ÞðÀ À1Þð ÀÞÂZ

1

0 Z v

0v ðv À ÞÀÀ1ð1 Àv ÞÀ v

¼ v

ðÀ1ÞðÀ À1Þð ÀÞ ðþ1 ÀÞZ 1

0v þ1ð1 Àv ÞÀ v

¼ðÀ1ÞðÀ À1Þð ÀÞ ðþ1 ÀÞ ðþ2 À þ1Þ

¼ ðÀ À1Þðþ1Þð ÀÞðÀ1ÞðÀ À1Þð ÀÞ ð þ2Þ

¼ ðþ1

Þð þ1Þðþ2Þ ð8 5Þ

ð ðÞ ðÞÞ ¼ð ðÞ ðÞÞ À ð ðÞÞ ð ðÞÞ 0 1

ð ðÞ ðÞÞ ¼ ðþ1Þð þ1Þðþ2ÞÀð þ1Þ

2

¼ ð À þ1Þð þ1Þ

2

ð þ2Þ ð8 6Þ

ð ðÞ ðÞÞ ¼ð À þ1Þð À þ1Þ !1 2

ð8 7Þ

, , V

Page 87: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 87/675

0 1

ð ð1Þ ðÞÞ ¼1

ð Þ Q ðÞ

S

1961 ¼2

ð ð1ÞÞ ¼2Z 1

À1 1 ÀZ À1

1

ffiffiffiffiffiffi2 ðÀ1 2Þ2 !1

ffiffiffiffiffiffi2 ðÀ1 2Þ 2

¼1Z 1

À1Z 1

ðÀ1 2Þð2þ 2Þ

¼ ¼

ð ð1ÞÞ ¼1Z 5 4

4 Z 10

2 ðÀ1 2Þ2

¼ ffiffiffi2

ffiffiffi Z 5 4

4

12Z 1

À1

2

ffiffiffiffiffiffi2 ðÀ1 2Þ2

¼1

ffiffiffiffiffiffi2 Z 5 4

4

¼1

ffiffiffiffiffiffi

2 À1

ffiffiffi

2 À1

ffiffiffi

2 ¼ À1

ffiffiffi

S ð ð1Þþ ð2ÞÞ ¼0 ð ð2ÞÞ ¼1

ffiffiffi

2

Page 88: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 88/675

Page 89: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 89/675

1 ½ ð Þ ðÞ2 ½ ð Þ ðÞ þ2

2ð2ÞðÞ

9 2 9 3

:

ð ÞÀ ½ ð Þ ¼ð ÀÞ ð1ÞðÞþ ð2ÞðÞ12 ½ð ÀÞ

2

À ð Þþ

1

¼3

ðÞðÞð ÀÞÀ ½ð ÀÞ ð ÞÀ ½ ð Þ

2

¼ð ÀÞ2

½ ð1ÞðÞ2

þ14½ ð2ÞðÞ

2

½ 2ð ÞÀ2 ð Þð ÀÞ

2

À ð1ÞðÞ ð2ÞðÞ ð Þð ÀÞþð Þ

½ ð Þ ¼2½ ð1ÞðÞ2

À14 ½ ð2ÞðÞ

2 4 þ ½ð Þ ð9 4Þ½ð Þ

½ ð Þ [ ]

½ ð Þ ¼ðÞ

½ ð Þ ¼ ½ ð1ÞðÞ2 2

½ ð Þ ½ ð Þ ½ ð Þ ¼ðÞ þ

ð2ÞðÞ2

2

½ ð Þ ¼ ½ ð1ÞðÞ2 2 À

ð2ÞðÞ2

2 !2

¼ ðÞ ðÞ¼ À1

ð ðÞÞ

8 2

Page 90: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 90/675

ðÞ ¼Q ðÞ ðÞ 8 2 8 3

¼ ð ðÞÞ ¼þ1

2 ¼ ð ðÞÞ ¼ð À þ1Þð þ1Þ

2

ð þ2Þ 9 1 ð1Þ ð2Þ

2 1

¼ð þ1Þ

ð1ÞðÞ ¼ À1 þ1 !& 'À1

ð2ÞðÞ ¼À À1 À1

þ1 ! À1 þ1 !& 'À3

S

ðÞ

ð ðÞÞ ¼ À1 þ1 ð9 5Þ

ð ðÞÞ ¼ð À þ1Þð þ1Þ

2

ð þ2Þ À1

þ1 !& 'À2

ð9 6Þ 8 1 ðÞ

½ð ðÞÀ Þ3

¼ð2 2 À6 þ4 þ4 2 À6 þ2Þð þ1Þ

3

ð þ2Þðþ3Þ ð9 7Þ 9 3 9 4

9 5 9 6

, , V

Page 91: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 91/675

19

ð ð4ÞÞ %À1ð0 20Þ ¼ À0 84

ð ð4ÞÞ %4ð16Þ

202ð21Þ½j ðÀ0 84ÞÀ2

¼0 1621

0 2803 À2 ¼0 097

R 1954

S 1962 ð4Þ ¼19 À0 8859

0 107406

ðÞ

ðÞ

:

1 À!1 À! 0 1

2 À!1 À

C 1 2

C 2 1948

ðÞ 1958

1

ðÞ j ðÞ ðÞ

2

Page 92: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 92/675

ðÞðÞ ¼ðÀ1Þð ÀÞÀ1ð1 À ÞÀ 0 1

ðÞ¼ ðÞÀ

ðÞð Þ ¼ðÀ1Þð ÀÞð þ ÞÀ1

ð1 À À ÞÀ

¼ À1

À1

À1ð1 À ÞÀ 1 þ

À1

1 À

1

À À

¼ À1

À1 À1ð1 À ÞÀ v

ð10 1Þ

v ¼ ðÀ1Þ 1 þ þ ð ÀÞ 1 À

1 À ð10 2Þ

ð1 þ Þ ¼1

¼1 ðÀ1ÞÀ1

À1 1

¼ 1 1 À ¼ 2

v ¼ðÀ1Þ 1 À21 2

2 þ 31 3

3 ÀÁÁÁ ÀðÀÞ 2 þ 22 2

2 þ 32 3

3 þÁÁÁ ¼ ½1ðÀ1ÞÀ2ð ÀÞ À

2

2 ½21ðÀ1Þþ2

2ð ÀÞþ

3

3 ½31ðÀ1ÞÀ3

2ð ÀÞ ÀÁÁÁ ð10 3ÞS v ! 1 ! 0 1

1 2

1 ¼ ð À þ1Þð þ2Þ !1 2

%1 À 1 2

, , V

Page 93: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 93/675

2 ¼ ð À þ1Þðþ2Þ !1 2

%ð1 À Þ !1 2

S 10 3

! 1

ðÀ1Þffiffiffiffiffiffiffiffiffiffiffi1 À

ffiffiffiffiffiffi Àð ÀÞffiffiffi

ffiffiffiffiffiffiffiffiffiffið1 À Þ ¼ À À ð1 À Þ ffiffiffiffiffiffiffiffið1 À Þ ¼ À ffiffiffiffiffi1 À

ffiffiffiffiffi ! 0

À 2 2

ðÀ1Þð1 À Þþð ÀÞ

ð1

À

Þ¼ ð1 À Þ Àð

1 À Þþ ¼1 Àð1 À Þ! 1

3 3

ðÀ1Þð1À Þ3 2

ð Þ3 2 Àð ÀÞ 3 2

½ð1À Þ3 2 ¼ð À1Þ3 2

1À 3 2

À3 2

½ð1À Þ1 2 ! 0

S 10 3 À1 2

!1v ¼ À 2 2

10 1 ’

% ffiffiffiffiffiffi2 À þ1 2

À1ð1 À ÞÀ ¼À1 2ð À þ1ÞÀþ1 2

ð þ1Þð þ2Þ1 2 %

À1 2ð À þ1ÞÀþ1 2

ð þ1Þþ1 2

S ! 1 10 1

À1

À1 À1ð1ÀÞÀ

¼ ðþ1ÞðÀþ

1

ÞðÀþ1Þ

þ1

À1ð1ÀÞÀ

% ffiffiffiffiffiffi2 Àðþ1Þðþ1Þþ3 2

2 À þ1 2 ÀðÀþ1ÞðÀþ1ÞÀþ3 2þ1 2ðÀþ1ÞÀþ3 2

ðþ1Þþ3 2 ¼1

ffiffiffiffi2

2

Page 94: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 94/675

!1 ðÞð Þ ¼

1

ffiffiffiffiffiffi2 Àð1 2Þ 2

ðÞ

!1 ð ðÞ Þ ¼ À

ðÞ 2

ðÞ¼ À1

ð ðÞÞ ðÞ

9 5 9 6 ! ! 1

ð ðÞÞ ! À1 ð Þ ð ðÞÞ %½

ð1 À Þ ½ ðÞÀ2

ðÞ . !

! 1 0 1 , ð1À Þ 1 2 ðÞ½ ðÞÀ

, ¼ À1 ð Þ.

ðÞ ¼ À1

ð Þ ! ! 1 ðÞ ðÞ1 S 1935 L ! 1 ! 1 ! 2 0 1 2 1

ðÞ ðÞ

ð1 À Þ½ ð ÞÀ2

1ð1 À 2Þ½ ð1Þ ð2ÞÀ2 ð Þ ¼ ¼1 2

, , V

Page 95: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 95/675

Page 96: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 96/675

ðÞ ðÞ 6 8

¼ ðÞÀ ðÞ V ¼ ðÞ

V

V ð v Þ ¼ðÀ1ÞðÀ À1Þð ÀÞðv À ÞÀ1 ÀÀ1ð1 Àv ÞÀ

0 v 1

ðÞ ¼ðÀ1ÞðÀ À1Þð ÀÞ ÀÀ1

Z 1

ðv À ÞÀ1

ð1 Àv ÞÀ v

v À ¼ð1 À Þ

ðÞ ¼ðÀ1ÞðÀ À1Þð ÀÞÀÀ1ð1 À ÞÀ1

ÂZ 1

0À1½ð1 À Þ Àð1 À ÞÀð1 À Þ

¼ðÀ1ÞðÀ À1Þð ÀÞÀÀ1ð1 À ÞÀþ ð À þ1Þ

¼ðÀ À1Þð À þ1ÞÀÀ1ð1 À ÞÀþ 0 1

ð11 3Þ À

À þ þ1 ðÀÞ 4 3 11 2

¼ ð Þ ¼Z 1

ðÀ À1Þð À þ ÞÀÀ1ð1 À ÞÀþ

P

, , V

Page 97: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 97/675

ðÞÀ ðÀ1Þ 1 .

-

ð ðÞÞ À ð ðÞÞ ¼ ðÞÀ ðÞ ð ðÞ ðÞÞ À

¼ ð ðÞÞ À ð ðÀ1ÞÞ ¼ ðÞÀ ðÀ1Þ ¼1 2 þ1

ð0Þ¼ À1 ðþ1Þ¼ 1

1 ¼ ð ð1ÞÞ ¼ ð1Þ2 ¼ ð ð2ÞÞ À ð ð1ÞÞ ¼ ð2ÞÀ ð1Þ

ÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁÁ¼ ð ðÞÞ À ð ðÀ1ÞÞ ¼ ðÞÀ ðÀ1Þ

ðþ1Þ¼1 À ðÞÞ

ð11 4Þ

C 11 1 1

C 11 1 1

ð Þ ¼1

þ1

ð1Þ ð2Þ ðÞ þ1

S J 11 4 ð1Þ ð2Þ ðÞ ð1Þ ð2Þ ðÞ 1

1 2 ð1 2 Þ ¼ 0

¼1 2 þ1

¼1 ¼1

2

Page 98: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 98/675

þ þ1 þÁÁÁþþ ¼ ðþ ÞÀ ðÞ þ S 1 2 1 2

1 þ 2 þÁÁÁþ¼ ðÞ 6 9

ð þ1Þ

W -

1 2

1 2

L ð1Þ ð2Þ ðÞ 1 ¼ ðÀ1 ð1Þ 2 ¼ ð ð1Þ ð2Þ ¼ ð ðÀ1Þ ðÞ ðþ1Þ¼ ð ðÞ1Þ ð þ1Þ

ð þ1Þ 1 2 þ1 þ1 ¼ À 1 À 2 ÀÁÁÁÀ

11 2

W

¼ , ,

, 1 2 þ1

ð 1 ¼ 1 2 ¼ 2 þ1 ¼ þ1Þ ¼1

þ 0

þ1

¼1 ¼

¼ 1 2 þ1 Ã1 Ã2 Ã

ð Ã1 ¼ Ã1 Ã2 ¼ Ã2 ü ÃÞ ¼þ À Ã1 À Ã2 ÀÁÁÁÀÃ

À1

þ 0 Ã

, , V 7

Page 99: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 99/675

1962 442–446

S

K , K ,

ð Þ ¼ ¼1 ð Þ ¼ ð Þ

ð Þ ðÞ

ð ðÞÞ ¼ ¼1 ð ðÞÞ ¼ ð ðÞÞ ¼

C 5

’ ’

ð þ Þ ’ ’

’ C 7

ð Þ C

’ ð Þ ’ ð Þ

ð ð ÞÞ ¼¼

1 ð ð ÞÞ þ

¼1 ð ð ÞÞ 1 þ 2 þÁÁÁþ ð ðÀ1Þ ðÞ

8 2

Page 100: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 100/675

ð ð ÞÞ ¼1 þ 2 þÁÁÁþ þ

ð Þ¼ð ð ÞÞ ’ ð Þ ð Þ ð Þ

1982 ð Þ¼ À À1 0 ¼0 ð ð ÞÞ ¼ ð Þþ

M

C 5

L S

ð Þ ¼Z 10 ½1 À ð Þ

L S

ð Þ ¼1

¼1 ð Þ

S

¼ ð1 À ÞÀ

¼1

ð À þ1ÞZ 0 À1ð1 À ÞÀ

0 1 ð À þ1Þ

ð ¼0 1 Þ1 À ð þ1 À Þ ¼ 1À ð À þ1Þ

, , V

Page 101: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 101/675

ð0 1Þ :

1 B ¼2 ¼1

ð Þ ¼2 0 1

ð Þ ¼Àð ÀÞ

½1 þ Àð ÀÞ 2 À1 1 À1 1 0 1

ð Þ ¼1

2 Àð À Þ À1 1 À1 1 0 1

C

ð Þ ¼½2 þð À Þ2 À1 1 À1 1 0 1

P 5 1 ¼ ð Þ

ð Þ ¼2ð1 À Þ0 1 ¼ ð Þ

0 27 ð1Þ ð2Þ ÁÁÁ ðÞS 5

ð Þ ¼1 6 ¼1 2 6 ð1Þ

8 3 ð Þ ¼ ½Àð À Þ 0 95

S ð1Þ

ð1Þ ð1Þ ð ð1ÞÞ 0 95

ð0 ð ðÞ ðÞ1 Þ 100 ð1 À Þ

0 1 1 0 90 0 5

7 2

Page 102: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 102/675

3:

ð Þ ¼ ðÀ Þ 0

V 7 1 2 0 1 ¼2 S

0 5 7 1

ðÞ ¼8ð0 5 À À0 5 Þ2ð1 þ4 À0 5 Þ 0 1

0 1 :

S 2 7 ð Þ ¼2 À2 0 4

ð Þ ¼4 ðÀ4 Þ 0

7 3

0 17 B ¼ ð ð1ÞÞV ¼ ½1 À ð ðÞÞ 6 8

¼1 ¼ V ! 1

8 9 5 9 6 :

2 þ1 2

19 ð Þ ¼ ðÀ Þ 0

L ðÞ

ðÞ S !1 ðÀ1 ðÞ Þ ¼ ðÀ Þ ð Þ ¼ ½ð2 þ 2ÞC

ðÞ S !1 ðÀ2 ðÞ Þ ¼ ðÀ ffiffiffiffiffiffiffiffi2 Þ ð Þ ¼

ð ffiffiffiffiffiffi2 Þ À3 2 ðÀ2 2 Þ 0

L ðÞ

ðÞ S ð ðÞ Þ ¼ ¼

½ ðÞ ½1 À ðÞÀ

ðÞ V ðÞ 6 4

ðÞ B ð ðÞ Þ ðÞ ! 1 ð Þ

0 1

L ð1Þ ð2Þ ÁÁÁ ðÞ ð Þ ¼ ðÀ Þ 0

ðÞ S ðÞ ðÞÀ ðÞ

ðÞ ðþ1ÞÀ ðÞ

, , V 7

Page 103: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 103/675

ðÞ S ð ðÞÞ ¼P ¼1 1 ð þ1 À ÞðÞ

L ð1Þ ð2Þ ÁÁÁ ðÞ

D

V ¼ ð ðÞÞ

ð ðþ1ÞÞ 1 À1 V ¼ ð ðÞÞ

ðÞ V 1

ðÞ V ð ðÀ1ÞÞ1 À1

ðÞ V 1 V 2 V

ðÞ S V 1 V 2 V

ð Þ S V 1 V 22 V

33 V 0 1

3 0 8

3

3

L

7 S 1 2 1 2 L

1 2 þ1 ¼0 1

ðÞ V 0 þ 1 þÁÁÁþ¼ þ1 1 þ2 2 þÁÁÁþ¼ðÞ ¼ 0 1

ð þ1Þ0 1 ÁÁÁ

þ À1

ðÞ ¼ 0

þ10

þ1

À 0 0 þ 0 ¼ À þ1 À þ2 1962

0 ‘‘ ’’ 1962 446–452

8 L 1 2 1 2 ’ ð Þ

ð

Þ ’ C

½1 À ð 1Þ ’ 1 S

7 2

Page 104: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 104/675

L ð1Þ ð2Þ ÁÁÁ ðÞ ¼

ðÞ S ð Þ ¼1 2

ð0 1 2 1ÞðÞ S ð ð ÞÞ À ð ð ÞÞ ð ð À ÞÞ 1976

ðÞ S ðÞ¼ ð ðÞÞ

½ ðÞ¼ ¼þ À À

À þ À1 þ ¼0 1

ðÞ ’

‘‘ ’’ ðÞ ðÞ¼ 1 þÁÁÁþ

¼ ðÞÀ ðÀ1Þ

ðÞ S

ð ðÞÞ ¼ þ1 ð ðÞÞ ¼ð À þ1Þ ð þ þ1Þ

ð þ1Þ2ð þ2Þ

1982

ð Þ L 1 1 ¼ ½1 À ð ðÞÞ ¼À ðÞ S

ð 1

¼Þ ¼

þ À À1À

þ ð Þ L 2 ’ 2 ¼ ð ð1ÞÞ ¼ð1Þ S 3 ¼ 1 þ 2

ð 3 ¼Þ ¼ ðþ1Þþ À À2

À þ R 1954

ð Þ L 4 ’ ¼ ð ðÞ ðþ1ÀÞ ðÞ ’ ’ 4 ¼ ½ ð ðþ1ÀÞÞ À ð ðÞÞ S 4

ð 4 ¼Þ ¼þ2 À À1

À þ À2 þ ¼0 1

, , V 7

Page 105: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 105/675

ðÞ S

ð 4Þ ¼2

þ1 ð 4Þ ¼

2 ð À1Þðþ þ1Þð þ1Þ

2ð þ2Þ

K 1984

3 4 0 ¼

L ð Þ S À1 1

½ ð Þ ð Þ ¼ ð Þ½1 À ð Þ

L 1 2 ð Þ ¼ ð2 ÞÀ

1 À 2 0 0 ’ 1 2 ÁÁÁ 1

2

ðÞ S 1 2

ð2 ÞÀð ÀÞ ÀP¼1 þð ÀÞ 2 ! 0 1 ÁÁÁ 1

ðÞ S À1½P¼1 þ ð ÀÞ 2

L ð1Þ

ð ¼1 2 Þ ð1Þ

ðÞ S

¼ ðÞ

ðÞ ¼ ðþ À1Þð þ ÞðÞ S

ðÀ1Þ ðÞ ¼ þ À1

7 2

Page 106: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 106/675

ðÞðÞ S

ð1Þ

¼ ð1 À Þ

D

ð Þ ¼10& 0

0

S

ð

Þ ¼¼1

ð À Þ

1 2

ð Þ ¼ð Þ

P ½ð Þ ð Þ ¼½ ð Þ ð Þ

ð Þ ¼ ð Þ À ¼ ð1 ÀÞð1 ÀÞ&

ðÞ

L ð Þ 0 1 D

ðÞ ¼ ffiffiffi ðÞÀ

ðÞ ¼ ðþ1Þ þ1 0 1

½ ðÞ ½ ðÞ ½ ðÞ ½ ðÞ

½ ðÞ ½ ðÞ 0 1

, , V 7

Page 107: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 107/675

3Tests of Randomness

3.1 INTRODUCTION

assing a line of ten persons waiting to buy a ticket at a movietheater on a aturday afternoon, suppose we observe the arrange-ment of ®ve males and ®ve females in the line to be , F, , F, , F,, F, , F. Would this be considered a random arrangement by

gender Intuitively, the answer is no, since the alternation of the twotypes of symbols suggests intentional mixing by pairs. Thisarrangement is an extreme case, as is the con®guration , , , ,, F, F, F, F, F, with intentional clustering. In the less extreme

situations, the randomness of an arrangement can be tested statis-tically using the theory of runs.

Given an ordered sequence of one or more types of symbols, a runis de®ned to be a succession of one or more types of symbols which arefollowed and preceded by a different symbol or no symbol at all. luesto lack of randomness are provided by any tendency of the symbols to

6

Page 108: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 108/675

exhibit a de®nite pattern in the sequence. oth the number of runsand the lengths of the runs, which are of course interrelated, shouldre¯ect the existence of some sort of pattern. Tests for randomness cantherefore be based on either criterion or some combination thereof. Toofew runs, too many runs, a run of excessive length, too many runs of excessive length, etc. can be used as statistical criteria for rejection of the null hypothesis of randomness, since these situations should occurrarely in a truly random sequence.

The alternative to randomness is often simply nonrandomness.In a test based on the total number of runs, both too few and too manyruns suggest lack of randomness. A null hypothesis of randomnesswould consequently be rejected if the total number of runs is either too

large or too small. However, the two situations may indicate differenttypes of lack of randomness. In the movie theater example, a sequencewith too many runs, tending toward the genders alternating, mightsuggest that the movie is popular with teenagers and young adults,whereas the other extreme arrangement may result if the movie ismore popular with younger children.

Tests of randomness are an important addition to statisticaltheory, because the theoretical bases for almost all the classical tech-niques, as well as distribution-free procedures, begin with the as-sumption of a random sample. If this assumption is valid, everysequential order is of no consequence. However, if the randomness of the observations is suspect, the information about order, which is al-most always available, can be used to test a hypothesis of randomness.

This kind of analysis is also useful in time-series and quality-controlstudies.The symbols studied for pattern may arise naturally, as with

the theater example, or may be arti®cially imposed according tosome dichotomizing criterion. Thus the runs tests are applicable toeither qualitative or quantitative data. In the latter case, the di-chotomy is usually effected by comparing the magnitude of eachnumber with a focal point, commonly the median or mean of thesample, and noting whether each observation exceeds or is exceededby this value. When the data consist of numerical observations, twoother types of runs analysis can be used to reach a conclusion aboutrandomness. oth of these techniques use the information aboutrelative magnitudes of adjacent numbers in the time-ordered se-quence. These techniques, called the runs up and down test and therank von Neumann test , use more of the available information andare especially effective when the alternative to randomness is eithera trend or autocorrelation.

TESTS OF RANDOMNESS

Page 109: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 109/675

3.2 TESTS BASED ON THE TOTAL NUMBER OF RUNS

Assume an ordered sequence of n elements of two types, n of the ®rsttype and n of the second type, where n n n . If r is the numberof runs of type elements and r is the number of runs of type , thetotal number of runs in the sequence is r r r . In order to derive atest for randomness based on the variable R, the total number of runs,we need the probability distribution of R when the null hypothesis of randomness is true.

EXACT NULL DISTRIBUTION OF R

The distribution of R will be found by ®rst determining the jointprobability distribution of R and R and then the distribution of theirsum. ince under the null hypothesis every arrangement of the n nobjects is equiprobable, the probability that R r and R r is thenumber of distinguishable arrangements of n n objects with r runsof type and r runs of type objects divided by the total number of distinguishable arrangements, which is n !=n !n !. For the numeratorquantity, the following counting lemma can be used.

Lemma 1 The number of distinguishable ways of distributing n-likeobjects into r distinguishable cells with no cell empty is

nÀrÀÀ Á; n 5 r

Proof uppose that the n -like objects are all white balls. lacethese n balls in a row and effect the division into r cells by in-

serting each of r À black balls between any two white balls inthe line. ince there are n À positions in which each black ballcan be placed, the total number of arrangements is nÀrÀÀ Á.

In order to obtain a sequence with r runs of the objects of type ,the n -like objects must be placed into r cells, which can be done in

n Àr À different ways. The same reasoning applies to obtain r runs of the other n objects. The total number of distinguishable arrangementsstarting with a run of type then is the product n Àr À n Àr À . imi-larly for a sequence starting with a run of type . The blocks of objectsof type and type must alternate, and consequently eitherr r

Æ or r r . If r r , the sequence must begin with a

run of type ; if r r À, a type run must come ®rst. ut if r r ,the sequence can begin with a run of either type, so the number of distinguishable arrangements must be doubled. We have thus provedthe following result.

CHAPTER 3

Page 110: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 110/675

Theorem 2.1 Let R and R denote the respective numbers of runs of nobjects of type and n objects of type in a random sample of sizen n n The joint probability distribution of R and R is

f R ; R r ; r c n Àr À n Àr À

n nn

r ; ; . . . ; nr ; ; . . . ; nr r or r r Æ

:

where c if r r and c if r r Æ

Corollary 2.1 The marginal probability distribution of R is

f R r

n Àr À n

r n n

n r ; ; . . . ; n :

Similarly for R with n and n interchanged

Proof From ( . ), the only possible values of r are r r ; r À,and r , for any r . Therefore we have

f R r r f R ; R r ; r

n nn f R r

n Àr À n À

r À n Àr À n À

r À n Àr À n À

r n Àr À n À

r À n Àr À n À

r À n

Àr !n

Àr À n

r À n

r !n Àr À n

r

TESTS OF RANDOMNESS 9

Page 111: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 111/675

Theorem 2.2 The probability distribution of R the total number of runs of n n n objects n of type and n of type 2 in arandom sample is

f R r

n Àr= À n À

r= À 0n nn if r is even

n Àr À = n À

r À = n Àr À = n À

r À = 0n nn if r is odd

VbbbbbbbbX : for r ; ; . . . ; n n

Proof For r even, there must be the same number of runs of bothtypes. Thus the only possible values of r and r are r r r= ,and ( . ) is summed over this pair. If r r Æ; r is odd. In thiscase, ( . ) is summed over the two pairs of values r r À =and r r = ; r r = and r r À = , obtainingthe given result. Note that the binomial coef®cient a

bÀ Áis de®nedto be zero if a < b.Using the result of Theorem . , tables can be prepared for tests

of signi®cance of the null hypothesis of randomness. For example, if n and n , we have

f R

:

f R :

f R :

f R :

0 CHAPTER 3

Page 112: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 112/675

For a two-sided test which rejects the null hypothesis for R 4 or R 5 , the exact signi®cance level a would be = : . For thecritical region de®ned by R 4 or R 5 , a = : . As for alltests based on discrete probability distributions, there are a ®nitenumber of possible values of a . For a test with signi®cance level atmost . , say, the ®rst critical region above would be used eventhough the actual value of a is only . . Tables which can be used to®nd rejection regions for the runs test are available in many sources.wed and Eisenhart ( ) give the probability distribution of R forn 4 n 4 , which is given in Table of the Appendix for all n 4 nsuch that n n 4 and other n and n such that n 4 n 4 ;note that left-tail and right-tail probabilities are given separately.

MOMENTS OF THE NULL DISTRIBUTION OF R

The kth moment of R is

E R k rr k f R r

&r evenr k n Àr= À n Àr= À

r oddr k n Àr À = n Àr À =

n Àr À = n Àr À = !'0n nn :

Thesmallest value of r is always . If n n , the largest number of runsoccurs when the symbols alternate, in which case r n . If n < n , themaximum value of r is n , since the sequence can both begin andend with a type symbol. Assuming without loss of generality thatn 4 n , the range of summation for r is 4 r 4 n . etting r ifor r even and r i for r odd, the range of i is 4 i 4 n .

For example, the mean of R is expressed as follows using ( . )

n nn E R

n

ii

n Ài À n À

i À n

ii

n Ài n À

i À n

ii n Ài À n Ài :

To evaluate these three sums, the following lemmas are useful.

TESTS OF RANDOMNESS 1

Page 113: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 113/675

Lemma 2

c

r

mr n

r m nm where c min m; n

Proof x m n x m x n for all x .

m n

i

m ni xi

m

j

m j x j

n

k

n k x k

Assuming without loss of generality that c m and equating thetwo coef®cients of xm on both sides of this equation, we obtain

m nm m

rmm Àr nr

Lemma 3c

r

mr n

r m nm where c min m; n À

Proof The proof follows as in emma , equating coef®cients of xm .

The algebraic process of obtaining E R from ( . ) is tedious andwill be left as an exercise for the reader. The variance of R can befound by evaluating the factorial moment E R R

Àin a similar

manner. A much simpler approach to ®nding the moments of R is provided

by considering R as the sum of indicator variables as follows forn n n . et

R I I ÁÁÁ I nwhere in an ordered sequences of the two types of symbols, we de®ne

I k if the kth element Tthe ( k À)th element otherwise&

Then I k is a ernoulli random variable with parameter p n n = n

À Á,

so

E I k E I k n n

n n À

2 CHAPTER 3

Page 114: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 114/675

ince R is a linear combination of these I k, we have

E R n

k E I k

n nn n :

var R varn

k I k2 3 n À var I k

4 j T k 4 ncov I j ; I k

n À E I k4 j T k 4 n

E I j I k Àn À E I k :

To evaluate the n À n À joint moments of the type E I j I k for j Tk, the subscript choices can be classi®ed as follows

. For the n À selections where j k À or j k ,

E I j I k n n n À n n n À

n n À n Àn n

n n À. For the remaining n À n À À n À n À n Àselections of j Tk,

E I j I k n n n À n À

n n À n À n Àubstitution of these moments in the appropriate parts of ( . ) gives

var R n nn

n À n n n

n n Àn n n À n Àn n À À n n

nn n n n Àn Ànn n n n À

:

ASYMPTOTIC NULL DISTRIBUTION

Although ( . ) can be used to ®nd the exact distribution of R for anyvalues of n and n , the calculations are laborious unless n and n areboth small. For large samples an approximation to the null distribu-tion can be used which gives reasonably good results as long as n andn are both larger than .

In order to ®nd the asymptotic distribution, we assume that thetotal sample size n tends to in®nity in such a way that n =n 3 l andn =n 3 Àl ; l ®xed, < l < . For large samples then, the mean andvariance of R from ( . ) and ( . ) are

TESTS OF RANDOMNESS 3

Page 115: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 115/675

Page 116: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 116/675

alternatives. If the alternative hypothesis is simply randomness, atwo-sided test should be used. ince the presence of a trend wouldusually be indicated by a clustering of like objects, which is re¯ectedby an unusually small number of runs, a one-sided test is moreappropriate for trend alternatives.

ecause of the generality of alternatives to randomness, nostatement can be made concerning the overall performance of thisruns test. However, its versatility should not be underrated. Othertests for randomness have been proposed which are especially usefulfor trend alternatives. The best known of these are tests based on thelength of the longest run and tests based on the runs up and downtheory. These two types of test criteria will be discussed in the fol-

lowing sections.APPLICATIONS

This test of randomness is based on the total number of runs R in anordered sequence of n elements of two types, n of type and n of type. alues of the null distribution of R computed from ( . ) are given inTable for n 4 n 4 as left-tail probabilities for R small and right-tail for R large. If the alternative is simply nonrandomness and thedesired level is a , we should reject for R 4 r a = or R 5 rHa = , where P R 4 r a = 4 a = and P R 5 rHa = 4 a = ; the exact level of this two-tailed test is P R 4 r a = P R 5 rHa = . If the desired alternative is atendency for like elements to cluster, we should reject only for two fewruns and therefore only for R 4 r

a =. On the other hand, if the

appropriate alternative is a tendency for the elements to mix, weshould reject only for too many runs and therefore only for R 5 rHa .ecause Table covers only n 4 n , the type of element which occursless frequently in the n observations should be called the type element.

For n > and n > , the critical values r and rHcan be foundfrom the normal approximation to the null distribution of the totalnumber of runs. If we use the exact mean and variance of R given in( . ) and ( . ) and a continuity correction of . , the left-tail and right-tail critical regions are

R : À À n n =n

n n n n Àn = n n Àp 4

À za

R À : À À n n =n

n n n n Àn = n n Àp 5 za

:

TESTS OF RANDOMNESS 5

Page 117: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 117/675

2

p 10 J

B B B B B

0 05

S ’ B’ B 1 1 ¼4 2 ¼6

¼6 D ð 2Þ ¼0 010

ð 9Þ ¼0 024 0 025

2 9 0 034 ¼6

0 05

S X C 2 1 D

ð 6Þ ¼0 595 2 11 ð À0 2107Þ ¼0 5952

8

Page 118: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 118/675

3.3 TESTS BASED ON THE LENGTH OF THE LONGEST RUN

A test based on the total number of runs in an ordered sequenceof n objects of type and n objects of type is only one way of using information about runs to detect patterns in the arrange-ment. Other statistics of interest are provided by considering thelengths of these runs. ince a run which is unusually long re¯ectsa tendency for like objects to cluster and therefore possibly atrend, osteller ( ) has suggested a test for randomness basedon the length of the longest run. Exact and asymptotic probabilitydistributions of the numbers of runs of given length are discussedin ood ( ).

The joint probability distribution of R and R derived in the

previous section, for the numbers of runs of the two types of ob- jects, disregards the individual lengths and is therefore of no usein this problem. We now need the joint probability distribution fora total of n n random variables, representing the numbers of runs of all possible lengths for each of the two types of elements inthe dichotomy. et the random variables R ij ; i ; ; j ; ; . . . ; n i ,denote, respectively, the numbers of runs of objects of type i whichare of length j . Then the following obvious relationships hold

n i

j jr ij n i and

n i

j r ij r i for i ;

The total number of arrangements of the n n symbol isstill n n

n , and each is equally likely under the null hypothesis of randomness. We must compute the number of arrangements inwhich there are exactly r ij runs of type i and length j , for all i and j . Assume that r and r are held ®xed. The number of arrange-ments of the r runs of type which are composed of r j runs of length j for j ; ; . . . ; n is simply the number of permutations of the r runs with r runs of length , r runs of length ; . . . ; r nof length n , where within each category the runs cannot be dis-tinguished. This number is r != n

j r j !. The number of arrange-ments for the r runs of type objects is similarly r != n

j r j !. If r r

Æ, the total number of permutations of the runs of both

types of objects is the product of these two expressions, but if r r , since the sequence can begin with either type of object,this number must be doubled. Therefore the following theorem isproved.

TESTS OF RANDOMNESS

Page 119: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 119/675

Theorem 3.1 Under the null hypothesis of randomness the probabilitythat the r runs of n objects of type and r runs of n objects of type consist of exactly r j ; j ; ; . . . ; n and r j ; j ; ; . . . ; nruns of length j respectively is

f r ; . . . ; r n ; r ; . . . ; r n cr !r !

in i j r ij !

n nn :

where c if r r and c if r r Æombining the reasoning for Theorem . with that of Theorem. , the joint distribution of the n random variables R and

R j ; j ; ; . . . ; n , can be obtained as follows

f r ; . . . ; r n ; r cr !

n Àr À

n j r j !

n nn :

The result is useful when only the total number, not the lengths, of runs of type objects is of interest. This joint distribution, whensummed over the values for r , gives the marginal probability dis-tribution of the lengths of the r runs of objects of type .

Theorem 3.2 The probability that the r runs of n objects of typeconsist of exactly r j ; j ; ; . . . ; n runs of length j respectivelyis

f r ; . . . ; r n r ! n

r n j r j !

n nn :

The proof is exactly the same as for the corollary toTheorem . . The distribution of lengths of runs of type objectsis similar.

In the probability distributions given in Theorems . and . ,both r and r , and r , respectively, were assumed to be ®xed numbers.

In other words, these are conditional probability distributions giventhe ®xed values. If these are not to be considered as ®xed, the condi-tional distributions are simply summed over the possible ®xed values,since these are mutually exclusive.

CHAPTER 3

Page 120: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 120/675

Theorem . can be used to ®nd the null probability distributionof a test for randomness based on K , the length of the longest run of type . For example, the probability that the longest in any number of runs of type is of length k is

r r ;... ;r k

r ! nr

k j r j !

n nn :

where the sums are extended over all sets of nonnegative integerssatisfying k

j r j r ; k j jr j n ; r k 5 ; r 4 n À k , and

r 4 n . For example, if n ; n , the longest possible run isof length . There can be no other runs, so that r andr r r r , and

P K

imilarly, we can obtain

P K !

! !

P K !

! ! !! !

P K

!! ! !

! !

P K !!

TESTS OF RANDOMNESS 9

Page 121: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 121/675

For a signi®cance level of at most . when n ; n , the nullhypothesis of randomness is rejected when there is a run of type elements of length . In general, the critical region would be thearrangements with at least one run of length t or more.

Theorem . must be used if the test is to be based on the lengthof the longest run of either type of element in the dichotomy. These twotheorems are tedious to apply unless n and n are both quite small.Tables are available in ateman ( ) and osteller ( ).

Tests based on the length of the longest run may or may not bemore powerful than a test based on the total number of runs, de-pending on the basis for comparison. oth tests use only a portion of the information available, since the total number of runs, although

affected by the lengths of the runs, does not directly make use of in-formation regarding these lengths, and the length of the longest runonly partially re¯ects both the lengths of other runs and the totalnumber of runs. ower functions are discussed in ateman ( ) andavid ( ).

3.4 RUNS UP AND DO N

When numerical observations are available and the sequence of numbers is analyzed for randomness according to the number orlengths of runs of elements above and below the median, someinformation is lost which might be useful in identifying a pattern inthe time-ordered observations. With runs up and down, instead of

using a single focal point for the entire series, the magnitude of each element is compared with that of the immediately precedingelement in the time sequence. If the next element is larger, a runup is started; if smaller, a run down is started. We can observewhen the sequence increases, and for how long, when it decreases,and for how long. A decision concerning randomness then might bebased on the number and lengths of these runs, whether up ordown, since a large number of long runs should not occur in a trulyrandom set of numerical observations. ince an excessive number of long runs is usually indicative of some sort of trend exhibited by thesequence, this type of analysis should be most sensitive to trendalternatives.

If the time-ordered observations were , , , , , , there is arun up of length , followed by a run down of length , followed by arun up of length . The sequence of six observations can be re-presented by ®ve plus and minus signs, ;À; ; ; , indicating theirrelative magnitudes. ore generally, suppose there are n numbers,

90 CHAPTER 3

Page 122: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 122/675

no two alike, say a < a < ÁÁÁ< a n when arranged in order of mag-nitude. The time-ordered sequence of observations S n x ; x ; . . . ; xnrepresents some permutation of these n numbers. These are n ! per-mutations, each one representing a possible set of sample observa-tions. Under the null hypothesis of randomness, each of these n !arrangements is equally likely to occur. The test for randomness usingruns up and down for the sequence Sn of dimension n is based on thederived sequence DnÀ of dimension n À, whose ith element is thesign of the difference xi À xi , for i ; ; . . . ; n À. et R i denote thenumber of runs, either up or down, of length exactly i in the sequence DnÀ or Sn. We have the obvious restrictions 4 i 4 n À and

nÀi ir i n À. The test for randomness will reject the null hypoth-

esis when there are at least r runs of length t or more, where r and tare determined by the desired signi®cance level. Therefore we must®nd the joint distribution of R ; R ; . . . ; RnÀ under the null hypothesiswhen every arrangement S n is equally likely. et f n r nÀ ; r nÀ ; . . . ; rdenote the probability that there are exactly r nÀ runs of lengthn À ; . . . ; r i runs of length i; . . . ; r runs of length . If u n r nÀ ; . . . ; rrepresents the corresponding frequency, then

f nu n

n !

because there are n ! possible arrangements of S n . ince the probabilitydistribution will be derived as a recursive relation, let us ®rst considerthe particular case where n and see how the distribution for n

can be generated from it.Given three numbers a < a < a , only runs of lengths and are possible. The ! arrangements and their corresponding valuesof r and r are given in Table . . ince the probability of at least onerun of length or more is / , if this de®nes the critical region thesigni®cance level is . . With this size sample, a smaller signi®cance

Table 4.1 Listing of arrangements when n 3

S D r r Probability distribution

a ; a ; a ( ; ) a ; a ; a ( ;

À) f ;

=

a ; a ; a (À; ) a ; a ; a ( ;À) f ; =a ; a ; a (À; ) a ; a ; a (À;À) f r ; r otherwise

TESTS OF RANDOMNESS 91

Page 123: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 123/675

level cannot be obtained without resorting to a randomized decisionrule.

Now consider the addition of a fourth number a , larger than allthe others. For each of the arrangements in S ; a can be inserted infour different places. In the particular arrangement a ; a ; a , forexample, insertion of a at the extreme left or between a and a wouldleave r unchanged but add a run of length . If a is placed at theextreme right, a run of length is increased to a run of length . If a isinserted between a and a , the one run of length is split into threeruns, each of length .

Extending this analysis to the general case, the extra observationmust either split an existing run, lengthen an existing run, or in-

troduce a new run of length . The ways in which the run lengths inS nÀ are affected by the insertion of an additional observation a n tomake an arrangement S n can be classi®ed into the following fourmutually exclusive and exhaustive cases

. An additional run of length can be added in the arrangement S n .

. A run of length i À in SnÀ can be changed into a run of length iin S n for i ; ; . . . ; n À.

. A run of length h i in S nÀ can be split into a run of length i,followed by a run of length , followed by a run of length i, for

4 i 4 n À = , where x denotes the largest integer notexceeding x.

. A run of length h i j in S n

Àcan be split up into

a. A run of length i, followed by a run of length , followed by a runof length j b. A run of length j , followed by a run of length , followed by a runof length i.

where h > i > j ; 4 h 4 n À.For n , the arrangements can be enumerated systematically

in a table like Table . to show how these cases arise. Table . gives apartial listing. When the table is completed the number of cases whichresult in any particular set r ; r ; r can be counted and divided by to obtain the complete probability distribution. This will be left as anexercise for the reader. The results are

(r ; r ; r ) ( , , ) ( , , ) ( , , ) Other values f r ; r ; r = = =

92 CHAPTER 3

Page 124: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 124/675

There is no illustration for case in the completed table of en-umerated arrangements since here n is not large enough to permith 5 . For n , insertion of a in the second position of a ; a ; a ; aproduces the sequence a ; a ; a ; a ; a . The one run of length hasbeen split into a run of length , followed by another run of length ,followed by a run of length , illustrating case b withh ; j ; i . imilarly, case a is illustrated by inserting a inthe third position. This also illustrates ase .

ore generally, the frequency u n of cases in S n having exactly rruns of length , r runs of length ; . . . ; r nÀ runs of length n À canbe generated from the frequencies for S nÀ by the following recursiverelation

u n r nÀ ; r nÀ ; . . . ; r h ; . . . ; r i ; . . . ; r j ; . . . ; ru nÀ r nÀ ; . . . ; r À

nÀi

r iÀ u nÀ r nÀ ; . . . ; r i À ; r iÀ ; . . . ; r

nÀ =

ih i

r h u nÀ r nÀ ; . . . ; r h ; . . . ; r i À ; . . . ; r ÀnÀi

iÀ j

h i j h 4 nÀ

r h

Âu nÀ rnÀ ; . . . ; r h ; . . . ; r i À ; . . . ; r j À ; . . . ; r À :

The terms in this sum represent cases to in that order. Forcase , u nÀ is multiplied by since for every arrangement in SnÀ

Table 4.2 Partial listing of arrangements when n 4

S r r S r r r Case illustrated

(a ; a ; a ) ( a ; a ; a ; a ) (a ; a ; a ; a ) (a ; a ; a ; a ) (a ; a ; a ; a )

(a ; a ; a ) ( a ; a ; a ; a ) (a ; a ; a ; a ) (a ; a ; a ; a ) (a ; a ; a ; a )

TESTS OF RANDOMNESS 93

Page 125: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 125/675

there are exactly two places in which a n can be inserted to add arun of length . These positions are always at an end or next to theend. If the ®rst run is a run up (down), insertion of a n at the ex-treme left (next to extreme left) position adds a run of length . A new run of length is also created by inserting a n at the extremeright (next to extreme right) in S nÀ if the last run in S nÀ was arun down (up). In case , we multiply by because of the (a) and (b)possibilities.

The result is tricky and tedious to use but is much easier thanenumeration. The process will be illustrated for n , given

u ; ; u ; ; u ; ; u r ; r ; r otherwise

Using ( . ), we have

u r ; r ; r ; r u r ; r ; r À r u r ; r À ; r

r u r À ; r ; r

r u r ; r ; r

r u r ; r ; r Àr u r ; r À ; r À

u ; ; ; u ; ; u ; ; ;

u ; ; u ; ;

u ; ; u ; ; ; u ; ; u ; ; u ; ; ; u ; ; u ; ; u ; ;

u ; ; u ; ; ; u ; ; u ; ; u ; ;

The means, variances, and covariances of the numbers of runs of length t (or more) are found in evene and Wolfowitz ( ). Tables of the exact probabilities of at least r runs of length t or more are given inOlmstead ( ) and Owen ( ) for n 4 , from which appropriatecritical regions can be found. Olmstead gives approximate prob-abilities for larger sample sizes. ee Wolfowitz ( a,b) regarding theasymptotic distribution which is oisson.

94 CHAPTER 3

Page 126: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 126/675

A test for randomness can also be based on the total number of runs, whether up or down, irrespective of their lengths. ince the totalnumber of runs R is related to the R i , the number of runs of length i, by

RnÀi

Ri :

the same recursive relation given in ( . ) can be used to ®nd the prob-ability distribution of R . evene ( ) showed that the asymptotic dis-tribution of the standardized random variable R with mean n À =and variance n À = is the standard normal distribution.

APPLICATIONS

For the test of randomness based on the total number of runs up anddown, R , in an ordered sequence of n numerical observations, orequivalently a sequence of n À plus and minus signs, the appro-priate sign is determined by comparing the magnitude of eachobservation with the one immediately preceding it in the sequence.The appropriate rejection regions for each alternative are exactly thesame as for the earlier test in ection . , which was based on thetotal number of runs of two types of elements. peci®cally, if thealternative is a tendency for like signs to cluster, the appropriaterejection region is small values of R . If the alternative is a tendencyfor like signs to mix with unlike signs, the appropriate rejectionregion is large values of R.

The exact distribution of R under the null hypothesis of ran-domness is given in Table E for n 4 as left-tail probabilities for Rsmall and right-tail for R large. For n > , the critical values of R canbe found from the normal approximation to the null distribution of thenumber of runs up and down statistic. If we incorporate a continuitycorrection of . , the left-tail and right-tail critical regions are

R : À n À =

n À =p 4 À za andR À : À n À =

n À =p 5 za

The two-tailed critical region is a combination of the above with za

replaced by za = :One of the primary uses of the runs up and down test is for an

analysis of time series data. The null hypothesis of randomness isthen interpreted as meaning the data can be regarded as in-dependent and identically distributed. The alternative of a tendencyto cluster is interpreted as an upward trend if the signs are

TESTS OF RANDOMNESS 95

Page 127: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 127/675

predominantly plus or a downward trend if the signs are pre-dominantly minus, and the alternative of a tendency to mix is in-terpreted as cyclical variations. The total number of runs test couldalso be used with time series data if the data are ®rst converted totwo types of symbols by comparing the magnitude of each one tosome standard for that period or to a single focal point, like themedian of the data. The test in this situation is frequently referred toas the runs above and below the median test. The test in the formercase was actually illustrated by Example . .

Example . illustrates an application of runs up and down intime series data.

Exam le 4.1 Tourism is regarded by all nations as big business be-cause the industry brings needed foreign exchange and helps thebalance of payments. The Travel arket Yearbook publishes extensivedata on tourism. Analyze the annual data on total number of touriststo the United tates for ± to see if there is evidence of atrend, using the . level.

Solution We use the runs up and down test of randomness for thesen observations with the alternative of a trend. The sequence of plus and minus signs is ; ; ; ; ; ; ; ; ; ; ;Àso that R .The left-tail critical value from Table E is R at exact level . ,the largest value that does not exceed . . ince is less than , wereject the null hypothesis and conclude there is a trend in number of tourists to the United tates and it is positive because the signs arepredominantly plus.

Year Number of tourists millions

, , , , ,

, , , , , , , ,

96 CHAPTER 3

Page 128: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 128/675

3.5 A TEST BASED ON RAN S

Another way to test for randomness by comparing the magnitude of each element with that of the immediately preceding element in thetime sequence is to compute the sum of the squares of the deviations of the pairs of successive elements. If the magnitudes of these elementsare replaced by their respective ranks in the sequence before com-puting the sum of squares of successive deviations, we can obtain anonparametric test.

peci®cally, let the time-ordered sequence of observations beS n X ; X ; . . . ; X n as in ection . . The test statistic

N

nÀi rank X i Àrank X i :

was proposed by artels ( ). A test based on a function of thisstatistic is the rank version of the ratio test for randomness developedby von Neumann using normal theory and is a linear transformationof the rank serial correlation coef®cient introduced by Wald andWolfowitz ( ).

It is easy to show that the test statistic N ranges betweenn À and n À n n À = if n is even and between n À andn À n n À = À if n is odd. The exact null distribution of

N can be found by enumeration and is given in artels ( ) for4 n 4 . For larger sample sizes, the test statistic

NnÀi rank X i Àrank X i

ni rank X i Àn =

:

is asymptotically normally distributed with mean and variancen À n À n À = n n n À , which is approximately

equal to = n . If there are no ties, the denominator of N isequal to the constant n n À = .

ince a trend in either direction will be re¯ected by a smallvalue of N and therefore N, the appropriate rejection region totest randomness against a trend alternative is small values of Nor N. If the alternative is a tendency for the data to alternatesmall and large values, the appropriate rejection region is large

values of the test statistic. Table of the Appending gives exact tailprobabilities for selected values of N for n 4 and approximateleft-tail critical values of N (based on the beta distribution) forlarger sample sizes. orresponding right-tail critical values are

TESTS OF RANDOMNESS 9

Page 129: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 129/675

found using the fact that this beta approximation is symmetricabout . .

artels ( ) used simulation studies to show that this test issuperior to the runs up and down test in many cases. Its asymptoticrelative ef®ciency is . with respect to the ordinary serial correlationcoef®cient against the alternative of ®rst-order autocorrelation undernormality. Autocorrelation is de®ned as a measure of the dependencebetween observations that occur a ®xed number of time units apart.ositive autocorrelation shows up in time series data that exhibit atrend in either direction, while negative autocorrelation is indicatedby ¯uctuations over time.

Exam le 5.1 We illustrate this rank on Neumann test using the datafrom Example . with n where the alternative is a positivetrend. We ®rst rank the number of tourists from smallest to largestand obtain

; ; ; ; ; ; ; ; ; ; ; ;

The value of N , the numerator of the N statistic, from ( . ) is then

N À À ÁÁÁ Àand the denominator is À = . Thus from ( . ),N = : , and Table shows that the left-tail critical

value based on the beta approximation is to reject the null hypothesisof randomness at the . level if N 4 : . Therefore we rejectthe null hypothesis and conclude that there is a signi®cant positivetrend. We also use these data to illustrate the test based on the normalapproximation to the distribution of N. The mean is and thevariance for n is . . The standard normal test statistic isthen : À = :p À : . At the . level, for example,the appropriate rejection region from Table A is Z 4 À: , so weagain conclude that there is a signi®cant postive trend.

3.6 SUMMARY

In this chapter we presented a number of tests that are appropriate fortesting the null hypothesis of randomness in a sequence of observa-tions whose order has some meaning. If the observations are simplytwo types of symbols, like and F, or and G, the total number of runs of symbols is the most appropriate test statistic. Tests based onthe lengths of the runs are primarily of theoretical interest. If the

9 CHAPTER 3

Page 130: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 130/675

observations are numerical measurements, the number of runs up anddown or the rank von Neumann ( N) statistic provides the best testbecause too much information is lost by using the test based on runsabove and below some ®xed value like the median.

Usually the alternative hypothesis is simply lack of randomness,and then these tests have no analog in parametric statistics. If thedata represent a time series, the alternative to randomness can beexhibition of a trend. In this case the N test is more powerful thanthe test based on runs up and down. Two additional tests for trend willbe presented later in hapter .

PROBLEMS

3.1. rove orollary . using a direct combinatorial argument based on emma .3.2. Find the mean and variance of the number of runs R1 of type elements, using theprobability distribution given in ( . ). ince E R E R1 E R2 , use your result toverify ( . ).3.3. Use emmas and to evaluate the sums in ( . ), obtaining the result given in( . ) for E R .3.4. how that the asymptotic distribution of the standardized random variable R1 À E R1 =s R1 is the standard normal distribution, using the distribution of R1 gi-ven in ( . ) and your answer to roblem . .3.5. erify that the asymptotic distribution of the random variable given in ( . ) is thestandard normal distribution.3.6. y considering the ratios f R r = f R r À2 and f R r 2 = f R r , where r is an evenpositive integer and f

Rr is given in ( . ) show that if the most probable number of runs

is an even integer k, then k satis®es the inequality

n nn < k <

n nn

3. . how that the probability that a sequence of n 1 elements of type and n 2 elementsof type begins with a type run of length exactly k is

n knn n k

where n rn !

n Àr !

3. . Find the rejection region with signi®cance level not exceeding . for a test of randomness based on the length of the longest run when n 1 n 2 6 :3.9. Find the complete probability distribution of the number of runs up and down of various length when n 6 using ( . ) and the results given for u 5 r 4 ; r 3 ; r 2 ; r 1 .3.10. Use your answers to roblem . to obtain the complete probability distributionof the total number of runs up and down when n 6 .

TESTS OF RANDOMNESS 99

Page 131: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 131/675

3.11. erify the statement that the variance of the N test statistic is approximatelyequal to =( n ).3.12. Analyze the data in Example . for evidence of trend using total number of runsabove and below

(a ) The sample median(b) The sample mean

3.13. A certain broker noted the following number of bonds sold each month for a -month period

(a ) Use the runs up and down test to see if these data show a directional trendand make an appropriate conclusion at the . level.

(b) Use the runs above and below the sample median test to see if these data showa trend and make an appropriate conclusion at the . level.

(c) ompare the conclusions reached in ( a ) and ( b) and give an explanation for thedifference.3.14. The following are time lapses in minutes between eruptions of Old Faithfulgeyser in Yellowstone National ark, recordedbetween thehours of a.m. and p.m.on acertain day, and measuredfrom thebeginning of oneeruption to thebeginning of thenext

, , , , , , , , , , , , , , ,, , , , , , , , , , , , , ,

A researcher wants to use these data for inference purposes, but is concerned aboutwhether it is reasonable to treat such data as a random sample. What do you thinkustify your answer.3.15. In a psychological experiment, the research question of interest is whether a rat``learned'' its way through a maze during trials. uppose the time-ordered observa-tions on number of correct choices by the rat on each trail are as follows

, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,, , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

(a ) Test these data for randomness against the alternative of a tendency tocluster, using the dichotomizing criterion that , , or correct choices indicate nolearning, while or correct choices indicate learning.

(b) Would therunsup anddowntestbe appropriateforthesedata Whyor whynot3.16. The data below represent departure of actual daily temperature in degreesFahrenheit from the normal daily temperature at noon at a certain airport on sevenconsecutive days.

an. uly Feb. Aug. ar. ept.

Apr. Oct. ay Nov.

une ec.

ay eparture À À

100 CHAPTER 3

Page 132: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 132/675

(a ) Give an appropriate P value that re¯ects whether the pattern of positive andnegative departures can be considered a random process or exhibits a tendency tocluster.

(b) Given an appropriate P value that re ects whether the pattern of suc-cessive departures (from one day to the next) can be considered a random process orexhibits a trend for these seven days.3.1 . The three graphs in Figure (see below) illustrate some kinds of nonrandompatterns. Time is on the horizontal axis. The data values are indicated by dots andthe horizontal line denotes the median of the data. For each graph, compute theone-tailed P -value for non randomness using two different nonparametric techniques.3.1 . artels ( ) illustrated the rank non Neumann test for randomness usingdata on annual changes in stock levels of corporate trading enterprises in Australiafor ± to ± . The values (in $A million) de¯ated by the AustralianG are , , , À, À , , , À, , À . He tested randomness

against the alternative of autocorrelation. andom stock level changes occur when

Fig. 1 Nonrandom patterns representing (a) cyclical movement, (b) trend movement,(c) clustering.

TESTS OF RANDOMNESS 101

Page 133: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 133/675

companies are well managed because future demands are accurately anticipated.``Negative autocorrelation constitutes evidence for a tendency to overreact to short-falls or excesses in stock levels, whereas positive autocorrelation suggests there is along delay in reaching desired stock levels.'' The test statistic is N = , which isnot signi®cant. ompare this result with that of ( a ) runs up and down, and ( b) withruns above and below the sample median.

102 CHAPTER 3

Page 134: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 134/675

4Tests of Goodness of Fit

4.1 INTRODUCTION

An important problem in statistics relates to obtaining informationabout the form of the population from which a sample is drawn. Theshape of this distribution might be the focus of the investigation. Alternatively, some inference concerning a particular aspect of thepopulation may be of primary interest. In this latter case, in classicalstatistics, information about the form generally must be postulated orincorporated in the null hypothesis to perform an exact parametrictype of inference. For example, suppose we have a small number of observations from an unknown population with unknown variance andthe hypothesis of interest concerns the value of the population mean.

The traditional parametric test, based on tudent's t distribution, isderived under the assumption of a normal population. The exact dis-tribution theory and probabilities of both types of errors depend on thispopulation form. Therefore it might be desirable to check on the

103

Page 135: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 135/675

Page 136: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 136/675

In order to apply the chi-square test in this situation, the sampledata must ®rst be grouped according to some scheme in order to form afrequency distribution. In the case of count or qualitative data, wherethe hypothesized distribution would be discrete, the categories wouldbe the relevant verbal or numerical classi®cations. For example, intossing a die, the categories would be the numbers of spots; in tossing acoin, the categories would be the numbers of heads; in surveys of brand preferences, the categories would be the brand names con-sidered. When the sample observations are quantitative, the cate-gories would be numerical classes chosen by the experimenter. In thiscase, the frequency distribution is not unique and some information isnecessarily lost. Even though the hypothesized distribution is most

likely continuous with measurement data, the data must be categor-ized for analysis by the chi-square test.When the population distribution is completely speci®ed by the

null hypothesis, one can calculate the probability that a random ob-servation will be classi®ed into each of the chosen or ®xed categories.These probabilities multiplied by n give the frequencies for each ca-tegory which would be expected if the null hypothesis were true. Ex-cept for sampling variation, there should be close agreement betweenthese expected and observed frequencies if the sample data are com-patible with the speci®ed F x . The corresponding observed and ex-pected frequencies can be compared visually using a histogram, afrequency polygon, or a bar chart. The chi-square goodness-of-®t testprovides a probability basis for effecting the comparison and deciding

whether the lack of agreement is too great to have occurred by chance. Assume that the n observations have been grouped into k mu-tually exclusive categories, and denote the observed and expectedfrequencies for the ith class by f i and ei , respectively, i ; ; . . . ; k.The decision regarding ®t is to be based on the deviations f i À ei . Thesum of these k deviations is zero except for rounding. The test criterionsuggested by earson ( ) is the sum of squares of these deviations,normalized by the expected frequency, or

k

i

f i À ei

ei:

A large value of would re¯ect an incompatibility between theobserved and expected relative frequencies, and therefore the nullhypothesis used to calculate the e should be rejected for large.

The exact probability distribution of the random variableis quite complicated, but for large samples its distribution is

TESTS OF GOODNESS OF FIT 105

Page 137: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 137/675

approximately chi square with k À degrees of freedom, given here asTable of the Appendix. The theoretical basis for this can be arguedbrie¯y as follows.

The only random variables of concern are the class frequencies F ; F ; . . . ; F k, which constitute a set of random variables from the k-variate multinomial distribution with k possible outcomes, the ithoutcome being the ith category in the classi®cation system. Withy ; y ; . . . ; y k denoting the probabilities of the respective outcomes and f ; f ; . . . ; f k denoting the observed outcomes, the likelihood function of the sample then is

L y ; y ; . . . ; y k

k

i

yi f i f i ; ; . . . ; n ;

k

i

f i n ; k

i

yi

:

The null hypothesis was assumed to specify the population distribu-tion completely, from which the yi can be calculated. This hypothesisthen is actually concerned only with the values of these parametersand can be equivalently stated as

H : yiei

nfor i ; ; . . . ; k

It is easily shown that the maximum-likelihood estimates of theparameters in ( . ) are yi f i =n . The likelihood-ratio statistic for thishypothesis then is

T L o

L O

L y ; y ; . . . ; y k

L y ; y ; . . . ; y k

k

i

yi

yi2 3 f i

As was stated in hapter , the distribution of the random variable

Àln T can be approximated by the chi-square distribution. Thedegrees of freedom are k À, since the restriction k

i yi leavesonly k À parameters in O to be estimated independently. We havehere

Àln T À k

i f i ln yi Àln

f in :

ome statisticians advocate using the expression in ( . ) as a testcriterion for goodness-of-®t. We shall now show that it is asymptoti-cally equivalent to the expression for given in ( . ). The Taylorseries expansion of ln yi about f i =n yi is

106 CHAPTER 4

Page 138: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 138/675

ln yi ln yi yi Àyiyi

yi Àyi

! Àyi2 3E

so that

ln yi Àln f in

yi À f in n

f i À yi À f in n

f iE

nyi À f i f i À

nyi À f i f i

E

where E represents the sum of terms alternating in sign

I j À j yi À f i

n j n j

j ! f j i:

ubstituting ( . ) in ( . ), we have

Àln T À k

inyi À f i

k

i

nyi À f i f i

k

iEH

k

i

f i À ei

f iEHH

y the law of large numbers F i=n is known to be a consistent estimatorof yi , or

limn3I n

P j F i Àn yij> E ! for every E >

Thus we see that the probability distribution of converges to that of

Àln T , which is chi square with k À degrees of freedom. Anapproximate a -level test then is obtained by rejecting H whenexceeds the Àa th quantile point of the chi-square distribution,denoted by w kÀ;a. This approximation can be used with con®dence aslong as every expected frequency is at least equal to . For any eismaller than , the usual procedure is to combine adjacent groups inthe frequency distribution until this restriction is satis®ed. Thenumber of degrees of freedom then must be reduced to correspond to

the actual number of categories used in the analysis. This rule of should not be considered in¯exible, however. It is conservative, andthe chi-square approximation is often reasonably accurate for expectedcell frequencies as small as . .

TESTS OF GOODNESS OF FIT 10

Page 139: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 139/675

Any case where the yi are completely speci®ed by the null hy-pothesis is thus easily handled. The more typical situation, however,is where the null hypothesis is composite, i.e., it states the form of thedistribution but not all the relevant parameters. For example, whenwe wish to test whether a sample is drawn from some normal popu-lation, m and s would not be given. However, in order to calculate theexpected frequencies under H ; m and s must be known. If the ex-pected frequencies are estimated from the data as n yi for i ;

; . . . ; k, the random variable for the goodness-of-®t test in ( . )becomes

k

i

F i Àn yi

n yi:

The asymptotic distribution of then may depend on themethod employed for estimation. When the estimates are found bythe method of maximum likelihood for the grouped data, the L o inthe likelihood-ratio test statistic is L y ; y ; . . . ; y k , where the yiare the Es of the yi under H . The derivation of the distributionof T and therefore goes through exactly as before except that thedimension of the space o is increased. The degrees of freedom forthen are k À Às, where s is the number of independent parametersin F x which had to be estimated from the grouped data in order toestimate all the yi . In the normal goodness-of-®t test, for example,the m and s parameter estimates would be calculated from thegrouped data and used with tables of the normal distribution to ®ndthe n yi , and the degrees of freedom for k categories would be k À.When the original data are ungrouped and the Es are based onthe likelihood function of all the observations, the theory is different.hernoff and ehmann ( ) have shown that the limiting dis-

tribution of is not the chi square in this case and that P > wa > a . The test is then anticonservative. Their investigationshowed that the error is considerably more serious for the normaldistribution than the oisson. A possible adjustment is discussed intheir paper. In practice, however, the statistic in ( . ) is often treatedas a chi-square variable anyway.

Exam le 2.1 A quality control engineer has taken samples of size each from a production process. The numbers of defectives for these

samples are recorded below. Test the null hypothesis at level . thatthe number of defectives follows

10 CHAPTER 4

Page 140: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 140/675

(a ) The oisson distribution(b) The binomial distribution

Solution ince the data are grouped and both of the hypothesizednull distributions are discrete, the chi-square goodness-of-®t test isappropriate. ince no parameters are speci®ed, they must be esti-mated from the data in order to carry out the test in both ( a ) and ( b).

Solution to a The oisson distribution is f x eÀmm x = x! for x, ; ; . . . ; where m here is the mean number of defectives in a sampleof size . The maximum-likelihood estimate of m is the mean numberof defectives in the samples, that is,

m :

We use this value in f x to estimate the probabilities as yi and tocompute ei yi . The calculations are shown in Table . . Notice thatthe ®nal y is not for exactly defects but rather for or more; this isnecessary to make y . The ®nal e is less than one, so it is com-bined with the category above before calculating . The ®nal result is

Number of defects Number of samples

or more

Table 2.1 Calculation of Q for Exam le 2.1 a

Defects f y e f À e = e

. . . . . . . . .

. . . . . g.

. or more . .

.

TESTS OF GOODNESS OF FIT 109

Page 141: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 141/675

Q ¼3 6010 3 À1 ¼5

B 0 05 3 7 81

Q ¼3 6010

ðQ 3 601Þ Q 3 XC L

0 3078 B 0 25 0 50

P

ðÞ 13 ¼13

65 50 ð13Þ ¼650

¼65 650 ¼0 1 C ¼50 2 2

Q ¼2 9680 3 0 05 7 81 XC L

0 3966

P

p ( )

ð À Þ2

0 10 0 2542 12 710 0 57781 24 0 3671 18 355 1 73612 10 0 2448 12 240 0 4099

3 4 0 0997 4 986 0 19504 1 0 0277

0 03421 385

1 7100 0492

5 1 0 0065 0 3252 9680

Page 142: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 142/675

S X C 2 1 Q

4 3

K - -

S 2 3 S

Page 143: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 143/675

squares, or absolute values, or the maximum deviation, to name only afew. The best-known test is the olmogorov- mirnov one-sample sta-tistic,which will be covered in this section.

The olomogorov- mirnov one-sample statistic is based on thedifferences between the hypothesized cumulative distribution function F x and the empirical distribution function of the sample S n x forall x. The empirical distribution function of the sample was de®ned inhapter as S n x , the proportion of sample observations that are lessthan or equal of x for all real numbers x. We showed there that S n xprovides a consistent point estimator for the true distribution F x x .Further, by the Glivenko- antelli Theorem in hapter , we knowthat as n increases, the step function S n x , with jumps occurring at

the values of the order statistics X ; X ; . . . ; X n for the sample, ap-proaches the true distribution function F X x for all x. Therefore, forlarge n , the deviations between the true function and its statisticalimage, jS n x À F X x j, should be small for all values for x: This resultsuggests that the statistic

Dn sup x jS n x À F X x j :

is, for any n, a reasonable measure of the accuracy of our estimate.This Dn statistic, called the Kolmogorov-Smirnov one-sample

statistic , is particularly useful in nonparametric statistical inferencebecause the probability distribution of Dn does not depend on F X x aslong as F X is continuous. Therefore, Dn is called a distribution-freestatistic.

The directional deviations de®ned as

Dn sup x

S n x À F X x DÀn sup x

F X x ÀS n x :

are called the one-sided Kolmogorov-Smirnov statistics . These mea-sures are also distribution free, as proved in the following theorem.

Theorem 3.1 The statistics D n ; Dn and D Àn are completely distribution free for any continuous F X

Proof Dn sup x jS n x À F X x j max

x Dn ; DÀn

e®ning the additional order statistics X

ÀIand X n

I, we

can write

S n x in

for X i 4 x < X i ; i ; ; . . . ; n

112 CHAPTER 4

Page 144: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 144/675

Therefore, we have

Dn sup x

S n x À F X x max4 i 4 n

sup X i 4 x< X i

S n x À F X x

max4 i 4 n

sup X i 4 x< X i

in À F X x !

max4 i 4 n

in À inf

X i 4 x< X i F X x !

max4 i 4 n

in À F X X i !

max max4 i 4 n

i

n À F X X i

!;

& ':

imilarly

DÀn max max4 i 4 n

F X X i Ài À

n !;& ' Dn max max

4 i 4 n

in À F X X i !; max

4 i 4 n F X X i À

i Àn !;& '

:

The probability distributions of Dn ; Dn , and DÀn therefore dependonly on the random variables F X X i ; i ; ; . . . ; n . These are the

order statistics from the uniform distribution on ( , ), regardlessof the original F X as long as it is continuous, because of theprobability integral transformation discussed in hapter . Thus Dn ; Dn and DÀn have distributions which are independent of theparticular F X .

A simpler proof can be given by making the transformationu F X x in Dn ; Dn or DÀn . This will be left to the reader as an exercise.The above proof has the advantage of giving de®nitions of theolmogorov- mirnov statistics in terms of order statistics.

In order to use the olmogorov statistics for inference, theirsampling distributions must be known. ince the distributions areindependent of F X , we can assume without loss of generality that F X is the uniform distribution on ( , ). The derivation of the dis-tribution of Dn is rather tedious. However, the approach below il-lustrates a number of properties of order statistics and is thereforeincluded here. For an interesting alternative derivation, see assey( ).

TESTS OF GOODNESS OF FIT 113

Page 145: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 145/675

Page 146: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 146/675

onsider any two consecutive values of i. We must have, for any4 i 4 n À, both

Ai :i À

n Àn < x <i

nn for X i 4 x 4 X i& '

and

Ai :i À

n Àn < x <i

nn for X i 4 x 4 X i& '

ince X i is the random variable common to both events and thecommon set of x is i = n Àn < x < i = n n for n 5 , theevent Ai Ai for any 4 i 4 n À is

i Àn Àn < X i < in

n for all n 5

In other words,

i Àn Àn < x <

in

n for X i 4 x 4 X ifor all i ; ; . . . ; n

if and only if

in Àn < X i <

in

nfor all i ; ; . . . ; n Àn 5

The joint probability distribution of the order statistics is

f X ; X ;... ; X n x ; x ; . . . ; xn n ! for x < x < ÁÁÁ< xn <

utting all this together now, we have

P Dn < nn forall À n < n <

n Àn

Pi

n Àn < X i <i

nn for all i ; ; . . . ; n À

for all 4 n <n À

n

Pn Àn < X < n

n n Àn < X < nn

 ÁÁÁn Àn Àn < X n < n Àn n !for all 4 n <n À

n

TESTS OF GOODNESS OF FIT 115

Page 147: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 147/675

Page 148: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 148/675

For any given n and n , we can evaluate P Dn < = n n or useTable of irnbaum ( ). The inverse procedure is to ®nd thatnumber Dn;a such that P Dn > Dn;a a . In our numerical examplewith n , a : , we ®nd n such that

P D > = n : or P D < = n :

and then set D ; : = n. From the previous evaluation of the Dsampling distribution, either

n : and < n < =

or

Àn n À : : and = 4 n < =The ®rst result has no solution, but the second yields the solutionn : . Therefore, D ; : : .

Numerical values of Dn;a are given in Table F of the Appendix forn 4 and selected tail probabilities a , and approximate values aregiven for larger n . ore extensive tables are given in unstan, Nix,and eynolds ( , p. ) for n 4 .

For large samples, olmogorov ( ) derived the following con-venient approximation to the sampling distribution of Dn , and mirnov( ) gave a simpler proof. The result is given here without proof.

Theorem 3.3 If F X is any continuous distribution function then for

every d >

limn3I

P Dn 4 d= np L d

where

L d ÀI

i ÀiÀ eÀi d

The function L d has been tabulated in mirnov ( ). ome of the results for the asymptotic approximation to Dn;a d a = np are

P Dn > d a = np : : : : :

d . . . . .

TESTS OF GOODNESS OF FIT 11

Page 149: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 149/675

The approximation has been found to be close enough for practicalapplication as long as n exceeds . A comparison of exact andasymptotic values of Dn;a for a : and . is given in Table . .

ince the one-sided olmogorov- mirnov statistics are alsodistribution-free, knowledge of their sampling distributions wouldmake them useful in nonparametric statistical inference as well. Theirexact sampling distributions are considerably easier to derive thanthat for Dn. Only the statistic Dn is considered in the following theo-rem, but Dn and DÀn have identical distributions because of symmetry.

Theorem 3.4 For D n sup x

S n x À F X x where F X x is any con-

tinuous cdf we have

P Dn < cc4

Àc unnÀ =nÀcÁÁÁ u=nÀc u=nÀc f u ; u ;.. . ; u n du ÁÁÁdu n < c<

c5

VXwhere

f u ; u ; . . . ; u n n ! for < u < u < ÁÁÁ< u n <

otherwise&Proof As before, we ®rst assume without loss of generality that F X is the uniform distribution on ( , ). Then we can write

Table 3.1 Exact and asym totic alues of D n ;a such that P D n > D n ;a = a fora =0 :01 ; 0 :05

Exact Asymptotic Ratio A = E

n 0 0 0 0 0 0 0 0 0 0 0 0

. . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . .

. . . . . . . . . . . .

Source . W. irnbaum ( ) NumericalTabulation of the istribution of olmogorov'statistic for Finite ample ize, J Am Statist Assoc , 47 , , Table .

11 CHAPTER 4

Page 150: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 150/675

Dn max max4 i4 n

in À X i ; !

the form found in ( . ). For all < c < , we have

P Dn < c P max4 i4 n n À X i < c !

Pin À X i < c forall i ; ; . . . ; n

P X i >in Àc forall i ; ; . . . ; n

IÀc InÀ =nÀc ÁÁÁ I=nÀc I=nÀc f x ; x ; . . . ; xn dx ÁÁÁdx n

where

f x ; x ; . . . ; xn n ! for < x < x < ÁÁÁ< xn < otherwise&

which is equivalent to the stated integral.

Another form of this result, due to irnbaum and Tingey ( ),which is more computationally tractable, is

P Dn > c Àc n cn

Àc

j n j ÀcÀj

n n

À j

c j n

j

À :

The equivalence of the two forms can be shown by induction.irnbaum and Tingey give a table of those values Dn;a , which satisfy

P Dn > Dn;a a for a : ; : ; : and selected values of n . Forlarge samples, we have the following theorem, which is given herewithout proof.

Theorem 3.5 If F X is any continuous distribution function then for every d 5

limn3I

P Dn <

d=

np À

d

As a result of this theorem, chi-square tables can be used for thedistribution of a function of Dn because of the following corollary.

TESTS OF GOODNESS OF FIT 119

Page 151: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 151/675

, 0 , V ¼4 þ2 , ! 1 ,

2 .

þffiffiffi 4 þ2 4 2 V 4 2

!1 ðV 4 2Þ ¼!1

þ ffiffiffi ¼1 À À2 2

¼1 À À4 2 2

!1 ðV Þ ¼1 À À 2 0

2

þ ¼0 05 B 5 99

0 05 2 4 þ2

0 05 ¼5 99

þ0 05 ffiffiffiffiffiffiffiffiffiffiffi1 4975 ¼1 22 ffiffiffi

K - -

K S 1 2 0 ð Þ ¼ 0ð Þ 0ð Þ

S ð Þ ð Þ ð Þ 0ð Þ

1 ð Þ 6¼ 0ð Þ

K S 0

C C 2 ð Þ ð Þ 1

Page 152: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 152/675

The value of the olmogorov- mirnov goodness-of-®t statistic Dnin ( . ) can be calculated using ( . ) if all n observations have differentnumerical values (no ties). However, the expression below is con-siderably easier for algebraic calculation and applies when ties arepresent. The formula is

Dn sup x jS n x À F x j max

x jS n x À F x j; jS n xÀe À F x jwhere e denotes any small positive number. Example . will illustratethis easy algebraic method of calculating Dn . uantiles of the exactnull distribution of Dn are given in Table F in the Appendix for n 4 ,along with approximate values for n > . The appropriate criticalregion is for Dn large.

Exam le 4.1 The observations below were chosen randomly fromthe continuous uniform distribution over ( , ), recorded to four sig-ni®cant ®gures, and rearranged in increasing order of magnitude.etermine the value of Dn , and test the null hypothesis that the

square roots of these numbers also have the continuous uniform dis-tribution, over ( , ).

Solution The calculations needed to ®nd Dn are shown in Table . .The entries in the ®rst column, labeled x, are not the observationsabove, but their respective square roots, because the null hypothesis isconcerned with the distribution of these square roots. The S n x arethe proportions of observed values less than or equal to each differentobserved x. The hypothesized distribution here is F x x, so thethird column is exactly the same as the ®rst column. The fourth col-umn is the difference S n x À F x . The ®fth column is the differenceS n xÀe À F x , that is, the difference between the S n value for anumber slightly smaller than an observed x and the F value for thatobserved x. Finally the sixth and seventh columns are the absolutevalues of the differences of the numbers in the fourth and ®fth col-umns. The supremum is the largest entry in either of the last twocolumns; its value here is Dn : . Table F shows that the . levelrejection region for n is Dn 5 : , so we reject the null hy-pothesis that these numbers are uniformly distributed.

. . . . .

. . . . .

. . . . .

. . . . .

TESTS OF GOODNESS OF FIT 121

Page 153: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 153/675

The theoretical justi®cation behind this example is as follows. etY have the continuous uniform distribution on ( , ) so that f Y y

for 4 y 4 . Then the pdf of X Y p can be shown to be (the readershould verify this) f X x x for 4 x 4 , which is not uniform. Infact, this is a beta distribution with parameters a and b .

ONE SIDED TESTS

With the statistics Dn and DÀn , it is possible to use olmogorov-mirnov statistics for a one-sided goodness-of-®t test which woulddetect directional differences between S n x and F x . For the alter-native

H ; : F X x 5 F x for all x

the appropriate rejection region is Dn > Dn;a , and for the alternative

H ;À: F X x 4 F x for all x

H is rejected when DÀn > DÀn;a . oth of these tests are consistentagainst their respective alternatives.

Table 4.1 Calculation of D n for Exam le 4.1

x Sn x F x Sn x À F x Sn xÀe À F x jSn x À F x j jSn xÀe À F x j. . . À. À. . .. . . À. À. . .. . . À. À. . .. . . À. À. . .. . . À. À. . .. . . À. À. . .. . . À. À. . .. . . À. À. . .. . . À. À. . .. . . À. À. . .. . . À. À. . .

. . . À. À. . .. . . À. À. . .

. . . À. À. . .

. . . À. À. . .

. . . À. À. . .

. . . À. À. . .

. . . À. À. . .

. . . À. À. . .. . . . À. . .

122 CHAPTER 4

Page 154: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 154/675

ost tests of goodness of ®t are two-sided and so applications of the one-sided olmogorov- mirnov statistics will not be demonstratedhere in detail. However, it is useful to know that the tail probabilitiesfor the one-sided statistics are approximately one-half of the corres-ponding tail probabilities for the two-sided statistic. Therefore, ap-proximate results can be obtained for the one-sided statistics by usingthe entries in Table F with each quantile being one-half of the valuelabeled. In our example, we ®nd Dn : as the largest entry in thefourth column of Table . . Now from Table F we see that for n ,the smallest critical value corresponding to a two-tailed P value of . is . . ince the observed value . is smaller than . , weconclude that the approximate P value for testing H against

H ; : F X x5

F x for all x is larger than . so we fail to reject H in favor of H ; . For the alternative H ;À: F X x 4 F x we ®nd DÀn : from the ®fth column of Table . . The approximate P valuefrom Table F is P < : and so we reject H in favor of H ;À. If, forexample, we observed Dn : , the approximate P value is between. and . .

The TAT A T solution to Example . is shown below. Thevalues of all three of the Dn statistics and P values agree with ours.The TAT A T package also provides olmogorov- mirnov type testsof goodness of ®t for some speci®c distributions such as the binomial,oisson, uniform, exponential and normal. ome of these will be dis-cussed and illustrated later in this chapter.

INITA offers the olmogorov- mirnov test for the normaldistribution and this will be discussed later in this chapter.

TESTS OF GOODNESS OF FIT 123

Page 155: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 155/675

Two other useful applications of the olmogorov- mirnovstatistics relate to point and interval estimations of the unknowncdf F X .

CONFIDENCE BANDS

One important statistical use of the Dn statistic is in ®nding con®dencebands on F X x for all x. From Table F in the Appendix, we can ®nd thenumber Dn;a such that

P Dn > Dn;a a

This is equivalent to the statement

P sup x j

S n x

À F X x

j< Dn;a

Àa

which means that

P S n x À Dn;a < F X x < S n x Dn;a for all x Àa

ut we know that 4 F X x 4 for all x, whereas the inequalityin this probability statement admits numbers outside this range. Thuswe de®ne

Ln x max S n x À Dn;a ;

andU n x min S n x Dn;a ;

and call Ln x a lower con®dence band and U n x an upper con®denceband for the cdf F X , with associated con®dence coef®cient Àa .The simplest procedure in application is to graph the observed

S n x as a step function and draw parallel lines at a distance Dn;a ineither direction, but always within the unit square. When n > , thevalue Dn;a can be determined from the asymptotic distribution. Of course, this con®dence-band procedure can be used to perform a test of the hypothesis F X x F x , since F x lies wholly within the limits Ln x and U n x if and only if the hypothesis cannot be rejected at asigni®cance level a .

imilar applications of the DÀn or Dn statistics are obvious.One criticism of con®dence bands is that they are too wide, par-

ticularly in the tails of the distribution, where the order statistics havea lot of variation. eeping this in mind, other approaches to con-structing a con®dence band on the cdf have been considered in theliterature. One general idea is to base bands on sup x wf F x gjS n x À F x j, where w x is a suitable weight function. ome authors have

124 CHAPTER 4

Page 156: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 156/675

Page 157: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 157/675

F i g .

4 . 1

S A S I n t e r a c t i v e D a t a A n a l y s i s f o r E x a m p l e 4 . 1 .

Page 158: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 158/675

F i g

. 4 . 2

S A S I n t e r a c t i v e D a t a A n a l y s i s o u t p u t f o r c o n d e n c e b a n d f o r E x a m p l e 4 . 1 .

7

Page 159: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 159/675

F i g

. 4

. 3

S A S I n t e r a c t i v e D a t a A n a l y s i s C o n d e n c e B a n d s o u t p u t f o r E x a m p l e 5 . 1 .

8

Page 160: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 160/675

For example, suppose we want to take a sample of size n and usethe resulting S n x as a point estimate of F X x for each x. We want theerror in this estimate to be no more than . with probability . .How large a sample should be taken We look down the . À.column in Table F of the Appendix until we ®nd the largest c that isless than or equal to . . This entry is . , which corresponds to asample size of n . If we want more precision in our point estimateand thereby specify a maximum error of . but keep the probabilityat . , Table F shows that n > . The value is found by solving

: = np : and we get n : , which is rounded up to require asample size of observations.

It should be noted that all of the theoretical properties of the

olmogorov- mirnov statistics required the assumption that F X becontinuous, since this is necessary to guarantee their distribution-free nature. The properties of the empirical distribution functiongiven in ection . , including the Glivenko- antelli theorem, do notrequire this continuity assumption. Furthermore, it is certainly de-sirable to have a goodness of ®t test which can be used when thehypothesized distribution is discrete. Noether ( , pp. ± ) andothers have shown that if the Dn;a values based on a continuous F X are used in a discrete application, the same signi®cance level is atmost a . However, lakter ( ) used onte arlo techniques toshow that this procedure is extremely conservative. onover ( )found recursive relationships for the exact distribution of Dn for F discrete.

ettitt and tephens ( ) give tables of exact tail probabilitiesof nD n that can be used with F discrete and also for grouped datafrom a continuous distribution as long as the expected frequencies areequal. They show that in these two cases, the test statistic can bewritten as

nD n max4 j 4 k

j

i F i À ei

because

Sn x j j

i

F in

and F x j j

i

ei

n

for ordered x 4 x 4 ÁÁÁ4 xn . This expression shows that the dis-tribution of Dn depends on the chosen grouping and also showsexplicitly the relationship between Dn and the chi-square statistic .

TESTS OF GOODNESS OF FIT 129

Page 161: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 161/675

This exact test has greater power than the chi-square test in the caseof grouped data.

4.5 LILLIEFORS'S TEST FOR NORMALITY

In this section we consider the problem of a goodness-of-®t test forthe normal distribution with no speci®ed mean and variance. Thisproblem is very important in practice because the assumption of ageneral normal distribution with unknown m and s is necessary to somany classical statistical test and estimation procedures. In thiscase, note that the null hypothesis is composite because it states thatthe underlying distribution is some normal distribution. In general,- tests can be applied in the case of composite goodness-of-®t

hypotheses after estimating the unknown parameters F x willthen be replaced by ^ F x . Unfortunately, the null distribution of the- test statistic with estimated parameters is far more complicated.This, of course, affects the P -value calculations. In the absence of anyadditional information, one approach could be to use the tables of the- test to approximate the P value or to ®nd the approximate cri-tical value. For the normal distribution, illiefors ( ) showed thatusing the usual critical points developed for the - test givesextremely conservative results. He then used onte arlo simula-tions to develop a table for the olmogorov- mirnov statistic thatgives accurate critical values. As before, the olmogorov- mirnovtwo-sided statistic is de®ned as

Dn sup x

S n x À^ F x

Here ^ F x is computed as the cumulative standard normal distributionF z where z xÀ x =s for each observed x, x is the mean of thesample of n observations, and s is the unbiased estimator of s(computed with n À in the denominator). The appropriate rejectionregion is in the right tail and Appendix Table O gives the exact tailprobabilities computed by onte arlo simulations. This table is takenfrom Edgeman and cott ( ), in which more samples were used toimprove the accuracy of the original results given by illiefors ( ).

Exam le 5.1 A random sample of persons is interviewed to esti-mate median annual gross income in a certain economically depressedtown. Use the most appropriate test for the null hypothesis that in-come data below are normally distributed.

130 CHAPTER 4

Page 162: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 162/675

S L ’

P ¼124 800 Pð À Þ2

¼84 600 000 ¼10 400 ¼ ffiffiffiffiffiffiffiffiffi

84 600 000 11 ¼2 773 25 ¼ ð À10 400Þ2 773

5 1 132 ¼0 1946 0 10 ¼12

5 1 S S M B

M B S S K S L ’ B K S

D’ S 1986 S S D

C ´ M P 4 14

M B S B S P P

9 800 10 200 9 300 8 700 15 200 6 9008 600 9 600 12 200 15 500 11 600 7 200

Page 163: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 163/675

Table 5.1 Calculations for Lilliefors's statistic in Exam le 5.1

x z Sn x F z jSn x ÀF z j jS n xÀe ÀF z jÀ. . . . .

À. . . . .

À. . . . .

À. . . . .

À. . . . .

À. . . . .

À. . . . .

À. . . . . . . . . .

. . . . . . . . . . . . . . .

.

132 CHAPTER 4

Page 164: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 164/675

this graph resembles that found on a normal probability paper. Thevertical axis is a probability scale and the horizontal axis is the usualdata scale. The plotted points are the S n x values; the straight line isa least-squares line ®tted to these points. If the points fall reasonablyclose to this straight line, the normal distribution hypothesis is sup-ported. The P value given by both packages is an approximation basedon linear interpolation in tables that are not the same as our Table O.ee the documentation in the packages and 'Agostino and tephens( ) for additional details.

The A software package also provides a number of excellent(interactive) ways to study goodness of ®t. We have given one il-lustration already for con®dence bands using data from Example

. . Now we illustrate some of the other possibilities in Figures .through . , using the data from Example . and the Interactiveata Analysis and the Analyst options, respectively, under A

version . . The highlights include a plot of the empirical cdf, a boxplot, con®dence bands, tests for a speci®c distribution the choices forwhich includes the normal, the lognormal, the exponential and theWeibull, a - plot together with a reference line and other avail-able options. Also interesting is a slider (the bottom panel in theoutput shown) where one can ``try out'' various means and standarddeviations to be compatible with the data on the basis of the -test. For details on the features, the reader is referred to Aversion . online documentations.

4.6 LILLIEFORS'S TEST FOR THE EXPONENTIAL DISTRIBUTION

Another important goodness-of-®t problem in practice is to test for theexponential distribution with no speci®ed mean. This problem isimportant because the assumption of an exponential distribution withan unknown mean m is made in many applications, particularly wherethe random variable under study is the waiting time, the time to theoccurrence of an event. illiefors ( ) developed an analog of theolmogorov- mirnov test in this situation and gave a table of criticalvalues based on onte arlo simulations. As in the normal case withunknown parameters, the olmogorov- mirnov two-sided statistic isde®ned as

Dn sup x jS n x À^ F x jHere ^ F x is computed as À eÀ x= x ^ F z À eÀ z, say, where x isthe sample mean and z x= x for each observed x. Thus one forms

TESTS OF GOODNESS OF FIT 133

Page 165: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 165/675

F i g

. 5 . 1

S A S / I n t e r a c t i v e D a t a A n a l y s i s g o o d n e s s o f t f o r E x a m p l e 5 . 1 .

Page 166: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 166/675

Page 167: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 167/675

F i g

. 5 . 3

S A S / I n t e r a c t i v e D a t a A n a l y s i s o f g o o d n e s s o f t f o r E x a m p l e 5 . 1 , c o n t i n u e d .

Page 168: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 168/675

F i g

. 5

. 4

S A S / I n t e r a c t i v e D a t a A n a l y s i s o f g o o d n e s s o f t f o r E x a m p l e 5 . 1 , c o n t i n u e d .

7

Page 169: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 169/675

F i g

. 5

. 5

S A S / I n t e r a c t i v e D a t a A n a l y s i s o f g o o d n e s s o f t f o r E x a m p l e 5 . 1 , c o n t i n u e d .

8

Page 170: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 170/675

F i g . 5

. 6

S A S / A n a l y s t Q - Q p l o t f o r E x a m p l e 5 . 1 .

Page 171: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 171/675

Page 172: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 172/675

0 15

S S D’

S 1986 L S S C ´ M D

6 1 ¼5 0

K S S 4 5 6 2 142 K S

¼0 2687

¼10

0 200 S S

Page 173: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 173/675

Page 174: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 174/675

C M B

7

D

J D

D

1960 [ 1968 ]

1983

Page 175: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 175/675

P P

P P ð Þ ð Þ S ’

P P P P

45 Q Q

Q Q ½ À1ð Þ À1ð Þ0 1 ’ Q Q

45 ð Þ ¼ ÀÀ Á À1ð Þ ¼þ À1ð Þ

Q Q

0 Q Q ½ À1

0 ð Þ À1 ð Þ

0 1 S ðÞ

Q Q À10 ð Þ

À1ð Þ ¼ ðÞ ¼1 2 0

Q Q

Q Q ¼

¼ ¼ À10 ð1Þ ¼ 1

¼ À0 5 ¼ þ1

¼ ðÀ0 375

Þ þ0 25

B 1958 À1

0 ðÀ0 375Þð þ0 25ð ÞÞ ðÞ à Q Q 0 ¼ 0

Page 176: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 176/675

D 0

45

Q Q

0 Q Q 0

ð ¼1Þ Q Q

P P Q Q Q Q 45

¼ 0 Q Q

0

Q Q ¼ 0

À QQ

0 B P P

P P

Q Q

‘‘ ’’ R J 1976

D’ S 1986 C 5

Page 177: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 177/675

p 7 6 1

¼ ðÀ0 375Þð þ0 25Þ Q Q 7 1 Q Q

XC L 7 1 7 2

7 p - p p

O

¼À37510 25

À1ð Þ

À ð1 À Þ1 5 1 0 060976 À1 54664 0 0629142 3 2 0 158537 À1 00049 0 1726132 5 3 0 256098 À0 65542 0 2958454 2 4 0 353659 À0 37546 0 4364274 6 5 0 451220 À0 12258 0 6000574 5 6 0 548780 0 12258 0 7958017 1 7 0 646341 0 37546 1 0394238 4 8 0 743902 0 65542 1 3621979 3 9 0 841463 1 00049 1 841770

10 4 10 0 939024 1 54664 2 797281

7 Q Q 7 1

Page 178: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 178/675

the scale and the location parameter under the respective model. Forthese data it appears that the normal distribution with mean . andstandard deviation . provides a better ®t than the exponential dis-tribution with mean . .

4. SUMMARY

In this chapter we presented procedures designed to help identify thepopulation from which a random sample is drawn. The primary testcriteria are the normalized sum of squares of deviations, and the dif-ference of the cumulative distribution functions, between the hypo-thesized and observed (sample) distributions. hi-square tests arebased on the ®rst criterion and the - type tests are based on thesecond (including the illiefors's tests).

The chi-square test is speci®cally designed for use with catego-rical data, while the - statistics are for random samples from con-tinuous populations. However, when the data are not categorical ascollected, these two goodness-of-®t tests can be used interchangeably.The reader is referred to Goodman ( ), irnbaum ( ), andassey ( c) for discussions of their relative merits. Only a brief comparison will be made here, which is relevant whenever raw un-grouped measurement data are available.

Fig. .2 Exponential - plot for Example . .

TESTS OF GOODNESS OF FIT 14

Page 179: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 179/675

The basic difference between the two tests is that chi squareis sensitive to vertical deviations between the observed and ex-pected histograms, whereas - type procedures are based onvertical deviations between the observed and expected cumulativedistribution functions. However, both types of deviations areuseful in determining goodness-of-®t and probably are equallyinformative. Another obvious difference is that chi square requiresgrouped data whereas - does not. Therefore, when the hy-pothesized distribution is continuous, the - test allows us toexamine the goodness-of-®t for each of the n observations, insteadof only for the k classes, where k 4 n . In this sense, the - typeprocedures make more complete use of the available data. Further,

the chi-square statistic is affected by the number of classes andtheir widths, which are often chosen arbitrarily by the experi-menter.

One of the primary advantages of the - test is that the exactsampling distribution of Dn is known or can be simulated and ta-bulated when all parameters are speci®ed, whereas the samplingdistribution of is only approximately chi square for any ®nite n .The - test can be applied for any size sample, while the chi-square test should be used only for n large and each expected cellfrequency not too small. When cells must be combined for chi-square application, the calculated value of is no longer unique, asit is affected by the scheme of combination. The - statistic ismuch more ¯exible than chi square, since it can be used in esti-

mation to ®nd minimum sample size and con®dence bands. With theone-sided Dn or DÀn statistics, we can test for deviations in a par-ticular direction, whereas the chi-square is always concernedequally with differences in either direction. In most cases, - iseasier to apply.

However, the chi-square test also has some advantages over -. A hypothesized distribution which is discrete presents no pro-

blems for the chi-square test, while the exact properties of Dn areviolated by the lack of continuity. As already stated, however, this isa minor problem with Dn which can generally be eliminated byreplacing equalities by inequalities in the probabilities. erhaps themain advantage of the chi-square test is that by simply reducingthe number of degrees of freedom and replacing unknown para-meters by consistent estimators, a goodness-of-®t test can be per-formed in the usual manner even when the hypothesizeddistribution is not completely speci®ed. If the hypothesized F x in Dn contains unspeci®ed parameters which are estimated from the

14 CHAPTER 4

Page 180: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 180/675

data, we obtain an estimate ^ Dn whose sampling distribution is dif-ferent from that of Dn . The test is conservative when the Dn criticalvalues are used.

As for relative performance, the power functions of the two sta-tistics depend on different quantities. If F x is the hypothesizedcumulative distribution and F X x the true distribution, the power of - depends on

sup x j F X x À F x j

while the power of chi square depends on

k

if F X a i À F X a i À F a i À F a i g

F a i À F a i

where the a i are the class limits in the numerical categories.The power of the chi-square test can be improved by clever

grouping in some situations. In particular, ochran ( ) and othershave shown that a choice of intervals which provide equal expectedfrequencies for all classes is a good procedure in this respect besidessimplifying the computations. The number of classes k can be chosensuch that the power is maximized in the vicinity of the point wherepower equals . . This procedure also eliminates the arbitrariness of

grouping. The expression for in ( . ) reduces to k F i Àn =nwhen ei n= k for i ; ; . . . ; k.any studies of power comparisons have been reported in the

literature over the years. ac, iefer, and Wolfowitz ( ) showedthat the - test is asymptotically more powerful than the chi-squaretest when testing for a completely speci®ed normal distribution. Fur-ther, when the sample size is small, the - provides an exact testwhile the chi-square does not.

When the hypothesized distribution is the normal or exponentialand the parameters are speci®ed (the null hypothesis is simple), the- test based on Table F gives an exact goodness-of-®t test. This testis conservative when parameters need to be estimated (the nullhypothesis is composite). In these cases, the modi®ed version of the Dn

statistic sometimes known as the illiefors's test statistic shouldbe used. The exact mathematical-statistical derivation of the dis-tribution of this test statistic is often very complicated but ontearlo estimates of the percentiles of the null distribution can be

TESTS OF GOODNESS OF FIT 149

Page 181: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 181/675

obtained. The tables given in illiefors ( , ) were generated inthis manner, as are our Tables O and T. Edgeman and cott ( )gave a step-by-step algorithm that included goodness-of-®t testing forthe lognormal, the ayleigh, Weibull and the two-parameter ex-ponential distribution.

Iman ( ) provided graphs for performing goodness-of-®t testsfor the normal and the exponential distributions with unspeci®edparameters. These graphs are in the form of con®dence bands basedon illiefors's ( ) critical values for the normal distribution testand urbin's ( ) critical values for the exponential distributiontest. On a graph with these bands, the empirical distribution func-tion of the standardized variable S n z is plotted on the vertical axis

as a step function in order to carry out the test. If this plot lies en-tirely within the con®dence bands, the null hypothesis is not re- jected. These graphs can be obtained by running some INITAmacro programs.

Finally, we have the option of using - or - plots to gleaninformation about the population distribution. The interpretation isusually subjective, however.

PROBLEMS

4.1. Two types of corn (golden and green-striped) carry recessive genes. When thesewere crossed, a ®rst generation was obtained which was consistently normal (neithergolden nor green-striped). When this generation was allowed to self-fertilize, four dis-tinct types of plants were produced normal, golden, green-striped, and golden-green-striped. In plants this process produced the following distribution

Normal Golden Green-striped Golden-green-striped

A monk named endel wrote an article theorizing that in a second generation of suchhybrids, the distribution of plant types should be in a ratio. Are the above dataconsistent with the good monk's theory

4.2. A group of four coins is tossed times, and the following data are obtained

o you think the four coins are balanced

4.3. A certain genetic model suggests that the probabilities for a particular trinomialdistribution are, respectively, y p ; y p À p , and y À p ; < p < .

Number of heads Frequency

150 CHAPTER 4

Page 182: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 182/675

Assume that X ; X , and X represent the respective frequencies in a sample of n in-dependent trials and that these numbers are known. erive a chi-square goodness-of-®ttest for this trinomial distribution if p is unknown.4.4. According to a genetic model, the proportions of individuals having the four bloodtypes should be related by

Type qType A p pqType r qrType A pr

where p q r . Given the blood types of individuals, how would you test theadequacy of the model4.5. If individuals are classi®ed according to gender and color blindness, it is hy-

pothesized that the distribution should be as follows

for some p q , where p denotes the proportion of defective genes in the relevantpopulation and therefore changes for each problem. How would the chi-square test beused to test the adequacy of the general model

4.6. how that in general, for de®ned as in ( . ),

E E k

i

F i À ei

ei4 5 k

i

nyi Àyi

einyi À ei

ei4 5From this we see that if the null hypothesis is true, nyi ei and E k À, the meanof the chi-square distribution.

4. . how algebraically that where ei nyi and k , we have

i

F i À ei

ei

F Ànyny Ày

so that when k ; p is thestatistic commonly used for testing a hypothesis concerningthe parameter of the binomial distribution for large samples. y the central-limit theo-rem,

p approaches the standard normal distribution as n 3 I and the square of any

standard normal variable is chi-square-distributed with degree of freedom. Thus wehave an entirely different argument for the distribution of when k .

4. . Give a simple proof that Dn ; Dn , and DÀn are completely distribution-free for anycontinuous F X by appealing to the transformation u F X x in the initial de®nitions of Dn ; Dn , and DÀn .

Male Female

Normal p= p = pq

olor blind q= q =

TESTS OF GOODNESS OF FIT 151

Page 183: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 183/675

P

À¼ 1

ð ðÞÞ ÀÀ1 !0& 'P À þ:

3 4

3 3

!1 1 07 ffiffiffi

À Á¼0 20

ð 0 05Þ 0 99

3 4 ð þ5 0 447Þ ¼0 10 C

3 5R C ´ M

ð Þ2 ¼Z 1

À1½ð Þ À ð Þ2 ð Þ

P 2 2 S

2 ¼1

12 þ¼1

ðÞÀ2 À1 !2

C ´ 1928 M 1931 S 1936 D 1957

S

0 90 0 25 ?

0 20 0 95 ?

7 13 ð Þ :

3 5 4 1 4 8 5 0 6 3 7 1 7 2 7 8 8 1 8 4 8 6 9 0

90% ð Þ P

ð

Þ

8 15

ð Þ ¼À 10 10 0

Page 184: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 184/675

: ; : ; : ; : ; : ; : ; : ; : ; : ; : ; : ; : ; : ; : ; :

4.19. For the data given in Example . use the most appropriate test to see if the dis-tribution can be assumed to be normal with mean , and standard deviation , .

4.20. The data below represent earnings per share (in dollars) for a random sample of ®ve common stocks listed on the New York tock Exchange.

: ; : ; : ; : ; :

(a ) Use the most appropriate test to see if these data can be regarded as arandom sample from a normal distribution.

(b) Use the most appropriate test to see if these data can be regarded as arandom sample from a normal distribution with m ; s .

(c) etermine the sample size required to use the empirical distribution function

to estimate the unknown cumulative distribution function with con®dence suchthat the error in the estimate is (i) less than . , (ii) less than . .

4.21. It is claimed that the number of errors made by a typesetter is oisson dis-tributed with an average rate of per words set. One hundred random samples of sets of words from this typesetter output are examined and the numbers of errorsare counted as shown below. Are these data consistent with the claim

4.22. For the original data in Example . (not the square roots), test the null hy-pothesis that they come from the continuous uniform distribution, using level . .4.23. Use the Dn statistic to test the null hypothesis that the data in Example .

(a ) ome from the oisson distribution with m :(b) ome from the binomial distribution with n ; p :

These tests will be conservative because both hypothesized distributions are discrete.

4.24. Each student in a class of is asked to list three people he likes and three hedislikes and label the people , , , , , according to how much he likes them, with denoting least liked and denoting most liked. From this list each student selects thenumber assigned to the person he thinks is the wealthiest of the six. The results in theform of an array are as follows

; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ; ;

Test the null hypothesis that the students are equally likely to select any of the numbers, , , , , , using the most appropriate test and the . level of signi®cance.

4.25. uring a -week period, demand for a certain kind of replacement part for T sets was distributed as shown below. Find the theoretical distribution of weekly de-mands for a oisson model with the same mean as the given data and perform an ap-propriate goodness-of-®t test.

No. of errors No. of samples

TESTS OF GOODNESS OF FIT 153

Page 185: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 185/675

Page 186: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 186/675

B

C K S

1 0 2 3 4 2 7 1 10 4

0ð Þ ¼1 À À 1

1:3:4:1:1 26 ’

50 B’ 80 C’ 35 D’ 10 ’ D ’ ?

213 111 62

’ ?

50 13

P

C

100 :95 80 40 52 60 80 82 58 65 50

0ð Þ ¼0 1 2ð3 À2 Þ 0 11 1(

0 91 262 9

3 44 15 16 0

Page 187: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 187/675

Page 188: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 188/675

S ’

0 ¼ 0 0 À ¼ ¼ 0

R C 2

1 L

0 1 100

ð Þ ¼

¼Q ð Þ ¼ À1 ð Þ

7

Page 189: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 189/675

L Q ð Þ C 2 0 50

1 2 C 2

ð Þ 100 100

ðÞ

¼

½ þ1 & [ ]

Q ð Þ ð Þ ð½þ1 Þ

10 1 C 2

! 1!

ðÞ ðÞ 100 ð1 À Þ

1

ð ðÞ ðÞÞ ¼1 À 0 1 1 À

ðÞ

ðÞ ðÞ ðÞ

ð ðÞ Þ ¼ð ðÞ ðÞÞ þ ð ðÞÞ

8 5

Page 190: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 190/675

ð ðÞ ðÞÞ ¼ð ðÞ Þ À ð ðÞ Þ ð2 1ÞS

ðÞ ð ðÞÞ ð Þ ¼B P

ð ðÞÞ ðÞ 0 1 ð Þ ¼

ð

ðÞ

Þ ¼½

ð

ðÞÞ¼Z 0 ðÀ1Þð ÀÞ À1ð1 À ÞÀ ð2 2Þ

2 2

2 1 2 2

2 1

Z 0À1

À1 À1ð1 À ÞÀ

ÀZ 0

À1

À1 À1

ð1

À

ÞÀ

¼1

À ð2 3

ÞC

2 3

ðÞÀ ðÞ ½ ðÞÀ ðÞ À 2 2 2 3

2 2

:

ð ðÞ Þ¼Z 0 À1

À1 À1

ð1 À ÞÀ

¼ À1

À1 ð1À ÞÀ

0 þ ÀZ 0 ð1À ÞÀÀ1 !

Page 191: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 191/675

Page 192: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 192/675

2 5 1 À

B

‘‘ ’’ ‘‘ ’’ 1 À 1 À

2 4 2 2

ðÞ 1 2

ð ðÞ Þ ¼ð Þþ ð ðþ1Þ

ÞþÁÁÁþ ð Þ

ð ðÞ Þ ¼¼

ð Þ

B

ð Þ ¼

ð Þ ¼ ð1 À ÞÀ

ð ðÞ Þ ¼¼ ð1 À ÞÀ

Page 193: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 193/675

ð1 À Þ100 ð ðÞ ðÞÞ

1

ð ðÞ ðÞÞ ¼À1

¼ ð1 À ÞÀ 1 À ð2 6Þ

2 ‘‘ ’’

ð1 Þ

À1

¼0 ð1 À ÞÀ2

À1

¼0 ð1 À ÞÀ 1 À2

ð2 7Þ

C C

À1

¼ ð1 À ÞÀ

¼À1

¼0

ð1

À

ÞÀ

ÀÀ1

¼0

ð1

À

ÞÀ

¼ ð2 8

Þ 20

C

¼ þ0 5 À 2 ffiffiffiffiffiffiffiffiffiffið1 À Þ ¼ þ0 5 þ 2 ffiffiffiffiffiffiffiffið1 À Þ ð2 9Þ2 ð 2Þ ¼1 À 2 C 3

2 9

1 Àp S ¼10 ¼0 35 1 À ¼0 95 2 7 C À1 ¼0 À1 ¼7 ¼1

5

Page 194: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 194/675

¼8 0 35 ð ð1Þ ð8ÞÞ 2 8 0 9952 À0 0135 ¼0 9817 ¼1 ¼7 0 95

¼10 ¼0 10 1 À ¼0 95 C À1 ¼3 À1 2 7 À1 ¼0

0 10 ð ð1Þ ð4ÞÞ 0 9872 À0 ¼0 9872

À 2 8 1 À 2 1

B ¼11 ¼0 25 1 À ¼0 95 2 7 ¼0 ¼7

0 9924 2 8 À 2 8 0 95

¼0 ¼6 0 9657

ð1Þ ð2Þ ÁÁÁ ðÞ

0 ¼ 0

0 0 0

ð 0

Þ ¼ 0

0

0

0 0

1 0

0 1 À1 0

À1 0

Page 195: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 195/675

ðÞ ðÞ0

ðÞ2 ðÞ0 ð3 1Þ

ð ðÞ0 0Þ ¼1 À ð ðÞ

0 0Þ

2 4

1 À¼

ð1 À ÞÀ ¼À1

¼0

ð1 À ÞÀ ð3 2Þ

S 5 4

ðÞ0 À1

0 À ðÀ1Þ ¼À þ1

0 D

ðÞÀ 0

3 1

2 À þ1

À 0 ¼1 2

0

ð À 0 0Þ ¼ð 0

Þ ¼1 À

¼P¼1 ð 0

Þ ð Þ ¼1 0

ð 0 Þ ¼1 2 B

1 À 0 0 1 À

ð À þ1 0Þ ¼¼Àþ1 ð1 À Þ À ð3 3Þ 3 2

À 3 2

5

Page 196: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 196/675

C

0

0 1

0 0

0

ðÞ2 ðÞ0

0 À

1 0

2 À

ð À 0Þ ¼À

¼0 ð1 À Þ À ð3 4Þ 1 6¼ 0

2 À À þ1 ð3 5Þ 3 2 3 4

2 C

20 ¼ 3 2 ¼1 À 3 4 20

1

0

0 5 þ ð1 À Þ þ ffiffiffiffiffiffiffiffiffið1 À Þ 1 0

À0 5 þ ð1 À Þ À ffiffiffiffiffiffiffiffiffið1 À Þ 1

6¼0

2

100 ð1 À Þ [ ð1 À Þ ]

Page 197: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 197/675

3 1 167

0

p S 75 R

R 693 15 R

690 750 680 700 660 710 720 730 650 670740 730 660 750 690

75 ?

0 05 0 75 ¼0 75

0 75 00 75 693

0 0 75 ¼693 1 0 75 6¼693 ¼8

À693 2 À À þ1 3 2 3 4 2 ¼0 025

¼15 ¼0 75 C 0 0173 0 025 À1 ¼7 ¼8

0 0134 0 025 ¼15 1 À ¼0 25 À ¼0 ¼15

0 8 0 0134 þ0 0173 0 0307 S

¼8 75

C ¼15 ¼0 25 ð 8 0Þ ¼0 9958

ð 8 0Þ ¼1 À0 9827 ¼0 0173 2 0 0346

95% 0 75 2 7

C ¼15 ¼0 75

5

Page 198: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 198/675

Page 199: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 199/675

Page 200: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 200/675

Page 201: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 201/675

0

0

¼0

ð0 5Þ ð4 2Þ

1 6¼ 0 ¼ ð 0Þ 6¼ ð 0Þ 2 0 2 2 0 2

¼ 2

ð0 5

Þ

2

0 2

¼0

ð0 5

Þ

2 ð4 3

Þ 2 ¼ À 0 2

C 1 S ð Þ ¼ ð Þ ¼ð1 À Þ ! 1 ! 1

¼0 5

3 1 1 0 O

¼ O

ð0 5Þ

¼0 50

12 S

7 5

Page 202: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 202/675

0 5

1 0 0

¼0 5 þ0 5 þ0 5 ffiffiffiffiffi ð4 4Þ

S

1 À O À0 5 À0 5

ffiffiffiffiffiffiffiffiffiffiffi0 25 ð4 5Þ

¼ 0

M

0

P 1955 S

1979 P P 1981 R 2001

W

¼ p

¼ 0 1 : 0

7

Page 203: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 203/675

B 1

P ðÞ ¼ð 1Þ 1

¼ i 0 1

P ðÞ ¼

¼

ð1 À Þ À

¼

ð0 5Þ

0 05 ¼ ð 0 1Þ

S

0 ¼28 1 28 ¼16 0 05

1 ¼29 04

¼0 05 12 ¼12 0 0384

¼ ð 28 1Þ¼

À29 041

28 À29 041

¼ ð À1 04Þ

¼1

À ðÀ1 04

Þ ¼0 8505

¼0 85

0 5 1

7 5

Page 204: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 204/675

P ð0 85Þ ¼16

¼12

16 ð0 85Þð0 15Þ16À

¼1 À11

¼0

16 ð0 85Þð0 15Þ16À ¼0 9209

0 ¼28 1 ¼29 04 ¼1

¼0 05 28 þ 0 05 ffiffiffiffiffiffi

16 ¼28 41

P

ð29 04

Þ ¼½ 28 41

$ ð29 04 1

Þ¼

À29 041 ffiffiffiffiffiffi

16 28 À29 04

0 25 ¼ ð À2 52Þ¼0 9941

0 0384 0 05

0 05

C 1 ¼0 0384

0 9209 28 þ 0 0384 ffiffiffiffiffiffi

16 ¼28 44 P ð29 04Þ ¼0 9918 ¼0 0384

7

Page 205: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 205/675

28

S 1 ¼ ð 0 1Þ

1 1 0

P ð 1Þ ¼ð 1 1 0Þ

¼ À À0 5

ffiffiffiffiffiffiffiffiffiffi ð1 À Þ ¼1 À À À0 5

ffiffiffiffiffiffiffiffiffiffi ð1 À Þ ð4 6Þ

¼ ð 1 1 0Þ

¼ ð 0Þ¼ À 2 À0 5

ffiffiffiffiffiffiffiffiffiffi 4

¼1

À2 À À1

ffiffiffiffiffi ð

4 7

Þ 4 7 ¼ ½ þ1 þ ffiffiffiffiffi

À1ð1 À Þ 2S 4 6

7 5

Page 206: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 206/675

P ð 1Þ ¼ 0 5½ þ1 þ ffiffiffiffiffi

À1ð1 À Þ À À0 5

ffiffiffiffiffiffiffiffi ð1 À Þ ( )

¼1 À ð0 5 À Þ þ0 5 ffiffiffiffiffi

ffiffiffiffiffiffiffiffiffi

ð1 À Þ

" # ð4 8Þ

¼ À1ð1 À Þ 1 À 0 05 ¼1 645 0 85 ¼ À1 04

¼ À 1À 4 1 ¼16 ¼0 05

4 1

pp p N ¼

0 5 0 55 0 6 0 65 0 70 0 75 0 80 0 85 0 90P 0 0461 0 0918 0 1629 0 2639 0 3960 0 5546 0 7255 0 8802 0 9773

7

Page 207: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 207/675

¼ ð 0 1

1 0Þ 0 ¼0 5 0 5 1 1 0

ð 0 1 ¼ 1 0Þ ð 1 1 ¼ 1 0Þ ¼ ð 0 1Þ ð 1 1Þ ¼0 5 ‘‘ ’’

0 5 0 5

1987

‘‘ ’’ ‘‘ ’’ 1 À 0

¼0 5

W

M B M

¼ ¼ 2 0 ¼ 0 0 ¼ 1 0

0 1 R

¼ ð 0 1Þ 2

¼ À 1 0 À 1

¼ 1 À 0

ð 1 À 0Þ ¼ À1ðÞ 0 ¼0 5 2 ¼1 ¼0 55 À 1 0 55 ¼0 1256

7 5

Page 208: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 208/675

1 ¼0 6256 ¼13 ¼0 05 0 0461 0 05

10 0 0461 1000 13

¼0 6256 À 0 ¼ À0 5 0

10 1000 1000 ¼0 55

1 ¼0 6256 ¼13 0 ¼0 50 ¼1 ¼0 0461 M B M 0 10 4 1

0 0918 :

77

Page 209: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 209/675

S M B

% : n

S 5 7 7 1

1 À

1 0

78 5

Page 210: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 210/675

Page 211: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 211/675

Page 212: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 212/675

Page 213: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 213/675

D ð Þ ¼1 2

¼ À2 þ ffiffiffi3

¼ À

S ¼

1 ¼ À ¼ ¼0

À À

¼0 5

¼0 5

0 ¼ 0 1 À 0 2 À 0

À 0 :

C ¼0 5 20

¼0 5

0 P

¼ O

ð0 5Þ

0 0 P O

¼0

ð0 5Þ

6¼ 0 0 2 2 2

1 ð Þ ¼ ðÀ À Þ

8 5

Page 214: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 214/675

Page 215: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 215/675

4 ¼12 6 0 1938 0 3876 0

0

C 1 À2

4 10 ð 0 2 þ1Þ ¼

0 2

2 B 0 2 þ1 0 2

4 1

ðÞ¼ ð Àþ1Þ

2

¼1 À2

¼0 5 þ0 5 À0 5 ffiffiffiffiffi 2 ð4 15Þ

4 4 4 15 0 95 ¼15

2 ¼0 025 0 025 0 0176 3

4

¼4 95%

ð ð4Þ ð12ÞÞ 1 À2 ¼1 À2ð0 0176Þ ¼0 9648

p S 13

8 5

Page 216: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 216/675

9 ?

L

0 ¼0 5 1 0 5

¼9 ¼13 0 1338

0 05p S 1 20 4

D ?

0 ¼0 1 0

O ¼4 ¼6 O ¼4 0 3438

0 3438 4 0

¼0 05 6 0 0156 S 4 0

4 2 S X C M B S S

1 10 182 16 193 7 114 4 35 7 56 2 3

8

Page 217: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 217/675

Page 218: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 218/675

87

Page 219: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 219/675

Page 220: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 220/675

0 95 95%

L

S 1986

K-

R

S 1 2 L ð 1Þ ð 2Þ ð Þ ð Þ ð Þ

ð Þ ð Þ ½ ð Þ ½ ð Þ

½ð Þ ½ð Þ

ð Þ ð Þ

¼1 2 ð ðÞÞ ¼

ð Þ ¼

¼1 ð À Þ ¼1 6¼1 ð À Þ ð5 1Þ

8

Page 221: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 221/675

5

Page 222: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 222/675

ðÞ ¼0 01 0& ð5 2Þ

ð Þ

½ð Þ ¼ ¼1 ¼1 2

R

S

ð Þ ¼½ð À Þð À Þ ¼ ð Þ À ð Þ ð Þ

ð Þ

B ð Þ 5 1

Page 223: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 223/675

Page 224: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 224/675

S 5 3

ð Þ

5 8

:

¼1 ð ðÞÞ¼

¼1

ðÀ1Þð ÀÞ

ÂZ 1

À1 ½ ð ÞÀ1

½1À ð Þ À ð Þ

¼ À1

¼0ð þ1Þ

ð À À1ÞZ 1

À1 ½ ð Þ

½1À ð Þ À À1 ð Þ

¼ À1

¼1

ð À1Þð À À1Þ

Â

Z 1

À1

½ ð Þ

½1À ð Þ À À1 ð Þ

þ À1

¼0

ð À À1ÞZ 1

À1 ½ ð Þ

½1À ð Þ

À À1 ð Þ

¼ ð À1ÞZ 1

À1 ð Þ

 À1

¼1

À2 À1 ½ ð Þ

À1

½1 À ð Þ À À1 ð Þ

þ Z 1

À1

À1

¼0

À1 ½ ð Þ

½1 À ð Þ À À1 ð Þ

¼ ð À1ÞZ 1À1

ð Þ ð Þ þ Z 1À1

ð Þ¼ ð À1Þ ½ ð Þ þ ð Þ ð5 9Þ

Page 225: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 225/675

5 8

½ ð Þ ¼12

2À1 1 2 1 ð À1Þ ½ ð Þ þ ð ÞÀ

þ12 ð Þ& '

¼12

2À1 1 2 1 ð À1Þ ½ ð Þ À

À12 ð Þ& '

¼12ð À1Þ

þ1 !1 2 1

½ ð Þ À12 ð Þ& ' ð5 10Þ

!1 ½

ð

Þ ¼

2

ffiffiffi

3

½

ð

Þ À

1

2

ð

Þ& ' ð5 11

ÞS 5 11 S 1954

K

5 1

S

P¼ 2

Q

5

Page 226: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 226/675

Page 227: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 227/675

Page 228: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 228/675

Page 229: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 229/675

¼1 00 0&

þ À À¼2

¼1 À

ð þ1Þ2

B ð ¼1Þ ¼ ð ¼0Þ ¼1 2 ð Þ ¼1 2 ð Þ ¼1 4

þ 7 1

ð þ 0Þ ¼

¼1ð Þ

2 ¼ ð þ1Þ

4

ð Þ 0 P 5 25

ð þ 0Þ ¼

¼1½ 2

4 ¼ ð þ1Þð2 þ1Þ

24 ð7 2Þ þ

þ ¼1 ð7 3Þ

¼1 þ 00 &

’ 0

1 ¼ ð 0Þ 2 ¼ ð þ 0Þ 3 ¼ ð 0 þ 0Þ 4 ¼ ð þ 0 þ 0Þ

ð7 4Þ

8 5

Page 230: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 230/675

Page 231: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 231/675

Page 232: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 232/675

þ ð þ1Þ2

7 2 S þ 7 1

1 2 2

0 þ

ð þ¼Þ ¼ðÞ2 ð7 7ÞðÞ

þ

¼1 ð1 À Þ ¼ ð þ1Þ

2 À

¼1

S þ ð þ1Þ4

B

þ ¼4 7 1

þðÞ ¼1 16 ¼0 1 2 8 9 102 16 ¼3 4 5 6 7

0

8<:

7 þ

V þ

( )

10 1 2 3 4 19 2 3 4 1

8 1 3 4 17 1 2 4 3 4 26 1 2 3 2 4 25 1 4 2 3 2

Page 233: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 233/675

þ þ À¼ ð þ1Þ2 þ ð þ1Þ4L þ À þ À

ð þ Þ ¼ þ À ð þ1Þ

4 À ð þ1Þ

4 !¼

ð þ1Þ4 À þ À

ð þ1Þ4 !

¼ ð þ1Þ

2 À þ !¼ ð À Þ

S þ À

ð Þ ¼ 0 ¼ 0 :

À 1 0

þ 1 0

þ 2 À 2 1 6¼ 0

S ¼8 ¼0 05 S 2 8 ¼256

256

ð0 05

Þ ¼12 80 13

þ 7 2 S ð þ 6Þ ¼14 256 0 05 ð þ 5Þ ¼10 256 ¼0 039 0 05 ¼5

0 039 S 0 025 ¼3 ð þ 3Þ ¼0 0195

7 T þN N ¼8

V þ

0 11 1 12 2 1

3 3 1 2 24 4 1 3 25 5 1 4 2 3 36 6 1 5 2 4 1 2 3 4

5

Page 234: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 234/675

L þ

C À1

1 2 À1 þ À1 þ

À1

0 þ þ À1 À1 0 þ

þ À1 7 7

ð þ ¼ Þ ¼

ð

Þ2 ¼ À1

ð

À

Þ

ð 0

Þ þ À1

ð

Þ

ð 0

Þ2 À1

¼ À1ð À Þ þ À1ð Þ

2 ð7 8Þ

D

0 5

ð þ¼Þ ¼

¼0 ð ¼ \ þ ¼Þ

¼

¼0 ð ¼Þ ð þ ¼ ¼Þ

¼

¼0

ð0 5Þ ð þ ¼ ¼Þ

D R

1979 50 K 1972 50

þ À 15

þ

7 2

¼4 þ À ð þ1Þ ffiffiffiffiffiffiffiffiffiffiffiffi

2 ð þ1Þð2 þ1Þ3 ð7 9Þ

Page 235: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 235/675

! 1 1 0 7 9

0 15 0 5

S

O

S 5 4

¼ 6¼ S 5 6

P 1981

7 9

S

þ1 þ2 þ þ ðþ1Þ2

þðþ1Þ2 !2¼ 2 þ ðþ1Þ þðþ1Þ

2

4" #

¼1 ðþ Þ2

¼ 2 þ ðþ1Þ þðþ1Þð2 þ1Þ6

ðþ1Þð2 þ1Þ6 Àðþ1Þ

2

4 ¼ðþ1ÞðÀ1Þ12 ð7 10Þ

5

Page 236: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 236/675

7 2

ð þ 0Þ ¼ ð þ1Þð2 þ1Þ

24 ÀPð2 À1Þ48 ð7 11Þ

W

þ

7 5 7 6 À 0

7 4 7 5 7 6 0 864

0 955 1 5

þ þ À

À þ

ð À¼ Þ ¼ ð þ1Þ

2 À þ ¼ ! ð7 12Þ W

C

S 5 4 M B M

¼13 ¼0 05 0 ¼0 5 1 ¼0 6256

S

þ 70

¼0 047 1000 ¼13 0 6256 1

þ 70

1000 1000

Page 237: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 237/675

Page 238: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 238/675

7

Page 239: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 239/675

M CR 1 2 1000

1 À

R

S ¼

¼

ð0 5Þ

¼

¼

ð1

À Þ À 1

À 1À ¼ ð 0 1Þ

8 5

Page 240: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 240/675

þ

1 0 ð þ 0 þ 0 1Þ

0 0 þ 1987

1 À

0 À

2

¼ ð þ Þ2

ð7 13Þ

þ ¼ 0 S

[ 7 6 ] 1

0 0 [ 7 5 ] 7 13

½ ð 1 À0 5Þ þ ð ð À1Þð 2 À0 5ÞÞ22

ð

þ1Þð

2

þ1Þ

24¼ ð

þ

Þ

2

ð7 14

Þ 1 ¼ ð 0Þ 2 ¼ ð þ 2 0Þ

1 0 7 14 7 3 ¼0 05 1 À ¼0 95

7 14 XC L

0 ¼0 1 ¼0 5 ¼0 05

33 0 95

7 14

Page 241: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 241/675

T a b l e 7

. 3 C a l c u l a t i o n s f o r s a m p l e s i z e d e t e r m i n a t i o n i n E X C E L

M 0

M 1

P 1

P 2

N

p 1 À

0 : 5

0 : 5 N ð N

À 1 Þ ð p 2 À

0 : 5 Þ

N ð N þ 1 Þ ð 2 N þ 1 Þ = 2 4

E r r o r

0

0 . 1

0 . 5 3 9 8 2 8

0 . 5 7 9 2 6

5 7 5 . 7 2 0 1

0 . 0 3 9 8 2 8

1 3 1 1 2 . 6 3 9 1 5

1 5 9 4 3 4 9 7

9 . 8 3 E - 1 1

0

0 . 2

0 . 5 7 9 2 6

0 . 6 5 5 4 2 2

1 5 0 . 7 8 6 8

0 . 0 7 9 2 6

1 7 5 5 . 1 6 6 4 4 7

2 8 8 5 4 7 . 2

À 9 . 6 E - 0 9

0

0 . 3

0 . 6 1 7 9 1 1

0 . 7 2 5 7 4 7

7 2 . 1 7 4 9 4

0 . 1 1 7 9 1 1

5 7 9 . 8 3 6 2 5 0 9

3 1 9 8 5 . 4 3

À 2 . 1 E - 0 7

0

0 . 4

0 . 6 5 5 4 2 2

0 . 7 8 8 1 4 5

4 4 . 7 5 7 4 7

0 . 1 5 5 4 2 2

2 8 2 . 1 6 1 8 9 3 1

7 7 2 3 . 9

À 2 . 7 E - 0 8

0

0 . 5

0 . 6 9 1 4 6 2

0 . 8 4 1 3 4 5

3 2 . 1 7 4 6 5

0 . 1 9 1 4 6 2

1 7 1 . 1 9 0 1 1 4

2 9 0 6 . 3 6 4

À 1 . 7 E - 0 7

0

0 . 6

0 . 7 2 5 7 4 7

0 . 8 8 4 9 3

2 5 . 4 5 2 6 2

0 . 2 2 5 7 4 7

1 1 9 . 7 8 7 0 8 5 3

1 4 5 6 . 1 3 4

À 4 . 6 E - 0 8

0

0 . 7

0 . 7 5 8 0 3 6

0 . 9 1 9 2 4 3

2 1 . 5 1 2 7 6

0 . 2 5 8 0 3 6

9 2 . 5 0 3 1 0 8 5 3

8 8 8 . 4 1 9 5

À 5 E - 0 8

0

0 . 8

0 . 7 8 8 1 4 5

0 . 9 4 5 2 0 1

1 9 . 0 6 4 0 5

0 . 2 8 8 1 4 5

7 6 . 6 5 7 7 9 3 9 2

6 2 3 . 6 0 7 8

À 2 . 1 E - 0 8

0

0 . 9

0 . 8 1 5 9 4

0 . 9 6 4 0 7

1 7 . 4 8 4 2 1

0 . 3 1 5 9 4

6 6 . 8 7 5 5 2 0 2 9

4 8 4 . 3 4 7 1

À 6 . 1 E - 0 9

0

1 . 0

0 . 8 4 1 3 4 5

0 . 9 7 7 2 5

1 6 . 4 4 0 1 9

0 . 3 4 1 3 4 5

6 0 . 5 7 2 4 5 5 3 3

4 0 4 . 7 5 7 4

À 2 . 6 E - 0 9

5

Page 242: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 242/675

Page 243: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 243/675

:

À1 6 13 4 2 3 5 9 ð7 16Þ ¼8 0 05

7 2 2 ¼3

¼ ð þ 3Þ þ ð À 3Þ ¼10 256 ¼0 039

7 4 þ À À

4

7 2 ¼3

¼0 039 2 S

S

S À À1 1 À¼3 5 1 ¼1 5

þ 4

1 2 C 2 9 À9 96 1

7 - - p

À4 À1 1 À1 5 À9 1 À8 9 À8 95

À1 À5 À2 1 À2 5 À10 1 À9 9 À9 956 2 4 9 4 5 À3 1 À2 9 À2 95

13 9 11 9 11 5 3 9 4 1 4 054 0 2 9 2 5 À5 1 À4 9 À4 952 À2 0 9 0 5 À7 1 À6 9 À6 953

À1 1 9 1 5

À6 1

À5 9

À5 95

5 1 3 9 3 5 À4 1 À3 9 À3 959 5 7 9 7 5 À0 1 0 1 0 05

þ À 3 3 5 3 5 5

5

Page 244: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 244/675

1 5 9

0 039

[ 1967

57–58] þ 2 À 2

ð À 0 Þ ¼ 0 þ À R 5 1 V 1 V 2 V

ðV Þ ¼

¼1 ðV ÀV Þ ¼ 6¼

ðV ÀV Þ þ1

ðÞ ¼

1 00 0

& 2

2

2 þ ¼ ð þ1Þ2

S 7 1

þ ¼

¼1 ð À 0 Þ

¼

¼1 þ

¼1 6¼ ð À 0 À À 0 Þ ð7 17Þ

þ :

Page 245: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 245/675

1 À 0 1 þ À 0 02 À 0 À 0 6¼ 1 þ

À 0 À 0 À 0 0 À 0 À 0 À 0 0

À 0 0 þ 2 0 ð þ Þ2 0 B À 0 0 À 0 0 ð þ Þ2 0

C ð þ Þ2 0 1 þ S

ð þ Þ2 0 1 À ð þ1Þ2 ð þ Þ2

W 0 0 ð þ1Þ2

2 þ1Þ 100 ð1 À Þ

7 16 ¼8

À1 2 3 4 5 6 9 13 36 7 5 ¼0 039 2 ¼3 S

1 5 9 0 1 5 9 ¼1À2 0 039 ¼

0 922

7 W v (7 )

À1 0 0 5 1 0 1 5 2 0 2 5 4 0 6 02 0 2 5 3 0 3 5 4 0 5 5 7 53 0 3 5 4 0 4 5 6 0 8 04 0 4 5 5 0 6 5 8 5

5 0 5 5 7 0 9 06 0 7 5 9 59 0 11 0

13 0

5

Page 246: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 246/675

½ ð1Þ ð Þ ð1Þ ð Þ ð ð1Þþ ð ÞÞ2

ð þ1Þ2 ð þ Þ2 1 V

ð 2 þ1Þ 7 2

-

ð 1 1Þð 2 2Þ ð Þ

1 À 1 2 À 2 À

7

Page 247: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 247/675

Page 248: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 248/675

þ 15 þ ð þ1Þ4 þ ð þ1Þ4

15

:

7 11

ð 2 þ1Þ 2 1 À2

2 S

ð2

þ1

Þ 2

ð þ1Þ2 W ¼ ð þ Þ2 1

W ðÞ W ½ ð þ1Þ2Àþ1

1 À2 > 15

¼ ð þ1Þ

4 þ0 5 À 2 ffiffiffiffiffiffiffiffiffi ð þ1Þð2 þ1Þ

24

0 þ ð þ1Þ4 þ0 5 þ ffiffiffiffiffiffiffiffiffiffiffið þ1Þð2 þ1Þ24 1 À À0 5 À ð þ1Þ4

ffiffiffiffiffiffiffi ð þ1Þð2 þ1Þ24 " #

M < M0 þ

ð

þ1

Þ4 À0 5 À ffiffiffiffiffiffiffiffiffiffiffið

þ1

Þð2

þ1

Þ24 þ0 5

À

ð

þ1

Þ4

ffiffiffiffiffiffiffi ð þ1Þð2 þ1Þ24 " # 6¼0 B 2 2

7

Page 249: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 249/675

¼ À À

p 7

?

B

¼B 0 ¼0 1 0

1 51 2 45 82 46 5 41 33 24 1 15 84 10 2 11 15 65 3 58 56 92 1 70 37 30 3 31 68 49 2 35 4

ð Þ1 5 4 5 4 42 5 2 5 2 33 8 3 8 3 64

À0 9 0 9 1

5 6 8 6 8 56 21 8 21 8 87 À1 3 1 3 28 13 8 13 8 7

8 5

Page 250: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 250/675

Page 251: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 251/675

M B S X C

S S PR C

VR

S S þ À ð þ1Þ4 ¼33 À18 ¼15 S

0 0391 2 0 1955

0 05

p 7 7 1 90% B

¼6 ¼0 047 1 À2ð0 047Þ ¼0 906 0 047

¼3 90 6%

5

Page 252: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 252/675

6 ð7Þ2 ¼21 ð þ Þ2

S À1 0 5 5 90 6%

À1 0 5 5 B

M B S X C

M B S X C

93 8%

À2 0 À1 0 1 0 3 0 4 0 8 0

À1 5 0 0 2 0 3 5 6 0

À0 5 1 0 2 5 5 50 5 1 5 4 51 0 3 53 0

Page 253: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 253/675

Page 254: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 254/675

0 ¼ ð1 þ0 01 Þ

¼ À ð1 þ0 01 Þ

B

½ð

À

Þ 0 ¼0 5

[ 1949 , ]

S ’ C 1

R 2 ¼0 637 R

3 ¼0 955 C 13 R

R 0 864

1 3 R 2 3

1 3 4 3 R

S

R 1 50 1 09

Page 255: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 255/675

5 1 ð Þ

: ðÞ 5 2

ðÞ ¼0 01 2 ¼01 0

(

V 4 14 ¼ À2 þ ffiffiffi

3

ðÞ 0 10 0 ¼2 1 2

:

À3 À6 1 9 4 10 12

ðÞ

ðÞ :

0 10 0 ¼2 1 6¼2

¼ À

ðÞ

ð Þ þ À

S

¼0 01 0 05 0 10 :

¼12 ¼15

7 10 1

0 ¼0 1 0 2 8 P P

126 131 153 125 119 102 116 163 120 126 152 129 102 105 100 175

5

Page 256: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 256/675

D

8 P þ À À 1 2

1 ð þ Þ

ð Þ ¼1 0

À1 0&L 1 2

D

¼ 0 0& þ ¼ S þ

’ ’ þ À À

’ C 8 S þ

1986 10 1

P 5 10 B À

18 0 262 0 202 18 0 295 0 227 19–20 0 216 0 191

18 0 287 0 209M 18–19 0 277 0 299M 18–20 0 223 0 151M 18 0 512 0 471

19 0 237 0 151 18–19 0 348 0 336

18 0 342 0 307

Page 257: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 257/675

[ M DD M DD ]

B À 25–29

2 C

M 1986

5

3 14

S 5 13

R

25À29

0 060 À0 025 0 068 À0 023 0 025 0 004

0 078 À0 008

M À0 022 0 061M 0 072 0 015M 0 041 À0 035

0 086 À0 016 0 012 À0 061

0 035 À0 051

20 20 34 19B 21 18 28 13C 23 10 J 20 21D 26 16 K 29 12

32 11 L 22 15 27 20 M 30 14 38 20 25 17

5

Page 258: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 258/675

L

0 87

P 4 20 2 0

15 7 2 1

0 95

S

7 4 3 0 ¼0 50 1 0 50

S S

8 7 1 B 0 90

69 12 14

:

C 2 C 251C 63 1220

S 60 461B 3 C 300

40 C 409

1 10 6 02 À8 7 À73 À6 8 54 À2 9 À85 15

7

Page 259: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 259/675

D ?

L

D

D

10 ’

D

D ?

0 90

D

L 3 15

1 1 38 1 42 8 2 63 2 692 9 69 10 37 9 2 44 2 683 0 39 0 39 10 0 56 0 534 1 42 1 46 11 0 69 0 725 0 54 0 55 12 0 71 0 726 5 94 6 15 13 0 95 0 907 0 59 0 61 14 0 55 0 52

O

1 41 372 42 393 48 314 38 395 38 346 45 477 21 198 28 309 29 25

10 14 8

8 5

Page 260: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 260/675

Page 261: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 261/675

0 05 0 50

0 958 24 P 5 27 0 90

L ðÞ 5 :

ð ð1Þ 0 5 ð5ÞÞð ð1Þ 0 25 ð3ÞÞð ð4Þ 0 80 ð5ÞÞ

ð ðÞ ðÀþ1Þ 2Þ 100 ð1 À Þ

1 À ¼1 À2 À1

À1 Z 0 5

0 Àð1 À ÞÀ1

(1) (n) X 0.50

:

ðÞ ð ð1Þ 0 50 ðÞÞ 0 99

ðÞ ½ ð ðÞÞ À ð ð1ÞÞ 0 5 0 95

D 1 À

D 1 À

5

Page 262: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 262/675

- p

C 5

Page 263: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 263/675

1 2 1 2

0

ð

Þ ¼

ð

Þ

P

S ’

ð Þ ¼ ð Þ ¼ ð ÀÞ ¼ ð ÀÞ 6¼0

þ À ¼0 0

0

Page 264: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 264/675

same shape and the same variance, and the amount of the shift y mustbe equal to the difference between the population means, mY Àm X , thepopulation medians, M Y À M X , and in fact the difference between anytwo respective location parameters or quantiles of the same order.

Another assumption about the form of the underlying populationis called the scale model , which assumes that the X and Y populationsare the same except possibly for a positive scale factor y which is notequal to one. The scale model can be written as

F Y x P Y 4 x P X 4 y F X y x for all x and y > ; y TThis means that X =y and Y have the same distribution for any posi-

tive y or that X is distributed as yY . Also, the variance of X is y times

the variance of Y and the mean of X is y times the mean of Y . A more general assumption about the form of the underlyingpopulations is called a location-scale model . This model can be writtenas

P Y ÀmY 4 x P X Àm X 4 y

which states the X Àm X =y and Y ÀmY are identically distributed (orsimilarly in terms of M Y ; M X ). Thus the location-scale model incorpo-rates properties of both the location and the scale models. Now themeans of X Àm X and Y ÀmY are both zero and the variance of X Àm X isy times the variance of Y ÀmY .

egardless of the model assumed, the general two-sample pro-blem is perhaps the most frequently discussed problem in nonpara-metric statistics. The null hypothesis is almost always formulated asidentical populations with the common distribution completely un-speci®ed except for the assumption that it is a continuous distributionfunction. Thus under the null case, the two random samples can beconsidered a single random sample of size N m n drawn from thecommon, continuous, but unspeci®ed population. Then the combinedordered con®guration of the m X and n Y random variables in thesample is one of the m n

m possible equally likely arrangements.For example, suppose we have two independent random samples,m X 's and n Y 's. Under the null hypothesis that the X 's andthe Y 's are identically distributed, each of the possible ar-rangements of the combined sample shown below is equally likely.

: XXXYY : XXYXY : YXYXX : XXYYX : XYXXY : XYXYX : YXXXY : YXXYX : XYYXX : YYXXX

THE GENERAL TWO-SAMPLE PROBLEM 233

Page 265: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 265/675

’ ’

1 10 ’ ’ M

0

ð Þ 6¼ ð Þ

1 ð Þ ð Þ ð Þ ð Þ

S 1 1 1 2

S S ð Þ ð Þ

S

ð Þ ¼ ð À Þ 6¼0

Page 266: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 266/675

þ 0

ð0

Þ S

ð Þ ¼ ð Þ 6¼1

1 ð 1Þ 1

L ð Þ ¼ ½ ð Þ

1 ð 1Þ

K S

M C 7 8

W -W W

L 1 2

1 2

Page 267: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 267/675

¼4 ¼5

0 ð Þ ¼ ð Þ

þ ¼

C 3 6 ’

’ ’

ð Þ 6¼ ð Þ

R

S W W 1940

ð 0Þ

Page 268: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 268/675

ð O 0Þ

O S

0 C 3

2 2 S 3 2 1 2

’ 1 ’ 2

D 1 ¼ 2 ¼ S 3 2

p

8

¼18

:

B

¼18 ffiffiffiffiffi ffi2 ¼ ffiffiffiffiffi36 ¼6

À2 18 À1 79 À1 66 À0 65 0 05 0 54 0 85 1 69

À2 18 À1 91 À1 79 1 66 À1 22 À0 96 À0 72 À0 65 0 050 14 0 54 0 82 0 85 1 45 1 69 1 86

À1 91 À1 22 À0 96 À0 72 0 14 0 82 1 45 1 86

C 4 90 7 25 8 04 14 10 18 30 21 21 23 10 28 12

7

Page 269: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 269/675

et X denote the standardized chi-square sample data and let Y denotethe normal sample data. For the solution using the Wald-Wolfowitzruns test, we simply count the number of runs in the ordered combinedcon®guration X ; Y ; X ; X ; Y ; Y ; Y ; X ; X ; Y ; X ; Y ; X ; Y ; X ; Y as R .Table shows that the P value, the left-tail probability with Rfor m ; n , exceeds . , and therefore we do not reject the nullhypothesis of equal distributions. Using ( . ) of ection . with acontinuity correction, we get Z : with P : and z : or P : without a continuity correction.

The TAT A T solution to Example . using the runs test isshown below. Note that the exact P value is . . This can be ver-i®ed from Table since P R 4 À P R 5 : . Note

that their asymptotic P value is not the same as ours using ( . ) of ection . .

THE PROBLEM OF TIES

Ideally, no ties should occur because of the assumption of continuouspopulations. Ties do not present a problem in counting the number of runs unless the tie is across samples; i.e., two or more observationsfrom different samples have exactly the same magnitude. For aconservative test, we can break all ties in all possible ways and com-pute the total number of runs for each resolution of all ties. The actual R used as the value of the test statistic is the largest computed value,

23 CHAPTER 6

Page 270: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 270/675

since that is the one least likely to lead to rejection of H . For eachgroup of ties across samples, where there are s x's and t y's of equalmagnitude for some s 5 ; t 5 , there are s t

s ways to break theties. Thus if there are k groups of ties, the total number of values of R to be computed is the product k

isi t i

si.

DISCUSSION

The Wald-Wolfowitz runs test is extremely general and is consistentagainst all types of differences in populations Wald and Wolfowitz( ) . The very generality of the test weakens its performanceagainst speci®c alternatives. Asymptotic power can be evaluated usingthe normal distribution with appropriate moments under the alter-native, which are given in Wolfowitz ( ). ince power, whetherexact or asymptotic, can be calculated only for completely speci®edalternatives, numerical power comparisons should not be the onlycriteria for this test. Its primary usefulness is in preliminary analysesof data when no particular form of alternative is yet formulated. Then,if the hypothesis is rejected, further studies can be made withother tests in an attempt to classify the type of difference betweenpopulations.

6.3 THE OLMOGOROV SMIRNOV T O SAMPLE TEST

The olmogorov- mirnov statistic is another one-sample test that canbe adapted to the two-sample problem. ecall from hapter that as agoodness-of-®t criterion, this test compared the empirical distributionfunction of a random sample with a hypothesized cumulative dis-tribution. In the two-sample case, the comparison is made between theempirical distribution functions of the two samples.

The order statistics corresponding to two random samples of sizem and n from continuous populations F X and F Y , are

X ; X ; . . . ; X m and Y ; Y ; . . . ; Y n

Their respective empirical distribution functions, denoted by Sm xand S n x , are de®ned as before

Sm x if x < X

k=m if X k 4 x < X k for k ; ; . . . ; m À if x 5 X mVX

THE GENERAL TWO-SAMPLE PROBLEM 239

Page 271: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 271/675

and

S n x if x < Y

k=n if Y k 4 x < Y k for k ; ; . . . ; n À if x 5 Y nVXIn a combined ordered arrangement of the m n sample observations,

Sm x and S n x are the respective proportions of X and Y observationswhich do not exceed the speci®ed value x.

If the null hypothesis

H : F Y x F X x for all x

is true, the population distributions are identical and we have twosamples from the same population. The empirical distributionfunctions for the X and Y samples are reasonable estimates of theirrespective population cdf. Therefore, allowing for sampling varia-tion, there should be reasonable agreement between the twoempirical distributions if indeed H is true; otherwise the datasuggest that H is not true and therefore should be rejected. This isthe intuitive logic behind most two-sample tests, and the problem isto de®ne what is a reasonable agreement between the two empiricalcdf's. In other words, how close do the two empirical cdf's have to beso that they could be viewed as not signi®cantly different, takingaccount of the sampling variability. Note that this approach neces-sarily requires a de®nition of closeness. The two-sided Kolmogorov-

Smirnov two-sample test criterion, denoted by Dm;n, is based onthe maximum absolute difference between the two empiricaldistributions

Dm;n max x jSm x ÀS n x j

ince here only the magnitudes, and not the directions, of the devia-tions are considered, Dm;n is appropriate for a general two-sidedalternative

H A: F Y x TF X x for some x

and the rejection region is in the upper tail, de®ned by

Dm;n5

ca

where

P Dm;n 5 ca j H 4 a

240 CHAPTER 6

Page 272: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 272/675

B C 3 2 S 2 3

ð O 0ÞO K S

K S

¼ ð Þ

ð Þ ¼ ð Þ ð Þ ¼ð Þ

2 12 þ 16 S

¼ 9 20

R S

1954 K 1961 M 1951 1952 S

D 1952 ¼

S 1958

ð 0Þ ð Þ À ð Þ þ

C

3 1 ð Þ ð Þ

À ¼ À 0 0

À ¼0

À

3 1 Q 2 4

Page 273: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 273/675

þ 0

þ

3 2 D

0 0

ð 0Þ ¼1 À ð 0Þ ¼1 À ð Þþ

3 2

ð Þ ¼ð À1 Þ þ ð À1Þ

ð0 Þ ¼ð 0Þ ¼1

P

Page 274: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 274/675

boundaries. This procedure is shown in Figure . for the arrange-ment xyyxxyy, where nd . ince here A( , ) , we have

P D ; 5 : À :

For the asymptotic null distribution, that is, m and n approachin®nity in such a way that m=n remains constant, mirnov ( )proved the result

limm;n3I

P mnm nr Dm;n 4 d L d

where

L d ÀI

i ÀiÀ eÀi d

Note that the asymptotic distribution of

mn = m n

p Dm;n is exactly

the same as the asymptotic distribution of

N p D N in Theorem . of

ection . . This is not surprising, since we know from the Glivenko-antelli theorem that as n 3 I ; S n x converges to F Y x , which canbe relabeled F X x as in the theorem. Then the only difference here isin the normalizing factor mn = m np , which replaces N p .

Fig. 3.2 Evaluation of A u; v for xyyxxyy.

THE GENERAL TWO-SAMPLE PROBLEM 243

Page 275: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 275/675

-

þ ¼ ½ ð Þ À ð Þ

1 ð Þ ð Þ ð Þ ð Þ

þ þ

1 S

K S

KS

ð þ Þ Qþ Q P 6 1 þ 1954

¼

ffiffiffiffiffiffiffiffiffiffiffiffiffið þ Þ þ

ffiffiffiffiffi þ 3 5 S 4 3

!1 ffiffiffiffiffiffiffiffiffiffiffi ffiþ þ ¼1 À À2 2

ð Þ ð Þ

Page 276: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 276/675

K S

1

M

1978 P 1979 K S

L 1953

KS C 1965

1954

K S 2 1

p K S

3 1 ¼2 8 ¼16 ¼ ¼8 ð64 8 8 32 0Þ ¼0 283

ð64 8 8 16 0Þ 0 283

þ ¼2 8 0 283 2 ¼0 142

0 ’ ’

S X C 3 1 K

S 0 9801

0 9639

Page 277: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 277/675

M C M P 1994

S

þ S X C

p

# ðÞ # ðÞ ðÞÀðÞ ðÞÀðÞÀ2 18 1 1 8 0 0 1 8 1 8

À1 91 1 1 8 1 1 8 0 0

À1 79 2 2 8 1 1 8 1 8 1 8

À1 66 3 3 8 1 1 8 2 8 2 8

À1 22 3 3 8 2 2 8 1 8 1 8

À0 96 3 3 8 3 3 8 0 0

À0 72 3 3 8 4 4 8 À1 8 1 8

À0 65 4 4 8 4 4 8 0 00 05 5 5 8 4 4 8 1 8 1 8

0 14 5 5 8 5 5 8 0 00 54 6 6 8 5 5 8 1 8 1 80 82 6 6 8 6 6 8 0 00 85 7 7 8 6 6 8 1 8 1 81 45 7 7 8 7 7 8 0 01 69 8 8 8 7 7 8 1 8 1 81 86 8 8 8 8 8 8 0 0

Page 278: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 278/675

K S

S

1 2 1 2 þ ¼

L V S

V

¼ ð Þ ¼ ð Þ

V

V ð Þ ¼ ð1 À ÞÀð1 À ÞÀ¼0 1 ¼0 1

ð4 1Þ

V ÀV

0 À ¼0

ÀV 4 1

¼

¼

ð þ Þð þ Þ

7

Page 279: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 279/675

V

þV ð þ Þð þ Þ

¼ þV

ðÞ ¼ þ ð1 À Þþ À ¼0 1 þ ð4 2Þ ¼ 4 1 4 2 ¼ ¼

ð Þ ¼ À þ

¼ ð0 À Þ1 ð Þ

ð4 3Þ þ

8

Page 280: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 280/675

¼ ¼

Àþ Á

ÀÁÀÀÁ S / ¼ ¼ ð þ Þ

4 3

S

ðþ1Þ

ð þ Þ S

ð þ Þ 0 5 B ¼ þ

½ð þ1Þ2 ð 2Þ ½ð þ2Þ2 4 3

¼ 2 ¼ ð À1Þ2

B M 1948 1951 M 1950 1948 M ’

4 3

D

¼ ð À1Þ2 1 ’

þ þ ’ À 2

Page 281: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 281/675

’ ’ À þ þ

ð Þ¼ 1 À1 À ½ ð Þ ð Þ

½1À ð Þ À1ÀÀ ½ ð ÞÀ½1 À ð ÞÀþ

þ ½ ð Þ½1 À ð Þ ÀÀ 1 Àþ À1 À

½

ð

ÞÀ

ð

Þ½1

À

ð

ÞÀþ À1

ð Þ ¼ ð Þ

ðÞ ¼ À1 À þ À1

À !ÂZ 1

À1½ ð Þ½1 À ð Þ þ ÀÀ1 ð Þ

¼ À ½ðÀ Þ þ ðÀ þ Þ ðþ1 þ ÀÞ

¼ À ð þ ÀÞð þ Þ

4 3B

0 5

ð þ1Þ2

Page 282: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 282/675

with rank N = and N = if N is even. A total of t observations arethen less than d, where t N À = or N / according as N is odd oreven. et U denote the number of X observations less than d. If the twosamples are drawn from identical continuous populations, the prob-ability distribution of U for ®xed t is

f U u

mu n

t Àu m n

t u max ; t Àn ; . . . ; min m; t ; t N = :

where x denotes the largest integer not exceeding the value x. If thenull hypothesis is true, then P X < d P Y < d for all d, and inparticular the two populations have a common median, which isestimated by d.

ince U m is an estimate of P X < d , which is approximatelyone-half under H , a test based on the value of U will be most sensitiveto differences in location. If U is much larger than m / , most of the X values are less than most of the Y values. This lends credence to therelation P X < d > P Y < d , that the X 's are stochastically smallerthan the Y 's, so that the median of the X population is smaller thanthe median of the Y population, or that y > . If U is too small relativeto m / , the opposite conclusion is implied. The appropriate rejection

regions and P values for nominal signi®cance level a then are asfollows

where ca and cHa are, respectively, the largest and smallest integerssuch that P U 4 ca j H 4 a and P U 5 cHa j H 4 a , c and cHare two

integers c < cHsuch that P U 4 c j H P U 5 cHj H o 4 a

and U is the observed value of the median test statistic U .

Alternative Rejection region P value

Y >T X , y > or M Y > M X U 5 cHa P U 5 U O

Y <T X , y < or M Y < M X U 4 ca P U 4 U O

y T or M X TM Y U 4 c or U 5 cH (smaller of the above)

THE GENERAL TWO-SAMPLE PROBLEM 251

Page 283: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 283/675

0 4 4 [L

1961 ] 0¼ À S 4 4

C 1 4 4

ð Þ ¼ ð Þ ¼ ð ÀÞ 2ð À1Þ ð4 5Þ

4 5

ð Þ ¼ ð ÀÞ 3

¼ À

½ ð ÀÞ 3 1 2 ð4 6Þ 0 5

0

O þ0 5 À

ffiffiffiffiffiffiffiffiffiffiffiffi

ð ÀÞ 3

ð4 7Þ

4 6

Page 284: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 284/675

4 6 ¼ þ

¼ À ffiffiffiffiffiffiffiffiffiffiffiffið ÀÞ ¼ À ðÀ Þ ffiffiffiffiffiffiffiffiffi ð Þð1 À Þ

¼ À ffiffiffiffiffiffiffiffiffiffiffiffi½ðþ Þ ½1 À ðþ Þ ¼ À ffiffiffiffiffiffiffiffiffiffiffiffi

ð1 À Þð1 þ1 Þ / /

¼ ðþ Þ

þ ¼

2 ½ð þ2Þ2

0

p

1: W : W

3 4 9 10 1 2 5 7 8

Page 285: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 285/675

L 1 2

¼ S ¼9 ¼ ð9 À1Þ2 ¼4

5 ¼2 4 4

ð 2 0Þ ¼

4

0 5

4 þ4

1 5

3 þ4

2 5

2 9

4 ¼105 126 ¼0 8333

1 0 4 7

ð0 975Þ ¼0 8352

M B 4 1 M B

M B

¼1

4 6 C 2 ¼0 09 B ¼0 05

3 84

M B 0 09

¼1 0 7642 M B

95%

M B

Page 286: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 286/675

À4 26 8 26

S X C S X C

S X C

S S S S

¼2 S S

1 ¼ S 0

Page 287: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 287/675

M M 1 ¼ ð S 0Þ

0 / 4 ð4Þ9 ¼1 78 S S ð 2 0Þ ¼81 126 ¼0 6429 1 À ð 1 0Þ

0 PR C

P R1 ¼ ðÀ Þ ffiffiffiffiffiffiffiffiffiffiffið ÀÞ 2ð À1Þ

S S

Page 288: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 288/675

CONFIDENCE INTERVAL PROCEDURE

Assuming the shift model, the median-test procedure can be adaptedto yield a con®dence-interval estimate for the shift parameter in thefollowing manner. uppose that the two populations are identical inevery way except for their medians. enote these unknown para-meters by M X and M Y , respectively, and the difference M Y À M X by y.In the shift model F Y x F X xÀy , the parameter y represents thedifference F ÀY p À F À X p between any two quantiles (of order p) of the two populations. In the present case, however, we assume thaty M Y À M X , the difference between the two medians p : . Fromthe original samples, if y were known we could form the derivedrandom variables

X ; X ; . . . ; X m and Y Ày; Y Ày; . . . ; Y n Ày

and these would constitute samples from identical populations or,equivalently, a single sample of size N m n from the commonpopulation. Then according to the median-test criterion with sig-ni®cance level a , the null hypothesis of identical distributions wouldbe accepted for these derived samples if U , the number of X observa-tions less than the median of the combined sample of derived obser-vations, lies in the interval c 4 U 4 cHÀ. ecall that the rejectionregion against the two-sided alternative y T is U 4 c or U 5 cH. Theintegers c and cHare chosen such that

c

u

mu

nt Àu m n

t t

u cH

mu

nt Àu m n

t 4 a :

THE GENERAL TWO-SAMPLE PROBLEM 25

Page 289: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 289/675

¼ 2 ð À1Þ2 S 0 þ1 0À1 1 À 0 100 ð1 À Þ

þ1 0À1

ð1Þ ð2Þ ð Þ ð1ÞÀ ð2ÞÀ ðÞÀ ¼ þ

À

ð1Þ ðÞ ð1ÞÀ ðÀÞÀ

ðþ1Þ ð Þ ðÀþ1ÞÀ ðÞÀ

þ1

¼ þ1

ðþ1Þ ðÀÞÀ ð0Þ ðÀ0þ1ÞÀ

0À1

ðþ1Þ ðÀÞÀ ð0Þ ðÀ0þ1ÞÀ

ðÀÞÀ ðþ1Þ ðÀ0þ1ÞÀ ð0Þ ð ðÀ0þ1ÞÀ ð0Þ ðÀÞÀ ðþ1ÞÞ 4 8

8

Page 290: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 290/675

1 À ¼ ðþ1 0À1 0Þ¼ ðþ1 0À1 ¼0Þ¼ ð ðÀ0þ1ÞÀ ð0Þ ðÀÞÀ ðþ1Þ ¼0Þ

S

ð ðÀ0þ1ÞÀ ð0Þ ðÀÞÀ ðþ1ÞÞ ¼1 À0 4 8

M ’

p 95% 4 1 0 4 3

¼4 ¼5 ¼4 4 1 ¼00¼4 4 8 0 04762

¼ À ð ð1ÞÀ ð4Þ ð4ÞÀ ð1ÞÞ 0 95238 ðÀ9 4Þ 95 238%

¼ À ðÀ4 9Þ M B ‘‘ 95 0% C 1 À 2 : À4 26 8 26Þ’’

0

C L 1996

W ðÞ ’ ðÞ

W ðÞ ð Þ W

¼ ¼ t ¼0 1 2 3 4

ð ¼ Þ 0 039683 0 31746 0 47619 0 15873 0 0079375 126 40 126 60 126 20 126 1 126

Page 291: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 291/675

the null distribution of r is distribution free (see, for example,roblem . ), so that the precedence test is a distribution-free test.The median test a special case of a precedence test since as seen in thearguments for the con®dence interval procedure (see also, for example,ratt, ), we have U 4 u À if and only if X u < Y tÀu . everalprecedence tests have been proposed in the literature and we will referto some of some of them in this chapter.

PO ER OF THE MEDIAN TEST

The power of the median test can be obtained as a special case of thepower of a precedence test and we sketch the arguments for the moregeneral case. The power of a precedence test can be obtained from the

distribution of the precedence statistic r under the alternativehypothesis. It can be shown that for i ; ; . . . ; n

P r i

ni

B r ;mÀr

 F Y F À X uÀ Á Ãi

À F Y F À X uÀ ÁÂ ÃnÀiu rÀ Àu mÀr du

:

Thus, for a one-sided precedence test with rejection region r < wa ,the power of the test is

w F X ; F Y ; m; n; a wa Ài

P r i :

where wa is obtained from the size of the test; in other words, wa is thelargest integer such that

wa Às

P r sj H 4 a

To obtain the power of the median test, we just need to substitutesuitable values in ( . ).

As an alternative development of power, note that under theassumption of the location model F Y x

F X x

Ày the null hypoth-

esis is that y equals zero. From the con®dence interval procedure de-veloped above, we know that for the median test a necessary andsuf®cient condition for accepting this hypothesis is that the numberzero be included in the random interval

260 CHAPTER 6

Page 292: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 292/675

½ð ðÀ0þ1ÞÀ ð0ÞÞð ðÀÞÀ ðþ1ÞÞ

6¼0

P ðÞ ¼ ð ðÀ0þ1ÞÀ ð0Þ 0 ðÀÞÀ ðþ1Þ 0 6¼0Þ 0 ð0Þ ðþ1Þ ðÀ0þ1Þ¼ ðÀ½0À1 Þ ðÀÞ ðÀ0þ1Þ ð0Þ ðÀÞ ðÀ0þ1Þ ð0Þ ðþ1Þ

ðÀÞ ðþ1Þ :

P ðÞ ¼ð ðÀ0þ1Þ ð0ÞÞ þ ð ðÀÞ ðþ1ÞÞS ðÞ ðÞ C 2

ð Þ ¼ ð À Þ

:

ð ðÞ ðÞÞ ¼Z 1

À1Z 1

À1 ðÞð Þ ðÞð Þ

ðÞ ðÞ ðÞÀ ðÞ

ð ðÞÞ À ð ðÞÞ ð ðÞÞ þ ð ðÞÞ C 2

ð ðÞÞ ¼ À1 þ1 ð ðÞÞ ¼ À1

þ1 ð

ðÞÞ ¼ð À þ1Þ

ð þ1Þ2

ð þ2Þ À1

þ1 !& 'À2

ð ðÞÞ ¼ð À þ1Þð þ1Þ

2

ð þ2Þ À1

þ1 !& 'À2

Page 293: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 293/675

C 5

S ’ 2 ¼0 637 C 13

M S 6 6

M 1943 1963 1968 1973

V

¼2 þ1 ðþ1Þ ðþ1Þ ðþ1Þ

½À1 ðþ1Þ ð ðþ1Þ1Þ V

V ð ðþ1ÞÞ ¼ðþ1Þ ðþ1Þ

0 ð Þ ¼ ð Þ

’ ’ ’ ðþ1Þ V

ð Þ ¼ ð À Þ 0 ¼0

1 0 V

Page 294: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 294/675

In a similar manner it can be seen that for the one-sided alter-native that the Y 's are stochastically smaller than the X 's or under theshift model H : y < , the rejection region should consist of small va-lues of .

ince the control median test is a precedence test, roblem. (c) (with n r and i r ) gives the null distribution of thetest statistic as

P j j H m r À j

m À j j r j

m rm :

for j ; ; . . . ; m. This can be easily evaluated to ®nd the exact P valuecorresponding to an observed value O or to calculate a critical valuefor a given level of signi®cance a . Further, from roblem . (d), thenull mean and the variance of are

E m r

nm

and

var m m n r n Àr

n nm m r

r

m m n

nIn general, m is an estimator of F X M Y q, say, and in fact is

a consistent estimator. Now, q is equal to : under H , is less than :if and only if M Y < M X , and is greater than . if and only if M Y > M X .Hence, like the median test, a test based on is especially sensitive todifferences in the medians.

Gastwirth ( ) showed that when m n tend to in®nity suchthat m n tends to f , and some regularity conditions are satis®ed,

mp =m Àq is asymptotically normally distributed with mean zeroand variance

q Àq ff X M Y

f Y

M Y

Under H we have q : ; f X f Y , and M X M Y , so that theasymptotic null mean and variance of are m= and m m n = n ,respectively. Thus, under H , the asymptotic distribution of

THE GENERAL TWO-SAMPLE PROBLEM 263

Page 295: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 295/675

¼V À 2

ffiffiffiffiffiffiffiffiffiffiffiffið þ Þ4 ¼ ffiffiffi ð2V À Þ ffiffiffiffiffiffiffiffiffið þ Þ

S

B B

¼0 5 ð Þ ð Þ

V S

P 1963 1973 M 1975 S 1979 S1980 R 1982

‘‘

’’

0 ¼0 5 1 0 5 0 1 V

¼2 þ 2 ð þ Þ

4 !1 2

ð5 2

Þ 2 þ1 ðþ1Þ ðÞ

Page 296: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 296/675

5 2

V 5 2

1968

P 1981 1973

W

S

0 5

¼0 5 V ðV 0 5Þ V

ðþ1Þ ðÃÞ Ã¼ ½

P ðÞ ¼½ ðþ1Þ ðÃÞ 0 5

¼ ð Þ ðV 0Þ

¼ ð Þ

R

Page 297: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 297/675

control median test is a symmetric function of the two sample sizes,which implies that ``the designation of the numbering of the samples isimmaterial as far as asymptotic ef®ciency is concerned'' (Gart, ).

ecause the A E between the control median test and themedian test is equal to one, neither one is preferable based on thiscriterion. However, if some other measure of ef®ciency is used, inter-esting distinctions can be made between the two tests. For these andother related results see, for example, illeen, Hettmansperger, andievers ( ) and the references therein. As noted earlier, the controlmedian test and the median tests are special cases of precedence tests. An overview of precedence tests for various problems can be found inhakraborti and van der aan ( , ).

The control median test is based on , the number of X valuesthat precede the median of the Y 's. Writing q F X M Y , the appro-priate rejection regions for a nominal signi®cance level a are shown inthe following table along with expressions for P values where d a anddHa are, respectively, the largest and smallest integers such that P 4 d a j H 4 a ; P 5 dHa j H 4 a and d and dHd < dH are twopositive integers such that

P 4 d j H P 5 dHj H 4 a

The exact critical values or P values can be easily found directly using( . ) or from the tables of the hypergeometric distribution (see ro-blem . ). In view of the simple form of the null distribution, it mightbe easier to calculate a P value corresponding to an observed value of . Note that unlike the median test, there is no dif®culty here inassigning probability in the tails with two-tailed tests since the dis-tribution of is symmetric under H , that is, P j j H P m À j j H for all j .

Alternative Rejection region P value

Y >T X , q > : P 5 j H

m

j P j j H

y > or M Y > M X 4 dHa

Y <T X , q < : P 4 j H j

P j j H

y < or M Y < M X 4 d a

y T or M Y TM X q T : 4 d or 5 dH (smaller of above)

266 CHAPTER 6

Page 298: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 298/675

In practice, the asymptotic distribution is useful for ®nding thecritical values or approximating the P value. For example, if the al-ternative is q 4 : , an approximation to the P value of the test with acontinuity correction is

F np v Àm

m m np 4 5 :

APPLICATIONS

Exam le 5.1 We illustrate the control median test using the datain Example . .

Solution Again, let samples and denote the X and the Y sample,respectively and assume that the null hypothesis to be tested is M X M Y q : against the alternative M Y < M X q < : underthe shift model. Thus, the P value for the control median test is also inthe left tail. The median of the Y sample is and thus . Using( . ) with m ; n , and r , the exact P value for the controlmedian test is

P 4 j H

= :

Hence there is not enough evidence against the null hypothesis infavor of the alternative. Also, from ( . ) the normal approximation tothe P value is found to be F : : and the approximate testleads to the same decision as the exact test.

For the median test, the combined sample median is equal to theY sample median and U . Now using ( . ) the exact P value for themedian test is

P U 4 j H = :

THE GENERAL TWO-SAMPLE PROBLEM 26

Page 299: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 299/675

and once again not enough evidence is available in favor of thealternative H . Using ( . ), the normal approximation to the P value isobtained as F : : , leading to the same conclusion.

6.6 THE MANN HITNEY U TEST

ike the Wald-Wolfowitz runs test in ection . , the ann-Whitney U test ( ann and Whitney, ) is based on the idea that the particularpattern exhibited when the X and Y random variables are arrangedtogether in increasing order of magnitude provides information aboutthe relationship between their populations. However, instead of mea-suring the tendency to cluster by the total number of runs, the ann-Whitney criterion is based on the magnitudes of, say , the Y 's inrelation to the X 's, that is, the position of the Y 's in the combinedordered sequence. A sample pattern of arrangement where most of theY 's are greater than most of the X 's, or vice versa, or both, would beevidence against a random mixing and thus tend to discredit the nullhypothesis of identical distributions.

The Mann- hitney U test statistic is de®ned as the number of times a Y precedes an X in the combined ordered arrangement of thetwo independent random samples

X ; X ; . . . ; X m and Y ; Y ; . . . ; Y n

into a single sequence of m n N variables increasing in magni-tude. We assume that the two samples are drawn from continuousdistributions, so that the possibility X i Y j for some i and j need notbe considered. If the mn indicator random variables are de®ned as

Dij

if Y j < X ifor i ; ; . . . ; m; j ; ; . . . ; n

if Y j > X iVbbX

:

a symbolic representation of the ann-Whitney U statistic is

U m

i

n

j Dij :

The logical rejection region for the one-sided alternative that theY 's are stochastically larger than the X 's,

H : F Y x 4 F X x strict inequality for some x

would clearly be small values of U . The fact that this is a consistenttest criterion can be shown by investigating the convergence of U =mn

26 CHAPTER 6

Page 300: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 300/675

to a certain parameter where H can be written as a statement con-cerning the value of that parameter.

For this purpose, we de®ne

p P Y < X IÀI xÀI f Y y f X x dydx

IÀI F Y x dF X x :

and the hypothesis testing problem can be rede®ned in terms of theparameter p. If H : F Y x F X x for all x is true, then

p

I

ÀI

F X x dF X x : :

If, for example, the alternative hypothesis is H : F Y x 4 F X x that isY >

T X , then H : p 4 : for all x and p < : for some x. Thus the nullhypothesisof identicaldistributions can be parameterized to H : p :and the alternative hypothesis to H : p < : .

The mn random variables de®ned in ( . ) are ernoulli variables,with moments

E Dij E Dij p var Dij p À p :

For the joint moments we note that these random variables are notindependent whenever the X subscripts or the Y subscripts are com-mon, so that

cov Dij ; Dhk

for i

Th and j

Tk

cov j T k

Dij ; Dik p À p coviTh

Dij ; Dhj p À p :

where the additional parameters introduced are

p P Y j < X i Y k < X i

P Y j and Y k < X i

IÀI F Y x dF X x :

and

p P X i > Y j X h > Y j

P X i and X h > Y j

IÀI À F X y dF Y y :

THE GENERAL TWO-SAMPLE PROBLEM 269

Page 301: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 301/675

S 6 2

ð Þ ¼¼1 ¼1

ð Þ ¼ ð6 9Þ

ð Þ ¼¼1 ¼1 ð Þ þ

¼1 1 6¼ ð Þ

þ ¼1 1 6¼ð Þ

þ 1 6¼ 1 6¼ ð Þ ð6 10Þ 6 5 6 6 6 10

ð Þ ¼ ð1 À Þ þ ð À1Þð 1 À 2Þ þ ð À1Þð 2 À 2Þ¼ ½ À 2ð À1Þ þ ðÀ1Þ 1 þ ð À1Þ 2 ð6 11Þ

S ð Þ ¼ ð Þ !0 ! 1 B C 1

M :

M

0 Àþ Á

ðÞ ¼ð ¼ Þ ¼ ðÞþ

ð6 13Þ

ðÞ

0 5 ð Þ ð Þ À 2 1

0 5 ð Þ ð Þ À 2 2

6¼0 5 ð Þ 6¼ ð Þ À 2 3ð6 12Þ

7

Page 302: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 302/675

of times a Y precedes an X is exactly u. The values of u for which f U uis nonzero range between zero and mn , for the two most extremeorderings in which every x precedes every y and every y precedes every x, respectively. We ®rst note that the probability distribution of U issymmetric about the mean mn = under the null hypothesis. Thisproperty may be argued as follows. For every particular arrangement z of the m x and n y letters, de®ne the conjugate arrangement zHas thesequence z written backward. In other words, if z denotes a set of numbers written from smallest to largest, zHdenotes the same num-bers written from largest to smallest. Every y that precedes an x in zthen follows that x in zH, so that if u is the value of the ann-Whitneystatistic for z; mn Àu is the value for zH. Therefore under H , we have

r m;n u r m;n mn Àu or, equivalently, P U À

mnu P U

mnu

P U mn Àmn

u h iP U Àmn

Àu ecause of this symmetry property, only lower tail critical values

need be found for either a one- or two-sided test. We de®ne the randomvariable U Has the number of times an X precedes a Y or, in the notationof ( . ),

U Hm

i

n

j À Dij

and rede®ne the rejection regions for size a tests corresponding to( . ) as follows

To determine the number ca for any m and n , we can enumerate thecases starting with u and work up until at least a

Àm n

m

Ácases

are counted. For example, for m ; n , the arrangements with

the smallest values of u , that is, where most of the X 's are smaller thanmost of the Y 's, are shown in Table . . The rejection regions for thisone-sided test for nominal signi®cance levels of . and . wouldthen be U 4 and U 4 , respectively.

Alternative Rejection region

p < : F Y x 4 F X x U 4 ca

p > : F Y x 5 F X x U H4 ca

p T : F Y x TF X x U 4 ca = or U H4 ca =

THE GENERAL TWO-SAMPLE PROBLEM 2 1

Page 303: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 303/675

Even though it is relatively easy to guess which orderings willlead to the smallest values of u ;Àm nm Áincreases rapidly as m; nincrease. ome more systematic method of generating critical values isneeded to eliminate the possibility of overlooking some arrangementswith u small and to increase the feasible range of sample sizes andsigni®cance levels for constructing tables. A particularly simple anduseful recurrence relation can be derived for the ann-Whitney sta-tistic. onsider a sequence of m n letters being built up by adding aletter to the right of a sequence of m n À letters. If the m n Àletters consist of m x and n À y letters, the extra letter must be a y.ut if a y is added to the right, the number of times a y precedes an x isunchanged. If the additional letter is an x, which would be the case form À x and n y letters in the original sequence, all of the y's will

precede this new x and there are n of them, so that u is increased by n.These two possibilities are mutually exclusive. Using the notation of ( . ) again, this recurrence relation can be expressed as

r m;n u r m;nÀ u r mÀ;n u Àn

and

f U u pm;n u r m;nÀ u r mÀ;n u Àn

m nm

nm n

r m;nÀ um n À

n À m

m nr mÀ;n u Àn

m n Àm À or

m n pm;n u np m;nÀ u mp mÀ;n u Àn :

Table 6.1 Generation of the left tail P alues of U for m 4 ; n 5

rdering u

XXXXYYYYY XXXYXYYYY P U 4 = : XXYXXYYYY P U 4 = : XXXYYXYYY P U 4 = : XYXXXYYYY P U 4 = : XXYXYXYYY XXXYYYXYY

2 2 CHAPTER 6

Page 304: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 304/675

¼0 1 2

¼1 2 ¼1 2

ðÞ ¼0 00ð0Þ ¼1 0 ð0Þ ¼10ðÞ ¼0 6¼0

0 ðÞ ¼0 6¼0

S 1953 M

1947 S

! 1 M

1947

ð Þ ¼ ð Þ 6 7 6 8 1 ¼ 2 ¼1 3 S 6 9 6 11

¼1 2 6 4

ð 0Þ ¼2 ð 0Þ ¼ ð

þ1

Þ12 ð6 15Þ

¼ À 2

ffiffiffiffiffiffiffiffiffiffiffiffið þ1Þ12

6 S 0 5

6 2

S ¼

7

Page 305: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 305/675

or both of the samples, a unique value of U is obtained. However, if oneor more X is tied with one or more Y , our de®nition requires that theties be broken in some way. The conservative approach may be adop-ted, which means that all ties are broken in all possible ways and thelargest resulting value of U (or UH) is used in reaching the decision.When there are many ties (as might be the case when each randomvariable can assume only a few ordered values such as very strong,strong, weak, very weak), an alternative approach may be preferable.

A common de®nition of the ann-Whitney statistic which doesallow for ties (see roblem . ) is

U T

m

i

n

j

Dij

where

Dij

if X i > Y j : if X i Y j

if X i < Y j VX

:

However, it is often more convenient to work with

DÃij if X i > Y j if X i Y j

À if X i < Y j VXIf the two parameters p and pÀare de®ned as

p P X > Y and pÀ P X < Y

U T may be considered as an estimate of its mean

E U T mn p À pÀ A standardized U T is asymptotically normally distributed. ince underthe null hypothesis p pÀ, we have E U T j H whether tiesoccur or not. The presence of ties does affect the variance, however.The variance of U T conditional upon the observed ties can be calcu-lated in a manner similar to the steps leading to ( . ) by introducingsome additional parameters. Then a correction for ties can be incor-porated in the standardized variable used for the test statistic. The

result is

var U T j H mn N

Àt Àt

N N À ! :

2 4 CHAPTER 6

Page 306: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 306/675

where t denotes the multiplicity of a tie and the sum is extended overall sets of t ties. The details will be left to the reader as an exercise orsee Noether, , pp. ± .

CONFIDENCE INTERVAL PROCEDURE

If the populations from which the X and Y samples are drawn areidentical in every respect except location, say F Y x F X xÀy for all x and some y, we say that the Y population is the same as the X population but shifted by an amount y, which may be either positive ornegative, the sign indicating the direction of the shift. We wish to usethe ann-Whitney test procedure to ®nd a con®dence interval for y,the amount of shift. Under the assumption that F Y x

F X x

Ày for

all x and some y, the sample observations X ; X ; . . . ; X m and Y Ày;Y Ày; . . . ; Y n Ày come from identical populations. y a con®denceinterval for y with con®dence coef®cient Àa we mean the range of values of y for which the null hypothesis of identical populations willbe accepted at signi®cance level a .

To apply the ann-Whitney procedure to this problem, the ran-dom variable U now denotes the number of times a Y Ày precedes an X , that is, the number of pairs X i ; Y j Ày ; i ; ; . . . ; m and j ; ; . . . ; n , for which X i > Y j Ày, or equivalently, Y j À X i < y. If atable of critical values for a two-sided U test at level a gives a rejectionregion of U 4 k, say, we reject H when no more than k differencesY j À X i are less than the value y, and accept H when more than kdifferences are less than y. The total number of differences Y

j À X

iis

mn . If these differences are ordered from smallest to largest accordingto actual (not absolute) magnitude, denoted by D ; D ; . . . ; D mn ,there are exactly k differences less than y if y is the k st-ordereddifference, D k . Any number exceeding this k st difference willproduce more than k differences less than y. Therefore, the lower limitof the con®dence interval for y is D k . imilarly, since the probabilitydistribution of U is symmetric, an upper con®dence limit is given bythat difference which is k st from the largest, that is, D mn À k . Thecon®dence interval with coef®cient Àa then is

D k < y < D mn À k

The procedure is illustrated by the following numerical example.

uppose that m ; n ; a : . y simple enumeration, we ®nd P U 4 = : and P U 4 = : ,andsothecriti-cal value when a = : is , with the exact probability of a type Ierror . . The con®dence interval will then be D < y < D .

THE GENERAL TWO-SAMPLE PROBLEM 2 5

Page 307: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 307/675

S 1 6 7 2 4 9 10 12 ð2Þ ð14Þ

6 2

À4 9 0 928

þ

À ¼ 1 : À

À 1 ð þ1Þ

6 1

¼1 ¼0 072

1 À

¼ ð Þ 1987

- v

À1 À6 À7

1 À4 À53 À2 À38 3 29 4 3

11 6 5

7

Page 308: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 308/675

M

¼ ð þ Þ2

12 ð1 ÀÞð À1 2Þ2 ð6 18Þ

¼

2 6 18 V

M ‘‘ ’’ ¼0 5 ¼

77

Page 309: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 309/675

formula reduces to the sample size formula ( . ) in ection . for theWilcoxon signed-rank test.

As an example, suppose we want to use a one-sided ann-Whitney test at a : to detect a difference p P Y > X :with power . . uppose we have to take fewer X observations thanY , say m : n. This makes c = . With z : : and z : : , we use ( . ) to ®nd

N : :

À ÁÀ Á::

Thus we need a total of observations, which translates to m ,n .

ollandt and Horn ( ) compared Noether's sample size for-mula to an alternative and found that Noether's approximation issuf®ciently reliable with small and large deviations from the nullhypothesis.

DISCUSSION

The ann-Whitney U test is a frequently used nonparametric testthat is equivalent to another well-known test, the Wilcoxon rank-sumtest, which will be presented independently in ection . . ecause theWilcoxon rank-sum test is easier to use in practice, we postpone givingnumerical examples until then. The discussion here applies equally toboth tests.

Only independence and continuous distributions need be as-sumed to test the null hypothesis of identical populations. The test issimple to use for any size samples, and tables of the exact null dis-tribution are widely available. The large-sample approximation isquite adequate for most practical purposes, and corrections for ties canbe incorporated in the test statistic. The test has been found to per-form particularly well as a test for equal means (or medians), since it isespecially sensitive to differences in location. In order to reduce thegenerality of the null hypothesis in this way, however, we must feelthat we can legitimately assume that the populations are identicalexcept possibly for their locations. A particular advantage of the testprocedure in this case is that it can be adapted to con®dence-intervalestimation of the difference in location.

When the populations are assumed to differ only in location, theann-Whitney test is directly comparable with tudent's t test formeans. The asymptotic relative ef®ciency of U relative to t is neverless than . , and if the populations are normal, the A E is quite

2 CHAPTER 6

Page 310: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 310/675

high at =p : (see hapter ). The ann-Whitney test per-forms better than the t test for some nonnormal distributions. Forexample, the A E is . for the double exponential distribution and. for the logistic distribution, which are both heavy-tailed dis-

tributions.any statisticians consider the ann-Whitney (or equivalently

the Wilcoxon rank-sum) test the best nonparametric test for location.Therefore power functions for smaller sample sizes and =or other dis-tributions are of interest. To calculate exact power, we sum the prob-abilities under the alternative for those arrangements of m X and n Y random variables which are in the rejection region. For anycombined arrangement Z where the X random variables occur in the

positions r ; r ; . . . ; r m and the Y 's in positions s ; s ; . . . ; sn , this prob-ability is

P Z m !n ! IÀI u N

ÀI ÁÁÁ uÀI uÀI

m

i f X u r i

n

j f Y u s j du ÁÁÁdu N

:

which is generally extremely tedious to evaluate. The asymptoticnormality of U holds even in the nonnull case, and the mean andvariance of U in ( . ) and ( . ) depend only on the parameters p; p ;and p if the distributions are continuous. Thus, approximations topower can be found if the integrals in ( . ), ( . ), and ( . ) are eval-uated. Unfortunately, even under the more speci®c alternative,

F Y x F X xÀy for some y, these integrals depend on both y and F X ,so that calculating even the approximation to power requires that thebasic parent population be speci®ed.

6. SUMMARY

In this chapter we have covered four different inference proceduresfor the null hypothesis that two mutually independent randomsamples come from identical distributions against the general alter-native that they differ in some way. TheWald-Wolfowitz runs test andolmogorov- mirnov two-sample tests are both noted for their gen-erality. ince they are sensitive to any type of difference between thetwo distributions, they are not very powerful against any speci®c

type of difference that could be stated in the alternative. Their ef®-ciency is very low for location alternatives. They are really usefulonly in preliminary studies. The median test is primarily sensitive todifferences in location and it does have a corresponding con®dence

THE GENERAL TWO-SAMPLE PROBLEM 2 9

Page 311: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 311/675

B

C R 1978 LMP

LMP 2000 ‘‘ ’ ’’

K S

1964

M

C 8 M

1 À

ð þ Þ þ ¼ ½ ð Þ À ð Þ

¼6 ¼7 ¼0 01 0 05 0 10

¼6 ¼7 ¼0 10

M 6 14 þ 4

V 6 15 0

ðÞ ðÞ ðÞ M

:

8

Page 312: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 312/675

a Using the signi®cance level . , test

H : y versus H : y Tb Give the exact level of the test in a .c Give a con®dence interval on y, with an exact con®dence coef®cient corre-

sponding to the exact level noted in b .6. . epresent a sample of m X and n Y random variable by a path of m n steps, theith step being one unit up or to the right according as the ith from the smallest ob-servation in the combined sample is an X or a Y , respectively. What is the algebraicrelation between the area under the path and the ann-Whitney statistic6. . an you think of other functions of the difference Sm x ÀSn x (besides themaximum) which could also be used for distribution-free tests of the equality of twopopulation distributions6.9. The census statistics for Alabama give the percentage changes in populationbetween and for each of the counties. These counties were divided into twomutually independent groups, rural and nonrural, according to population size of lessthan , in or not. andom samples of nine rural and seven nonrural countiesgave the following data on percentage population change

Use all of the methods of this chapter to test the null hypothesis of equal distributions.( Hint Notice that m

Tn here.)

6.10. a how that the distribution of the precedence statistic P i under the nullhypothesis F X F Y , given in roblem : c , can be expressed as

P P i j j H n

m n

m j n Ài À m n À j i À

i j

m j n

i m n j i j ; ; . . . ; m

These relationships are useful in calculating the null distribution of precedence statis-tics using tables of a hypergeometric distribution.

b Hence show that the null distribution of the control median test statistic ,with n r , can be expressed as

rm r

m

j r

r m r j r j ; ; . . . ; m

c repare a table of the cumulative probabilities of the null distribution of forsome suitable values of m and n (odd).

X

Y

ural . , À : , À : , À : , À : , À: , À: , . , .

Nonrural À: , . , . , . , . , . , .

THE GENERAL TWO-SAMPLE PROBLEM 2 1

Page 313: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 313/675

6.11. For the control median test statistic , use roblem . , or otherwise, to showthat when F X F Y ,

E m

and var r mm r

Hint Use the fact that E X EY E X jY and var X var Y E X jY EY var X jY

6.12. how that when m; n 3 I such that m= m n 3 l ; < l < , then the nulldistribution of the precedence statistic P i given in roblem . tends to the negativebinomial distribution with parameters i and l , or

j i Ài À l j Àl j j ; ; . . . ; m en ;

6.13. In some applications the quantity x p F X k p , where k p is the pth quantile of

F Y , is of interest. et lim n3I m=n l , where l is a ®xed quantity, and let fr ngbe asequence of positive integers such that lim n3I r n =n p. Finally let m;n be the numberof X observations that do not exceed Y r n .

a how that mÀ m;n is a consistent estimator of x p.b how that the random variable m = mÀ m;n Àx p is asymptotically nor-

mally distributed with mean zero and variance

x p Àx p l p À p f X k p

f Y k p

where f X and f Y are the density functions corresponding to F X and F Y , respectively.(Gastwirth, ; hakraborti and ukerjee; )

6.14. A sample of three girls and ®ve boys are given instructions on how to complete acertain task. Then they are asked to perform the task over and over until they complete

it correctly. The number of repetitions necessary for correct completion are , , and forthe girls and , , , , and for the boys. Find the P value for the alternative that onthe average the girls learn the task faster than the boys, and ®nd a con®dence intervalestimate for the difference y M Y À M X with a con®dence coef®cient at least equal to. , using the median test.6.15. A researcher is interested in learning if a new drug is better than a placebo intreating a certain disease. ecause of the nature of the disease, only a limited number of patients can be found. Out of these, are randomly assigned to the placebo and to thenew drug. uppose that the concentration of a certain chemical in blood is measured andsmaller measurements are better. The data are as follows

rug : : ; : ; : ; : ; : lacebo : : ; : ; : ; : ; :

a Use the median test and the control median test to test the hypothesis. Foreach test give the null hypothesis, the alternative hypothesis, the value of the test

statistic, the exact and the approximate P value and a conclusion. What assumptions arewe makingb Use the median test to calculate a con®dence interval for the difference

between the medians. What is the highest possible level of con®dence What assump-tions are we making for this procedure

2 2 CHAPTER 6

Page 314: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 314/675

7 k

- p

7

C 6

M

M S

8

Page 315: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 315/675

P C 8

9

7 K

1 2 1 2

0 ð Þ ¼ ð Þ ¼ ð Þ

þ ¼ 1 2

5 1 S 5 5

ð Þ ¼ ¼1

ð À Þ þ ¼1

ð À Þ ð Þ ¼

¼1ð À Þ þ

¼1ð À Þ

ð2 1Þ

ðÞ ¼0 01 0&

L

¼ ð 1 2 Þ¼1

¼0 1 2 ¼ þ

ð 1 2 3 4Þ ¼ ð2 9 3 4Þ

ð 1 2 3Þ ¼ ð1 6 10Þ 1 2 34 6 9 10 ð 1 1 3 4 2 2 3Þ

8 7

Page 316: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 316/675

0 1 1 1 0 1 0 S 6 ¼1 2 6

M

ð Þ ¼

¼1 ð2 2Þ

7 K

0: ð Þ ¼ ð Þ ¼ ð Þ , ¼1 2

ð Þ ¼ ð Þ ¼ 2 ð Þ ¼ À 2ð À1Þ ð3 1Þ

S

ð Þ¼ ¼1 ¼0 ¼1 2

0 8<: B

ð Þ ¼ ð Þ ¼ 2

ð Þ ¼ð ¼1 \ ¼1Þ ¼2 2 ¼ ð À1Þ

ð À1Þ

K 8

Page 317: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 317/675

ð Þ ¼ð À1Þ ð À1ÞÀ 2

¼ À 2ð À1Þ

0 ð Þ ¼ ð Þ ¼ ð Þ ,

ð Þ ¼

¼1

ð Þ ¼ 2ð À1Þ

¼1

2 À

¼1 2

ð3 2Þ

ð Þ ¼

¼1 ð Þ ¼

¼1

ð Þ ¼

¼1

2 ð Þ þ6¼

ð Þ

¼ P

¼12

2 À P6¼ P

2ð À1Þ¼ 2ð À1Þ

¼1

2 À

¼1

2 À6¼

¼ 2ð À1Þ

¼1

2 À

¼1 2

¼P

¼1 ¼P

¼1 , 0 ð Þ ¼ ð Þ ¼ ð Þ ,

ð Þ ¼ 2ð À1Þ

¼1 À

¼1

¼1

ð

Þ ¼

¼1 ð

Þ þ 6¼

ð

Þ¼ 2

¼1 À 2ð À1Þ 6¼

8 7

Page 318: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 318/675

¼ 2ð À1Þ

¼1 À

¼1 À6¼

¼ 2ð À1Þ

¼1 À

¼1

¼1

6 19 S 6 6

¼

¼

ð Þ ¼ Z 1

À1Z

À1ÁÁÁZ 2

À1Y¼1 ð ÞY ¼1

ð Þ 1 ÁÁÁ

1 2 1 2 S

ð Þ ¼ Z 1

À1Z

À1ÁÁÁZ 2

À1Y

¼1 ð Þ 1 ÁÁÁ

¼ ð3 3

Þ ! !

ð Þ S

þÀ Á¼ À Á 3 3

0S 1 À Á

ð Þ

À Á

½ ð Þ ¼ ¼ð Þ0 ð3 4Þ

K 87

Page 319: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 319/675

ð Þ ð Þ ¼ S X C

6¼0 ½ ð Þ À¼ ¼½ ð Þ À¼ À

S 0 ð Þ ¼þ

ð 0Þ ¼À þ À

ð Þ þ ð 0Þ ¼2

ð Þ ð Þ ¼ P

¼1

þ Àþ1 ¼ ¼ ¼1 2

¼ ð 1 2 Þ 0¼ ð 01 02 0 Þ

0¼ Àþ1

ð Þ þ ð 0Þ ¼

¼1 þ

¼1 Àþ1

¼

¼1

þ

¼1

À

þ1

¼

¼1 ð þ Àþ1Þ ¼

¼1 ¼

88 7

Page 320: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 320/675

S ½ ð Þ ¼½ ð 0Þ ¼2

¼2 ¼2P

¼1

¼

ð Þ ¼ ¼ 2.

S ¼ 00¼1 À

ð Þ þ ð 0Þ ¼

¼1 þ

¼1 ð1 À Þ ¼

¼1 ¼2

ð Þ ¼ 2

¼ À þ1 2 .

0 0¼ þ 2 2 0¼ À 2 2

ð Þ þ ð 0Þ ¼ 2

¼1 þ

¼ 2

þ1ð À þ1Þ

þ 2

¼1 2þ þ

¼ 2þ1ð À þ1Þ À 2

¼ 2

¼1 þ

¼ 2þ1ð À þ1Þ

þ

¼ 2þ1 À

2 þ

2

¼1

2 À þ1

¼ 2

¼1

2 þ1 þ

¼ 2þ1

2 þ1

¼ 2 þ1 ¼2

ð Þ

K 8

Page 321: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 321/675

ð Þ

7

P 1:

¼

¼1 0¼

¼1 Àþ1

¼ 0 ¼ Àþ1 ¼1 2

P 2:

¼

¼1 0¼

¼1 ð1 À Þ

þ 0¼

¼1

P 3:

¼

¼1

¼1ð1 À Àþ1Þ

þ 0¼

¼1 ¼ Àþ1 ¼1 2

! 1 ! 1

S

À ð Þ ð Þ

C S

7

Page 322: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 322/675

1958

C S S

¼P

¼1

¼

Z 1

À1

½ ð Þ ð Þ :

1 ð Þ ð Þ

2 ! 0 13 ð Þ ¼ ð Þ þ ð1 À Þ ð Þ ð Þ

4 ð Þ ¼ S

S

S :

0 ¼

ð Þ 0

½ ð Þ ð Þ

ð Þ ¼1

0 ( ¼ ½ ð Þ ¼ ð Þ

¼ Z 1

À1 ð Þ ð Þ¼ Z 1

À1½ ð Þþ ð Þ ð Þ

K

Page 323: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 323/675

¼ Z 1

À1ð Þ

Âð1 0 Þ

¼1

¼1

ð Þ ¼ ð Þ þ ð1 À Þ ð Þ C S

C S 1958

8 , ð Þ ¼ !1 ð Þ ,

ðÞð Þ¼ ð Þ ð1 À Þ ÀÀ1 2þ ¼0 1 2 0

!1

À

¼ ðÞ

¼Z 1

À1 ½ ð Þ ð Þ

2 ¼2

1 À

Z Z

À1 1 ð Þ½1 À ð Þ 0½ ð Þ 0½ ð Þ8<

:Â ð Þ ð Þ þ ð1 À Þ Z Z À1 1

ð Þ½1 À ð Þ

7

Page 324: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 324/675

 0½ ð Þ 0½ ð Þ ð Þ ð Þ ) 6¼0

8 ð Þ ¼ ð Þ ¼ ð Þ ,

¼Z 1

0 ðÞ

2 ¼2ð1 À ÞZ Z

0 1

ð1 À Þ 0ð Þ 0ð Þ

¼2ð1 À ÞZ Z Z Z 0 1

0ð Þ 0ð Þ

¼2ð1 À ÞZ Z Z 0 1

Z 0ð Þ 0ð Þ

¼2ð1 À ÞZ Z Z 0 1

½ ðÞ À ð Þ 0ð Þ

¼2ð1 À Þ

Z Z 0 1

ðÞ ð Þ À 2ð Þ

2

!¼ ð1 À ÞZ Z 0 1

½ 2ðÞ À2 ðÞ ðÞ þ 2ðÞ

¼ ð1 À ÞZ 1

0 2ðÞ þZ 1

0 ð1 À Þ 2ðÞ

ÀZ 1

0 ðÞ Z 1

0 ðÞ !

¼ ð1 À ÞZ 1

0 2ðÞ ÀZ 1

0 ðÞ !2( )

3 2 ¼ ð Þ

K

Page 325: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 325/675

7

ð Þ 6¼ ð Þ

ð Þ ð Þ

C 6 KS M

S 1 2

ð Þ 6 19

S 6 6

S X C

7

Page 326: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 326/675

Page 327: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 327/675

8 k

8

S ¼ þ

:

0 ð Þ ¼ ð Þ ð Þ ¼ ð À Þ 6¼0

0 0 1 1

Page 328: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 328/675

’ ’ 0 ’ ’ 0

0

À ¼0 À 0

¼ À 0

þ À2 :

¼

À

ffiffiffiffiffiffiffiffiffiffiffiffið À1Þ2 þ ð À1Þ2

þ À2 ffiffiffiffiffiffiffiffiffiþ ð

1 1

Þ

M

’ ’

’ ’

ð Þ ¼ ð À Þ 0 0

K 7

Page 329: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 329/675

8 W K-

’ ’

1945

0 ð S Þ ’ 0 ð S Þ

’ 6¼0

’ ¼ ¼1 2 W

W ¼

¼1

7 2 2[ 2 2 S 7 2]

W

7 3 2

ðW Þ ¼ð þ1Þ2 ðW Þ ¼ ð þ1Þ

12

V W P¼1 ¼ ð þ1Þ2

P

¼ À þ1 ¼ ð2 À þ1Þ2 7 3 4

þ Àþ1 ¼ þ1 ¼1 2

¼3

¼4

73

À Á¼35 1’ 0’ W

6 18 12 2 1

8 8

Page 330: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 330/675

2 1

ðW 17Þ ¼2 35 ¼0 0571S

5 7 8

6 6 14 M ð Þ

ð Þ ¼À1 ð À Þ þ À1ð Þ

W ð Þ ¼ ð Þ ¼ÂÀ1 ð À Þ þ À1ð ÞÃþ .ð þ Þ ð Þ ¼ À1 ð À Þ þ À1ð Þ ð2 2Þ

J 10 M

K 1972

7 3 8 W

12

W N

V W ’

18 5 6 7 117 4 6 7 116 3 6 7 4 5 7 215 2 6 7 3 5 7 4 5 6 314 1 6 7 2 5 7 3 4 7 3 5 6 413 1 5 7 2 4 7 2 5 6 3 4 6 412 1 4 7 2 3 7 1 5 6 2 4 6 3 4 5 5

K

Page 331: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 331/675

C 1967

32–35 W

5 7 10

Pð2 À1Þ12

S 7 3 2

2ð À1Þ

ð þ1Þð2 þ1Þ6 ÀPð2 À1Þ

12 !À ð þ1Þ2 !2( )

¼ ð þ1Þ12 À Pð2 À1Þ12 ð À1Þ ð2 3Þ M

C 6 6 6 2

¼¼1 ¼1

¼¼1 ð 1 þ 2 þÁÁÁþ Þ

¼1 0 &P ¼1

¼¼1 ½ð Þ À ¼

¼1 ð Þ À ð1 þ 2 þÁÁÁþÞ

¼

¼1 À ð1 þ2 þÁÁÁþÞ

¼

¼1 À2 ð þ1Þ¼W À2 ð þ1Þ ð2 4Þ

8

Page 332: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 332/675

W C 2 P ¼1

P¼1 ð Þ

¼¼1 ¼1 ð Þ ¼

¼1 ¼1 ð ðÞÞ ¼¼1 ½ ð ðÞÞ À

ð2 5Þ ð ðÞÞ

2 5 M

R 0 864

M

P 10 5

C 10

W ¼ À O W

W J10 W ð þ1Þ2

W ð þ1Þ2

0 5 :

0 ð S

Þ W ðW OÞ0 ð

S Þ W 0 ðW OÞ

6¼0 W 2 W 0 2 2

K

Page 333: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 333/675

2 3

R S 6 6 ¼ À M

ð þ1Þ À ¼1 2 ¼1 2

2 M L 2 2

2 3 2 ¼ 0 2 À ð þ1Þ2

ð þ1Þ2 W J ¼ 2 ¼4 ¼5 ¼0 032 00 032 ¼12 0 032 ¼12 À10 ¼2 þ1 ¼3 0 936

þ1 0 2

J ð þ1Þ2 W

¼ À

À J 1 À2 J

¼ 2 þ0 5 À 2 ffiffiffiffiffiffiffiffiffiffið þ1Þ12 ð2 6Þ

Z

p C

0 W ð þ1Þ2þ0 5þ ffiffiffiffiffiffiffiffiffiffiffið þ1Þ12 1À O À0 5 À ð þ1Þ2

ffiffiffiffiffiffiffið þ1Þ12Þ 0 W ð þ1Þ2À0 5À ffiffiffiffiffiffiffiffiffiffiffið þ1Þ12 O þ0 5 À ð þ1Þ2

ffiffiffiffiffiffiffið þ1Þ12 6¼0 B 2 2

8

Page 334: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 334/675

6 Â4

¼ ¼6

1 ¼ À 0 W 9 4 11 2 11 3 11 4

12 0 12 6 13 2 13 4 14 0 14 1 15 4 16 4 W ¼1 þ2 þ4 þ5 þ6 þ7 ¼25 ðW 25Þ ¼0 013 J

¼6 ¼6 0 ¼0 1 0 0 013

M B S X C S S 2 1 S S M B 0 0153

M B À

P 8 16 S X C

S S

!

S S

1 1

12 6 11 2 16 4 15 411 4 9 4 14 1 14 013 2 12 0 13 4 11 3

K

Page 335: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 335/675

Page 336: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 336/675

K

Page 337: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 337/675

p

2 1 6 7

2 4 9 10 12

C D C D 0 90

X ¼3 ¼5 À

À

J ¼3 ¼5 0 90

¼0 036 0 928 2

¼2 3 5 ¼15 À À4 9

À4 À 9 À9 À 4

M B S X C M B

0 926 0 928 M B

S X C

S X C

L

À1 À6 À7

2 1 À4 À54 3 À2 À39 8 3 2

10 9 4 312 11 6 5

8

Page 338: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 338/675

8

- ( )

1952 1951 ¼ ððÞÞ ðÞ

K 7

Page 339: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 339/675

1 ¼

¼1 ððÞÞ ð3 1Þ

100 1961

1952 K

1964

2 ¼ P

¼1½ ððÞÞ2

ð À1Þ ð3 2Þ

¼ð À2Þ

1 2

ð1 À 2Þ1 2

¼ 1 ½2ð À1Þ1 2

S ’ À2

R 1 S ’ S ’ R 1 C S

1958 ððÞÞ

‘‘ ’’

1 2

1 S 5 5S

8 8

Page 340: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 340/675

S

À1½ð þ1Þ ð Þ C 2

À1

½ð þ1Þ ððÞÞ ð Þ ¼¼ À1ð Þ ½ð þ1Þ

W

W 1952 1953

¼

¼1À1

þ1

ð3 3

Þ

ð þ1Þ

1956 50

2 ¼ P

¼1 À1 þ1 2

ð À1Þ ð3 4Þ

K

Page 341: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 341/675

S ððÞÞ !0 !

1 ðÞ ððÞÞ ððÞÞ

2 8 2 2 8 3 ½ððÞÞ ¼ð þ1Þ

½ððÞÞ ¼ð À þ1Þð þ1Þ

2

ð þ2Þ! 0

! 1

ððÞÞ

ð þ1Þ ðÞ À1½ð þ1Þ

! 1 R

S 5 7

ð ðÞÞ

‘‘ ’’

R 1 S ’

C 11

p 2 1 ¼6 ¼6

¼12

À1 6292 À1 1157 À0 7928 À0 5368 À0 3122 À0 1026

0 1026

0 3122 1 ¼ À3 5939 3 1 2 ¼2 6857 3 2 ¼ À2 1930 ð À2 1930Þ ¼0 0142

8

Page 342: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 342/675

¼12 À1 4261 À1 0201 À0 7363 À0 5024 À0 2934 À0 0966

¼ À3 242 3 3 2 ¼2 1624 3 4

¼ À2 2047 ð À2 2047 Þ ¼0 0137

S S 2 1

S X C

S S S X C À3 242

1 1

K

Page 343: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 343/675

K

1965

0 1

8

Page 344: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 344/675

S ¼ ½ þ1

¼ ½ þ1 [ ] D

:

¼

¼1 ð À þ1Þ

¼

¼ Àþ1½À ð À Þ

ð3 3Þ :

¼

¼1 À þ1 2ð Þ

¼

¼ Àþ1 À ð À Þ À1 2½

À þ ¼ ¼ 2

À

ð þ1Þ2 Æ

7 3 2 þ ¼0

þ1 À 7 3 2 7 3 3

ð Æ Þ ¼ ð Þ þ ð Þ Æ2 ð Þ ¼

ð À Þ ¼0 ð À Þ ¼ ð4 2 À1Þ6 ð À1Þ ð3 4Þ

K

Page 345: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 345/675

B 7 3 4 À ¼

¼ 6 1966

0 025

¼ 6

0 968 ¼ ¼0 42 ¼ ¼0 5 R 0 955

8

C 6

S 8 2 M S 6 6

R

8

Page 346: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 346/675

8 ¼1 2 :

¼ 1 2

¼ 1 2 1 2

M 2 4

8 1 2 D þ ¼

¼ 0

¼ 0

1 2 1 2

S W 2 1 þ 5 7 1

W

þ W

þ S /

8 À 3 3 ¼ ¼3 ¼ 3

0 10 W ¼ ¼38 V 3 4 À ¼ ¼ 8 S S 6 4 8 Q 1989

M ’ C 100

36 D 23 7 2

S S

10 18–29 10 50–59 D

18 À29 11 13 15 15 17 19 20 21 21 22 50 À59 8 9 10 11 12 13 5 17 19 23

8 7 1988 18

K

Page 347: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 347/675

15

5 1

C

?8 8 M 1986 P 5 12

S

?

S 1 20 32 22 21 27 26 38S 2 34 20 30 28 25 23 29

8

B ’

95 B ’

B 68 75 92 79 95’ 69 76 81 72 75 80

8 S

1972 300 S

1 À2 00 1 14 À17 00 1 1 00 À0 56 11 002 0 00 À2 64 2 00 2 0 50 0 87 5 003 À1 00 À1 96 23 00 3 À0 75 À0 75 1 004 À4 00 0 86 13 00 4 À2 00 À0 60 35 005 À0 75 À2 35 2 00 5 À3 00 0 00 À5 006 À1 75 À2 51 5 00 6 À2 50 À2 54 2 007 À2 75 0 55 8 00 7 0 00 À3 10 3 008 0 00 3 40 3 00 8 0 25 3 48 À7 009 À1 75 0 00 7 00

10 1 00 À4 94 10 00

8

Page 348: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 348/675

P QPQ PQ

0–33 18 0 20 3 S 8 10 D

D ’ ?

S 16 18 21 14 25 24 27 1217 15 28 31 30 26 27 20 21 19

8 20

:

C 5 6 7 7 8 8 9 12D 7 8 8 8 9 9 12 13 14 17

S

0 05

8

79 13 138 129 59 76 75 53 96 141 133 107 102 129 110 104

0 10

D

8 8 8

35 50

7 0 90

9 31 9 57 10 21 8 86 8 52 10 53 9 21M 9 14 9 98 8 46 8 93 10 14 10 17 11 04 9 43

K 7

Page 349: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 349/675

8 0 90 :

¼11 ¼11 ¼5 ¼6

( ) ( ) 8

:

: 62 68 78 92 53 81P : 54 70 79

0 90

8 V À 0 95 M B 2 1

8 8

Page 350: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 350/675

k

C C 8

S C 6 8 M

À

Y ¼ X

À C 6 K S

S

Page 351: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 351/675

0 ¼

À1 À1 ¼ ¼1ð À Þ

2

À1 . ¼1ð À Þ

2

À1

S ’ À1 À1

À ð Þ ¼ À

¼ À ð Þ 0

ð1 1Þ

¼

ð

À

Þ

ð

Þ ¼ð

Þ

0 ¼1 À À

ð À Þ ¼ ð À Þ ð Þ ¼2 ð ÞS

1 1

À ð Þ ¼ À ð Þ 0 ð1 2ÞS

9

Page 352: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 352/675

1 2

0¼ À 0 ¼ À ¼1 2

¼1 2

0 0 0 0

Y ¼ X X Y

X ¼ Y ¼

S

ð Þ ¼ ð Þ 0 6¼1 ð1 3Þ

1 1

1 1 ð Þ ¼ ð Þ

ð Þ ¼ð Þ 1 S ¼ ¼ ¼0¼ 1 1 1 2

ð Þ ¼ ð ÞðÞ 1 ðÞ 1

K

Page 353: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 353/675

1 1 1 3

1 ð Þ ¼À 0 ð Þ ¼ À 1 S

ð Þ ¼1 ð Þ ¼1 2 ð Þ ¼1 ð Þ ¼1 ð Þ ¼ 2 ¼ ð2 Þ 1

1 3

ð Þ ¼ ð Þ ð Þ ¼ ð Þ ¼

¼ ¼0 ¼ ¼0

À ð Þ ¼ À ð Þ 0 6¼1

ð1 4Þ B 1 3 1 4

1 3

¼0

M

— M

B D B S

9

Page 354: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 354/675

K S M S 9 9 D 1976 P

S 9 8 S 9 10

ð þ1Þ2

À ð þ1Þ2

Z ¼ ¼3

P6

¼1 À þ12

À Á ¼ À1 5

1954

¼

¼1 À þ1

2 2

ð2 1Þ ’

S 2 1 2 2

K

Page 355: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 355/675

7 3 2 S 7 3 :

ð Þ ¼

¼1 À þ1

2 2

¼ 2 À ð þ1Þ þ ð þ1Þ

2

4" #¼

ð þ1Þð2 þ1Þ6 À

ð þ1Þ2

2 þ ð þ1Þ

2

4" # 12 ð Þ ¼ ð þ1Þð À1Þ

ð Þ ¼ð 2 À1Þ12 ð2 2Þ

2ð À1Þ ð Þ

¼

¼1 À þ1

2 4

À

¼1 À þ1

2 2" #28<:9=;

¼ 4 À4 þ1

23 þ6 ð þ1Þ

2

42 À4 ð þ1Þ

3

8

"(Â þ ð þ1Þ

4

16 #À ð 2 À1Þ

12 !2)

N v

1 2 3 2 À1

2 À1

2À Á2 À32À Á2 À5

2À Á2 3

2À Á2 12À Á2

2 þ1

2 þ2 À2 À1 12À Á2 3

2À Á2 À5

2À Á2 À32À Á2 À1

2À Á2

N

1 2 3 À12

þ12

À12À Á2 À3

2À Á2 À52À Á2

ð1Þ2 0

þ32 À2 À1

ð1Þ2

À5

2

À Á2

À3

2

À Á2

À1

2

À Á2

9

Page 356: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 356/675

¼1

3 ¼ ð þ1Þ

2 !2

¼1

4 ¼ ð þ1Þð2 þ1Þð3 2 þ3 À1Þ

180

ð

Þ ¼ð þ1Þð 2 À4Þ

180 ð2 3

Þ

¼ Àþ1 7 3 7 7 3 5 ð 2 À1Þ24

¼ L S

D L 1968 2 2 2 3

M

15 2 2 ¼0 76

- - - -

M

¼

¼1 À þ1

2 ¼ ð þ1Þ

¼1 þ1 À12

ð3 1Þ

1957 B1960 D B 1958

K

Page 357: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 357/675

¼

¼1

þ12 À À

þ12 ¼ ð þ1Þ

2 À ð3 2Þ

¼½ð þ1Þ2

¼1 þ

¼½ð þ1Þ2 þ1ð À þ1Þ ð3 3Þ

½

S 1 2

2

ð þ1Þ2 S

M

À ð Þ ¼ À ð Þ 0 6¼1

S 9 7

7 3 6

þ ¼

ð Þ

ð Þ

1 1 ð 0Þ1 2 ð 0Þ6¼1 3 4 2

9

Page 358: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 358/675

À1 À1

ð þ1Þ2 ð þ1Þ2 À1 2

ð Þ ¼À1 À þ1

2 þ À1ð Þ ð Þ ¼À1 À

2 þ À1ð Þ

ð Þ ¼À1 ð À Þ þ À1ð Þ ¼ þ12 !

ð Þ ¼ ð Þ. þ

ð þ Þ ð Þ ¼ À1 ð À Þ þ À1ð Þ 6 6 14 8 2 2 M

20 B

1960

7 3 2 3 3 3 2 ¼ ð þ1Þ2

ð Þ ¼½

¼1 þ

¼½ þ1ð À þ1Þ ¼

½

¼1 þ À½

¼1 " #

ð Þ ¼2 2

¼1 ¼ ð þ2Þ4

ð Þ ¼2Pð À1Þ2

¼1 þ þ12 ¼ ð þ1Þ

2

4

K 7

Page 359: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 359/675

ð Þ ¼ ð Þ

¼ 2ð À1Þ"

¼1 À þ1

2 2

À

¼1 À þ1

2 2#¼ 2ð À1Þ

2ð 2 À1Þ12 À

ð Þ !2( )

¼ 2ð À1Þ 2ð 2 À1Þ

12 À" ð þ1Þ2 À

ð Þ#28<

:9=; ð Þ ¼ 2ð À1Þ

2ð 2 À1Þ12 À

2

4 2" #¼ ð 2 À4Þ

48ð À1Þ

ð Þ ¼ 2ð À1Þ 2ð 2 À1Þ

12 À 2 À1

4 2" #¼ ð

þ1

Þð 2

þ3

Þ48 2

C

ð Þ ¼ð þ2Þ4 ð Þ ¼ð þ1Þ2 4

ð Þ ¼ ð 2 À4Þ48ð À1Þ ð Þ ¼ ð þ1Þð 2 þ3Þ48 2

ð3 4Þ

D B 1958

1 0

8 9

Page 360: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 360/675

¼½ð þ1Þ2

¼1

þ22 !À þ

¼½ð þ1Þ2 þ1À

þ12 !

ð3 5Þ B

‘‘ ’’

0 ð þ1Þ2 7 3 2:

ð Þ ¼ð þ2Þ4 ð Þ ¼ð 2 À1Þ4

ð Þ ¼ ð 2 À4Þ48ð À1Þ ð Þ ¼ ð þ1Þð 2 þ3Þ48 2

ð3 6Þ

þ ¼ ð þ2Þ2½ ð3 7ÞS

D B1958 ¼ 8

S 6 2 ¼0 608

- K

S

W

K

Page 361: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 361/675

S 1960

S

¼

¼1

¼

2 1 2

2 À1 1 22ð ÀÞ þ2 2 2ð ÀÞ þ1 2 8>>><>>>:

ð4 1Þ

S W :

ð Þ ¼ð þ1Þ2 ð Þ ¼ ð þ1Þ

12 ð4 2Þ W

J 10 S

0

4 1 2

1 2 3 4 5 ÁÁÁ 2* ÁÁÁ À4 À3 À2 À1

1 4 5 8 9 ÁÁÁ ÁÁÁ10 7 6 3 2

* 2 ¼ ð 2Þ þ1

9

Page 362: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 362/675

2 2 0 2 2ð Þþ1 0 2ð Þþ1 þ 0þ1À Á4

:

00 ¼ ¼ 2 þ1 À ¼

þ1ð Þ2 À ð4 3Þ

K -

(196 ) M

[ 8 3 2 ] S

¼

¼1

À1

þ1

!2

ð5 1Þ

ð Þ S 0

W - k

W 1 3 4 5 ÁÁÁ 2

1 4 5 8 9 0 0 2 3 6 7 10 À1 þ 0 þ 0 3 7 11 15 19 2 À100 ð þ 0þ1Þ4 1 2 3 4 5 2

W ð 2Þ þ1 ÁÁÁ À4 À3 À2 À1

À1 10 7 6 3 20 0 9 8 5 4 1 þ 0 þ 0 2 À1 19 15 11 7 300 ð þ 0þ1Þ4 2 5 4 3 2 1

K

Page 363: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 363/675

20 K 1962

ð Þ ¼

¼1À1

þ1 !2

ð Þ ¼ ð À1Þ

¼1À1

þ1 !4À ð À1Þ½ ð Þ

2

S R

1

C 1961

¼1 ½ ð2

ðÞÞ ð5 2Þ

ðÞ 8 3 1

1956 S 1962 20 K B 1977

50

K

8 3 3 ¼ ¼ 2 þ

D B ¼

ð þ Þ ¼2

ð þ Þ ¼ ð4 2 À À6 3Þ6 2ð À1Þ

¼ ¼

¼ 6 1966

¼6

¼ 0 50

R 0 850

¼ ¼1 8 R 0 76 M ’

9

Page 364: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 364/675

0 608 S 9 3 9 4 25

S 9 5 R

7 K

S

’ ’

’ ’ ’ ’

’ ’ M 6 6 2

¼1 0 0

0 & S 1957

¼¼1 ¼1

ð7 1Þ

¼ ð 0 0 Þ¼Z 0

À1Z À1 ð Þ ð Þ þZ 1

0 Z 1

ð Þ ð Þ

¼

Z 0

À1

ð Þ ð Þþ

Z 1

0 ½1 À ð Þ ð Þ

¼Z 0

À1½ ð ÞÀ ð Þ ð ÞÞþZ 1

0 ½ ð ÞÀ ð Þ ð Þþ1 4

ð7 2Þ

K

Page 365: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 365/675

0 ¼1 4

ð Þ ¼B 1 2 M

6 6 10 6 6 11

1 ¼ ½ð 0 0 Þ\ð 0 0 Þ¼ ½ð 0 0Þ þ ½ð 0Þ\ð 0Þ¼Z 0

À1½ ð Þ

2 ð ÞþZ 10 ½1 À ð Þ

2 ð Þ ð7 3Þ 2 ¼ ½ð 0 0 Þ\ð 0 0 Þ

¼ ½ð 0Þ\ð 0Þ þ ½ð 0Þ\ð 0Þ¼Z 0

À1½1 2 À ð Þ

2 ð ÞþZ 10 ½ ð ÞÀ1 2 2 ð Þ ð7 4Þ

6 6 11

ð Þ ¼ ½ À 2ð À1Þ þ ðÀ1Þ 1 þð À1Þ 2 ð7 5ÞS ð Þ ¼ ð Þ !0

S 7 2 ¼ À1 4

¼Z 0

À1½ ð Þ À ð Þ ð Þ þZ 1

0 ½ ð Þ À ð Þ ð Þð7 6Þ

1 4 ð 0Þ ð 1Þ À 4 1 ð 0Þ 1 4 ð 0Þ ð 1Þ À 4 2 ð 0Þ 7 7 6¼1 4 ð 6¼0Þ ð6¼1Þ À 4 3 2

9

Page 366: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 366/675

ð Þ ð Þ

7 6

ð Þ ¼ ð Þ

1 1 ð Þ ð Þ 0 ð Þ ð Þ 02 1 ð Þ ð Þ 0 ð Þ ð Þ 0

7 6

¼ ÆZ 1

À1 ð Þ À ð Þ ð Þ ð7 8Þ

1 1 7 7

M

¼W þ ð À ÞðÀW Þ ð7 9Þ W

L 1976

0¼¼1 ¼1

0 ¼ À 0 ¼1 0

0 0

(ð7 10Þ

7 9 2 0 0 2 2 0 2

2 W 22 2

L 1976

K

Page 367: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 367/675

ð Þ ¼ ð Þ ¼1 41 ¼ 2 ¼1 12 S 7 5

ð Þ ¼ 4 ð Þ ¼ ðð þ7ÞÞ48

4 ffiffiffi3 ð À 4Þ ffiffiffiffiffiffiffiffiffiffiffiffið þ7Þ ð7 11Þ

¼ ¼0 ¼0 6 6 16

S

À S

S S 8 2 M

L 7 1 P ¼1 :

1 0 ð Þ À

2 0

À ð Þ þ1 ÀV

’ 0

V ’ 0

9

Page 368: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 368/675

¼1 ¼0

¼¼1

0

½ ð Þ À þ¼1

0

½ À ð Þ þ1 ÀV

¼ 0 þ 0ð À þ1Þ À À V

¼ 0 þ 0ð À þ1Þ À ð þ1Þ

2 ÀV ðV þ1Þ

2

P 0

0 V X

B ! 1

V 2 À ð þ2Þ4 3 3

S 9 3 9 4 R 6 2

8 -

1 2 ¼ ¼

À ð Þ ¼ À ð Þ 0

S

0 0¼

ð 0 Þ ¼ð Þ ¼ ð Þ ¼ ð Þ 0

1 À

K 7

Page 369: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 369/675

¼1 2

¼1 2 S 7 1 ð Þ 0

0 S

0 1 2 1 À

ð Þ ð 0Þ

ð8 1Þ

ð

Þð

Þ ð

Þð 0

Þ 0

0 L

1976 7 9 0 5

¼ 4 þ0 5 À 2 ffiffiffiffiffiffiffiffiffiffið þ7Þ48 ð8 2Þ

0 0¼ 2 À þ1

S 9 10

M M K

S

8 9

Page 370: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 370/675

þ ¼ 4

ðÞ ¼ 2 À

2 0 ð9 1

Þ 1948

R

ðÞ ¼ð À1Þ ð þ À1 À þ2Þ ð9 2Þ

P9 9 R 1953 K

L

¼ À þ ð9 3Þ K 1956

¼ þ R ’

R1965

K

Page 371: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 371/675

0¼ À 0¼ À

0

0 10 1

S 9 10

0¼ À 0¼ À

00

0 0

C 8 10 1

S 8 2

M 1963

2

2 À À

ð

À

À

Þ

V

ð V Þ Q

9

Page 372: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 372/675

R

0 304 0 955

S S 9 4

J

0 ¼

¼1 þ ¼

4 1 10

0 5 :

¼ 1 ð 0Þ¼ 1 0 ð 0Þ¼

6¼1 2 0 2 2

¼

1 ð þ1Þ2 1 À À0 5 À ð þ1Þ2

ffiffiffiffiffiffiffið þ1Þ12 " #þ 0 5 þ ffiffiffiffiffiffiffiffiffiffiffið þ1Þ12

¼

1 ð

þ1

Þ2 þ0 5

À ð

þ1

Þ2

ffiffiffiffiffiffiffið þ1Þ12 " #À0 5 À ffiffiffiffiffiffiffiffiffiffiffið þ1Þ12

¼

6¼1 B 2 2

K

Page 373: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 373/675

p

B 10

?

S : 0 028 0 029 0 011 À0 030 0 017 À0 012 À0 027

À0 018 0 022 À0 023S :

À0 002 0 016 0 005

À0 001 0 000 0 008

À0 005

À0 009 0 001 À0 019

S

S

10 1 ¼60 J ¼10

¼10 ¼0 000 S 0 1 1

S X C S S 10 1

W W

À0 030 1 0 000 19

À0 027 4 0 001 18

À0 023 5 0 005 15

À0 019 8 0 008 14

À0 018 9 0 011 11

À0 012 12 0 016 10

À0 009 13 0 017 7À0 005 16 0 022 6

À0 002 17 0 028 3

À0 001 20 0 029 2

9

Page 374: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 374/675

S X C S S S X C

À3 402 S S À3 3639 M B

K

Page 375: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 375/675

S S

: À0 030 À0 027 À0 023 À0 019 À0 018 À0 012 À0 009

À0 005 À0 002 À0 001P : 0 000 0 001 0 005 0 008 0 011 0 016 0 017 0 022 0 028

0 029

¼2 þ1 ¼3 7 1 0¼23 þ24 ¼47 7 10

¼ À2 93

0 0017 ¼ À2 87¼0 0021 þ 0¼

S

9

Page 376: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 376/675

S S S S X C

À À

À À

1 20

10 2 W ¼149 J 0 000

S S

À À

¼ 1 1 À À À À

v k

0 000 1 0 016 110 001 2 5 0 017 120 001 2 5 0 018 130 002 4 0 019 140 005 5 5 0 022 150 005 5 5 0 023 16

0 008 7 0 027 170 009 8 0 028 180 011 9 0 029 190 012 10 0 030 20

K

Page 377: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 377/675

Page 378: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 378/675

1 75 23 0

M — P

J

8 2 6 10 2

ðY ÀM ÞðX ÀM Þ

À

À À0 019 À0 009 À0 005 À0 002 À0 001

À0 030 0 6333 0 3000 0 1667 0 0667 0 0333

À0 027 0 7037 0 3333 0 1852 0 0741 0 0370

À0 023 0 8261 0 3913 0 2174 0 0870 0 0435

À0 018 1 0556 0 5000 0 2778 0 1111 0 0556

À0 012 1 5833 0 7500 0 4167 0 1667 0 0833

À

À 0 000 0 001 0 005 0 008 0 016

0 011 0 0 0909 04545 0 7273 1 45450 017 0 0 0588 0 2941 0 4706 0 94120 022 0 0 0455 0 2273 0 3636 0 72730 028 0 0 0357 0 1786 0 2857 0 57140 029 0 0 0345 0 1724 0 2759 0 5517

K 7

Page 379: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 379/675

p

: 35 66 58 83 71 : 46 56 60 49

B

S B

¼4

¼5

35 46 49 56 58 60 66 71 83 W ¼15 J ¼0 143 S ¼24 ¼0 206 J

0 95

J ¼4 ¼5 ¼3 0 936 20 10 5

0 648 1 400

¼1

B/A

35 58 66 71 83

46 1 314 0 793 0 697 0 648 0 55449 1 400 0 845 0 742 0 690 0 59056 1 600 0 966 0 848 0 789 0 67560 1 714 1 034 0 909 0 845 0 723

8 9

Page 380: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 380/675

S 9 2 9 6 À

0¼ À ð À Þ 0 0

0¼ À 0¼ À

0 0

R C 1

¼ ¼1 ¼ 0 6¼1

1 ¼ 0¼ 0 0 ¼1 B

0¼ 0 0 ¼1 ¼ ¼0

2 S À 0¼ ½ À ð À Þ 0

0 ¼1 B 0¼ 0 0 ¼1

¼ ¼03

0¼ ð À Þ 0 0¼ À 0 0¼1 0¼ 0¼0 ¼ 0 6¼1

0 6¼1

S

10 1

K

Page 381: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 381/675

¼ 0 6¼1

10 2 B ¼

B D B S 9 3 0 608

0 600 0 94

R M S 9 2 0 76 K C

S 5 R 1 00 R 0 850

¼ ¼1 8

D ¼ ¼3 M ’

D ¼ ¼3 B 3 3

V 2 3 ð Þ 7 3 2

3 1

7 3 2 3 5

V 4 3 7 4 3 3 4

P 9 4 9 5 8 7 3 2 þ 6¼

þ V 9 2 R ’

1988

P ’

9

Page 382: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 382/675

10 ¼ C 1 ’ 2

’ 5 ?

R :

: 1200 1220 1300 1170 1080 1110 1130 : 1210 1180 1000 1010 980 1400 1430 1390 970

S

S 16 0

: 15 7 16 1 15 9 16 2 15 9 16 0 15 8 16 1 16 3 16 5 15 5 : 15 4 16 0 15 6 15 7 16 6 16 3 16 4 16 8 15 2 16 9 15 1

0 95

/

D :

L : 36 36 38 40 41 41 42L : 29 34 37 39 40 43 44

1

7 34 64 75 94 36 2

6 44 83 27 6

K

Page 383: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 383/675

S

M ¼M ¼40

?

95% 1

2

9

Page 384: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 384/675

10Tests of the E uality of k Inde endentSam les

10.1 INTRODUCTION

The natural extension of the two-sample problem is the k-sampleproblem, where observations are taken under a variety of differentand independent conditions. Assume that we have k independent setsof observations, one from each of k continuous populations F x ; F x ; . . . ; F k x where the ith random sample is of sizen i ; i ; ; . . . ; k and there are a total of k

i n i N observations.Note that we are again assuming the independence extends acrosssamples in addition to within samples. The extension of the two-sample hypothesis to the k-sample problem is that all k samples aredrawn from identical populations

H : F x F x ÁÁÁ F k x for all x

The general alternative is simply that the populations differ in someway.

353

Page 385: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 385/675

’ ð À 1Þ ð À 2Þ ð À Þ

0 1 ¼ 2 ¼ ÁÁÁ ¼

1 6¼ 6¼

0 1 ¼ 2 ¼ ÁÁÁ ¼

D

1 2 ¼P ¼1ð Þ ¼P

¼1P ¼1ð Þ

¼ P

¼1 ð À Þ2

À1

P

¼1P ¼1ð À Þ2

À ¼

À1 À 0

K

Page 386: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 386/675

when appropriate, it is a useful k-sample techniquefor thehypothesisof identical populations and therefore is included in this chapter.

10.2 EXTENSION OF THE MEDIAN TEST

Under the hypothesis of identical populations, we have a single ran-dom sample of size k

i n i N from the common population. Thegrand median d of the pooled samples is an estimate of the median of this common population. Therefore, an observation from any of the ksamples is as likely to be above d as below it. The set of N observationswill support the null hypothesis then if, for each of the k samples,about half of the observations in that sample are less than the grandsample median. A test based on this criterion is attributed to ood( , pp. ± ) and rown and ood ( , ).

As in the two-sample case, the grand sample median d will bede®ned as the observation in the pooled ordered sample which hasrank N = if N is odd and any number between the twoobservations with ranks N = and N = if N is even. Then, for eachsample separately, the observations are dichotomized according asthey are less than d or not. e®ne the random variable U i as thenumber of observations in sample number i which are less than d, andlet t denote the total number of observations which are less then d.Then, by the de®nition of d, we have

t k

iu i

N = if N is even

N À = if N is oddVXetting u i denote the observed value of U i , we can present the calcu-lations in the following table.

Under the null hypothesis, each of the N

t

À Ápossible sets of t

observations is equally likely to be in the less-than- d category, and thenumber of dichotomizations with this particular sample outcome is

ki

n i

u i. Therefore the null probability distribution of the random

Sample Sample 2 ÁÁÁ Sample k Total

< d u u ÁÁÁ u k t5 d n Àu n Àu ÁÁÁ n k Àu k N ÀtTotal n n ÁÁÁ n k N

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 355

Page 387: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 387/675

variables is the multivariate extension of the hypergeometric dis-tribution, or

f u ; u ; . . . ; u kjt nu n

u ÁÁÁn k

u k N t 0 :

If any or all of the U i differ too much from their expected value of n iy,where y denotes the probability that an observation from the commonpopulation is less than d, the null hypothesis should be rejected.Generally, it would be impractical to set up join rejection regions forthe test statistics U ; U ; . . . ; U k, because of the large variety of com-binations of the sample sizes n ; n ; . . . ; n k and the fact that the

alternative hypothesis is generally two-sided for k > , as in the case of the F test. Fortunately, we can use another test criterion which,although an approximation, is reasonably accurate even for N as smallas if each sample consists of at least ®ve observations. This teststatistic can be derived by appealing to the analysis of goodness-of-®ttests in hapter . Each of the N elements in the pooled sample isclassi®ed according to two criteria, sample number and its magnituderelative to d. et these k categories be denoted by i; j , wherei ; ; . . . ; k according to the sample number and j if the obser-vation is less than d and j otherwise. enote the observed andexpected frequencies for the i; j category by f ij and eij , respectively.Then

f i u i f i n i Àu i

for i ; ; . . . ; k

and the expected frequencies under H are estimated from the data by

ein it N

ein i N Àt

N

for i ; ; . . . ; k

The goodness-of-®t test criterion for these k categories from ( . . ),ection . , is then

k

i j

f ij À eij

eij k

i

u i Àn i t= N n i t= N

k

i

n i Àu i Àn i N Àt = N n i N Àt = N

356 CHAPTER 10

Page 388: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 388/675

N k

i

u i Àn i t= N n i t

N k

i

n i t= N Àu i

n i N Àt

N k

i

u i Àn i t= N n i t N Àt

N t N Àt

k

i

u i Àn i t= N n i

:

and has approximately the chi-square distribution under H . Theparameters estimated from the data are the k probabilities that anobservation is less then d for each of the k samples and that it is notless then d. ut for each sample these probabilities sum to , and sothere are only k independent parameters estimated. The number of degrees of freedom for is then k À À k, or k À. The chi-squareapproximation to the distribution of is somewhat improved bymultiplication of by the factor N À = N . Then the rejectionregion is

PR for N À

N 5 w kÀ;a

As with the two-sample median test, tied observations do notpresent a problem unless there is more than one observation equal tothe median, which can occur only for N odd, or if N is even and the twomiddle observations are equal. The conservative approach is sug-

gested, whereby the decision is based on that resolution of ties whichleads to the smallest value of .

Exam le 2.1 A study has shown that percent of normal sleeperssnore occasionally while percent snore almost all the time. orethan patents have been registered in the U. . atent Of®ce fordevices purported to stop snoring. Three of these devices are asqueaker sewn into the back of night clothes, a tie to secure thewrists to the sides of the bed, and a chin strap to keep the mouthshut. An experiment was conducted to determine which device isthe most effective in stopping snoring or at least in reducing it.Fifteen men who are habitual snorers were divided randomly intothree groups to test the devices. Each man's sleep was monitored

for one night by a machine that measures amount of snoring on a-point scale while using a device. Analyze the results shown

below to determine whether the three devices are equally effectiveor not.

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 35

Page 389: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 389/675

Solution The overall sample median is . ince N is odd, wehave t and the data are

We calculate : from ( . ) and N À = N : . With df = we ®nd : < P < : from Table of the Appendix. There is noevidence that the three medians differ.

The TAT A T solution to Example . is shown below. Notethat the results do not agree with ours. This is because they de®ne theU statistics as the number of sample observations that are less than orequal to delta, rather than the number strictly less than delta as wedid. This means that they de®ne t N = whenever N is odd,while we de®ne t N À = . The U statistics with this de®nition of 4 are , , and , for groups , , and , respectively. The reader canverify that these values make : as the printout shows. Thedifference in the answers is surprisingly large and the conclusions arenot the same. The TAT A T printout also shows an exact P -valueand a point probability. We discuss these after Example . .

Squeaker rist tie Chin strap

Group 2

< 5

35 CHAPTER 10

Page 390: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 390/675

Exam le 2.2 The staff of a mental hospital is concerned with whichkind of treatment is most effective for a particular type of mentaldisorder. A battery of tests administered to all patients delineated agroup of patients who were similar as regards diagnosis and alsopersonality, intelligence, and projective and physiological factors.These people were randomly divided into four different groups of each for treatment. For months the respective groups received ( )electroshock, ( ) psychotherapy, ( ) electroshock plus psychotherapy,and ( ) no type of treatment. At the end of this period the battery of tests

was repeated on each patient. The only type of measurement possiblefor these tests is a ranking of all patientson thebasis of their relativedegree of improvement at the end of the treatment period; rank in-dicates the highest level of improvement, rank the second highest,and so forth. On the basis of these data (see Table . ), does there seemto be any difference in effectiveness of the types of treatment

Solution We use the median test to see whether the four groups havethe same location. The overall sample median is the observation withrank . since N , and we note that t and n i t= N . Theresults are

Table 2.1 Ran ing of atients

Groups

2

Group 2

< : 5 :

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 359

Page 391: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 391/675

We calculate : from ( . ) and N À = N : . From Table with df = , we ®nd P < : and we reject the null hypothesis of

equal medians for the four groups.

The TAT A T solution for this example is shown below. Theresults agree with ours and always will when N is even. As withExample . , the TAT A T solution shows the calculation of anexact P value and a point probability; this is exact in the sense that itis calculated using the multivariate hypergeometric distribution givenin ( . ). TAT A T also provides a onte arlo estimate of the P-value. The reader is referred to the TAT A T manual for moredetails.

10.3 EXTENSION OF THE CONTROL MEDIAN TEST

The control median test was presented in hapter as an alter-native to the median test in the context of the two-sample problem.

We showed that the control median test is simple to use, is asef®cient as the median test in large samples, and is advantageousin certain experimental situations. We now present a generalizationof the control median test, due to en ( ), to the k-sample case,

360 CHAPTER 10

Page 392: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 392/675

where the null hypothesis is that k 5 populations are identicalagainst the general alternative that the populations are different insome way.

uppose that independent random samples of size n ; n ; . . . ; n kare available from populations through k. Without any loss of gen-erality let sample be the control sample. First, we choose q 5

fractions < p < p < ÁÁÁ< pq < and ®nd the quantiles X < X < ÁÁÁ< X q , corresponding to the fractions, from the ®rst sample.Thus X i is the r i th-order statistic of the ®rst sample wherer i n pi and x denotes the largest integer not exceeding x.

The q quantiles de®ne q nonoverlapping and contiguouscells or blocks written as

I j : X j ; X j for j ; ; . . . ; q

where X ÀI and X q

I . For the two-sample control mediantest we have k ; q ; p n = , and the test is based on thenumber of observations in sample that belong to I . For k > sam-ples and q5 quantiles, we count for the ith sample the numberof observations ij that belong to the block I j ; j ; ; . . . ; q;i ; ; . . . ; k, so that n i

q j ij ; j r j for j 5 and r . The

generalization of the control median test is based on these counts. Itmay be noted that in the terminology introduced in hapter , thecount ij is the frequency of the j th block for the ith sample.

The derivation of the joint distribution of the counts ij ; i ; ; . . . ; k; j ; ; . . . ; q, provides an interesting example of computing probabilities by conditioning. To this end, observe thatgiven (conditional on) the q quantiles from the ®rst sample, X < X < ÁÁÁ< X q , the joint distribution of i ; i ; . . . ; iq , for anyi ; ; . . . ; k, is multinomial, given by

n !vi !vi ! ÁÁÁviq !

F i X vi F i X À F i X vi

ÁÁÁ F i X q

À F i X qÀ viqÀ À F i X q viq :

where viq n i

Àvi vi

ÁÁÁviqÀ .

The desired (unconditional) joint probability distribution of i ; i ; . . . ; iq can be derived by calculating the expectation of theexpression in ( . ) with respect to the chosen quantiles from the ®rstsample. The joint distribution of the q quantiles from the ®rst sample is

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 361

Page 393: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 393/675

1

ð10 À1Þð11 À1ÞÁÁÁð1 À1 À1Þ 1 ½ 1ð 1Þ10À1

 ½ 1ð 2Þ À 1ð 1Þ11 À1

ÁÁÁ½ 1ð Þ À 1ð À1Þ1 À1À1

½1 À 1ð Þ1

À1 1 2 ÁÁÁ 1 1 ¼ 1 À ð10 þ 11 þÁÁÁþ1 À1Þ ð1Þ1 ð2Þ1 ðÞ1 ðV 0 V 1 V Þ

ðV 0 V 1 V Þ 6¼ V ’

½V ¼ ¼2 3 ¼0 1

¼1

1 QÀ1 ¼0 ð À1ÞZ ÁÁÁ Z Y

À1

¼0½ 1ð þ1ÞÀ 1ð Þ1

À1

½1À 1ð Þ1

ÂY

¼2 Q ¼0 1 Y ¼0½ ð þ1À ð Þ " #Y ¼1 1ð Þ ð3 2Þ

À1 ¼0 1 ÁÁÁþ1 ¼ 1 0 V ’

Q

¼1

Q

¼1 YÀ1

¼0ð À1Þ

ð1 À1Þ

Q

¼2 " # ð3 3Þ ¼P

¼1 ¼P

¼1 ¼0 1

V 3 3

0

Qü

¼0

À1

¼1

À

!2

¼ 1 ð 1 þ1Þ ¼0 1 M 1951

Page 394: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 394/675

Page 395: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 395/675

magnitude of each observation when compared with every otherobservation. This comparison is effected in terms of ranks.

ince under H we have essentially a single sample of size N fromthe common population, combine the N observations into a single or-dered sequence from smallest to largest, keeping track of which ob-servation is from which sample, and assign the ranks ; ; . . . ; N to thesequence. If adjacent ranks are well distributed among the k samples,which would be true for a random sample from a single population, thetotal sum of ranks, N

i i N N = , would be divided pro-portionally according to sample size among the k samples. For the ithsample which contains n i observations, the expected sum of rankswould be

n i

N N N n i N

Equivalently, since the expected rank for any observation is theaverage rank N = , the expected sum of ranks for n i observationsis n i N = . enote the actual sum of ranks assigned to the ele-ments in the ith sample by R i. A reasonable test statistic could bebased on a function of the deviations between these observed andexpected rank sums. ince deviations in either direction indicate dis-parity between the samples and absolute values are not particularlytractable mathematically, the sum of squares of these deviations canbe employed as

S k

i Ri À

n i N ! :

The null hypothesis is rejected for large values of S .In order to determine the null probability distribution of S ,

consider the ranked sample data recorded in a table with k columns,where the entries in the ith column are the n i ranks assigned to theelements in the ith sample. Then R i is the ith-column sum. Under H ,the integers ; ; . . . ; N are assigned at random to the k columns exceptfor the restriction that there be n i integers in column i. The totalnumber of ways to make the assignment of ranks then is the numberof partitions of N distinct elements into k ordered sets, the ith of sizen i , and this is

N ! ki n i!F

364 CHAPTER 10

Page 396: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 396/675

Each of these possibilities must be enumerated and the value of Scalculated for each. If t s denotes the number of assignments with theparticular value s calculated from ( . ), then

f S s t s ki n i!= N !

Obviously, the calculations required are extremely tedious andtherefore will not be illustrated here. Tables of exact probabilities for Sare available in ijkoort ( ) for k ; , and , but only for n i equaland very small. ritical values for some larger equal sample sizes arealso given.

A somewhat more useful test criterion is a weighted sum of

squares of deviations, with the reciprocals of the respective samplesizes used as weights. This test statistic, due to ruskal and Wallis( ), is de®ned as

H N N

k

i n i Ri À

n i N ! :

The consistency of H is investigated in ruskal ( ). H and S areequivalent test criteria only for all n i equal. Exact probabilities for H are given in Table of the Appendix for k , all n i 4 . The tables inIman, uade, and Alexander ( ) also cover k , all n i 4 and k , all n i 4 for the upper of the exact distribution.

ince there are practical limitations on the range of tables whichcan be constructed, some reasonable approximation to the null dis-tribution is required if a test based on the rationale if S is to be usefulin application.

Under the null hypothesis, the n i entries in column i were ran-domly selected from the set f ; ; . . . ; N g. They actually constitute arandom sample of size n i drawn without replacement from the ®nitepopulation consisting of the ®rst N integers. The mean and variance of this population are

m N

i

i N

N

s N

i

i À N = N

N ÀThe average rank sum for the ith column, Ri Ri=n i, is the meanof this random sample, and as for any sample mean from a ®nitepopulation.

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 365

Page 397: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 397/675

E Ri m var Ri s N Àn i

n i N ÀHere then we have

E Ri N

var Ri N N Àn i

n i

cov Ri; R j À N

ince Ri is a sample mean, if n i is large, the entral imit Theoremallows us to approximate the distribution of

Zi Ri À N =

N N Àn i = n ip :

by the standard normal. onsequently Zi is distributed approximatelyas chi square with one degree of freedom. This holds for i ; ; . . . ; k,but the Zi are clearly not independent random variables since

ki n i Ri N N = , a constant. ruskal ( ) showed that

under H O , if no n i is very small, the random variable

k

i

N Àn i

N Zi

k

i

n i Ri À N = N N

H :

is distributed approximately as chi square with k

À degrees of free-

dom. The approximate size a rejection is H 5 wa ; kÀ . ome otherapproximations to the null distribution of H are discussed in Alexanderand uade ( ) and Iman and avenport ( ). For a discussionabout the power of this test, see for example Andrews ( ).

The assumption made initially was that the populations werecontinuous, and this of course was to avoid the problem of ties. Whentwo or more observations are tied within a column, the value of H isthe same regardless of the method used to resolve the ties since therank sum is not affected. When ties occur across columns, the midrankmethod is generally used. Alternatively, for a conservative test the tiescan be broken in the way which is least conducive to rejection of H .

If ties to the extent t are present and are handled by the midrank

method, the variance of the ®nite population is

sN À À

t t À

366 CHAPTER 10

Page 398: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 398/675

ð Þ

4 4

¼1

À

À ð þ1Þ2

2

ð þ1Þð À Þ12 À Àð À1ÞPð2 À1Þ12

8>><>>:9>>=>>;

¼

¼1

12 À ð þ1Þ2

2

ð þ1Þ À

Pð2 À1Þ À1

¼

1 À

Pð2 À1Þ

ð 2

À1

Þ

ð4 5Þ

4 2

1 ÀPð2 À1Þ ð 2 À1Þ

1

¼ À

ffiffiffiffiffiffiffiffiffiffiffi

½ ð þ1Þ12 ð1 þ1 Þ

ð4 6Þ

ü ½ ð À1Þ [ À1 ] j Ã

À1 2 ¼0 20 1 À

à ¼0 20 :

D 1964

1 2 3 4 5 6 7 8 9 10

à 1 282 1 645 1 834 1 960 2 054 2 128 2 189 2 241 2 287 2 326

7

Page 399: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 399/675

K

0 1 ¼ 2 ¼ ÁÁÁ ¼

1 ’

1

þ2

þÁÁÁþ

¼

1

4 2 :

¼12

ð þ1Þ

¼1

2

À3ð þ1Þ ð4 7Þ

K ¼3 5 À1

¼3

B K

4 6

À Ã ffiffiffiffiffiffiffiffiffiffi ð þ1Þ12

1 s ð4 8Þ

¼ ¼ 4 6 Ã

ffiffiffiffiffiffiffiffiffiffiffiffiffi ffi ð þ1Þ6

p 2 2 K

8

Page 400: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 400/675

1 40 2 1 1 ¼260 2 ¼122 3 ¼90

4 ¼348 1 ¼ 2 ¼ 3 ¼ 4 ¼10

¼12

40ð41Þð10Þ½260 2 þ1222 þ902 þ348 2 À3ð41Þ ¼31 89

3 B 0 001

¼0 20 1 ¼26 0 2 ¼12 2 3 ¼9 0

4 ¼24 8 4 8 11 125 1 2 1 3 2

4

4 1 M B S S S X C

M B ¼31 89 0 000 3 M B 4 5 M

B

4 3 ð þ1Þ2

Page 401: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 401/675

4 47

4 34 8 20 5 1 72 1 26 0

S S S X C

7

Page 402: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 402/675

p 2 1 K

1 15 1

P 2 ¼5688 5 ¼1137 6 ¼12ð1137 6Þ15ð16ÞÀ3ð16Þ K ¼3 1 ¼ 2 ¼ 3 ¼5 0 0010 010

S

0 20 ü1 834 ¼3

W

6 15 29 13 3

10 11 412 14 1

5 7 8S 42 60 18

7

Page 403: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 403/675

side of ( . ) is . . The sample mean ranks are R : , R , R : . Our conclusion is that only groups and havesigni®cantly different median treatment effects at the overall. signi®cance level. ecall that our hand calculations did not

lead to a rejection of the null hypothesis by the median test inExample . .

The computer solutions to Example . are shown below usingthe A , TAT A T, and INITA packages. The results for thevalue of H agree exactly. The P value using the chi-square approx-imation is . , which agrees with the outputs. Note that bothTAT A T and A allow the user the option of computing what theycall an exact P value based on the permutation distribution of the

ruskal-Wallis statistic. This can be very useful when the sample si-zes are small so that the chi-square approximation could be suspect.However, the exact computation is quite time consuming even formoderate sample sizes, such as as in Example . . For thisexample, A ®nds this exact P value to be . , which makesthe results seem far more signi®cant than those inferred from eitherTable or the chi-square approximation. INITA does not have anoption to calculate an exact P value and it does not provide the cor-rection for ties.

3 2 CHAPTER 10

Page 404: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 404/675

K-

K

D P

¼1 ¼ 1 2

ð Þ

¼1 2

P ð Þ S ð Þ

ð Þ

7

Page 405: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 405/675

Q ¼

¼1 P ½ ð Þ À ðP ¼1P ½ ð Þ Þ 2

ð5 1Þ

½ ð Þ 1 2

¼

¼1 ¼

¼1 ½ ð Þ

S 1967 170–172

ð 1 2 Þ ! 1

ð À1ÞQ

P

¼1 ð À Þ2

À1

S

¼ À1

P

¼1½ ððÞÞ2

¼1

½P ððÞÞ 2

¼ À1

P

¼1½À1ð þ1Þ2

¼1

½PÀ1ð þ1Þ 2

p 2 2 S X C

7

Page 406: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 406/675

29 27 0

R K

S

C

K

‘‘ ’’

B BB B 1972 R D 1988

7

Page 407: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 407/675

10.6 TESTS AGAINST ORDERED ALTERNATIVES

uppose we are concerned with testing the null hypothesis H that thepopulations are identical against the alternative hypothesis that thelocation parameters are in an increasing order,

H : y 4 y 4 ÁÁÁ4 y k

where at least one of the inequalities is strict. This alternative wouldbe of interest if, for example, it is expected that increasing the value of a factor would result in an increase in the value of a response. If theexpected direction in H is not the natural ordering, we may simply

relabel the treatments so that the postulated order agrees with thenatural order here. This alternative is also known as the ``simpleorder.'' It may be noted that under the location model when yi < y j , thepopulation corresponding to F j is stochastically larger than thepopulation corresponding to F i .

A number of distribution-free tests for the problem of simpleorder are available in the literature. One way to motivate some of these tests is to note that the alternative hypothesis H may be writtenin an expanded form as

y 4 y ; y 4 y ; . . . ; y 4 y k

y 4 y ; y ; 4 y ; . . . ; y 4 y k :

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .y kÀ 4 y kÀ ; y kÀ 4 y k

y kÀ 4 y k

where at least one of the k k À = inequalities must be strict. Hencethe problem of testing H against H may be viewed as a collection of k k À = test problems, each of which is a two-sample problem. Thisobservation allows us to contemplate several tests for the problem athand since we have a number of tests already available for the two-sample problem.

There are two basic questions we need to address. It is clear thatall of the k k

À= two-sample test statistics must be of the same

form, for example, ann-Whitney, median, or control median. The®rst question is which of the available two-sample tests should be usedand why. This is a problem we ®rst encountered in hapter , and wecan use either the optimal test (from the A E point of view) or some

3 6 CHAPTER 10

Page 408: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 408/675

test that is more attractive from a practical point of view. Now, after atwo-sample test is chosen and k k À = of these tests are performed,the next question is to decide how to combine these tests into a single®nal test with desirable properties.

A popular test for the ordered alternatives problem is atest proposed by Terpstra ( ) and onckheere ( ) indepen-dently, hereafter called the T test. The T test uses a ann-Whitney statistic U ij for the two-sample problem comparing samplesi and j , where i; j ; ; . . . ; k with i < j , and an overall teststatistic is constructed simply by adding all the U ij . Thus the teststatistic is

B U U ÁÁÁU k U U ÁÁÁU k ÁÁÁU kÀ; k

4 i< j 4 kU ij

kÀi

k

j i

n i

r

n j

s I X ir < X js

where X ir is the r th observation in sample i and X js is the sth obser-vation in sample j , and I is the usual indicator function. The appro-priate rejection region is large values of B because if the alternative H is true, observations from the j th sample will tend to be larger than theobservations from the ith sample. Thus the appropriate rejectionregion is

B 5 B a ; k; n ; n ; . . . ; n k

where P B5 B a ; k; n ; n ; . . . ; n k 4 a is satis®ed under H .The T test is distribution free if the cdf's are all continuous.

ince all N != P ki n i! rank assignments are equally likely under H ,

the null distribution of the test statistic B can be obtained by com-puting the associated value of B for each possible ranking and enu-merating. The required calculations are bound to be tedious, especiallyfor large n i , and will not be illustrated here. However, some exactcritical values have been tabulated for k , 4 n 4 n 4 n 4 ; k ; ; ; n n ÁÁÁ n k and are given in Table of the Appendix for selected a .

In practice, for larger sample sizes, it is more convenient to usean approximate test. If n i= N tends to some constant between and ,the distribution of the random vector U ; U ; . . . ; U kÀ; k under H can be approximated by a k k À = -dimensional normal distribution.

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 3

Page 409: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 409/675

From the results in ection . for the ann-Whitney test we have E U ij n in j = under H so that

E B 4 i< j 4 k

n in j N À ki n i :

The derivation of the variance of B is more involved and is left as anexercise for the reader. The result under H is

var B N N À

ki n i n i :

In view of these results, an approximate level a test based on the Tstatistic is to reject H in favor of H if

B5 E B za var B =

where za is the Àa th quantile of the standard normal probabilitydistribution.

ecause of our assumption of continuity, theoretically there canbe no tied observations within or between samples. However, ties dooccur in practice, and when the number of ties is large the test shouldbe modi®ed in a suitable manner. When observation(s) from sample iare tied with observation(s) from sample j , we replace U ij by U Ãij (seethe discussion of the problem of ties with the ann-Whitney statisticin hapter ), de®ned as

U Ãij n i

r

n j

s Drs

where

Drs

if X ir < X js= if X ir X js

if X ir > X jsVXThis is equivalent to replacing the ranks of the tied observations by

their midranks in the combined samples i and j . The T test in the

case of ties is then based on

BÃ kÀi

k

j U Ãij :

3 CHAPTER 10

Page 410: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 410/675

Under the null hypothesis, the expectation of BÃis the same asthat of B, given in ( . ). Also, the variance of BÃunder H is

var BÃ À N N À N À k

in i n i À n i

À t t À t ! N N À N À À

 k

in i n i À n i À ! t t À t À !

N N À À k

in i n i À ! t t À ! :

where denotes the summation over all distinct values among the N observations, and t denotes the number of occurrences (multiplicity) of a particular distinct value. For a proof of this result see, for example,endall and Gibbons ( , pp. ± ). When there are no ties in thedata, t for all N observations and the expression ( . ) reduces to( . ). Although tables are apparently not yet available for the exactnull distribution of BÃ, an approximate test for large sample sizes canbe based on the statistic BÃÀ E BÃ = var BÃ = and the standardnormal distribution. Thus, in the case of ties, an approximately size a

T test is to rejectH

in favor of H

if

BÃ5 E BÃ za var BÃ = :

The T test is consistent against ordered alternatives under theassumption that n i= N tends to some constant between and as N tends to in®nity. The asymptotic relative ef®ciency and some asymp-totic power comparisons with competing tests are discussed in uri( ).

A vast body of literature is available on related and alternativeprocedures for this problem. hacko ( ), Hogg ( ), uri ( ),onover ( ), horack ( ), ohnson and ehrotra ( ), Tryon

and Hettmansperger ( ), hirahata ( ), Fairley and Flinger( ), and Hettmansperger and Norton ( ), among others,consider various extensions. Among these, uri ( ) studies ageneralization of the T test by introducing a class of linear rank

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 3 9

Page 411: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 411/675

statistics using an arbitrary score function and any two-samplehernoff- avage statistic. When the expected normal scores are

used, uri's procedure is highly recommended ( erenson, ) whenthe samples are from ( ) normal populations; ( ) light-tailed popu-lations of the beta family, regardless of symmetry; ( ) heavy-tailedand moderately skewed populations; and ( ) heavy-tailed and veryskewed populations. Tyron and Hettmansperger generalize uri'sclass of tests by including weighting coef®cients to form linearcombinations of two-sample hernoff- avage statistics and providesome interesting results about how to determine the optimalweighting coef®cients. hakraborti and esu ( a ) adopt a similarapproach using the two-sample control quantile (median) test sta-

tistics and show that their optimal test has higher A E in certainsituations.

APPLICATIONS

The T test rejects H against the ordered alternative H when B issigni®cantly large. Thus, the exact P value is

P B5 b j H

where b is the observed value of the test statistic B. When sample sizesare moderately large, the normal approximation to the P value is

given by

ÀFB À E B

var Bp 4 5where E B and var B are given in ( . ) and ( . ).

Exam le 6.1 Experts have long claimed that speakers who use somesort of audiovisual aids in their presentations are much more effectivein communicating with their audience. A consulting agency would liketo test this claim; however, it is very dif®cult to ®nd many speakerswho can be regarded as (virtually) identical in their speaking cap-abilities, and the agency is successful in locating only such speakersfrom a nationwide search. The speakers are randomly assigned to one

3 0 CHAPTER 10

Page 412: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 412/675

35

30 100

1 3 2

0 1 ¼ 2 ¼ 3 1 1 2 3 ¼3 J M 12 13 23 12 ¼22 13 ¼24 23 ¼21

¼67 J R

ð 67 0Þ 0 0044 0 1

0ð Þ ¼37 5 0ð Þ ¼89 5833

¼3 1168

1 À ð3 12Þ ¼0 0009 J

S S S X C

1 2 3

74 58 68 60 69 70 72 75 80 71 73 78 88 85 76

8

Page 413: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 413/675

8

Page 414: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 414/675

10. COMPARISONS ITH A CONTROL

The case of ordered alternatives is one example of a situation wherethe experimenter wishes to detect, a priori, not just any differencesamong a group of populations, but differences only in some speci®cdirections. We now consider another example where the alternative tothe null hypothesis of homogeneity is restricted, a priori, in a speci®cdirection.

uppose we want to test only a partial ordering of the k Àdistributions with respect to a common distribution. This will be thesituation where only a comparison between each of the distributionsand the common distribution is of interest and what happens amongthe k À distributions is somewhat irrelevant. For example, in a

drug screening study, it is often of interest to compare a group of treatments under development to what is currently in use (whichmay be nothing or a placebo), called the control, and then subjectthose treatments that are ``better'' than the control to more elaboratestudies. Also, in a business environment, for example, peoplemight consider changing their current investment policy to one of anumber of newly available comparable policies provided the payoff ishigher.

The alternative hypothesis of interest here is another general-ization of the one-sided alternatives problem to the case of severalsamples and constitutes a subset of the ordered alternatives problemdiscussed earlier. One expects, at least intuitively, to be able to utilizethe available pertinent information to construct a test which is more

powerful than the tests for either the general or the ordered alter-natives problem.

Without any loss of generality let F be the cdf of the controlpopulation and let F i be the cdf of the ith treatment population,

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 3 3

Page 415: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 415/675

i ; ; . . . ; k, where F i F xÀyi with F i p, so that yi is the pth< p < quantile of F i . Our problem is to test, for a speci®ed p, the

null hypothesis that the pth treatment quantiles are equal and equalto the pth control quantile,

H : y y ÁÁÁ y k y

against the one-sided alternative hypothesis,

H : y 5 y ; y 5 y ; . . . ; y k5 y

where at least one of the inequalities is strict. As pointed out earlier,when yi > y ; F i is stochastically larger than F . In the literature onhypothesis testing with restricted alternatives, this alternative isknown as the simple-tree alternative. iller ( ), among others,calls this a ``many-one'' problem.

In some cases it might be of interest to test the alternative in theopposite direction, y 4 y ; y 4 y ; . . . ; y k 4 y , where at least one of the inequalities is strict. The tests we discuss can easily be adapted forthis case.

Our approach in the many-one problem is basically similar tothat in the ordered alternatives problem in that we view the testingproblem as a collection of k À subtesting problems H i: yi yagainst H i : yi5 y for i ; ; . . . ; k. Thus, a distribution-free test fortesting H against H is obtained in two steps. First an appropriateone-sample test for the ith subtesting problem is selected and then k À of these tests are combined in a suitable manner to produce anoverall test.

efore considering speci®c tests we would like to make a dis-tinction between the cases (i) where suf®cient prior knowledge aboutthe control is at hand so that the control population may be assumed tobe known (in this case the control is often called a standard), at least tothe extent of the parameter(s) of interest, and (ii) where no concreteknowledge about the control group is available. These two cases will betreated separately and the test procedures for these cases will besomewhat different.

CASE I : u1 NO N

First consider the case of testing H where y is either known orspeci®ed in advance, against the alternative H . In this case, thesubtesting problem, for every i ; ; . . . ; k, is a one-sample problemand therefore one of several available distribution-free tests can be

3 4 CHAPTER 10

Page 416: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 416/675

used in step one. For example, if the only reasonable assumption aboutthe parent distributions is continuity, we would use the sign test. Onthe other hand, if it can be assumed that the underlying distributionsare symmetric about their respective preselected quantiles, we maywant to use the Wilcoxon signed rank test. For simplicity and ease of presentation we detail only the tests based on the sign test, althoughone can proceed in a similar manner with some other one-sampledistribution-free tests. art of this discussion is from the papers byhakraborti and Gibbons ( , ).

The usual sign test statistic for testing H i against H i is based onthe total number of negative differences X ij Ày in the ith sample,

i

n i

j I X ij Ày <

i ; ; . . . ; k, and we reject H i in favor of H i if i is small. With thismotivation, a simple overall test of H against H can be based on ,the sum of the i 's, i ; ; . . . ; k, and the rejection region consists of small values of . One practical advantage of using is that under H , has a binomial distribution with parameters N k

i n i and p. Accordingly, P values or exact critical values can be found usingbinomial tables for small to moderate sample sizes. For larger samplesizes, the normal approximation to the binomial distribution can beused to construct tests with signi®cance levels approximately equal tothe nominal value or to ®nd the approximate P value.

A simple modi®cation of this sum test when the sample sizes arequite different is to use à ki i=n i as the test statistic since the

i 's, and hence , may be quite sensitive to unequal sample sizes. Theexact and/or approximate test can be implemented as before, althoughfor the exact test we no longer have the convenience of using tables of the binomial distribution. The details are left as an exercise for thereader (see, for example, hakraborti and Gibbons, ).

An alternative test for this problem may be obtained by applyingthe union-intersection principle ( oy, ). Here the null hypothesis H is the intersection of the k À subalternative hypotheses H i : yi5 y . Thus, H should be rejected if and only if at least one of thesubnull hypotheses H i is rejected and the latter event takes place if the smallest of the statistics ; ; . . . ; k is too small. In other words,

an overall test may be based on

min n ;

n ; . . . ;

k

n k :

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 3 5

Page 417: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 417/675

and one should reject H in favor of H if is small. The test based on is expected to be more sensitive than the sum test since a rejectionof any of the k À subnull hypotheses here would lead to a rejectionof the overall null hypothesis.

The exact distribution of can be obtained using the fact that ; ; . . . ; k are mutually independent and under H each i has abinomial distribution with parameters n i and p. Thus

P 4 vj H À k

i Àn i v

j

n i

j p j À p n iÀ j 4 5 :

where v is a fraction between and and x denotes the greatestinteger not exceeding x. The exact null distribution can be used tocalculate exact P values for small sample sizes. When sample sizes arelarge, it is more convenient to use the normal approximation to thebinomial distribution to calculate approximate P values.

CASE II : u1 UN NO N

When y is unknown, we use the same general idea of ®rst choosing asuitable test for the ith subtesting problem (which in the present caseis a two-sample problem), i ; ; . . . ; k, and then combining the k À test statistics to construct an overall test statistic. However, asmight be expected, the details are more involved, because the statisticsto be combined are now dependent.

To study our tests in this case consider, for the ith subtestingproblem, the following `` i to '' statistics

i

n i

j I X ij < T

where T is a suitable estimate of y . y analogy with the one-samplecase the quantity i can be called a two-sample sign statistic, whichallows us to consider direct extensions of our earlier procedures to thepresent case. It may be noted that if in fact T is a sample order statistic(for example, when y is the median of F ; T should be the median of the sample from F ), i is simply the placement of T among theobservations from the ith sample. We have seen that the distribution

of i does not depend on F i under H , and therefore any test based onthe 's is a distribution-free test.

Now consider some tests based on a combination of the 's. Again, as in case (i), we can use the sum of the 's for a simple overall

3 6 CHAPTER 10

Page 418: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 418/675

0 1 W C D 1988 CD

W W W ’

¼2 3 W ’ ð Þ ¼0 1 ð À 1Þ

½W ¼ ¼ Z 1

À1Y

¼2 ½ ðÞ ½1 À ðÞ À ðÞ ð7 3Þ

¼0 1 ¼2 3 2 þ 3 þÁÁÁþ ¼ 7 3

W

W W

2 W

1 À 1 P 2 28 6 10

½W ¼ 0 ¼ À À

À 1 À þ À1 À 1 0

¼0 1 ¼1 2 1

½W ¼ 0 ¼1

À 1 1 À1

À1 À1

þ À1 0 ð7 4Þ

P 2 28

0ðW Þ ¼ ð À 1Þ þ1 ð7 5Þ

87

Page 419: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 419/675

and

var i n Ài N N Àn

n n:

The null distribution can be used to determine the exact criticalvalue for a given level of signi®cance a or to ®nd the P value for anobserved value of .

For some practical applications and further generalizations, it isuseful to derive the large sample distribution of . We ®rst ®nd thelarge sample distribution of the k À dimensional random vector

; ; . . . ; k . It can be shown ( hakraborti and esu, a ;Gastwrith and Wang, ) that the large sample distribution of thisrandom vector can be approximated by a k À -variate normal dis-tribution. The result is stated below as a theorem.

Theorem .1 Let N be the k À -dimensional vector whose ith ele-ment is N = i =n i À F i y ; i ; ; . . . ; k À Supposethat as min n ; n ; . . . ; n k 3 I we have n i= N 3 l i; < l i < ;i ; ; . . . ; k Also let F Hi y f i y exist and be positive fori ; ; . . . ; k The random vector N converges in distribution toa k À -dimensional normal distribution with mean vectorand covariance matrix S whose i ; jth element is

s ij i j p À p

ldij F i y À F i y

li

where i f i y = f y ; i; j ; ; . . . ; k À and dij is equal toif i j and is equal to otherwise

From Theorem . , the null distribution of can be approximatedby a normal distribution with mean N Àn p and variance N N Àn p À p =n . Thus, for case (ii), an approximately size a testbased on the sum of the 's is to reject H in favor of H if

4 N Àn p À za N N Àn p À p

n != :

As we noted earlier, alternative nonparametric tests are availablefor our consideration in step , where the basic set of test statistics ischosen. For example, when y is unknown so that we choose a

3 CHAPTER 10

Page 420: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 420/675

two-sample test for the ith subtesting problem, we can employ theann-Whitney U statistic (or more generally any linear rank statistic)between the ith and the ®rst sample, say U i, and combine these U 's insome suitable manner for an overall test statistic. The resulting testsare distribution-free since the distribution of U i does not depend oneither F i or F under H .

As in the case of the 's, the sum of the U 's, say à k

i U i , can be used for a simple overall test and such a testhas been proposed and studied by Fligner and Wolfe ( ), here-after referred to as FW test. The null distribution of à is thesame as that of a two-sample ann-Whitney statistic (see ection. ) with m n and n N Àn . It can be seen that the FW test

rejects H in favor of the simple-tree alternative if Ãis large sothat the P value is in the upper tail. A choice between the testsbased on the sum of U 's and the sum of 's may be made on thebasis of the A E. Interestingly, when p : (that is, when yi is themedian of F i), the A E between these two tests can be shown to bethe same as the A E between the sign test and the signed-ranktest, regardless of the underlying distribution. For example, whenthe underlying distribution F is normal, the A E is . , whereaswhen F is double exponential, the A E is . .

APPLICATIONS

The test rejects H against the simple-tree alternative H when is signi®cantly small. Thus the P value is P 4 wj H , where w is theobserved value of the test statistic . The exact P value can be cal-culated using the null distribution of given in ( . ) by adding theindividual probabilities under H . However, for moderately largesample sizes, the normal approximation to the P value is adequate.

This can be calculated from Fh À E

var p i, where E and var

are given in ( . ) and ( . ), respectively.

Exam le .1 onsider again the problem in Example . , where agroup of speakers are compared with respect to their ability to com-

municate. ince the ®rst group of speakers use no audiovisual aids, wecould regard this as the control group. It is reasonable to expect thatthe use of audiovisual aids would have some bene®cial effects incommunication, and hence the scores for at least one of the groups

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 3 9

Page 421: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 421/675

and should be greater than those for group . Thus we are interestedin testing H : y y y against H : y 5 y ; y 5 y , where at leastone of the inequalities is strict. In this example, y is unknown, andtherefore we will use the test with k ; n n n . Fromthe data we ®nd T , , and . Hence and theexact P value from ( . ) is

À ÀÀ À

À 0 ! :

Therefore, at say a : , there is evidence that the median scores of at least one of groups and is greater than that of group . In order

to use the normal approximation we ®nd E andvar : , from ( . ) and ( . ), and the approximate P valuefrom Table A of the Appendix is F À: : . This is reasonablyclose to the exact P value even though the sample sizes are small.

For the Fligner-Wolfe test, we ®nd U and U so thatthe P value is P Ã5 jH . To calculate this probability note that Ãis in fact the value of the ann-Whitney statistic calculated be-tween sample (as the ®rst sample) and samples and combined (asthe second sample). Now using the relationship between U statisticsand rank-sum statistics, it can be seen that the required probability isin fact equal to the probability that the rank-sum statistic between twosamples of size m and n is at most under H . This is foundfrom Table of the Appendix as . . Thus, the evidence against thenull hypothesis in favor of the simple tree alternative is stronger, onthe basis of the FW test.

10. THE CHI SQUARE TEST FOR k PROPORTIONS

There is one further k-sample test which should be mentioned,although it is applicable in a situation different from that of the othertests in this chapter. If the k populations are all ernoulli distribu-tions, the samples consist of count data. The k populations are allequivalent if the ernoulli parameters y ; y ; . . . ; y k are all equal. et X ; X ; . . . ; X k denote the numbers of successes in each of the k sam-ples, respectively. For i ; ; . . . ; k; X i has the binomial distribution

with parameter yi and number of trials n i. A test statistic for the nullhypothesis

H : y y ÁÁÁ y k unspecified

390 CHAPTER 10

Page 422: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 422/675

against the alternative that the y's are not all equal can be derivedfrom the goodness-of-®t test exactly as in the case of the k-samplemedian test in ection . , except that our classi®cation criterionbesides sample number is now simply success or failure for eachsample instead of less than d or not. Using the same notation as before,with y denoting the estimated common parameter under H (whichreplaces t= N ), we have

f i xi

f i n i À xi

ei n iy ei n i Ày

and the test criterion from ( . ) is k

i

xi Àn iyn iy Ày

where y ki xi= N , which has approximately the chi-square dis-

tribution with k À degrees of freedom. Hence the approximately sizea test is to reject H in favor of the alternative if 5 w kÀ;a .

A simple form of the test statistic , useful for calculations, is

y Ày

k

i

xin i À

N y

Ày:

Exam le .1 In a double-blind study of drugs to treat duodenal pepticulcers, a large number of patients were divided into groups to comparethree different treatments, antacid, antipepsin, and anticholinergic. Antacid has long been considered the major medication for ulcers; thelatter two drugs act on different digestive juices. The number of pa-tients in the groups and the percent who bene®ted from that treat-ment are shown below. oes there appear to be a difference inbene®cial effects of these three treatments

Solution We must ®rst calculate the number of patients bene®ted byeach drug as x : ; x : ; x : ;

Treatment Number Percent bene®ted

Antacid Anticholinergic Antipepsin

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 391

Page 423: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 423/675

then the estimate of the common probability of bene®t is y= : and

: : À:

::

with degrees of freedom. Table of the Appendix shows that P < : , so we conclude that the probabilities of bene®t from eachdrug are not equal.

10.9 SUMMARY

In this chapter we have been concerned with data consisting of kmutually independent random samples from k populations where thenull hypothesis of interest is that the k populations are identical. Actual measurements are not required to carry out any of the tests.

When the location model is appropriate and the alternative isthat the locations are not all the same, the median test extension, theruskal-Wallis, Terry and van der Waerden tests are all appropriate.The median test uses less information than the others and thereforemay be less powerful. Further, exact P values require calculationsbased on the multivariate extension of the hypergeometric distribu-tion and this is quite tedious. As in the two-sample case, the mediantest is primarily of theoretical interest. On the other hand, the rus-kal-Wallis test is simple to use and quite powerful. Tables of the exact

distribution are available and the chi-square approximation is rea-sonably accurate for moderate sample sizes. All of the tests are quicker and easier to apply than the F test and

may perform better if the F test assumptions are not satis®ed. Fur-ther, as in the parametric setting, nonparametric methods of multiplecomparisons can be used in many cases to determine which pairs of population medians differ signi®cantly; see, e.g., iller ( , ).The advantage of a multiple comparisons procedure over separatepairwise comparisons is that the signi®cance level is the overall level,the probability of a Type I error in all of the conclusions reached.

If the alternative states a distinct complete ordering of themedians, the onckheere-Terpstra test is appropriate and exact Pvalues can be obtained. We also discuss tests where the alternativestates an ordering of medians with respect to a control group only,where the control median may be either known or unknown.

The asymptotic relative ef®ciency of the ruskal-Wallis testcompared to the normal theory test is at least . for any continuous

392 CHAPTER 10

Page 424: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 424/675

distribution and is . for the normal distribution. The Terry andvan der Waerden tests should have an A E of under these circum-stances, since they are asymptotically optimum for normal distribu-tions. The A E of the median test is only =p : relative to the Ftest for normal populations with equal variances, and = relative tothe ruskal-Wallis test, in each case for normal distributions. Forfurther details, see Andrews ( ). All the A E results stated for themedian test apply equally to the control median test, since these twotests have an A E of , regardless of the parent distribution.

Finally, we discuss the chi-square test for k proportions as anapproximate test for identical populations when the populations areall ernoulli and the only relevant parameters are the probabilities of

success.

PROBLEMS

10.1. Generate by enumeration the exact null probability distribution of the k-samplemedian test statistic for k ; n ; n ; n . If the rejection region consists of those arrangements which are least likely under the null hypothesis, ®nd this region Rand the exact a . ompute the values of the statistic for all arrangements and comparethat critical region for the same value of a with the region R.10.2. Generate the exact distribution of the ruskal-Wallis statistic H for the same kand n i as in roblem . . Find the critical region which consists of those rank sums R ; R ; R which have the largest value of H and ®nd exact a .10.3. y enumeration, place the median test criterion U ; U ; U and the H testcriterion R ; R ; R in one-to-one correspondence for the same k and n i as in roblems. and . . If the two tests reject for the largest values of and H , respectively, which

test seems to distinguish better between extreme arrangements10.4. erify that the form of H given in ( . ) is algebraically equivalent to ( . ).10.5. how that H with k is exactly equivalent to the large-sample approximationto the two-sample Wilcoxon rank-sum test statistic mentioned in ection . withm n ; n n .10.6. how that H is equivalent to the F test statistic in one-way analysis-of-varianceproblem if applied to the ranks of the observations rather than the actual numbers.

Hint Express the F ratio as a function of H in the form given in ( . ) or ( . ) to showthat

F k À

N À k N À

H À !ÀThis is an example of what is called a rank transform statistic. For related interesting

results, see for example, Iman and onover ( ).10. . Write the k-sample median test statistic given in ( . ) in the form of ( . ) (cf.roblem . ).

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 393

Page 425: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 425/675

10. . How could the subsampling procedure described in ection . be extended totest the equality of variances in k populations10.9. In the context of the k-sample control median test de®ned in ection . ,show that for any i ; ; . . . ; k and j ; ; . . . ; q the random variable ij has abinomial distribution. What are the parameters of the distribution (i) in general and(ii) under H 10.10. how that under H : y y ÁÁÁ y k, the distribution of Ã, the sum teststatistic for comparisons with a control when y is known in the case of unequal samplesizes de®ned in ection . , is given by

p à v k n i

vi : N

where N ki n i ; vi ; ; . . . ; n i ; i ; ; . . . ; k; are such that

k vvi=n i v. Enu-merate the probability distribution of à when (i) n n n and (ii)

n ; n n .10.11. In the context of the onckheere-Terpstra test discussed in ection . , showthat under H for i < j < r ;

cov U ij ; U ir j H cov U ji; U ri j H n in j n r =

cov U ij ; U ri j H cov U ji; U ir j H Àn i n j n r =

Hint cov U ij ; U ir var U ij ;U ir Àvar U ij var U ir var U i; j r Àvar U ij var U ir ,where U i; j r is the ann-Whitney U statistic computed between the ith sample and thecombined j th and r th samples.10.12. truckman- ohnson ( ) surveyed students in a study to compare theproportions of men and women at a small midwestern university who have been coercedby their date into having sexual intercourse (date rape). A survey of over studentsproduced responses. Of the female respondents, reported an experience of coercion, while of the male respondents reported coercion. Test the null hy-pothesis that males and females experience coercion at an equal rate.10.13. any psychologists have developed theories about how different kinds of brain dominance may affect recall ability of information presented in various for-mats. rown and Evans ( ) compared recall ability of subjects classi®ed intothree groups according to their approach to problem solving as a result of theirscores on the Human Information rocess urvey. The three groups are eft (ac-tive, verbal, logical), ight (receptive, spacial, intuitive), and Integrative (combi-nation of right and left). Information was presented to these subjects in tabularform about the number of physicians who practice in six different states. ecallwas measured by how accurately the subjects were able to rank the states fromhighest to lowest after the presentation concluded. For the scores in Table de-

termine whether median recall ability is the same for the three groups (higherscores indicate greater recall).

394 CHAPTER 10

Page 426: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 426/675

10.14. A matching-to-sample ( T ) task is used by psychologists to understand howother species perceive and use identity relations. A standard T task consists of havingsubjects observe a sample stimulus and then rewarding the subject if it responds to anidentical (matching) sample stimulus. Then the psychologist studies the ability of sub- jects to transfer the matching concept to other sample stimuli. Oden, Thompson, andremack ( ) reported a study in which four infant chimpanzees learned an T taskwith only two training sample stimuli. Then the chimpanzees were tested on their abilityto transfer the learning to three kinds of novel items, classi®ed as Objects, Fabrics, andFood. The data were recorded as number of correct matches in a total of trails. Onepurpose of the study was to show that the concept of matching is broadly construed bychimpanzees irrespective of the type of sample stimulus. etermine whether the data inTable support this theory.

10.15. Andrews ( ) examines attitudes toward advertising by undergraduatemarketing students at universities in six different geographic regions. Attitudes weremeasured by answers to a questionnaire that allowed responses on a -point ikert scale( strongly disagree and strongly agree). Three statements on the questionnaire

related to the social dimension were ( ) most advertising insults the intelligence of theaverage consumer; ( ) advertising often persuades people to buy things they shouldn'tbuy; and ( ) in general, advertisements present a true picture of the product being ad-vertised. For the mean scores given in Table , determine whether there are any regionaldifferences in attitude for the social dimension.

Table 1

Left Right Integrative

Table 2

Chimp Training bject Fabric Food

Whiskey iza Opal Frieda

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 395

Page 427: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 427/675

P 80 75 45

:

D ?7 R 100 100

100 10 ¼ 20 ¼ ’

30 ¼ ’ 40 ¼ ’ D ¼0 05

8 — 1 2 3 4 —

3 69 4 48 3 69M 4 22 3 75 3 25

3 63 4 54 4 09S 4 16 4 35 3 61S C 3 96 4 73 3 41S 3 78 4 49 3 64

55 15 30 25 60 15

10 19 31 3320 20 37 3430 36 20 2140 25 12 12

100 100 100

Page 428: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 428/675

and did not improve in four weeks of treatment. Test the null hypothesis that thetreatments are equally effective.

10.19. Eighteen ®sh of a comparable size in a particular variety are divided randomlyinto three groups and each group is prepared by a different chef using the same recipe.Each prepared ®sh is then rated on each of the criteria of aroma, ¯avor, texture, and

moisture by professional tasters. Use the composite scores below to test the nullhypothesis that mean scores for all three chefs are the same.

10.20. An of®ce has three computers, A; B, and C. In a study of computer usage, the®rm has kept records on weekly use rates for weeks, except that computer A was outfor repairs for part of weeks. The eventual goal is to decide which computers to putunder a service contract because they have a higher usage rate. As a ®rst step in thisstudy, analyze the data below on weekly computer usage rates to determine whetherthere is a signi®cant difference in average usage. an you make a preliminaryrecommendation

10.21. A company is testing four cereals to determine taste preferences of potentialbuyers. Four different panels of persons are selected independently; one cereal is

Treatment Number improved Number not improved

Chef A Chef B Chef C

. . .

. . .

. . .

. . .

. . .

. . .

A B C

. . .

. . .

. . .. . .

. . .. .

. .

TESTS OF THE EQUALITY OF k INDEPENDENT SAMPLES 39

Page 429: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 429/675

presented to all members of each panel. After testing, each person is asked if he wouldpurchase the product. The results are shown below. Test the null hypothesis that tastepreference is the same for each cereal.

10.22. elow are four sets of ®ve measurements, each set an array of data (for yourconvenience) on the smoothness of a certain type of paper, each set obtained from adifferent laboratory. Find an approximate P value to test whether the median smooth-ness can be regarded as the same for all laboratories.

10.23. erify the value of the ruskal-Wallis test statistic given in the A solution toExample . in hapter .

Cereal

A B C D

Number who would buy Number who would not buy

Laboratory Data

A . . . . . B . . . . .C . . . . . D . . . . .

39 CHAPTER 10

Page 430: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 430/675

v p

C 5

ð À Þ ¼ ð Þ þ ð Þ À2 ð Þ

Page 431: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 431/675

P

ð Þ ¼ ð Þ½ ð Þ ð Þ

1 2

1

Page 432: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 432/675

B

S ‘‘ ’’ :

1

þ1

2

À1

3 1 2 À1 þ1

4 5

À

À À À 6 À À

7

S

V

Page 433: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 433/675

P

ð Þð Þ

¼ ½ð Þ \ð Þ [½ð Þ \ð Þ¼ ½ð À Þð À Þ 0

¼ ½ð Þ \ ð Þ þ ½ð Þ \ ð Þ ¼ ½ð À Þð À Þ 0

¼ ½ð Þ \ ð Þ þ ½ð Þ \ð ÞP

t

t ¼ À 1 7 ¼ ¼

¼ ð Þ À ½ð Þ \ ð Þþ ð Þ À ½ð Þ \ ð Þ

¼ ð Þ þ ð Þ À ¼1 À

t

t ¼2 À1 ¼1 À2

t ? ð Þ ¼ð Þ

Page 434: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 434/675

¼ ð Þ ð Þ þ ð Þ ð Þ¼ ð Þ ð Þ þ ð Þ ð Þ ¼

t ¼0

t ¼0 ¼0

t

¼2

S 2

2

ð Þ ð Þ

¼ À

ffiffiffi2

V ¼ À

ffiffiffi2

ð V Þ ¼ð Þ S

¼ ðV 0Þ

¼Z 0

À1Z 0

À1j ð Þ þZ 1

0 Z 10

j ð Þ

¼2 Z 0

À1Z 0

À1j ð Þ ¼2 ð0 0Þ

j ð Þ ð Þ

S

ð0 0Þ ¼14 þ

12

¼12 þ

1

V

Page 435: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 435/675

t ¼2

t

K ’

S 11 1 K ’

t ¼ À ð2 1Þ ð Þð Þ

¼ ½ð À Þð À Þ 0 ¼ ½ð À Þð À Þ 0

ð2 2Þ t

ð 1 1

Þð 2 2

Þ

ð

Þ c d j j

¼ ð À Þ ð À Þ ð2 3Þ

ðÞ ¼À1 00 ¼01 0

( j

¼1

À1 0

8><>:

Page 436: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 436/675

ð Þ ¼ ¼1 ¼ À11 À À ¼0

8<: ð2 4Þ

ð Þ ¼1 þ ðÀ1Þ ¼ À ¼t ð2 5ÞS ¼ ¼0 2

À Á

t

¼1

2À Á¼21

ð À1Þ ð2 6Þ

j 2 3

t S

t

j

2 6

2ð À1Þ2

ð Þ¼4 1 ð Þþ1

1

6¼ 6¼

ð Þ!ð2 7Þ

S j j hk

6¼ 6¼ 2 7

V

Page 437: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 437/675

2ð À1Þ2

ð Þ ¼4 2 ð ÞþÀ1

¼1 ¼þ1 ¼þ1 6¼

ð Þ

þ ¼2

À1

¼1

À1

¼1

ð Þ

þ ¼2

À1

¼1 ¼ þ1

ð Þ

þÀ1

¼2 ¼þ1

À1

¼1 6¼ ð Þ! ð2 8Þ

B 2 8

j hk

j k k j 2À Á 6¼ 6¼ B

j hk 3À Á

ð2

þ2

þ1

þ1

Þ3

À Á¼6 3

À Á 2 8

2ð À1Þ2

ð Þ ¼4 2 ð Þ þ6 3 ð Þ !ð À1Þ ð Þ ¼2 ð Þ þ4ð À2Þ ð Þ ð2 9Þ

6¼ ¼1 2 À1 ¼2 3 ¼2 3

j 2 4 j :

ð 2 Þ ¼1 þ ðÀ1Þ

2 ¼ þ

ð Þ ¼ ð þ Þ À ð À Þ2 ð2 10Þ

Page 438: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 438/675

j k

ð Þ¼

¼ ¼1 ¼ ¼À1 ¼1 ¼À1

¼À1 ¼11À À À2 ¼0 ¼À1 0 1

¼À1 0 1 ¼00

8>>>>>>>>><>>>>>>>>>: ð2 11Þ 6¼ ¼1 2 0 1

ð Þ ¼12 þðÀ1Þ2 þ2ðÀ1Þ

ð Þ ¼ þ À2 À ð À Þ2

ð2 12ÞS 2 10 2 12 2 9

ð À1Þ ð Þ ¼2ð þ Þ þ4ð À2Þð þ À2 ÞÀ2ð2 À3Þð À Þ

2ð2 13Þ

1

j ¼0 ¼0

þ ¼1 þ þ2 ¼1

2 13 c cd :

ð À1Þ ð Þ ¼2 À2ð2 À3Þð2 À1Þ2þ4ð À2Þð1 À4 Þ

¼8ð2 À3Þ ð1 À Þ À16ð À2Þ ð2 14ÞS

¼ ð ¼1 \ ¼ À1Þ¼ ð ¼1Þ À ð ¼1 \ ¼1Þ¼ À

V 7

Page 439: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 439/675

2 14

ð À1Þ ð Þ ¼8ð2 À3Þ ð1 À Þ À16ð À2Þð À Þ¼8 ð1 À Þ þ16ð À2Þð À 2Þ ð2 15Þ

c j j S cc

¼ ð ¼1 \ ¼1Þ¼ ½ð À Þð À Þ 0 \ ð À Þð À Þ 0 ð2 16Þ

6¼ ¼1 2 cc j j k k

c cc

X,Y

¼ ½ð Þ \ ð Þ þ ½ð Þ \ ð Þ¼Z 1

À1Z 1

À1 ½ð Þ \ ð Þ ð Þ

þZ 1

À1Z 1

À1 ½ð Þ \ ð Þ ð Þ

¼2Z 1

À1Z 1

À1 ð Þ ð Þ ð2 17Þ

¼ ð ½ð Þ\ð Þ [½ð Þ\ð Þ\ ½ð Þ\ð Þ[½ð Þ\ð Þ Þ

¼ ½ð [ Þ\ð[ Þ¼ ½ð \ Þ[ð \ Þ[ð \ Þ[ð \ Þ¼ ð \ Þþ ð \ Þþ2 ð \ Þ¼Z 1

À1Z 1

À1 ½ð Þ\ð Þ\ð Þ\ð Þþ ½ð Þ\ð Þ\ð Þ\ð Þþ2 ½ð Þ\ð Þ\ð Þ\ð Þ

Â

ð

Þ¼Z 1À1Z 1

À1ð ½ð Þ\ð Þ

2

þ ½ð Þ\ð Þ2

þ2 ½ð Þ\ð Þ ½ð Þ\ð Þ Þ ð Þ

8

Page 440: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 440/675

¼Z 1À1Z 1

À1 ½ð Þ\ð Þ þ ½ð Þ\ð Þ

2 ð Þ

¼Z 1

À1Z 1

À1½1À ð ÞÀ ð Þþ2 ð Þ

2 ð Þ ð2 18Þ 2 6

2 j

2 6

¼¼1 ¼1

ð À1Þ ð

2 19

Þ

¼ ð À Þ V ¼ ð À Þ ¼ V 6¼ 6¼

¼1 ¼1 2

¼¼1 ¼1

V 2 ¼ ð À1Þ 2 19

¼ P¼1P ¼1 V

P¼1P ¼1 2 P¼1P ¼1 V 2 1 2 ð2 20Þ

K 1990

j Q j 1 ¼ ÀQ

¼ ðÀQÞ 2 ¼ 2 ð2 21Þ 6¼0

6¼ þQ ¼ 2 2 21

V

Page 441: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 441/675

¼2

2À ÁÀ1 ¼1 À2Q

2À Á ð2 22Þ 2 22 S 1

t ¼2 À1 ¼1 À2

2À ÁQ 2À Á c d

1

À 0

¼1

K 1990 30–31

Q 2À Áð1 À Þ2

S 11 6

T

S S t ¼0

2 2

2

ð 2 0Þ ¼

Page 442: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 442/675

t

!

2 21 2 22 D

ð Þ j 1

ð ¼Þ ¼ð Þ ð2 23Þ

ðÞ ¼ð ¼Þ ¼ ¼ 2 þ12 ! ð2 24Þ

ð þ1 Þ ð Þ

¼1 2 1 2 1 3

1 2

þ1 þ1

ð1 2 Þ þ1 þ1 1 1

1

2 2 þ1 À1 ¼1 2 þ1

ð þ1 Þ ¼ð Þ þð À1Þ þð À2ÞþÁÁÁþð À Þð2 25Þ

¼2 À ð À1Þ2 ð2 26Þ

V

Page 443: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 443/675

þ1 À1 0 þ1

0¼2 0À ð þ1Þ2 ¼2ðþ À1Þ Àð þ1Þ

2

¼2 À ð À1Þ2 þ2ð À1Þ À¼ þ2ð À1Þ À

2 ð À1Þ À ¼1 2 þ1 2 25

ð þ1 Þ ¼ð þ Þ þð þ À2Þ þð þ À4ÞþÁÁÁþð À þ2Þ þð À Þ ð2 27Þ

2 26

ð À1Þ2 B Q L M

K V 1952 1953 B 1973 1974 B 1974

1988 K 1990 2 25

2 27 ¼3

3! :

:

1 3 13 13 31 31 3 1

3 2 2 1 1 03 1 1 À1 À1 À3

0 1 2 3

À3 À1 1 3

ð3 Þ ð3 Þ 1 2 2 1

Page 444: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 444/675

2 25 ð4 Þ ¼P3¼0 ð3 ÀÞ

ð4 0Þ ¼ð3 0Þ ¼1ð4 1Þ ¼ð3 1Þ þð3 0Þ ¼3ð4 2Þ ¼ð3 2Þ þð3 1Þ þð3 0Þ ¼5ð4 3Þ ¼ð3 3Þ þð3 2Þ þð3 1Þ þð3 0Þ ¼6ð4 4Þ ¼ð3 3Þ þð3 2Þ þð3 1Þ ¼5ð4 5Þ ¼ð3 3Þ þð3 2Þ ¼3ð4 6Þ ¼ð3 3Þ ¼1

2 27 ð4 Þ ¼

P3

¼0 ð3 þ3 À2 Þ

¼4 :

ð Þ K 1990 91–92

S

S t S 1 t ¼0

ð 0Þ ¼0 ð 0Þ 2 15 0

0 1 2 17 2 18

¼2Z 1

0 Z 1

0 ¼1 2

¼Z 1

0 Z 1

0 ð1 À À þ2 Þ2

¼5 18ð2 28Þ

0 1 2 3 4 5 6

À6 À4 À2 0 2 4 6

À1 À2 3 À1 3 0 1 3 2 3 1 ð Þ 1 24 3 24 5 24 6 24 5 24 3 24 1 24

V

Page 445: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 445/675

S 2 15

ð À1Þ ð Þ ¼2 þ16ð À2Þ

36

ð Þ ¼2ð2 þ5Þ9 ð À1Þ ð2 29Þ

¼3 ffiffiffiffiffiffiffiffiffiffiffið À1Þ

ffiffiffiffiffiffiffiffiffiffiffi

2ð2 þ5Þ

ð2 30Þ

j ð Þ t t 6¼0

- K ’

S ð Þ ¼t

t ð Þ 2 13 2 15

ð Þ

ð Þ

ð Àt Þ ð Þ S

2

À Á

Page 446: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 446/675

¼1 2

ð1 2 Þ D

¼

¼

¼ þ ¼ ¼1 2 ð Þ ð Þ ð À1Þ P¼1

¼¼1 ð À1Þ ð2 31ÞS

¼ 0þ 0¼ ¼1 2 ð Þ ð Þ

¼¼1 ð À1Þ ð2 32Þ

ð Þð Þð Þ 6¼ 6¼

ð À Þð À Þ ð À Þð À Þ ð À1ÞðÀ2Þ D

6¼ 6¼ 1

ð Þ ð Þ ð Þ

¼¼1 ð À1ÞðÀ2Þ

ð Þ ð Þ ð Þ :

1:

V

Page 447: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 447/675

2:

3:

:

1

2

3

¼21 1 þ 2 þ 2 !¼ðþ Þ

2

Àðþ Þ¼2 À ¼ ð À1Þ

¼¼1

ð À1Þð À

1

ÞðÀ2

Þð2 33Þ

S

¼¼1

ð À1Þð À1ÞðÀ2Þ ð2 34Þ

¼¼1

0þ 0þ 0þ 0

ð À1ÞðÀ2Þ ¼¼1 ð À1ÞðÀ2Þ ð2 35Þ

S 2 31 2 33 2 15

ð À1

Þ2

ð

Þ ¼8

À8 2

ð2

À3

Þ þ16

ð À2

Þ

2ð À1Þ2 2ð Þ ¼8 2

¼1

2 À2 À3

ð À1Þ ¼1 2

À¼1

ð2 36Þ

Page 448: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 448/675

2 36

M

M :

¼ þ1 ¼0þ2 2 ¼0þ2 3 ¼0þ3 4 ¼1þ2 5 ¼4þ1 6 ¼5þ0

¼20 2 ¼76 ¼6

¼20

6

ð5

Þ¼

23

¼76 À206ð5Þð4Þ¼

715

¼2ð2 3Þ À1 ¼1 3

302 2ð Þ ¼8 2ð76Þ À20 À9

6ð5Þ202 !¼96

2ð Þ ¼0 1067 ð Þ ¼0 33

¼2ð 1 þ 2 þ 3 Þ 44

1: 1 5 1 6 2: 5 6 3:

44 ¼2ð3Þ ¼6 ¼ 4ð4 À1Þ

M 91 52 69 99 72 78 89 72 69 96 66 67

M 1 2 3 4 5 6 4 3 1 2 5 6

V 7

Page 449: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 449/675

M

1 ¼ 4 ¼3 2 ¼ 3 ¼ 5 ¼ 6 ¼4

¼11 15 ¼1 2 ¼7 15 2ð Þ ¼ À32 1125

! 2 15

S

:

1 ð1 À Þ100 K

À 2 ð Þ t þ 2 ð Þ2

0 t ¼t 0 1 t 6¼t 0

0

Àt 0

ð Þ 2

S 2 3

t À 2 6 2 19 2 21 t 2 13

3 1 4 2 6 5

8

Page 450: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 450/675

ð Þ 2 15

2 20 2 22 2 6 2 19 2 21

1

C

ð Þð Þ

1 ÀPð À1Þð À1Þ !1 ÀPð À1Þ

ð À1Þ !

2 13 1967 76–77

¼ C

2À Á

ð À1Þ 2 6 2 19

2 21 2 20

S P¼1P ¼1 2

À ð Þ

ð À1Þ À

Pð À1Þ S

2 20

¼ P¼1P ¼1 V

½ð À1Þ ÀPð À1Þ ½ð À1Þ ÀPð À1Þ1 2 ð2 37Þ

V

Page 451: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 451/675

2 21

¼ ÀQ

½ð2ÞÀPð2Þ ½ð2ÞÀPð2Þ1 2 ð2 38Þ

2 38

2 37 2 38 2 20 2 21

2 37 2 38

2 20 2 21

¼ ÀQ

þQ ð2 39Þ

S 11 1 þ1

À1 ¼1

¼1 K ’ t ¼ À þ ¼1 B

þ ¼ ½ð À Þð À Þ 0 þ ½ð À Þð À Þ 0

¼1 À ½ð À Þð À Þ ¼0

¼1

À

½ð

¼

Þ [ð

¼

Þ ¼1

À

t ‘‘ ’’

6¼0

Page 452: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 452/675

t üt

1 À ¼ ÃÀ Ã

à Ã

ü1 À ¼ ½ð À Þð À Þ 0 ½ð À Þð À Þ 6¼0

S t à t

ü 1 À ¼ À

þ

2 31 2 32 S ð þ Þ ð þ Þ

t à K 1954 1959 1963

K ’

C 3

K ’

1 2

V

Page 453: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 453/675

¼ ¼1 2 2 3

¼ ð À1Þ ð À Þ 2 6

2 ¼1 ð À Þ

t

’ K

ð 1 1Þð 2 2Þ ð Þ P

¼ P¼1

ð

À

Þð

À

ÞP¼1ð À Þ2P¼1ð À Þ

2

Þ 1 2 ð3 1Þ

1 2

3 1 ’

Page 454: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 454/675

S 11 4

S ’

3 1 D

¼ ð Þ ¼ ð Þ

ð1 1Þð2 2Þ ð ÞS

¼1 ¼

¼1 ¼¼1 ¼ ð þ1Þ

2 ¼ ¼ þ1

2 ð3 2Þ

¼1 ð À Þ2

¼¼1 ð À Þ

2

¼¼1 Àð þ1Þ

2 2

¼ ð 2 À1Þ12

ð3 3ÞS 3 1

:

¼12P¼1ð À ÞðÀ Þ

ð 2 À1Þ ð3 4Þ

¼12 P¼1 À ð þ1Þ

2 4 ð 2 À1Þ ð3 5Þ

¼12P¼1

ð 2 À1Þ À3ð þ1Þ

À1 ð3 6Þ

¼ À ¼ ð À Þ À ðÀ ÞS 3 3

¼1 2 ¼

¼1 ð À Þ2þ

¼1 ð À Þ2À2

¼1 ð À ÞðÀ Þ

V

Page 455: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 455/675

3 4 S

¼1 À6P¼1 2

ð 2 À1Þ ð3 7Þ

¼ ¼1 2 ¼ À S 11 1 ‘‘ ’’

S ’

1 ð Þ ð Þ

¼ ¼0 ¼1 2 12

¼ À þ1

¼1

2

¼¼1 ½À ðÀ þ1Þ2

¼4 ¼1 À þ1

2 2

¼ ð2

À1

Þ3

3 3 S 3 7 ¼ À13–6 S 3 7 3 1 3 1 ½À1 1

7 S

ð Þ

Page 456: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 456/675

Page 457: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 457/675

4¼1

2 ¼2 ð 2 À1Þ

3 þ2¼1

2 À2¼1

02

¼1

2 þ¼1

02 ¼ ð 2 À1Þ3 ¼

P¼12

P¼1 ¼0

3 6 B 2 ¼3 1 2 3

1 2 3

K 1990 97–98 M 10 11 30 M

P 2 1961 1962 D J V M

1972 Z 1972 1973 D R 1979 1981 1986 1988 R 1989

K 1990

ð1 2 3Þ P¼1

1 2 3 14 1 01 3 2 13 0 52 1 3 13 0 5

À1 0 À0 5 0 5 1 0

ðÞ 1 6 2 6 2 6 1 6

Page 458: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 458/675

ð Þ ¼1

¼1 2 ð3 8Þ ð Þ ¼

1

ð À1Þ ¼1 2 6¼ ð3 9Þ

3 2 3 3

ð Þ ¼þ12 ð Þ ¼

2 À112

ð Þ ¼ð Þ À ð Þ ð Þ

¼1

ð À1Þ¼1 ¼1

À1

2

¼1 2

¼1

2ð À1Þ ¼1 2

À¼1

2 À ðÀ1Þ¼1 2

¼ À12ð À1Þ

2ð þ1Þð2 þ1Þ6 À

2ð þ1Þ2

4" #¼ Àþ112

ð3 10Þ

j

:

¼1

¼ ð Þ ð Þ ¼ð þ1Þ2

4 ð3 11Þ

¼1 ¼ ð Þ ð Þ þð À1Þ ð Þ ð Þ

¼ ð 2 À1Þ2 þ ð À1Þðþ1Þ2

144 ¼2ð À1Þðþ1Þ2

144

ð3 12Þ

V 7

Page 459: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 459/675

3 6

ð 0Þ ¼0 ð 0Þ ¼1

À1 ð3 13Þ

C 3 6

P¼1

1957 247–248

3 11 3 12

¼ 12¼1 À3 3 À5 2

¼ ffiffiffiffiffiffiffiffiffiffiffiÀ1 ð3 14Þ

S

¼ ffiffiffiffiffiffiffiffiffiffiffi

À2 1 À 2 ð

3 15

Þ S ’ À2

S

K ’ ð Þ

ð

Þ 6¼0 B

S

ð Þ ð Þ

8

Page 460: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 460/675

ð þ1Þ2 3 4 3 7

3 1 3 1

3 7 3 1

¼1 ð ÀÞ2

¼¼1

2 À ð þ1Þ2

2

þ1 þ2 þ

¼1

þ ¼ þ þ12

þ þ12 2

¼ 2 þ ð þ1Þ þð þ1Þ

2

4" # ð3 16Þ

V

Page 461: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 461/675

¼1 ð þ Þ2¼ 2

þ ð þ1Þ þð þ1Þð2 þ1Þ6 ð3 17Þ

3 17 3 16

ð þ1Þð2 þ1Þ6 À ð þ1Þ

2

4 ¼ ð 2 À1Þ12

S

¼1 ð ÀÞ2

¼ ð 2 À1Þ12 À 0 ð3 18Þ

0¼Pð 2 À1Þ12 L 0 3 1

¼12½P¼1 À ð þ1Þ

2 4

½ð 2 À1Þ À12 0½ð 2 À1Þ À12 0 1 2 ð3 19Þ

¼ ð 2 À1Þ À6

P¼1 2 À6ð0þ 0Þ

½ð2

À1Þ À

120½ð

2

À1Þ À

120

1 2 ð3 20Þ 3 5 3 7

¼1 2 ¼ ð 2 À1Þ

6 À0À 0À2¼1 ð À ÞðÀ Þ

S ð Þ ð 0Þ

Page 462: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 462/675

3 8 3 9 01 02 0

3 18

ð Þ ¼¼1

ð0ÀÞ2

¼ ð 2 À1Þ À12 012

3 10

ð

Þ ¼1

ð À1Þ¼1 ¼1

00 À

2

¼ ÀP¼1 ð0ÀÞ2

ð À1Þ ¼ Àð 2 À1Þ À12 012 ð À1Þ

S 3 19

ð 0Þ¼144½ ð Þ ð Þþð À1Þ ð Þ ð Þ

½ð 2 À1ÞÀ12 0½ð 2 À1ÞÀ12 0

ð 0Þ ¼1

À1

ffiffiffiffiffiÀ1 3 19 3 20

3 7

À1

K ’

V

Page 463: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 463/675

W T ; E ( ), t , r

S 11 1 t

t

t

3 1 P

‘‘ ’’

S S

2 20

¼¼1 ¼1

V

ð À1Þ ð4 1Þ

¼ ð À Þ V ¼ ð À Þ ð4 2Þ

Page 464: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 464/675

j V j

5 5 1

¼1 þ ¼1 6¼

ð À Þ ð4 3Þ

ðÞ ¼0 01 0&

ð À Þ ¼1 À2 ð À Þ 1 6¼

ð À Þ ¼0 ð4 4ÞS 4 3

¼ þ12 À

12 ¼1 ð À Þ

À ¼ À ¼1

2

3 4

ð 2 À1Þ ¼12¼1 ð À ÞðÀ Þ ¼3

¼1 ¼1

¼1V

¼3¼1 ¼1

V þ3¼1 ¼1 ¼1

V

4 1

¼3

þ1

þ6

ð2

À1

Þ¼1

¼1

¼1

V ð4 5Þ

B ð Þ ð Þ V 0 c

V

Page 465: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 465/675

ð À1Þ

¼2 À1 4 5

ð Þð Þ ð Þ

ð À Þð À Þ ¼ V 0

2 ¼ ½ð À Þð À Þ 0

2 V 0 6¼

2

À Áð À2Þ

4 5

2 ð À2Þ½ 2 À ð1 À 2Þ ¼ð À1ÞðÀ2Þð2 2 À1Þ2

4 5

¼3

þ1 ð2 À1Þ þ3ð À2Þ

þ1 ð2 2 À1Þ ð4 6Þ

ð Þ ¼3

½t

þ ð À2

Þð2

2 À1

Þþ1 ð4 7Þ 2

ð Þ 2 17 c:

Page 466: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 466/675

2 ¼ ½ð Þ\ð Þ þ ½ð Þ\ð Þ¼Z 1

À1Z 1

À1 ½ð Þ\ð Þ

þ ½ð Þ\ð Þ ð Þ

¼Z 1

À1Z 1

À1½ ð Þþ1À ð ÞÀ ð Þþ ð Þ ð Þ ð Þ

¼1þ2Z 1

À1Z 1

À1 ð Þ ð Þ ð ÞÀ2Z 1

À1 ð Þ ð Þ

¼2Z 1

À1Z 1

À1 ð Þ ð Þ ð Þ ð4 8Þ

2 ¼2Z 1

À1Z 1

À1 ð Þ ð Þ ð Þ ð4 9Þ

2 17 2 ¼ ¼1 2 c

0 1 c2 1 3 2 3

2 ð Þ ð Þ 2 ð Þ þ 2

ð Þ 4 9

2 2Z 1

À1Z 1

À1 2

ð Þ ð Þ ¼2 3

S

2 ð Þ ð Þ ¼ ½ ð Þ þ ð Þ2

À 2 ð Þ À 2

ð Þ 4 9

2 ¼Z 1

À1Z 1

À1½ ð Þ þ ð Þ

2 ð Þ À2 3

Z 1

À1Z 1

À1½ ð Þ þ ð Þ ð Þ !2

À2 3 ¼1 3

¼

V

Page 467: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 467/675

ð Þ ¼ ð Þ

ð Þ & 4 8

2 ¼2ð2ÞZ 1

À1Z À1 ð Þ ð Þ ð Þ ¼2 3

¼ À

ð Þ ¼ ð Þ À ðÀ Þ À 0 À &

2 ¼

2

Z 1

À1Z 1

À ½

ð

Þ À

ðÀ

Þ

ð

Þ

ðÀ

Þ¼Z 1

À11 À 2

ðÀ Þ À2½1 À ðÀ Þ ðÀ Þ ðÀ Þ

¼Z 1

À1½1 À ðÀ Þ

2 ðÀ Þ ¼1 3

S 4 7 t :

2

4 7

2 ¼ !1 ð Þ ¼3ð2 2 À1Þ ð4 10Þ

ð Þ 2 t

ð Þ ¼3t

þ ð À2

Þ2þ1 ð4 11Þ

2 2 4 9

, ¼t ¼ ð Þ

À1

0D 1

Page 468: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 468/675

2 ¼2 ½ ð Þ ð Þ ¼2 ½ ð Þ ð Þ þ2 ½ ð Þ ½ ð Þ¼2 ½ ð Þ ð Þ þ1 2

S

½ ð Þ ¼ ½ ð Þ ¼1 12

6 2 ¼ ½ ð Þ ð Þ þ3 4 10

2

¼ ½

ð

Þ

ð

Þ2

ð Þ

P

P P

3 1

¼ ð ðÞÞ

ðÞ

ð1 1Þð2 2Þ ð Þ

ð1 1Þð2 2Þ ð Þ

¼

P¼1

P¼1

2

P 1957 P 1961

V 7

Page 469: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 469/675

¼ À1

ð Þ ¼ À1 1 À0 6

þ8 !ð Þ ¼

1

À3

¼ À1

¼ À1

K ’ S 11 2

¼2

ð À1Þ¼2ð ÀQÞð À1Þ

Q ð Þ ð Þ

À1 1 À1 1 0

Q

Q

Q 2 37 b

:

8

Page 470: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 470/675

L 10 Q

11 30 30 :

K ’

S S 11 3 1 3 7

¼1 À6P¼1 2

ð 2 À1Þ

3 19 M L

2

ffiffiffiffiffiffiffiffiffiffiffiÀ1 À 2

ffiffiffiffiÀ1

’ 6 1

P ð Þ À ð Þ 2 À 2 2

P ffiffiffiffiffiffiffiffiffiffiffiffiffi2ð2 þ5Þ 3 ffiffiffiffiffiffiffiffiffiffiffið À1Þ ð 3 ffiffiffiffiffiffiffiffið À1Þ ffiffiffiffiffiffi

2ð2 þ5Þ À ffiffiffiffiffiffiffiffiffiffiffiffi

2ð2 þ5Þ 3 ffiffiffiffiffiffiffiffiffiffið À1Þ ð 3 ffiffiffiffiffiffiffiffið À1Þ ffiffiffiffiffiffi2ð2 þ5Þ

B 2 2

V

Page 471: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 471/675

p 1 C

K S

K ’ J 1 J 2

¼2ð20Þ9ð8Þ ¼0 556 L 0 022 P 2 ¼34

¼1 À6ð34Þ9ð80Þ ¼0 717

P ffiffiffiffiffiffiffiffiffiffiffiÀ1 ð ffiffiffiffiffiffiffiÀ1 Þ À ffiffiffiffiffiffiffiffiffiffiffiÀ1 ð ffiffiffiffiffiffiffiÀ1 Þ B 2 2

J 1 1 5 9 7 4 6 8 2 3J 2 4 3 6 8 2 7 9 1 5

1 Q 2

1 4 5 3 À3 92 1 7 0 1 13 5 4 2 À2 44 2 5 0 2 45 3 4 0 2 46 7 2 1 À1 17 8 1 1 À1 18 9 0 1 À1 19 6 3 9

28 8 0 34

Page 472: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 472/675

M ¼0 018

M B K ’ S ’ M B

S ’ J 1 2 P

¼0 717 M B P

S ’ S X C

t t t S ’ K ’

S X C

ð 2 09Þ 2 30 S ’

3 15 S ’ ð 2 72Þ 7

2 36

V

Page 473: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 473/675

Page 474: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 474/675

2 4 3 4

:

8 Q

Q ¼8

7

K S

B

50 B

P

1 4 5 2 3 7 8 9 61 4 2 5 3 7 8 9 61 2 4 5 3 7 8 9 61 2 4 3 5 7 8 9 61 2 3 4 5 7 8 9 61 2 3 4 5 7 8 9 61 2 3 4 5 7 8 6 91 2 3 4 5 6 7 8 9

V

Page 475: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 475/675

B

‘‘ ’’ B

C

K ’ t t

C

S ’ t

P

P 9 2

¼0 912

B M

D ’ B

C 3

Page 476: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 476/675

K S :

ðÞ C ðÞ

ðÞ 95 tV 4 9

þ

S K ’ M

1 2 1 0 C 7

S

ð0 0

Þ¼1

4 þ1

2

C B S 2030

1 2 1 3 5 4 8 7 62 1 2 4 5 7 6 8 3

( )

C 6 6 23 4 1 24

2 1 12 1 5 12 0 8 7

0 6 18 J 0 6 8 M 0 5 35

C 0 4 11

V

Page 477: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 477/675

C R&D 2 7 J 2 8 S

M 1989

ðÞ J S R&D

ðÞ D J S

7 W 1000 S D

8 4 1 C 3 M S ’

0 8 0 4 1 2 0 5

P 0 7 1 3C 3 8 4 7P 0 4 0 7R 2 9 2 2

1 9 0 5 1 9 1 4

M 1 6 1 3M 2 7 5 8

5 1 4 8M 3 0 3 2

2 6 1 2 4 5 9 0

1945 3 51950 2 61955 2 31960 2 21965 2 51970 3 51975 4 81980 5 21985 5 0

Page 478: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 478/675

D P 11 8 D ’ K ’ 12 20

:

:

( )

( ) ( ) C ðÞ ðÞ

20

? B

D

1 0 00 11 2 10 2 0 03 12 2 25 3 0 05 13 2 50 4 1 11 14 2 50 5 0 00 15 2 51 6 0 00 16 2 60 7 0 02 17 2 50 8 0 06 18 2 45 9 1 15 19 0 02 10 2 00 20 0 00

1 4 01 75 9 21 01 72 12 9 01 81 5 10 02 913 9 01 74 10 4 01 77 1 22 02 93 7 17 02 956 3 01 71 10 9 01 76 1 26 02 82 9 12 02 90

6 15 01 76 11 1 01 78 3 21 02 84 10 4 02 928 4 01 98 12 5 01 80 4 6 02 89 12 6 02 93

V 7

Page 479: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 479/675

S

:

ðÞ

ðÞ ðÞðÞ K

ðÞ ðÞ 10 ‘‘

’’

S C

D K ’ S ’

¼3

5 5 3509 6 2302 4 5404 4 3900 5 9107 9 2202 0 6803 3 590

13 1 904 2 520

1 45 632 26 03 20 164 40 915 36 256 23 2

9 4 3 7 2 1 5 8 10 6

7 6 5 9 2 3 8 5 10 1

8

Page 480: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 480/675

7

ðÞ

ðÞ K

8

:

ðÞ C K ’

ðÞ C S ’

3 1B 1 3C 5 4D 6 2 2 6 7 8 8 9 4 5 9 7

0–6 5 9 12 167–9 4 10 13 18

10–12 D 10 7 9 12S 12 12 12 19C 4 3 12 16 14S 10 15

14

V

Page 481: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 481/675

p

S

C 10

Page 482: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 482/675

L

¼ þ þ þ ¼1 2 ¼1 2

j j

2

0 1 ¼ 2 ¼ ÁÁÁ ¼

ð À1Þ P ¼1ð À Þ

2

P

¼1P ¼1ð À À þ Þ

2

¼

¼1

¼

¼1

¼

¼1

¼1

À1 À1 À1

D

Page 483: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 483/675

¼1 2 ¼1 2 S

1 2 Â

’ ¼

0 1 ¼ 2 ¼ ÁÁÁ ¼

1 6¼ 6¼

P ’

1 1 2 ÁÁÁ

’ ð À À Þ

¼1 2 ¼1 2

¼1 2

2

Page 484: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 484/675

¼ ’

C 11 ’

’ W -W K 3

S 12 1

1937 1940

1 2 ’

¼1 2 ¼1 2 1 2

1 2

Page 485: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 485/675

:

1 2 ÁÁÁ 1 11 12 ÁÁÁ1 ð þ1Þ22 21 22 ÁÁÁ2 ð þ1Þ2

1 2 ÁÁÁ ð þ1Þ2

1 2 ÁÁÁ ð þ1Þ2

ð2 1Þ

þ1 2

¼ ¼1 À

ð þ1Þ2 !2¼ ¼1

¼1 À þ12 !2 ð2 2Þ

0 1 ¼ 2 ¼ ÁÁÁ ¼

! k Â

ðÞ ¼ð Þ

2

Page 486: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 486/675

, À1 K 1990 150–151 ¼3

8 ¼4 4 M Q 2 8 1962 ¼3 15 ¼4 8 M 1971 Q 1972 1977

¼6 ¼6

þ1 2 2 2

¼ ¼1

¼1 ð À Þ2

þ2 ¼1 1 ð À Þð À Þ

¼ ¼1 ð À Þ2

þ2

¼ ð 2 À1Þ12 þ2 ð2 3Þ

3 2 3 3 3 10

C 11:

ð Þ ¼þ12 ð Þ ¼

2 À112

ð Þ ¼ Àþ1

12

ð Þ ¼0

ð Þ ¼ 2 ð Þ ¼0 ð Þ ¼ð 2Þ

Page 487: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 487/675

2 ¼ ¼1 1 ð ÀÞ

2

ð ÀÞ2

þ21 1 1

ð À Þð ÀÞð ÀÞð ÀÞð2 4Þ

S 6¼

ð 2Þ ¼ ¼1 1 ð Þ ð Þ

þ2

1

2 ð

Þ ð

Þ ð2 5

Þ ð 2Þ ¼ 2 ð 2 À1Þ

2

144 þ22 2 ð þ1Þ

2

144

¼ 2

2 ð þ1Þ2 ð À1Þ

144 ð2 6ÞS 2 3

ðÞ ¼ð 2 À1Þ12 ðÞ ¼

2 ð À1ÞðÀ1Þðþ1Þ2

72 ð2 7Þ

Q ¼12

ð þ1Þ¼12P ¼1 2

ð þ1ÞÀ3 ð þ1Þ ð2 8Þ ðQÞ ¼À1 ðQÞ ¼2ð À1Þð À1Þ %2ð À1Þ

À1 Q

Q À1

7

Q 2 Q 2

À1

S Q ’

2

Page 488: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 488/675

¼

¼1 ¼1 ð ÀÞ2

¼ ¼1 À þ1

2 2

¼2 À112

’ 2 8

Q ¼ð À1Þ

t D

¼

¼1 ¼1

¼ þ12 ¼

¼

¼1

¼

¼1 ¼1 ð ÀÞ2

¼

¼1 ¼1 ð À þ ÀÞ2

¼

¼1 ¼1 ð À Þ2

þ ¼1 ð ÀÞ

2

þ2 ¼1 ð ÀÞ

¼1 ð À Þ

¼

¼1 ¼1 ð À Þ2

þ ¼1

À ½ ð þ1Þ2 2

¼

¼1 ¼1 ð À Þ2

þ ¼2 À112 ð2 9Þ

2 1

7

Page 489: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 489/675

ð À1Þ À ð2 10Þ

2 10 S ’ ð À1Þ

ð À1Þð À1Þ

0 1

¼2

¼ ÁÁÁ ¼

1

1 2 2 2

¼ ¼1 2

À 2 ð þ1Þ

2

4 ð2 11Þ

¼3 8 ¼4 4 0

, B Q 2 8 À1

v

B À1

B À1 0a

R ð À1Þð À1Þ À

À1

a

8 2

Page 490: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 490/675

Q ¼12ð À1Þ

ð 2 À1Þ ÀPPð2 À1Þ ð2 12Þ

S 12 4

ð À1Þ2 S 10 4

À Ãffiffiffiffiffiffiffiffiffiffi ð þ1Þ6 ð2 13Þ

à S 10 4

½ ð À1Þ

0 15 0 20

p

46 24 12 6 4

30 2 2

¼4 ¼8

0 1 ¼ 2 ¼ 3 ¼ 4 1 4

2 3 2 11

¼ ð82 þ172 þ26 52 þ28 52Þ À82ð4Þð5Þ

2

4 ¼267 5

Page 491: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 491/675

Q ¼12ð267 5Þ

8ð4Þð5Þ ¼20 1

2 8 3 B 0 001

1 7

2 PPð2 À1Þ ¼12 2 12 Q ¼20 58

S

¼0 15 ¼8 ¼4 ü2 241 2 13 11 572 6

24 4 24

p

4 1 6 4

1 14 23 26 302 19 25 25 333 17 22 29 284 17 21 28 275 16 24 28 326 15 23 27 367 18 26 27 268 16 22 30 32

k p

4 1 6 4

1 1 2 3 42 1 2 5 2 5 43 1 2 4 34 1 2 4 35 1 2 3 46 1 2 3 47 1 2 5 4 2 58 1 2 3 4

8 17 26 5 28 5

2

Page 492: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 492/675

M B S X C

Q S X C M B S X C

Page 493: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 493/675

2

Page 494: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 494/675

’ Â S 12 2

1 1 2 ÁÁÁ P 1963

¼ ¼1

ð3 1

Þ

P 1963 0 001 0 01 0 05 Q

¼12ð À0 5Þ À3 ð þ1Þ

2

ð þ1Þ ffiffiffiffiffiffiffiffiffiffi ð À1Þ ð3 2Þ

¼12

ð 3 À ÞÀ3ð þ1Þ

À1

P S 12 4

p P1963

Page 495: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 495/675

1 P ’ 1 1 S

3 1

3 1 ¼168 Q

¼4 ¼6 ¼0 05 163

¼0 05

ð þ1Þ

2

4 þ0 5 þ ð þ1Þ ffiffiffiffiffiffiffiffi ð À1Þ 12

162 13 ð 0 05 ¼1 645Þ

S X C 168

p

1 3 4 2 12 4 2 1 33 4 2 3 14 4 1 3 25 2 4 3 1

6 4 3 1 2

j 21 16 13 10 j 4 3 2 1

2

Page 496: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 496/675

S X C

p 2 1

1 24 12 6 4

¼1ð8Þ þ2ð17Þ þ3ð26 5Þ þ4ð28 5Þ ¼235 5

Q 0 001

S X C

Page 497: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 497/675

J 1954 P

K

S 12 1

S C 11

2À Á 2À Á

2

Page 498: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 498/675

S

2À Á

 2 1

1 2

ð 1 2 Þ

1 2 3

S ð þ1Þ2

¼1 À

ð þ1

Þ2 !2

¼ 2 ¼1

À þ1

2 2

¼ 2

2

À1

12 ð4 1Þ

7

Page 499: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 499/675

¼ ¼1 À

ð þ1Þ2 !

2

ð4 2Þ 2 9

¼2 ð 2 À1Þ

12 ¼ þ

¼1

¼1 ð À Þ2

ð4 3Þ

4 1

2 ð 2 À1Þ12 ¼

¼ ð þ1Þ2 ’

W ¼ ¼12

2 ð 2 À1Þ ð4 4Þ

’ 0 1 1

0 W

4 3 P

¼1ð À Þ2

À1 þ1 À1

0 1

8 2

Page 500: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 500/675

W W K

W 2À Á 2À Á

¼PP1

2 ¼

¼1

¼1

6¼ ð À1Þ ð4 5Þ

¼12

ð 2 À1Þ ¼1 À þ1

2 À þ12 6¼

D ð þ1Þ2

¼12 ¼1

¼1

¼1

ð À Þð À Þ ð À1Þð2 À1Þ

¼12 ¼1

½P

¼1ð À Þ2ÀP

¼1ð À Þ2

ð À1Þð2 À1Þ¼P ¼1ð À Þ

2

Àð À1Þ ¼ À

ð À1Þ ¼ À1 À1 ð4 6Þ

W ¼ þ1

À ¼ ð

À1

Þ þ1

ð4 7Þ

W ¼1 ¼1 1 ð Þ

1 ¼ À1 ¼ À1 ð Þ S W 4 7

À1 ð À1Þ W

S

Â

ð 1 2 Þ ¼1 2

Page 501: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 501/675

W

W W 1 W

W

ð Þ

W S 12 2

Q ¼

12 ð þ1Þ¼ ð À1ÞW

À1

W 2 7

W 4 7 11 3 13

ð Þ ¼0 ð Þ ¼1

À1 1

ð Þ ð Þ 1 1

ð Þ ¼0 ¼ ¼

4 5

ð Þ ¼0 2 2

ð Þ¼1 ð Þ

þ1 1

6¼ 6¼ ð Þ¼2 0ð À1Þ

7 2

Page 502: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 502/675

ð Þ ¼2

ð À1ÞðÀ1Þ 4 7

ðW Þ ¼1 ðW Þ ¼

2ð À1Þ 3ð À1Þ ð4 8Þ

¼ ð À1Þ À22 ¼ð À1Þ½ ð À1Þ À2

2

W ð À1Þ W

¼ ½ð1 À Þ S ’

v 1 ¼2 v 2 ¼2 ¼ ð Þ2 ’

1

ð

À1

ÞW

ð1

ÀW

Þ S ’ 1

¼ À1

À2

2 ¼ ð À1Þv 12 ½ð À1ÞW ð1 ÀW Þ 2 ’ 1 ¼ À1 À2

2 ¼ ð À1Þv 1

3 ð À1ÞW À1

1 2 10

ð À1Þ À ¼

ð À1ÞW

1 ÀW

À1 ð À1Þð À1Þ

7

Page 503: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 503/675

S

Â

1 2

¼1

¼12

¼1 ¼1ð À Þð À Þ

ð 2 À1Þ ¼12 ¼1

ð À Þð À Þ ð 2 À1Þ

¼12P ¼1

ð 2 À1ÞÀ3ð þ1Þð À1Þ

¼ ðþ1Þ2

P ¼1

7 2

Page 504: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 504/675

¼1 ð À Þ2

¼ ¼1

2 þ 2

¼1

2 À2

¼1

¼ ¼1

2 þ 2

¼1

2 À2 ¼1

¼ À2 ¼1

P ¼1

Â

11 3 18

¼1 2

¼1

Àþ1

2 2

¼ð 2 À1Þ À

Pð2 À1Þ

12

2 9

¼

¼1 ¼1 À þ1

2 2

¼ ð 2 À1Þ ÀPPð2 À1Þ12

W ¼ W

W

4 4

7

Page 505: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 505/675

2 11

Q 2 8 ð À1ÞW B À1

p

D ?

1 8

¼3 ¼8

P 2 ¼1510 2 11 ¼52 W ¼0 138 4 4

Q 2 8

1 3 4 5 6 7 8

Q 90 60 45 48 58 72 25 85 V 62 81 92 76 70 75 95 72R 60 91 85 81 90 76 93 80

1 3 4 5 6 7 8

Q 1 4 7 6 5 3 8 2 V 8 3 2 4 7 5 1 6R 8 12 4 5 3 7 1 6

17 9 13 15 15 15 10 14

7 2

Page 506: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 506/675

Q ¼2 89 7 B 0 50 0 90

Q

P 2 ¼158 ¼ À0 881 ¼0 004 M

SPSSX S X C K

0 8951 S X C 0 9267

1 3 4 5 6 7 8 V 122 171 177 157 160 151 188 152 V 8 3 2 5 4 7 1 6Q 1 4 7 6 5 3 8 2

7

Page 507: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 507/675

K

S 12 4

10

:

1

2 3

1

7 2

Page 508: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 508/675

( , ) 1

0

1 P ¼1 ¼ ¼1 2

2 P

¼1 ¼ ¼1 2

3 P

¼1 ¼ 6¼ ¼1 2

S 1 2

¼1 ¼1 ¼ ¼

¼ 3

¼1 ¼1 2

¼

¼1 ¼1

2 þ ¼1 ¼1

¼ þ ð À1Þ

1 2

¼ ð À1Þð À1Þ ¼ ð À1Þ

À1

S

À1 ð À1Þ D S

C C 1957 520–544 ¼7 ¼1 ¼ ¼3

, , , , , :

O 1 3 4 5 6 7

77

Page 509: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 509/675

1 2

1 1 2 2 1 1

3 3

1 2 2 1 2 1

1 2 À1 2 S

1 ¼1

1 þ À1

¼1 2

¼11 þ

À1

¼1 ¼

¼1 þ

À1

¼1

¼1 þ ð À1Þ

¼1 2

2 3S

½ð þ1

Þ2

½

¼ ð þ1

Þ2

ð þ1Þ2

þ þ2 þ ð À1Þ

78 2

Page 510: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 510/675

À1

¼0 ð þ Þ À ð þ1Þ

2 !2¼ 2 ð 2 À1Þ12

L

W ¼12P ¼1½ À ð þ1Þ2 2

2

ð2

À1

Þð5 1Þ

¼ ¼ 5 1 4 4

0 1

W

W

W 1951

W 15

L P 1972

W 2 7 L ¼1 2

11 3 2 11 3 3 11 3 10

ð Þ ¼þ1

2 ð Þ ¼2

À1

12 ð Þ ¼ Àþ1

12

6¼ D

ð þ1Þ2 W 5 1

7

Page 511: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 511/675

Page 512: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 512/675

W

À1

Q ¼ ð 2 À1ÞW

þ1

ðQÞ ¼À1

ðQÞ ¼2ð À1Þ1 À ð À1Þð À1Þ !%2ð À1Þ 1 À

1

Q 2 Q 2

À1

W S 5 6

D

Q ¼12P ¼1 2

ð þ1ÞÀ3 2ð þ1Þ ð5 3Þ

À1

Q

K ’

8

Page 513: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 513/675

5 1

W ¼12P ¼1 2

À3 2 ð þ1Þ2

2 ð 2 À1Þ ð5 4ÞQ ¼ ð 2 À1ÞW ð þ1Þ À1

À Ã ffiffiffiffiffiffiffiffiffiffi ð 2 À1Þ

6ð À1Þs ð5 5ÞÃ ½ ð À1Þ

p ¼7 ¼3 ¼1

¼ ð À1Þ ð À1Þ ¼7 C K ’

¼3

P 2 ¼280 5 4 W ¼1

W

1 1 2 32 1 3 23 3 2 14 2 3 15 1 3 26 2 1 37 1 3 2

3 5 9 7 8 4 6

8 2

Page 514: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 514/675

5 3 Q ¼12 6 B

0 05 0 10 S X C S S

D

K ’

C

K ’ S 11 2

ð Þ ¼1 2

D

¼ ð À Þ V ¼ ð À Þ W ¼ ð À Þ 1 , , , ¼ ¼ ¼

11 ¼ ð1 1 1Þ 22 ¼ ðÀ1 À1 1Þ 12 ¼ ðÀ1 1 1Þ 21 ¼ ð1 À1 1Þ

11 , 1 22

12 21

2

Â2 6 1

8

Page 515: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 515/675

¼ 11 22 À 12 21

ð 1 2 1 2 Þ1 2 ð6 1Þ

À1 þ1

ð 11 þ 21Þð 12 þ 22Þð 11 þ 12Þð 21 þ 22Þ À ð 11 22 À 12 21Þ2

¼0

¼ ¼0 = =

12 ¼ 21 ¼0 XY.Z ¼1 11 ¼ 22 ¼0

¼ À1 M 1975 1981 M1951

K

2 ¼ ð 11 þ 22Þ À ð 12 þ 21Þ

2 ¼ ð 11 þ 21Þ À ð 22 þ 12Þ

2 ¼ ð 11 þ 12Þ Àð 22 þ 21Þ

S 2 ¼ 11 þ 12 þ 12 þ 22 ¼

P C 11 12 1.

P D 21 22 2.

.1 .2 .. ¼

8 2

Page 516: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 516/675

1 À 2 ¼

4ð 11 þ 21Þð 12 þ 22Þ2 ¼4 1 2

2

1 À 2 ¼4ð 11 þ 12Þð 22 þ 21Þ2 ¼

4 1 22

2 ¼ ½ð 11 þ 22Þ À ð 12 þ 21Þ ½ð 11 þ 22Þ þ ð 12 þ 21Þ2ð À Þ ¼4ð 11 22 À 12 21Þ

6 1

¼ À

½ð1 À 2 Þð1 À 2 Þ

1 2 ð6 2Þ

S

Q1967

6 2 P

S 1959

6 2

P

p M 1975 ¼7 6 2

8

Page 517: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 517/675

C C

K ’

ð Þð Þ ð Þ ¼3 Q ¼18 ¼ À0 7143 ¼1 Q ¼20

¼ À0 9048 ¼19 Q ¼2 ¼0 8095

¼0 005 L ¼7

6 2

¼0 8095 À ðÀ0 7143ÞðÀ0 9048Þ

ffiffiffiffiffiffiffiffiffiffiffi½1 À ðÀ0 7143 Þ

2

½1 À ðÀ0 9048Þ2

q ¼0 548

P 0 025 0 05

7

S 12 2

p

V

1 2 3 4 5 6 7 6 7 5 3 4 1 2 7 6 5 3 4 2 1

8 2

Page 518: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 518/675

P ’ S 12 3

D S 12 5

S 12 4

K ’

‘‘ ’’

S

ð À1Þ2 S 12 5

S 12 6 K

‘‘ ’’

À1 þ1

:

V

1 45 48 43 412 49 45 42 393 38 39 35 36

87

Page 519: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 519/675

:

C K ’ C

av W 4 7

D W

:

C K 6 1

C 6 2

M 1986 P 5 12 8 8

1 14

( ) D

( ) D

1990 S

1 2 1 3 5 4 8 7 62 1 2 4 5 7 6 8 33 3 2 1 4 5 8 7 6

1 3 5 6 4 2 1 2 6 4 3 5

2 1 5 4 6 3

88 2

Page 520: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 520/675

’ 5

2 0

2

20 22 18 21 24 20

23 23 19 26 28 25 32 34 36 27 30 28

38 38 42 34 36 40 28 29 28 20 21 23 29 32 32

22 25 24 30 31 37 25 27 25

O

60

þ55 1 1

C 44 59 1 4M 34 41 1 4MB 25 30 1 8

20 23 2 2

8

Page 521: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 521/675

7 0 100 100

K C

8 J 1989 15 20 000

60 60 ¼ 10 D

?

, , , , , , 21

1 3 4 5 6 7 8

V 90 60 45 48 58 72 25 85Q 62 81 92 76 70 75 95 72L 60 91 85 81 90 76 93 80

4 5 16

55 40 38 32 43 41 50 50 47 46 41 40

51 54 53 49 39 42 53 51 53 48 38 58

60 54 47 45 44 38S 43 42 45 45 57 45

45 46 43 49 54 45Q 48 48 50 43 44 41

51 46 40 49 53 51 V 53 51 47 53 42 47

509 482 463 459 455 448

2

Page 522: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 522/675

3 12

D

S 1 10

?

?

S ¼ ¼ 5 3 5 3 Q ¼12 þ1

V

1 2 3 4 5 6 7

1 2 1 3 8 2 1 3 15 2 1 32 1 2 3 9 2 1 3 16 1 2 33 2 3 1 10 1 3 2 17 3 2 14 3 2 1 11 3 2 1 18 2 1 35 3 1 2 12 3 1 2 19 2 1 36 1 3 2 13 2 3 1 20 2 1 37 1 3 2 14 1 3 2 21 2 1 3

1 3 4 5 6 7 8 9 1

1 5 3 8 9 2 7 6 1 4 102 7 4 6 2 3 9 8 5 1 103 3 5 7 6 4 10 8 2 1 94 4 5 7 8 3 9 6 1 2 10

Page 523: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 523/675

10 : 1 ¼ 2 ¼ 3 ¼

C 1 2

3

S P 11 15 :

?

?

0 10 10

100

.

(1 2 ) ($1 , )

($1 , )

1 35 305 352 22 130 983 27 189 834 16 175 765 28 101 936 46 269 777 56 421 448 12 195 579 40 282 31

10 32 203 92

8 9 5 6 1 2 7 4 10 3

1 3 4 5 6 7

7 10 7 9 8 8 8 7 6 5 8 7 5 7

3 7 3 5 4 6 3

4 3 2 1 0 1 0

2

Page 524: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 524/675

V

100

10

1 ¼ 10 ¼

V V 100

:

C D

V

1 0 0 0 02 0 0 3 13 0 2 4 14 2 2 5 85 3 6 3 106 5 5 1 07 5 5 4 48 7 3 4 19 3 2 1 0

10 0 0 0 0

S 25 25 25 25

Page 525: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 525/675

p v

C 1 P M

‘‘ ’’

R P 1955

Page 526: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 526/675

S Ã

C 1

à à Ã

Ã

Ã

0

¼0 1

¼1

0 ¼0 1 ¼1

1 S 0 90

0 05 Ã

ffiffiffiffiffià 1 64 ¼0 05 S 0 90 Ã

:

P ð1Þ ¼ð1Þ¼ ð ffiffiffiffiffià 1 64 ¼1Þ¼ ½ ffiffiffiffiffià ð À1Þ 1 64 À ffiffiffiffiffià ¼0 90

ð1 64 À ffiffiffiffiffiÃ Þ ¼0 10 1 64 À ffiffiffiffiffià ¼ À1 28 ü9

S 5 4

¼ 0 5 ¼ ð1 1Þ

¼ ð1 À ÞÀ

¼ ð1Þ ð1 2Þ

V

Page 527: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 527/675

¼ ð 0 ¼1Þ ¼1 À ðÀ1Þ ¼0 8413 0 05

1 1 1 2 ¼0 05 ¼0 90 0 85

S 1 1

1

2

¼1 À ¼13

0 05 ¼10 1 ¼16 0 05 ¼12 2 1

2 ¼0 05

¼0 90

¼0 05 ¼0 90 1 1S

}}}

}}

}

¼0 05

¼1 À

ð1Þ17 13 0 0245 0 9013

12 0 0717 0 968116 12 0 0383 0 9211 1

0 930811 0 1050 0 9766 0 1754

15 12 0 0176 0 8226 10 9151

11 0 0593 0 9382 0 801014 11 0 0288 0 8535 1

0 888110 0 0899 0 9533 0 3470

13 10 0 0461 0 88209 0 1334 0 9650

Page 528: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 528/675

0 05 0 90

¼ 0 5 þ À1 0 5 ¼0 05

0 05

¼ ð0 85Þ

ð0 85ÞÀ

þ À1 ð0 85Þ

ð0 15ÞÀ

D

0 90 0 90 0 90 1 1 ¼15 ¼14

14 15

0 60 0 64

S

à ðÞ ÃðÞ

S

1 0 ð 0Þ ½ðÞ Ã½ðÞ 0

Ã

V 7

Page 529: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 529/675

S Ã

0 2 1 2 À 2 À

!1P ½ ðÞ ¼1

!1P ½ ÃðÞ ¼1

S 0 1 2 0 1 2 À

!1 ¼ 0 0 1 2

0

R

Ã

P ðÞ P ÃðÞ Ã( Ã ,

), , 0 . Ã .

à ¼ ðÞ , ,

!1P ð Þ ¼!1

P ÃðÞ

1. ( ) Ã

R ð ÃÞ ¼!1Ã

8

Page 530: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 530/675

Ã

R Ã

0

¼ Ã

0 ¼ 0 1 0

2 Ã2 Ã Ã Ã

Ã

ð ¼ 0Þ ¼ ð Ã Ã ¼ 0Þ ¼

Ã

1 ð Þ 0 ð Þ ¼2 3

02

!1ð Þ ¼0

ffiffiffi ð Þ¼0

¼3

0

¼ 0 þ ffiffiffi

!1

ð Þ ¼

ð

Þ ¼0

¼1

!1ð Þ¼ð Þ¼0

¼1

V

Page 531: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 531/675

4 !1

À ð Þ¼

ð Þ¼ ¼" #¼ ð Þ

5 !1

½ ¼ 0 ¼ 0 1

,

!1P ð ¼ Þ ¼1 À ð À Þ

1 À ð Þ ¼

!1 ð ¼ Þ

¼ !1

À ð Þ¼ð Þ¼

À ð Þ¼ð Þ¼

" #¼1 À ð Þ 4

¼ !1À ð Þ¼ð Þ¼

¼ !1À ð Þ¼ð Þ¼0

ð Þ¼0

ð Þ¼" #¼ !1

À ð Þ¼ð Þ¼0

3

ð Þ¼ ’ 0 1

ð Þ¼ ¼ ð Þ¼0 þ ð À 0Þ ð Þ¼Ã0

0 Ã0

S

¼ !1À ð Þ¼0

ð Þ¼0 Àð À 0Þ½ð Þ ¼Ã0

ð Þ¼0( )

Page 532: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 532/675

¼ !1À ð Þ¼0

ð Þ¼0 !À 1 2 3

¼ À 5 4

¼ !1 ð ¼ 0Þ

¼ !1

À ð Þ¼0

ð Þ¼0

À ð Þ¼0

ð Þ¼0 !¼1 À ð Þ

à , R Ã

R ð ÃÞ ¼!1ð Þ ¼0

ð ÃÞ ¼0 !2 2ð ÃÞ¼02ð Þ¼0 ð2 1Þ

2 1 Ã

1 À ð À Þ 1 À ð À ÃÃÞ

À ¼ À ÃÃ

ü Ã

3

¼ 0 þ

ffiffiffi

¼ ü 0 þÃ

ffiffiffiffiffi

Ã

ffiffiffi ¼

Ã

ffiffiffiffiffiÃ

ü

à 1 2

V

Page 533: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 533/675

S R

R ð ÃÞ ¼Ã¼

à 2

¼ Ã 2

¼ !1ð Þ ¼0

ffiffiffi ð Þ¼0 ffiffiffi

ð ÃÞ¼0

ð ÃÞ ¼0 !2

¼ !1

½ ð 2

2ð Þ ¼0

½ ð ÃÞ2

2

ð ÃÞ ¼0ð2 2Þ

2 1

2 2 R

R ð ÃÞ ¼!1 ð Þ ð ÃÞ ð2 3Þ

ð Þ ¼ 0

ð Þ ¼½ ð

Þ2

¼02ð Þ¼0 ð2 4Þ 2 2

, 1 6¼ 0 ,

2 1 2

, Ã 1 2 .

R 2 2 R

Page 534: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 534/675

þ

7 3 8

2 2 R 2 1 2 2 R

R

1954 P 1964 P S 1971 C D 1991

R

- -

C 5

ð Þ ¼ ð À Þ ð3 1Þ S

0 ¼0

V

Page 535: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 535/675

S ’

2

0 ¼0

Ã

¼ ffiffiffiffiffi

¼ ffiffiffiffiffi

ð À Þþ ffiffiffiffiffi

" #

2 ¼P

¼1 ð À Þ2

ð À1Þ S !1 ð Þ ¼1 Ã

Ã

ð Ã Þ ¼ ffiffiffiffiffi

ð Ã Þ ¼ ð Þ2 ¼1

ð Ã Þ¼0 ¼ ffiffiffiffiffi

2 4 S ’ 2

ð Ã Þ ¼ 2 ð3 2Þ

S 5 4 3 1

0 ¼ ¼0

S

ð Þ ¼ ð Þ ¼ ð1 À Þ ¼ ð 0Þ ¼1 À ð0Þ

Page 536: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 536/675

ð0Þ 3 1

¼0 ¼ 1 À ð0Þ½ ¼0

¼ 1 À ðÀÞ½ ¼0

¼ ðÞ¼0 ¼ ð0Þ ¼ ðÞ¼0 ¼0 5 ð Þ¼0 ¼ 4

ð Þ ¼4 2 ðÞ ¼2 2ðÞ ¼4 2 À1ð0 5ÞÂ Ã ð3 3Þ

S 5 7 L 1 2 ð Þ ¼ ð À Þ

0 þ

0 ¼ 0 0

V þ ¼ þ ð 2Þ þ

V þ V þ 5 7 5

ðV þ Þ ¼2

À1 ð 0Þ þ ð þ 0Þ

¼2

À1 1 À ðÀ Þ½ þZ 1

À1½1 À ðÀ À Þ ð À Þ

¼2

À1 ½1 À ðÀ Þ þZ 1

À1½1 À ðÀ À2 Þ ð Þ

ðV þ Þ ¼

2

À1

ðÀ Þ þ2

Z 1

À1

ðÀ À2 Þ ð Þ

ð Þ ¼ ð Þ S 0 ð Þ ¼ ðÀ Þ

V

Page 537: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 537/675

ðV þ Þ ¼0 ¼2

ð0Þ À1 þ !

¼Z 1

À1 ð Þ ð Þ ¼Z 1

À1 2ð Þ

5 7 6 V þ 0

ð þ1Þð2 þ1Þ6 ð À1Þ

2

2 4

24½ ð0Þð À1Þ þ 2 ð À1Þ2

ð þ1Þð2 þ1Þ ð3 4Þ 3 3 3 2 R S ’

R ð ÃÞ ¼4 À1ð0 5ÞÂ ÃÈ É2 2 ð3 5Þ R S ’

3 4 3 2 2 3 R ð þ ÃÞ

¼ !124½ ð0Þð À1Þ þ 2 ð À1Þ

2

ð þ1Þð2 þ1Þ 2

¼12 2 2 ð3 6Þ 2 R

R

1984 R

3 4 3 3 2 3

R ð þÞ ¼½ À1ð0 5Þ

2

3½R 1À1 2ð Þ

2 ð3 7Þ

Page 538: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 538/675

R R Ã

1

ð 2Þ ð Þ ¼ À ð Þ ¼ ð0Þ ¼ ð2 2ÞÀ

1 2 ð Þ ¼2 2

R ð ÃÞ ¼2

2

ð Þ ¼1 À1 2 þ1 2

ð Þ ¼1 À1 2 1 2 ð0Þ ¼1 ð Þ ¼1 12 ð Ã Þ ¼12 ð Þ ¼4 R ð ÃÞ ¼1 3

3 D

ð Þ ¼2 À À ð Þ ¼2 À

ð0Þ ¼2 ð Þ ¼2 2

ð Ã Þ ¼ 2 2 ð Þ ¼ 2

R ð ÃÞ ¼2

R ½ R ð þ ÃÞ

R ½ R ð Þ R

R 3 1

R 3 1

V 7

Page 539: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 539/675

S ’

M

R 1 R ð þ ÃÞ 0 864

L 1956 R 3 1

64%

L 1956 R ð ÃÞ 1 3

3 1

S ’

( T þ, T *), ( K , T *), ( K ,T þ) p

R ð þ ÃÞ R ð ÃÞ R ð þÞ 1 1 3 1 3

3 ¼0 955 2 ¼0 64 2 3L 2 9 ¼1 097 2 12 ¼0 82 3 4D 1 5 2 4 3

8

Page 540: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 540/675

¼ À

2 3 2

2 ¼ 2

þ 2 À2 ð Þ

ðÞ 3 3 ðÞW -

ð Þ ¼ ð À Þ ð3 8Þ

0 ¼0

2 S ’

à ¼ ffiffiffiffiffiffiffiffiffiffiffiþ À

þ ¼ ffiffiffiffiffiffiffiffiþ À À þ þ

2

þ ¼P¼1ð À Þ2

þP¼1ð À Þ2

þ À2

2 S þ 1 ! 1! Ã ¼ À

ð Ã Þ ¼ ffiffiffiffiffiffiffiffiffiffið þ Þ ð Ã Þ ¼þ

2 þ 2

2 ¼1

ð Ã Þ ¼ ffiffiffiffiffiffiffiffiffiffið þ Þ

S ’

V

Page 541: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 541/675

ð Ã Þ ¼2ð þ Þ ð3 9Þ M 6 6 2

ð Þ ¼ ð Þ ¼ ð À 0Þ ¼ 7 6 3

3 4

¼Z 1

À1 ð À Þ ð Þ

ð Þ¼0¼ ¼0 ¼ À Z 1

À1 2 ð Þ

0 ¼0 5 6 6 15

ð Þ ¼ ð þ þ1Þ12

ð Þ ¼12 R 1

À1 2 ð ÞÂ Ã2

þ þ1 ð3 10Þ 3 10 3 9 2 4 R M S

’ R

R M 0 864

0

M R

R M 1954

Page 542: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 542/675

ð Þ ¼ ðÞ

ðÞ þ ðÞ ¼þ2 ð3 11Þ

ð Þ¼0 ¼

ðÞ¼0 ¼ ðÞ ¼0 ð3 12Þ

3 8 3 11

ðÞ þ ðÀ Þ ¼þ2 ð3 13ÞD 3 13

ðÞ þ ðÀ Þ À1 ¼0

¼0

¼0 ¼ þ ð3 14ÞS 3 14 3 12

ð Þ ¼0 ¼ þ ðÞ¼0 ð3 15Þ 3 13 ¼0

ðÞ þ ðÞ ¼þ2

¼ À1 ð0 5Þ

3 15

ð

¼0 ¼ þ À1

0 5ð ÞÂ Ã ð3 16Þ 6 4 5

4

ð þ Þ 2 4 3 16

ð Þ ¼4 þ À1 0 5ð ÞÈ ÉÂ Ã2

ð3 17Þ

V

Page 543: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 543/675

3 10 3 17 2 4 R M

R 3 7

3 1

R M S ’

3 9 3 10 2 S S ’

ð

Z 1

À1 2 ð Þ ¼Z 1

À1ð2 2ÞÀ

1

À À 2 !¼

1

ffiffiffiffiffiffiffiffiffi2 2 ffiffiffi

12 ¼ ð2 ffiffiffi

ÞÀ1

ð Ã Þ ¼2ð þ Þ ð Þ ¼3

2ð þ þ1Þ R ð Ã Þ ¼3

1 3 2 P 13 1

M

7 3 8

S 8 3

8 3 2 7 3 8

¼ À1 þ1 ¼ À1

þ1 ¼ 3 4

ð

Þ ¼

ð

Þ þ ð1

À

Þ

ð

À Þ

½ ð Þ ¼ !1 ð Þ ¼À1½ ð Þ þ ð1 À Þ ð À Þ

Page 544: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 544/675

7 3 8

¼Z 1

À1À1½ ð Þ þ ð1 À Þ ð À Þ ð Þ

À1½ ðÞ ¼ ðÞ ¼ð Þ

ðÞ ¼j ð Þ ¼ ½ ðÞj ð Þ

¼0 ¼Z 1À1

Àð1 À Þ 2 ð Þj À1½ ð Þ ð3 18Þ

¼0 C 7 3 8

2 ¼ ð1 À ÞZ 1

0 ½À1ðÞ2

ÀZ 1

0À1ðÞ !2( )

¼ ð1 À ÞZ 1

À1 2 j ð Þ ÀZ 1

À1 j ð Þ !2( )

¼1

À

3 18 ð Þ ð 2Þ

ð Þ ¼ À ð Þ ¼1

j À

¼0 ¼ À1 À

2 Z 1

À1

j 2½ð À Þj ½ð À Þ

¼ À1 À Z 1

À1

1j À ¼ À

1 À

ð Þ ¼ ð1 À Þ2 ¼ 2 ð3 19Þ

S ’ Ã 3 9

V

Page 545: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 545/675

W -

ð Þ ¼ ð Þ ð3 20Þ

0: ¼1

à ¼ð À1ÞP¼1ð À Þ2

ð À1ÞP¼1ð À Þ2

S

ð Þ ¼2 ð Þ

ð Ã Þ ¼P¼1ð À Þ2

À1" # À1

P¼1ð À Þ

2" #¼ ðÀ1Þ ð Þ

1

P¼1ð À Þ2" #

¼ ðÀ1Þ2 ð ÞP¼1ð À Þ

2" #¼ ðÀ1Þ2

1Q

Q ð Þ ¼ð ÞQ

À1

1

Q ¼1

À12 2ðÀ1Þ2

Z 1

0 À1 À 2 ½ðÀ1Þ2 À1

¼À32

2 À12 ¼1

À3

Page 546: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 546/675

ð Ã Þ ¼ð À1Þ2

À3 ð Ã Þ¼1¼

2ð À1ÞÀ3

à S ’ À1 À1

S 1 2

2 22ð1 þ 2 À2Þ

1ð2 À4Þð2 À2Þ2

ð Ã Þ¼1 ¼2ð À1Þ

2ð À4Þ

ð À1ÞðÀ5ÞðÀ3Þ2

Ã

ð Ã Þ ¼2ð À1ÞðÀ5Þ

À4 %2 ¼2 ð1 À Þ ð3 21Þ

M B B D S

7 3 8 3 20

ð Þ ¼ ð Þ þ ð1 À Þ ð Þ M S 9 2

0 ¼ À2

¼1 À þ1

2 2

¼ À2

0

¼ À þ1

2 2

¼

!1

ð Þ ¼ ð À0 5Þ2

V

Page 547: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 547/675

0

¼Z 1

À1 ð Þ þ ð1 À Þ ðÞ À0 5½

2 ð Þ

¼1¼2ð1 À ÞZ 1

À1½ ð Þ À0 5 2

ð Þ

2 ¼ ð1 À ÞZ

1

0 ð À0 5Þ4

ÀZ 1

0 ð À0 5Þ2

!2

( )¼

1 À

180

ð Þ ¼720 ð1 À ÞZ 1

À1½ ð Þ À1

2 2 ð Þ& '2

ð3 22Þ M

ð Þ ð Þ ¼

ð

ÞZ 1

À1½ð À0 5Þ j 2ð Þ ¼Z 1

À1 ð Þj 2ð Þ À

12 ffiffiffiffiffiffi

2 Z 1

À1 j ð ffiffiffi

2 Þ

¼Z 1

À1Z À1 j ðÞj 2ð Þ

¼Z 1

À1j ðÞZ 1

1

2À 2

¼1

4 Z 1

À1j ðÞ À2

¼1

4 Z 1

À1

1

ffiffiffiffiffiffi2 À

3 2 2

¼ð4 ffiffiffi3 ÞÀ

1

Page 548: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 548/675

ð Þ ¼15 ð1 À Þ 2 R ð Ã Þ ¼15 2 2

S 9 3 0 ð þ1Þ 0 ¼ 9 3 1

¼ þ1 À12 ¼

þ1 À

12 À

12 ¼

ð Þ ¼ À0 5

0

¼Z 1

À1 ð Þ þ ð1 À Þ ð Þ À0 5 ð Þ

¼1 ¼ ð1 À ÞZ 1

À1 ð Þ ð Þ ¼1

¼ ð1 À ÞZ 1

À1 2 ð Þ

ð Þ

¼1¼2ð1 À ÞZ 10

2 ð Þ ð3 23Þ

¼1

2 ¼ ð1 À ÞZ 1

0 À0 5 2

À Z 1

0 À0 5 2" #¼

1 À

48

ð Þ ¼j ð Þ 3 23

ð Þ ¼12 ð1 À Þ 2

R ð ÃÞ ¼6 2 R ð Þ ¼4 5

V 7

Page 549: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 549/675

R

P 1

2 ‘‘ ’’

B 19601960 1967 B

B

K 1965 R

K 1963 1964 1965 R 1973 R 1979 116 B

1980 C 1991

1995 R

P

L 1970

7 3 8 8 2 1 ð Þ ¼ ð À Þ:

ð 2Þ ð Þ ¼½À Þð

8

Page 550: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 550/675

ð Þ ¼0 À1 2 þ1 2 À1 2 1 21 1 2

8<:

ð Þ ¼ð1 2Þ 01 Àð1 2Þ À 0&

C S ’ P 13 1

P 13 1 13 2 R M S ’ :

: 3

: 1

D : 3 2

C S ’ ð Þ ¼ ð À Þ

ð Þ ¼ ð1 À À ÞÀ1

K 9 5 1

P 13 1:

7 P 13 4 R S ’ 2 128 V R

: 1 3

: 2 3

L : 3 4

D : 4 3

S 1 2 3 1 S

1

3 R

R ð 1 3Þ ¼R ð 1 2Þ R ð 2 3Þ ¼ ½ R ð 3 1ÞÀ1

1 1 0 1 0 1

V

Page 551: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 551/675

C 5

Â

Â2

Page 552: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 552/675

2 Â2 ’

M ’

S

1 2

S

S

1 Â 2 Â 3 ÂÁÁÁ 1 2

ð \ \ \ ÁÁ ÁÞ ¼ð Þ ð Þ ð ÞÁÁÁ & & &

ð \ \ \ ÁÁ ÁÞ ¼ð Þ ð \ \ÁÁÁÞ

2

Â3

Â5 30

Page 553: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 553/675

S

 , \ :

¼

¼1 ¼¼1

11 12

¼ ð \ Þ¼1

¼1 ¼1

Y¼1

Y

¼1ð

Þ

1 2 ÁÁÁ

1 11 12 ÁÁÁ 1 1 2 21 22 ÁÁÁ 2 2

Á Á ÁÁÁ Á ÁÁ Á ÁÁÁ Á ÁÁ Á ÁÁÁ Á Á 1 2 ÁÁÁ

1 2 ÁÁÁ ¼

Page 554: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 554/675

0 ¼ ¼1 2 ¼1 2

¼

¼1 ¼ ð Þ ¼

¼1 ¼ ð Þ

0

S 4 2

ðÀ1Þ þ ð À1Þ

0

ð1 2 1 2 Þ ¼Y¼1Y

¼1ð Þ ð2 1Þ

¼1 ¼

¼1 ¼1

¼ ¼ ¼1 2 ¼1 2

Page 555: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 555/675

0

¼

B S 4 2

Q ¼¼1

¼1ð À Þ

2

¼¼1

¼1ð À Þ

2

ð2 2Þ 0 À1 À ðÀ1 þ À1Þ ¼ ðÀ1Þð À1Þ S

Q 2 Q 2

ðÀ1Þð À1Þ 5

1 Â 2 Â 3

0 ¼ ¼1 2 1 ¼1 2 2

¼1 2

3

¼ 2

1

¼1

2

¼1

3

¼1 ð 2 À Þ2 2

1 2 3 À1 À ð1 À1 þ 2 À1 þ 3 À1Þ ¼1 2 3 À 1 À 2 À 3 þ2

0 ¼

Page 556: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 556/675

¼

1 2 3 À1 À ð1 À1 þ 2 3 À1Þ ¼ ð1 À1Þð2 3 À1Þ

Â

ð \ Þ S

ð \ Þ ¼ð Þ ð Þð Þ ¼

¼ ð Þ P ¼1 ¼1

À1 ð À1Þ ¼ ð Þ ¼ð Þ ¼

¼1 2

P ¼1 ¼1 0 ð Þ ¼

¼1ð À Þ

2

ð2 3Þ À1

¼ ¼1 2 ¼1 2

0 1 ¼ 2 ¼ ÁÁÁ ¼ ¼ ¼1 2

2 3

¼1 2

¼1

¼1ð À Þ

2

ð2 4Þ

Page 557: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 557/675

0 À1 ð À1Þ 2 4

À1

0

ð1 2 Þ ¼Y¼1Y

¼1

¼Y

¼1

¼ S 2 4

2 2 ð À1Þ À ð À1Þ ¼ðÀ1Þð À1Þ B

P

1904

¼ QQ þ

1 2

ð2 5ÞQ

Q

Q

1

1 Â

¼ À1 1 2

¼ ð Þ

Page 558: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 558/675

C

¼1 ¼1 2 0 1 ¼1

¼0 À Q 2 2

¼1ðÀ Þð0 À Þ

2

þ ð1 À Þ2

¼

¼1

Á

ðÀ

Á

Þ½

þ ðÀ

Þ ¼ð À1Þ

¼ ð À1ÞÀ þ !1 2

¼ À1 1 2

ð Þ

¼ ffiffiffiffiffiffiffiffiffiffiffiQ ð2 6Þ

Q

Q 2 2

p S 1984 ’

452 2 1

7

Page 559: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 559/675

2 1 2 2

Q ¼42 250 6

B 0 001 2 5

¼ ffiffiffiffiffiffiffiffiffiffiffiffi42 250 494 25 ¼0 2924 2 6

ffiffiffiffiffiffiffiffiffiffiffiffiffi42 250 452 ¼0 3057

S X C P ’ CC

p

( )

( . ) 1–15 16

105 7 11 1230 01–0 10 58 5 13 760 11–0 99 84 37 42 1631 00 57 16 17 90

304 65 83 452

8

Page 560: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 560/675

3

Â2

Q ¼

¼1

2

¼1ð À Þ

2

¼

¼1ð À Þ

2

ð1 À Þ ð3 1Þ

¼ 1 À ¼ 2 ¼

¼1

1 2 1 2

1 2 3 1

3 1 S 10 8

3 1

p Q 200

p q

1–15 16

0 105 82 7 7 17 7 11 22 6 1230 01–0 10 58 51 1 5 10 9 13 14 0 760 11–0 99 84 109 6 37 23 4 42 30 0 1631 00 57 60 5 16 12 9 17 16 5 90

304 65 83 452

R 125 81 40

Page 561: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 561/675

?

75 119 160 75 þ119 þ160 600 ¼0 59 118

Q 3 1 74 70 2 B 0 001

¼2 3 1

Q

¼ð 1 1 À 2 2Þ

2

ð1 À Þð1 1 þ1 2Þ ð3 2

Þ 3 2

Q

S 3 2 Q

Q ¼ ð 11 22 À 12 21Þ

2

1 2 1 2 ð3 3Þ K

C 11 S

S

1

2 K C 11

ð Þð Þ À À

1 1 2 2 1 2

2 1 , ¼1 2

11 22 12 21

Page 562: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 562/675

S 11 2 37 1 2 1 2

¼ 11 22 À 12 21

ð 1 2 1 2 Þ1 2 ¼

Q 1 2

ð3 4ÞQ t 2

K 2 6

p 2 1

S 452 37 3 2 2 1

B 0 11–0 99 B 1–15

K ’ 452 Q

2 3 B 11 2 38 11 2 30

Q

105 5 þ13 þ37 þ42 þ16 þ17 ¼13 650 7 58 þ84 þ57 ¼1 3937 13 þ42 þ17 ¼504 11 58 þ84 þ57 þ5 þ37 þ16 ¼2 827

58 37 þ42 þ16 þ17 ¼6 496 58 84 þ57 ¼7055 42

þ17

¼295 13 84

þ57

þ37

þ16

¼2 522

84 16 þ17 ¼2 2772 37 57 ¼2 10937 17 ¼629 42 57 þ16 ¼3 066

24 346 12 622

Page 563: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 563/675

¼24 346À12 622

ffiffiffiffiffiffiffiffiffiffiffiffi4522 À304

2À ÁÀ652À ÁÀ83

2À Á! 4522À ÁÀ123

2À ÁÀ762À ÁÀ163

2À ÁÀ902À Á!s

¼0 1915

¼3ð0 1915Þ ffiffiffiffiffiffiffiffiffiffi

452ð451Þ ffiffiffiffiffiffiffiffiffiffiffi2ð904 þ5Þ ¼6 08

0 000

2

Â2 K ’

12 6 1 Q 3 3

¼ ffiffiffiffiffiffiffiffiffiffiffiQ ¼ 2

Q 6 1 C 12

S 1 2

1 2 1 2 1 2

2 Â2 4 1

S 14 3 Q 3 2

1 1 1À 1 1

2 2 2À 2 2

1 þ 2 À 1 þ 2

Page 564: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 564/675

¼ 1 þ 2

À ð 1 þ 2Þ ’ ’

2 Â2 1 þ 2 1

0 1 ¼ 2 ¼ 1

1 1 2

2 ð4 1Þ

4 1

1 2 Â2 2 Â2

1

1 1 2 1 ð 1 1O ¼ Þ 1O

1 2 Â2

1 1O 1

p S R

Page 565: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 565/675

S S 3 4 ?

1 0 1 2 3 4 3 2 Â2

4 1

43 4

1 þ44 4

0 !084 ¼0 2286 þ0 0143 ¼0 2429

3 3 Q ¼2 0 ¼1 B

0 10 0 25 2 Â2

M 3 1 4

1 3 4 4 4 8

M 4 0 4 0 4 4

4 4 8

Page 566: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 566/675

S X C S S B

0 2429 1 3 0 2286 S X C

S X C 1 807

Page 567: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 567/675

Page 568: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 568/675

S 6 4 ’

1 þ 2 2

À1 2 ’

1982 S X C ’ 1 2 1 ¼ 1

2 ¼ 2 1 ¼ 2 ¼10 0 05 1 ¼0 5 2 ¼0 8 ’ 0 13 4 1

S X C 1 2

S 2 Â2

L 11 22

12 21

S X C ’

7

Page 569: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 569/675

12 21

2 Â2

L

ð11 12 21 22Þ 11 þ 12 þ 21 þ 22 ¼1 11

1 ¼ 11 þ 21

1 ¼ 11 þ 12 0 1 ¼ 1 12 ¼ 21

0 1 ¼ 1

¼ ð 1 À 1Þ 1 À 1 S 1 ¼ 11 þ 12 1 ¼ 11 þ 21

¼ ð 12 À 21Þ

ð12 þ 21Þ ’ 0

1 1 6¼ 1

ð 12 À 21Þ2

ð 12 þ 21Þ ð5 1Þ 1

¼ ð 12 À 21Þ

12 21 12

21 ð 12Þ ¼ 12 ð 12Þ ¼ 12ð1 À 12Þ

S 11 12 1.

21 22 2.

.1 .2

8

Page 570: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 570/675

ð 21Þ ¼ 21 ð 21Þ ¼ 21ð1 À 21Þ ð Þ ¼12 À 21

2 ð Þ ¼ ð 12Þ þ ð 21Þ À2 ð 12 21Þ ð5 2Þ ð 11 12 21 22Þ

ð11 12 21 22Þ 12 21 ð12 21Þ 12 21

1 2 1

12 1 þ 21 2 þ ½1 À ð12 þ 21Þ

ð5 3Þ 5 3 1 2 1 ¼ 2 ¼0

ð 12 21Þ ¼ ð À1Þ12 21

ð 12 21Þ ¼ ð À1Þ12 21 À ð 12Þð 21Þ ¼ À 12 21

5 2

2 ð Þ ¼ 12ð1 À 12Þ þ 21ð1 À 21Þ À2ðÀ 12 21Þ¼ ½ð12 þ 21Þ À ð12 À 21Þ

2

ð5 4Þ ¼ ð 12 À 21Þ 12 À 21

½ð12 þ 21Þ À ð12 À 21Þ2

12 ¼ 21

ð12 þ 21Þ

ð 12 þ 21Þ 2 M ’ 5 1

M ’ ð 11 12 21 22Þ ð11 12 21 22Þ L

¼ 12 þ 21 12

½ ¼ 12 ð12 þ 21Þ 12 ¼ 21

0 5 12

Page 571: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 571/675

1 1 1 1 12 21

P 12 ¼0 ð0 5Þ

20

¼ 12 À0 5

ffiffiffiffiffiffiffiffiffiffi0 25 ¼

12 À 21

ffiffiffiffiffiffiffiffiffiffi 12 þ 21 ð5 5Þ

¼ 12 À 21 þ0 5 ffiffiffiffiffiffiffiffiffiffiffi 12 þ 21 ð5 6Þ

5 2 M 5 1

1 12 21

5 5 S X C 5 1

p S 100 35

65 55 45 65

30

?

2 Â2 1 1 21 12

S 20 15 35 35 30 65

55 45 100

Page 572: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 572/675

15

15 50 5 5

¼15 À50ð0 5Þ ffiffiffiffiffiffiffiffiffiffiffi

0 25ð50Þ ¼ À2 83

0 0023 5 3

¼ð15 À35Þ þ0 5

ffiffiffiffiffiffiffiffiffiffiffi15 þ35 ¼ À2 76

0 0029

M ’ 5 1

2 ¼ð35 À15Þ2

35 þ15 ¼8 0

B 0 001 0 005S X C S S

S X C

S X C ’

M ’ S S 0 0047

Page 573: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 573/675

Page 574: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 574/675

M ’

L 1993 L 1992 S X C

M ’

C 1 2 S 4 2

P 4 1 4 3 4 5 4 27 4 31 4 32

Â

1 2 0 1 P ¼1 ¼2 1 2 2 Â 6 1

0 11 ¼ 21

12

¼22 1

¼2

ð 1 þ 2 Þ ¼ 1 2 1 2

¼ À1

Q ¼2

¼1

¼1ð À Þ

2

ð6 1Þ

1 ÁÁÁ

1 11 12 ÁÁÁ 1 k 1

2 21 22 ÁÁÁ 2 k 2

.1 .2 ÁÁÁ .k

Page 575: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 575/675

3 1

p B

100

50 2 ‘‘D ’ ’’ ‘‘R ’’ ‘‘R ’’ ?

0 1 ¼ 1 2 ¼ 2 3 ¼ 3

ð À Þ2

ð Þ Q

Q

¼0 17

þ0

þ0 10

þ0 17

þ0

þ0 10

¼0 54

B ¼2 0 50 S X C M B

C 12 15 23 50C 15 15 20 50

27 30 43 100

C 11 ¼12 12 ¼15 13 ¼23 50 11 ¼13 5 12 ¼15 13 ¼21 5

¼0 17 ¼0 ¼0 10C 21 ¼15 22 ¼15 23 ¼20 50

21 ¼13 5 22 ¼15 23 ¼21 5

¼0 17 ¼0 ¼0 10

27 30 43 100

Page 576: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 576/675

Page 577: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 577/675

6 1

S 8 2

6 2

3 2 K

p 10

45 45 L M

45 45 3

1 ¼ 2 ¼ 3 ¼ L

1

ð1 þ2 þ3 þ4 þ5 þ6Þ6 ¼21 6 ¼3 5 M ð7 þ8 þÁÁÁþ14Þ8 ¼84 8 ¼10 5

ð15 þ16 þÁÁÁþ20Þ6 ¼105 6 ¼17 5

45 2 3 5 10 45 4 5 1 10

6 8 6 20

Page 578: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 578/675

W ¼2ð3 5Þ þ3ð10 5Þ þ5ð17 5Þ ¼126

W ¼10 ¼10 ¼20

ð þ1Þ2 8 2 3 105 154 74 ¼1 688

¼1 648 0 046 0 050 0 05

S X C

0 0785 0 05

[ L 1975 384]

¼P

1

D

7

Page 579: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 579/675

K 1987 S X C 5 0

:

ðÞ M

8 : D Q

?

28

1

100 D 20 80

B

5 8 11

B

30 35 40

B

25 17 9

D 70 60 80S 400 300 300

8

Page 580: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 580/675

100 20 5 80 10

D

200 : 100 ‘‘D

?’’ 200 90

7

1500

69

30

1400

5 4 9 13 6 19

18 10 28

8

($1 )

5 5 – O

D 7 10 8 25D 10 18 9 37

17 28 17 62

Page 581: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 581/675

8 ’ S

2 C

100

ðÞ

ðÞ C

ðÞ C K ’

ðÞ C K

135 S S

135 S

37 30 37

B 11

6 ?

7

4 135 D

686

(1 ¼ )

1 3

25 8 7 5 2025–39 12 8 20 4040 20 15 5 40

40 30 30 100

Page 582: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 582/675

Z B

P P

P 14 8

D 2 1

S 2 2

Â

7 V 3 1 3 2 3 3

3 6 32 16B 23 30 70 56C 48 51 67 29D 49 45 27 6

71 44 17 2

Page 583: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 583/675

pp

D 554 B C S D 555 C C B D 556 D R D 568 R D D 573 K S S S 576 B D ¼0 5 577 P S R S 578 K S S S 581 J P R S S 584 K K S 592 L K ’ S 593 M S ’ C R C 595 ’ V S K ’

C C 598

Page 584: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 584/675

L ’ D C V 599 P S P K ’ P R

C C 600 Q P ’ S 601 R C V P

J 602 S R S 607 L ’ D C

V 610

C J L Q J D

# 1976 1985 1997 S P S 13224 2144 P

Page 585: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 585/675

Page 586: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 586/675

- q v

n .95 .9 .5 . 5 .1 . 5 . 1 . 5 . 1

1 0 004 0 016 0 45 1 32 2 71 3 84 6 63 7 88 10 832 0 10 0 21 1 39 2 77 4 61 5 99 9 21 10 60 13 823 0 35 0 58 2 37 4 11 6 25 7 81 11 34 12 84 16 274 0 71 1 06 3 36 5 39 7 78 9 49 13 28 14 86 18 475 1 15 1 61 4 35 6 63 9 24 11 07 15 09 16 75 20 526 1 64 2 20 5 35 7 84 10 64 12 59 16 81 18 55 22 467 2 17 2 83 6 35 9 04 12 02 14 07 18 48 20 28 24 32

8 2 73 3 49 7 34 10 22 12 36 15 51 20 09 21 96 26 129 3 33 4 17 8 34 11 39 14 68 16 92 21 67 23 59 27 8810 3 94 4 87 9 34 12 55 15 99 18 31 23 21 25 19 29 5911 4 57 5 58 10 34 13 70 17 28 19 68 24 72 26 76 31 2612 5 23 6 30 11 34 14 85 18 55 21 03 26 22 28 30 32 9113 5 89 7 04 12 34 15 98 19 81 22 36 27 69 29 82 34 5314 6 57 7 79 13 34 17 12 21 06 23 68 29 14 31 32 36 1215 7 26 8 55 14 34 18 25 22 31 25 00 30 58 32 80 37 7016 7 96 9 31 15 34 19 37 23 54 26 30 32 00 34 27 39 2517 8 67 10 09 16 34 20 49 24 77 27 59 33 41 35 72 40 7918 9 39 10 86 17 34 21 60 25 99 28 87 34 81 37 16 42 3119 10 12 11 65 18 34 22 72 27 20 30 14 36 19 38 58 43 8220 10 85 12 44 19 34 23 83 28 41 31 41 37 57 40 00 45 3221 11 59 13 24 20 34 24 93 29 62 32 67 38 93 41 40 46 8022 12 34 14 04 21 34 26 04 30 81 33 92 40 29 42 80 48 2723 13 09 14 85 22 34 27 14 32 01 35 17 41 64 44 18 49 7324 13 85 15 66 23 34 28 24 33 20 36 42 42 98 45 56 51 1825 14 61 16 47 24 34 29 34 34 38 37 65 44 31 46 93 52 6226 15 38 17 29 25 34 30 43 35 56 38 89 45 64 48 29 54 0527 16 15 18 11 26 34 31 53 36 74 40 11 46 96 49 64 55 4828 16 93 18 94 27 34 32 62 37 92 41 34 48 28 50 99 56 8929 17 71 19 77 28 34 33 71 39 09 42 56 49 59 52 34 58 3030 18 49 20 60 29 34 34 80 40 26 43 77 50 89 53 67 59 70

v 30 Q ¼ ffiffiffiffiffiffiffi

2Q À ffiffiffiffiffiffiffiffiffiffiffi2n À1

: 8 P S 1954 1 C P C

B

Page 587: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 587/675

v B

. 5 .1 .15 . . 5 .3 .35 .4 .45

1 0 9500 9000 8500 8000 7500 7000 6500 6000 55001 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

2 0 9025 8100 7225 6400 5625 4900 4225 3600 30251 9975 9900 9775 9600 9375 9100 8775 8400 79752 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

3 0 8574 7290 6141 5120 4219 3430 2746 2160 16641 9928 9720 9392 8960 8438 7840 7182 6480 57482 9999 9990 9966 9920 9844 9730 9561 9360 90893 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

4 0 8145 6561 5220 4096 3164 2401 1785 1296 09151 9860 9477 8905 8192 7383 6517 5630 4752 39102 9995 9963 9880 9728 9492 9163 8735 8208 75853 1 0000 9999 9995 9984 9961 9919 9850 9744 95904 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

5 0 7738 5905 4437 3277 2373 1681 1160 0778 05031 9774 9185 8352 7373 6328 5282 4284 3370 25622 9988 9914 9734 9421 8965 8369 7648 6826 59313 1 0000 9995 9978 9933 9844 9692 9460 9130 86884 1 0000 1 0000 9999 9997 9990 9976 9947 9898 98155 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

6 0 7351 5314 3771 2621 1780 1176 0754 0467 0277

1 9672 8857 7765 6554 5339 4202 3191 2333 16362 9978 9842 9527 9011 8306 7443 6471 5443 44153 9999 9987 9941 9830 9624 9295 8826 8208 74474 1 0000 9999 9996 9984 9954 9891 9777 9590 93085 1 0000 1 0000 1 0000 9999 9998 9993 9982 9959 99176 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

7 0 6983 4783 3206 2097 1335 0824 0490 0280 01521 9556 8503 7166 5767 4449 3294 2338 1586 10242 9962 9743 9262 8520 7564 6471 5323 4199 31643 9998 9973 9879 9667 9294 8740 8002 7102 60834 1 0000 9998 9988 9953 9871 9712 9444 9037 84715 1 0000 1 0000 9999 9996 9987 9962 9910 9812 96436 1 0000 1 0000 1 0000 1 0000 9999 9998 9994 9984 99637 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

Page 588: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 588/675

ð Þ

.5 .55 .6 .65 .7 .75 .8 .85 .9 .95

1 0 5000 4500 4000 3500 3000 2500 2000 1500 1000 05001 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

2 0 2500 2025 1600 1225 0900 0625 0400 0225 0100 00251 7500 6975 6400 5775 5100 4375 3600 2775 1900 09752 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

3 0 1250 0911 0640 0429 0270 0156 0080 0034 0010 00011 5000 4252 3520 2818 2160 1562 1040 0608 0280 00722 8750 8336 7840 7254 6570 5781 4880 3859 2710 14263 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

4 0 0625 0410 0256 0150 0081 0039 0016 0005 0001 00001 3125 2415 1792 1265 0837 0508 0272 0120 0037 00052 6875 6090 5248 4370 3483 2617 1808 1095 0523 01403 9375 9085 8704 8215 7599 6836 5904 4780 3439 18554 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

5 0 0312 0185 0102 0053 0024 0010 0003 0001 0000 00001 1875 1312 0870 0540 0308 0156 0067 0022 0005 00002 5000 4069 3174 2352 1631 1035 0579 0266 0086 00123 8125 7438 6630 5716 4718 3672 2627 1648 0815 02264 9688 9497 9222 8840 8319 7627 6723 5563 4095 22625 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

6 0 0156 0083 0041 0018 0007 0002 0001 0000 0000 00001 1094 0692 0410 0223 0109 0046 0016 0004 0001 00002 3438 2553 1792 1174 0705 0376 0170 0059 0013 0001

3 6562 5585 4557 3529 2557 1694 0989 0473 0158 00224 8906 8364 7667 6809 5798 4661 3446 2235 1143 03285 9844 9723 9533 9246 8824 8220 7379 6229 4686 26496 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

7 0 0078 0037 0016 0006 0002 0001 0000 0000 0000 00001 0625 0357 0188 0090 0038 0013 0004 0001 0000 00002 2266 1529 0963 0556 0288 0129 0047 0012 0002 00003 5000 3917 2898 1998 1260 0706 0333 0121 0027 00024 7734 6836 5801 4677 3529 2436 1480 0738 0257 00385 9375 8976 8414 7662 6706 5551 4233 2834 1497 04446 9922 9848 9720 9510 9176 8665 7903 6794 5217 30177 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

7

Page 589: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 589/675

ð Þ

. 5 .1 .15 . . 5 .3 .35 .4 .45

8 0 6634 4305 2725 1678 1001 0576 0319 0168 00841 9428 8131 6572 5033 3671 2553 1691 1064 06322 9942 9619 8948 7969 6785 5518 4278 3154 22013 9996 9950 9786 9437 8862 8059 7064 5941 47704 1 0000 9996 9971 9896 9727 9420 8939 8263 73965 1 0000 1 0000 9998 9988 9958 9887 9747 9502 91156 1 0000 1 0000 1 0000 9999 9996 9987 9964 9915 98197 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9998 9993 99838 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

9 0 6302 3874 2316 1342 0751 0404 0207 0101 00461 9288 7748 5995 4362 3003 1960 1211 0705 03852 9916 9470 8591 7382 6007 4628 3373 2318 14953 9994 9917 9661 9144 8343 7297 6089 4826 36144 1 0000 9991 9944 9804 9511 9012 8283 7334 62145 1 0000 9999 9994 9969 9900 9747 9464 9006 83426 1 0000 1 0000 1 0000 9997 9987 9957 9888 9750 95027 1 0000 1 0000 1 0000 1 0000 9999 9996 9986 9962 99098 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9997 99929 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

10 0 5987 3487 1969 1074 0563 0282 0135 0060 00251 9139 7361 5443 3758 2440 1493 0860 0464 02332 9885 9298 8202 6778 5256 3828 2616 1673 09963 9990 9872 9500 8791 7759 6496 5138 3823 26604 9999 9984 9901 9672 9219 8497 7515 6331 5044

5 1 0000 9999 9986 9936 9803 9527 9051 8338 73846 1 0000 1 0000 9999 9991 9965 9894 9740 9452 89807 1 0000 1 0000 1 0000 9999 9996 9984 9952 9877 97268 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9995 9983 99559 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9997

10 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000011 0 5688 3138 1673 0859 0422 0198 0088 0036 0014

1 8981 6974 4922 3221 1971 1130 0606 0302 01392 9848 9104 7788 6174 4552 3127 2001 1189 06523 9984 9815 9306 8389 7133 5696 4256 2963 19114 9999 9972 9841 9496 8854 7897 6683 5328 39715 1 0000 9997 9973 9883 9657 9218 8513 7535 63316 1 0000 1 0000 9997 9980 9924 9784 9499 9006 82627 1 0000 1 0000 1 0000 9998 9988 9957 9878 9707 93908 1 0000 1 0000 1 0000 1 0000 9999 9994 9980 9941 9852

9 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9998 9993 997810 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 999811 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

8

Page 590: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 590/675

ð Þ

.5 .55 .6 .65 .7 .75 .8 .85 .9 .95

8 0 0039 0017 0007 0002 0001 0000 0000 0000 0000 00001 0352 0181 0085 0036 0013 0004 0001 0000 0000 00002 1445 0885 0498 0253 0113 0042 0012 0002 0000 00003 3633 2604 1737 1061 0580 0273 0104 0029 0004 00004 6367 5230 4059 2936 1941 1138 0563 0214 0050 00045 8555 7799 6846 5722 4482 3215 2031 1052 0381 00586 9648 9368 8936 8309 7447 6329 4967 3428 1869 05727 9961 9916 9832 9681 9424 8999 8322 7275 5695 33668 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

9 0 0020 0008 0003 0001 0000 0000 0000 0000 0000 00001 0195 0091 0038 0014 0004 0001 0000 0000 0000 00002 0898 0498 0250 0112 0043 0013 0003 0000 0000 00003 2539 1658 0994 0536 0253 0100 0031 0006 0001 00004 5000 3786 2666 1717 0988 0489 0196 0056 0009 00005 7461 6386 5174 3911 2703 1657 0856 0339 0083 00066 9102 8505 7682 6627 5372 3993 2618 1409 0530 00847 9805 9615 9295 8789 8040 6997 5638 4005 2252 07128 9980 9954 9899 9793 9596 9249 8658 7684 6126 36989 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

10 0 0010 0003 0001 0000 0000 0000 0000 0000 0000 00001 0107 0045 0017 0005 0001 0000 0000 0000 0000 00002 0547 0274 0123 0048 0016 0004 0001 0000 0000 00003 1719 1020 0548 0260 0106 0035 0009 0001 0000 00004 3770 2616 1662 0949 0473 0197 0064 0014 0001 0000

5 6230 4956 3669 2485 1503 0781 0328 0099 0016 00016 8281 7340 6177 4862 3504 2241 1209 0500 0128 00107 9453 9004 8327 7384 6172 4744 3222 1798 0702 01158 9893 9767 9536 9140 8507 7560 6242 4557 2639 08619 9990 9975 9940 9865 9718 9437 8926 8031 6513 4013

10 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000011 0 0005 0002 0000 0000 0000 0000 0000 0000 0000 0000

1 0059 0022 0007 0002 0000 0000 0000 0000 0000 00002 0327 0148 0059 0020 0006 0001 0000 0000 0000 00003 1133 0610 0293 0122 0043 0012 0002 0000 0000 00004 2744 1738 0994 0501 0216 0076 0020 0003 0000 00005 5000 3669 2465 1487 0782 0343 0117 0027 0003 00006 7256 6029 4672 3317 2103 1146 0504 0159 0028 00017 8867 8089 7037 5744 4304 2867 1611 0694 0185 00168 9673 9348 8811 7999 6873 5448 3826 2212 0896 0152

9 9941 9861 9698 9394 8870 8029 6779 5078 3026 101910 9995 9986 9964 9912 9802 9578 9141 8327 6862 431211 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

Page 591: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 591/675

ð Þ

. 5 .1 .15 . . 5 .3 .35 .4 .45

12 0 5404 2824 1422 0687 0317 0138 0057 0022 00081 8816 6590 4435 2749 1584 0850 0424 0196 00832 9804 8891 7358 5583 3907 2528 1513 0834 04213 9978 9744 9078 7946 6488 4925 3467 2253 13454 9998 9957 9761 9274 8424 7237 5833 4382 30445 1 0000 9995 9954 9806 9456 8822 7873 6652 52696 1 0000 9999 9993 9961 9857 9614 9154 8418 73937 1 0000 1 0000 9999 9994 9972 9905 9745 9427 88838 1 0000 1 0000 1 0000 9999 9996 9983 9944 9847 9644

9 1 0000 1 0000 1 0000 1 0000 1 0000 9998 9992 9972 992110 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9997 998911 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 999912 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

13 0 5133 2542 1209 0550 0238 0097 0037 0013 00041 8646 6213 3983 2336 1267 0637 0296 0126 00492 9755 8661 7296 5017 3326 2025 1132 0579 02693 9969 9658 9033 7473 5843 4206 2783 1686 09294 9997 9935 9740 9009 7940 6543 5005 3530 22795 1 0000 9991 9947 9700 9198 8346 7159 5744 42686 1 0000 9999 9987 9930 9757 9376 8705 7712 64377 1 0000 1 0000 9998 9988 9944 9818 9538 9023 82128 1 0000 1 0000 1 0000 9998 9990 9960 9874 9679 93029 1 0000 1 0000 1 0000 1 0000 9999 9993 9975 9922 9797

10 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9997 9987 995911 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 999512 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000013 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

14 0 4877 2288 1028 0440 0178 0068 0024 0008 00021 8470 5846 3567 1979 1010 0475 0205 0081 00292 9699 8416 6479 4481 2811 1608 0839 0398 01703 9958 9559 8535 6982 5213 3552 2205 1243 06324 9996 9908 9533 8702 7415 5842 4227 2793 16725 1 0000 9985 9885 9561 8883 7805 6405 4859 33736 1 0000 9998 9978 9884 9617 9067 8164 6925 54617 1 0000 1 0000 9997 9976 9897 9685 9247 8499 74148 1 0000 1 0000 1 0000 9996 9978 9917 9757 9417 88119 1 0000 1 0000 1 0000 1 0000 9997 9983 9940 9825 9574

10 1 0000 1 0000 1 0000 1 0000 1 0000 9998 9989 9961 988611 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9994 997812 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 999713 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000014 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

Page 592: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 592/675

ð Þ

.5 .55 .6 .65 .7 .75 .8 .85 .9 .95

12 0 0002 0001 0000 0000 0000 0000 0000 0000 0000 00001 0032 0011 0003 0001 0000 0000 0000 0000 0000 00002 0193 0079 0028 0008 0002 0000 0000 0000 0000 00003 0730 0356 0153 0056 0017 0004 0001 0000 0000 00004 1938 1117 0573 0255 0095 0028 0006 0001 0000 00005 3872 2607 1582 0846 0386 0143 0039 0007 0001 00006 6128 4731 3348 2127 1178 0544 0194 0046 0005 00007 8062 6956 5618 4167 2763 1576 0726 0239 0043 00028 9270 8655 7747 6533 5075 3512 2054 0922 0256 0022

9 9807 9579 9166 8487 7472 6093 4417 2642 1109 019610 9968 9917 9804 9576 9150 8416 7251 5565 3410 118411 9998 9992 9978 9943 9862 9683 9313 8578 7176 459612 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

13 0 0001 0000 0000 0000 0000 0000 0000 0000 0000 00001 0017 0005 0001 0000 0000 0000 0000 0000 0000 00002 0112 0041 0013 0003 0001 0000 0000 0000 0000 00003 0461 0203 0078 0025 0007 0001 0000 0000 0000 00004 1334 0698 0321 0126 0040 0010 0002 0000 0000 00005 2905 1788 0977 0462 0182 0056 0012 0002 0000 00006 5000 3563 2288 1295 0624 0243 0070 0013 0001 00007 7095 5732 4256 2841 1654 0802 0300 0053 0009 00008 8666 7721 6470 4995 3457 2060 0991 0260 0065 00039 9539 9071 8314 7217 5794 4157 2527 0967 0342 0031

10 9888 9731 9421 8868 7975 6674 4983 2704 1339 024511 9983 9951 9874 9704 9363 8733 7664 6017 3787 135412 9999 9996 9987 9963 9903 9762 9450 8791 7458 486713 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

14 0 0000 0000 0000 0000 0000 0000 0000 0000 0000 00001 0009 0003 0001 0000 0000 0000 0000 0000 0000 00002 0065 0022 0006 0001 0000 0000 0000 0000 0000 00003 0287 0114 0039 0011 0002 0000 0000 0000 0000 00004 0898 0462 0175 0060 0017 0003 0000 0000 0000 00005 2120 1189 0583 0243 0083 0022 0004 0000 0000 00006 3953 2586 1501 0753 0315 0103 0024 0003 0000 00007 6047 4539 3075 1836 0933 0383 0116 0022 0002 00008 7880 6627 5141 3595 2195 1117 0439 0115 0015 00009 9102 8328 7207 5773 4158 2585 1298 0467 0092 0004

10 9713 9368 8757 7795 6448 4787 3018 1465 0441 004211 9935 9830 9602 9161 8392 7189 5519 3521 1584 030112 9991 9971 9919 9795 9525 8990 8021 6433 4154 153013 9999 9998 9992 9976 9932 9822 9560 8972 7712 512314 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

Page 593: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 593/675

ð Þ

. 5 .1 .15 . . 5 .3 .35 .4 .45

15 0 4633 2059 0874 0352 0134 0047 0016 0005 00011 8290 5490 3186 1671 0802 0353 0142 0052 00172 9638 8159 6042 3980 2361 1268 0617 0271 01073 9945 9444 8227 6482 4613 2969 1727 0905 04244 9994 9873 9383 8358 6865 5155 3519 2173 12045 9999 9978 9832 9389 8516 7216 5643 4032 26086 1 0000 9997 9964 9819 9434 8689 7548 6098 45227 1 0000 1 0000 9994 9958 9927 9500 8868 7869 65358 1 0000 1 0000 9999 9992 9958 9848 9578 9050 8182

9 1 0000 1 0000 1 0000 9999 9992 9963 9876 9662 923110 1 0000 1 0000 1 0000 1 0000 9999 9993 9972 9907 974511 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9995 9981 993712 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9997 998913 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 999914 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000015 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

16 0 4401 1853 0743 0281 0100 0033 0010 0003 00011 8108 5147 2839 1407 0635 0261 0098 0033 00102 9571 7892 5614 3518 1971 0994 0451 0183 00663 9930 9316 7899 5981 4050 2459 1339 0651 02814 9991 9830 9209 7982 6302 4499 2892 1666 08535 9999 9967 9765 9183 8103 6598 4900 3288 19766 1 0000 9995 9944 9733 9204 8247 6881 5272 36607 1 0000 9999 9989 9930 9729 9256 8406 7161 56298 1 0000 1 0000 9998 9985 9925 9743 9329 8577 74419 1 0000 1 0000 1 0000 9998 9984 9929 9771 9417 8759

10 1 0000 1 0000 1 0000 1 0000 9997 9984 9938 9809 951411 1 0000 1 0000 1 0000 1 0000 1 0000 9997 9987 9951 985112 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9998 9991 996513 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9991 999414 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 999915 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000016 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

Page 594: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 594/675

ð Þ

.5 .55 .6 .65 .7 .75 .8 .85 .9 .95

15 0 0000 0000 0000 0000 0000 0000 0000 0000 0000 00001 0005 0001 0000 0000 0000 0000 0000 0000 0000 00002 0037 0011 0003 0001 0000 0000 0000 0000 0000 00003 0176 0063 0019 0005 0001 0000 0000 0000 0000 00004 0592 0255 0093 0028 0007 0001 0000 0000 0000 00005 1509 0769 0338 0124 0037 0008 0001 0000 0000 00006 3036 1818 0950 0422 0152 0042 0008 0001 0000 00007 5000 3465 2131 1132 0500 0173 0042 0006 0000 00008 6964 5478 3902 2452 1311 0566 0181 0036 0003 0000

9 8491 7392 5968 4357 2784 1484 0611 0168 0022 000110 9408 8796 7827 6481 4845 3135 1642 0617 0127 000611 9824 9576 9095 8273 7031 5387 3518 1773 0556 005512 9963 9893 9729 9383 8732 7639 6020 3958 1841 036213 9995 9983 9948 9858 9647 9198 8329 6814 4510 171014 1 0000 9999 9995 9984 9953 9866 9648 9126 7941 536715 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000 1 0000 1 0000 1 0000

16 0 0000 0000 0000 0000 0000 0000 0000 0000 0000 00001 0003 0001 0000 0000 0000 0000 0000 0000 0000 00002 0021 0006 0001 0000 0000 0000 0000 0000 0000 00003 0106 0035 0009 0002 0000 0000 0000 0000 0000 00004 0384 0149 0049 0013 0003 0000 0000 0000 0000 00005 1051 0486 0191 0062 0016 0003 0000 0000 0000 00006 2272 1241 0583 0229 0071 0016 0002 0000 0000 00007 4018 2559 1423 0671 0257 0075 0015 0002 0000 00008 5982 4371 2839 1594 0744 0271 0070 0011 0001 00009 7728 6340 4728 3119 1753 0796 0267 0056 0005 0000

10 8949 8024 6712 5100 3402 1897 0817 0235 0033 000111 9616 9147 8334 7108 5501 3698 2018 0791 0170 000912 9894 9719 9349 8661 7541 5950 4019 2101 0684 007013 9979 9934 9817 9549 9006 8729 6482 4386 2108 042914 9997 9990 9967 9902 9739 9365 8593 7161 4853 189215 1 0000 9999 9997 9990 9967 9900 9719 9257 8147 559916 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

Page 595: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 595/675

ð Þ

. 5 .1 .15 . . 5 .3 .35 .4 .45

17 0 4181 1668 0631 0225 0075 0023 0007 0002 00001 7922 4818 2525 1182 0501 0193 0067 0021 00062 9497 7618 5198 3096 1637 0774 0327 0123 00413 9912 9174 7556 5489 3530 2019 1028 0464 01844 9988 9779 9013 7582 5739 3887 2348 1260 05965 9999 9953 9681 8943 7653 5968 4197 2639 14716 1 0000 9992 9917 9623 8929 7752 6188 4478 29027 1 0000 9999 9983 9891 9598 8954 7872 6405 47438 1 0000 1 0000 9997 9974 9876 9597 9006 8011 6626

9 1 0000 1 0000 1 0000 9995 9969 9873 9617 9081 816610 1 0000 1 0000 1 0000 9999 9994 9968 9880 9652 917411 1 0000 1 0000 1 0000 1 0000 9999 9993 9970 9894 969912 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9994 9975 991413 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9995 998114 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 999715 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000016 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000017 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

18 0 3972 1501 0536 0180 0056 0016 0004 0001 00001 7735 4503 2241 0991 0395 0142 0046 0013 00032 9419 7338 4797 2713 1353 0600 0236 0082 00253 9891 9018 7202 5010 3057 1646 0783 0328 01204 9985 9718 8794 7164 5187 3327 1886 0942 04115 9998 9936 9581 8671 7175 5344 3550 2088 10776 1 0000 9988 9882 9487 8610 7217 5491 3743 22587 1 0000 9998 9973 9837 9431 8593 7283 5634 39158 1 0000 1 0000 9995 9957 9807 9404 8609 7368 57789 1 0000 1 0000 9999 9991 9946 9790 9403 8653 7473

10 1 0000 1 0000 1 0000 9998 9988 9939 9788 9424 872011 1 0000 1 0000 1 0000 1 0000 9998 9986 9938 9797 946312 1 0000 1 0000 1 0000 1 0000 1 0000 9997 9986 9942 981713 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9997 9987 995114 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9998 999015 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 999916 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000017 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000018 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

Page 596: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 596/675

ð Þ

.5 .55 .6 .65 .7 .75 .8 .85 .9 .95

17 0 0000 0000 0000 0000 0000 0000 0000 0000 0000 00001 0001 0000 0000 0000 0000 0000 0000 0000 0000 00002 0012 0003 0001 0000 0000 0000 0000 0000 0000 00003 0064 0019 0005 0001 0000 0000 0000 0000 0000 00004 0245 0086 0025 0006 0001 0000 0000 0000 0000 00005 0717 0301 0106 0030 0007 0001 0000 0000 0000 00006 1662 0826 0348 0120 0032 0006 0001 0000 0000 00007 3145 1834 0919 0383 0127 0031 0005 0000 0000 00008 5000 3374 1989 0994 0403 0124 0026 0003 0000 0000

9 6855 5257 3595 2128 1046 0402 0109 0017 0001 000010 8338 7098 5522 3812 2248 1071 0377 0083 0008 000011 9283 8529 7361 5803 4032 2347 1057 0319 047 000112 9755 9404 8740 7652 6113 4261 2418 0987 0221 001213 9936 9816 9536 8972 7981 6470 4511 2444 0826 008814 9988 9959 9877 9673 9226 8363 6904 4802 2382 050315 9999 9994 9979 9933 9807 9499 8818 7475 5182 207816 1 0000 1 0000 9998 9993 9977 9925 9775 9369 8332 581917 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

18 0 0000 0000 0000 0000 0000 0000 0000 0000 0000 00001 0001 0000 0000 0000 0000 0000 0000 0000 0000 00002 0007 0001 0000 0000 0000 0000 0000 0000 0000 00003 0038 0010 0002 0000 0000 0000 0000 0000 0000 00004 0154 0049 0013 0003 0000 0000 0000 0000 0000 00005 0481 0183 0058 0014 0003 0000 0000 0000 0000 00006 1189 0537 0203 0062 0014 0002 0000 0000 0000 00007 2403 1280 0576 0212 0061 0012 0002 0000 0000 00008 4073 2527 1347 0597 0210 0054 0009 0001 0000 00009 5927 4222 2632 1391 0596 0193 0043 0005 0000 0000

10 7597 6085 4366 2717 1407 0569 0163 0027 0002 000011 8811 7742 6257 4509 2783 1390 0513 0118 0012 000012 9519 8923 7912 6450 4656 2825 1329 0419 0064 000213 9846 9589 9058 8114 6673 4813 2836 1206 0282 001514 9962 9880 9672 9217 8354 6943 4990 2798 0982 010915 9993 9975 9918 9764 9400 8647 7287 5203 2662 058116 9999 9997 9987 9954 9858 9605 9009 7759 5497 226517 1 0000 1 0000 9999 9996 9984 9944 9820 9464 8499 602818 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

Page 597: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 597/675

ð Þ

. 5 .1 .15 . . 5 .3 .35 .4 .45

19 0 3774 1351 0456 0144 0042 0011 0003 0001 00001 7547 4203 1985 0829 0310 0104 0031 0008 00022 9335 7054 4413 2369 1113 0462 0170 0055 00153 9868 8850 6841 4551 2631 1332 0591 0230 00774 9980 9648 8556 6733 4654 2822 1500 0696 02805 9998 9914 9463 8369 6678 4739 2968 1629 07776 1 0000 9983 9837 9324 8251 6655 4812 3081 17277 1 0000 9997 9959 9767 9225 8180 6656 4878 31698 1 0000 1 0000 9992 9933 9713 9161 8145 6675 4940

9 1 0000 1 0000 9999 9984 9911 9674 9125 8139 671010 1 0000 1 0000 1 0000 9997 9977 9895 9653 9115 815911 1 0000 1 0000 1 0000 1 0000 9995 9972 9886 9648 912912 1 0000 1 0000 1 0000 1 0000 9999 9994 9969 9884 965813 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9993 9969 989114 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 9994 997215 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9999 999516 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 999917 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000018 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000019 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

20 0 3585 1216 0388 0115 0032 0008 0002 0000 00001 7358 3917 1756 0692 0243 0076 0021 0005 00012 9245 6769 4049 2061 0913 0355 0121 0036 00093 9841 8670 6477 4114 2252 1071 0444 0160 00494 9974 9568 8298 6296 4148 2375 1182 0510 01895 9997 9887 9327 8042 6172 4164 2454 1256 05536 1 0000 9976 9781 9133 7858 6080 4166 2500 12997 1 0000 9996 9941 9679 8982 7723 6010 4159 25208 1 0000 9999 9987 9900 9591 8867 7624 5956 41439 1 0000 1 0000 9998 9974 9861 9520 8782 7553 5914

10 1 0000 1 0000 1 0000 9994 9961 9829 9468 8725 750711 1 0000 1 0000 1 0000 9999 9991 9949 9804 9435 869212 1 0000 1 0000 1 0000 1 0000 9998 9987 9940 9790 942013 1 0000 1 0000 1 0000 1 0000 1 0000 9997 9985 9935 978614 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9997 9984 993615 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 9997 998516 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 999717 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000018 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000019 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 000020 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

Page 598: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 598/675

ð Þ

.5 .55 .6 .65 .7 .75 .8 .85 .9 95

19 0 0000 0000 0000 0000 0000 0000 0000 0000 0000 00001 0000 0000 0000 0000 0000 0000 0000 0000 0000 00002 0004 0001 0000 0000 0000 0000 0000 0000 0000 00003 0022 0005 0001 0000 0000 0000 0000 0000 0000 00004 0096 0028 0006 0001 0000 0000 0000 0000 0000 00005 0318 0109 0031 0007 0001 0000 0000 0000 0000 00006 0835 0342 0116 0031 0006 0001 0000 0000 0000 00007 1796 0871 0352 0114 0028 0005 0000 0000 0000 00008 3238 1841 0885 0347 0105 0023 0003 0000 0000 0000

9 5000 3290 1861 0875 0326 0089 0016 0001 0000 000010 6762 5060 3325 1855 0839 0287 0067 0008 0000 000011 8204 6831 5122 3344 1820 0775 0233 0041 0003 000012 9165 8273 6919 5188 3345 1749 0676 0163 0017 000013 9682 9223 8371 7032 5261 3322 1631 0537 0086 000214 9904 9720 9304 8500 7178 5346 3267 1444 0352 002015 9978 9923 9770 9409 8668 7369 5449 3159 1150 013216 9996 9985 9945 9830 9538 8887 7631 5587 2946 066517 1 0000 9998 9992 9969 9896 9690 9171 8015 5797 245318 1 0000 1 0000 9999 9997 9989 9958 9856 9544 8649 622619 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

20 0 0000 0000 0000 0000 0000 0000 0000 0000 0000 00001 0000 0000 0000 0000 0000 0000 0000 0000 0000 00002 0002 0000 0000 0000 0000 0000 0000 0000 0000 00003 0013 0003 0000 0000 0000 0000 0000 0000 0000 00004 0059 0015 0003 0000 0000 0000 0000 0000 0000 00005 0207 0064 0016 0003 0000 0000 0000 0000 0000 00006 0577 0214 0065 0015 0003 0000 0000 0000 0000 00007 1316 0580 0210 0060 0013 0002 0000 0000 0000 00008 2517 1308 0565 0196 0051 0009 0001 0000 0000 00009 4119 2493 1275 0532 0171 0039 0006 0000 0000 0000

10 5881 4086 2447 1218 0480 0139 0026 0002 0000 000011 7483 5857 4044 2376 1133 0409 0100 0013 0001 000012 8684 7480 5841 3990 2277 1018 0321 0059 0004 000013 9423 8701 7500 5834 3920 2142 0867 0219 0024 000014 9793 9447 8744 7546 5836 3828 1958 0673 0113 000315 9941 9811 9490 8818 7625 5852 3704 1702 0432 002616 9987 9951 9840 9556 8929 7748 5886 3523 1330 015917 9998 9991 9964 9879 9645 9087 7939 5951 3231 075518 1 0000 9999 9995 9979 9924 9757 9308 8244 6083 264219 1 0000 1 0000 1 0000 9998 9992 9968 9885 9612 8784 641520 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000 1 0000

: 2 , ð 195 1952 1958 B S S P

D C

7

Page 599: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 599/675

¼ 1 þ 2 1 2

1 2 1 2 1 2 1 2

2 2 2 333 2 18 2 011 3 14 2 003 4 10 2 0022 3 2 200 3 105 3 025 3 014

3 500 4 284 4 101 4 0682 4 2 133 3 3 2 100 5 350 5 203

3 400 3 300 3 15 2 002 6 4192 5 2 095 3 4 2 057 3 022 4 11 2 001

3 333 3 200 4 091 3 0112 6 2 071 3 5 2 036 5 331 4 055

3 286 3 143 3 16 2 002 5 1762 7 2 056 4 429 3 020 6 374

3 250 3 6 2 024 4 082 4 12 2 0012 8 2 044 3 107 5 314 3 009

3 222 4 345 3 17 2 002 4 0452 9 2 036 3 7 2 017 3 018 5 154

3 200 3 083 4 074 6 3354 491 4 283 5 298 4 13 2 001

2 10 2 030 3 8 2 012 4 4 2 029 3 0073 182 3 067 3 114 4 0374 455 4 236 4 371 5 136

2 11 2 026 3 9 2 009 4 5 2 016 6 3023 167 3 055 3 071 4 14 2 0014 423 4 200 4 262 3 006

2 12 2 022 5 491 5 500 4 0313 154 3 10 2 007 4 6 2 010 5 1214 396 3 045 3 048 6 274

2 13 2 019 4 171 4 190 4 15 2 0013 143 5 455 5 405 3 0054 371 3 11 2 005 4 7 2 006 4 027

2 14 2 017 3 038 3 033 5 1083 133 4 148 4 142 6 2494 350 5 423 5 333 4 16 2 000

2 15 2 015 3 12 2 004 4 8 2 004 3 0043 125 3 033 3 024 4 0234 331 4 130 4 109 5 097

2 16 2 013 5 396 5 279 6 2273 118 3 13 2 004 4 9 2 003 5 5 2 0084 314 3 029 3 018 3 040

2 17 2 012 4 114 4 085 4 167

3 111 5 371 5 236 5 3574 298 6 471

8

Page 600: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 600/675

Table D Continued

Left-tail probabilities

n n R P n n R P n n R P n n R P

. . . . . . . . . . . . . . . .

. . . . . . . . . . . .

. . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . .

. . . . . . . . .

. . .

(Continued )

APPENDIX OF TABLES 569

Page 601: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 601/675

Table D Continued

Left-tail probabilities

n n R P n n R P n n R P n n R P

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

. . . . . . . . . . . . . . . . . . . . . . . . . .

. . . .

.

(Continued )

5 0 APPENDIX OF TABLES

Page 602: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 602/675

( )

1 2 1 2 1 2 1 2

2 2 4 333 4 8 9 071 5 11 11 058 6 12 12 0752 3 5 100 8 212 10 154 11 217

4 500 7 467 9 374 10 3952 4 5 200 4 9 9 098 5 12 11 075 6 13 13 0342 5 5 286 8 255 10 181 12 0922 6 5 357 4 10 9 126 5 12 9 421 11 2572 7 5 417 8 294 5 13 11 092 10 4392 8 5 467 4 11 9 154 10 208 6 14 13 0443 3 6 100 8 330 9 465 12 111

5 300 4 12 9 181 5 14 11 111 11 295

3 4 7 029 8 363 10 234 10 4806 200 4 13 9 208 5 15 11 129 7 7 14 0015 457 8 393 10 258 13 004

3 5 7 071 4 14 9 234 6 6 12 002 12 0256 286 8 421 11 013 11 078

3 6 7 119 4 15 9 258 10 067 10 2096 357 8 446 9 175 9 383

3 7 7 167 4 16 9 282 8 392 7 8 15 0006 417 8 470 6 7 13 001 14 002

3 8 7 212 5 5 10 008 12 008 13 0126 467 9 040 11 034 12 051

3 9 7 255 8 167 10 121 11 1333 10 7 294 7 357 9 267 10 2963 11 7 330 5 6 11 002 8 500 9 4863 12 7 363 10 024 6 8 13 002 7 9 15 0013 13 7 393 9 089 12 016 14 0063 14 7 421 8 262 11 063 13 0253 15 7 446 7 478 10 179 12 0843 16 7 470 5 7 11 008 9 354 11 1943 17 7 491 10 045 6 9 13 006 10 3784 4 8 029 9 146 12 028 7 10 15 002

7 114 8 348 11 098 14 0106 371 5 8 11 016 10 238 13 043

4 5 9 008 10 071 9 434 12 1218 071 9 207 6 10 13 010 11 2577 214 8 424 12 042 10 4516 500 5 9 11 028 11 136 7 11 15 004

4 6 9 024 10 098 10 294 14 0178 119 9 266 6 11 13 017 13 0647 310 8 490 12 058 12 160

4 7 9 045 5 10 11 11 176 11 3188 167 10 126 10 346 7 12 15 0077 394 9 322 6 12 13 025 14 025

7

Page 603: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 603/675

Page 604: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 604/675

p À1

3 1 3333 2 6667 13 1 00004 3 4167 2 0000

1 0833 2 9167 3 0001 12 00725 1 0167 4 2667 4 0026 11 0568

2 2500 3 7500 5 0213 10 20586 1 0028 6 0964 9 4587

2 0861 5 1694 7 2749 8 72513 4139 4 5861 14 1 0000

7 1 0004 6 1079 2 00002 0250 5 4417 3 00003 1909 4 8091 4 0007 13 0046

8 1 0000 5 0079 12 03912 0063 7 0687 6 0441 11 15363 0749 6 3250 7 1534 10 37224 3124 5 6876 8 3633 9 6367

9 1 0000 15 1 00002 0014 2 00003 0257 8 0437 3 00004 1500 7 2347 4 00025 4347 6 5653 5 0027 14 0029

10 1 0000 6 0186 13 02672 0003 9 0278 7 0782 12 11343 0079 8 1671 8 2216 11 29704 0633 7 4524 9 4520 10 54805 2427 6 7573 16 1 0000

11 1 0000 2 00002 0001 3 00003 0022 10 0177 4 0001 15 00194 0239 9 1177 5 0009 14 01825 1196 8 3540 6 0072 13 08286 3438 7 6562 7 0367 12 2335

12 1 0000 8 1238 11 46312 0000 9 2975 10 70253 00054 0082 11 01135 0529 10 08216 1918 9 27207 4453 8 5547

7

Page 605: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 605/675

ð Þ

17 1 0000 21 1 00002 0000 2 00003 0000 3 00004 0000 4 00005 0003 16 0012 5 00006 0026 15 0123 6 00007 0160 14 0600 7 0003 20 00028 0638 13 1812 8 0023 19 00259 1799 12 3850 9 0117 18 0154

10 3770 11 6230 10 0431 17 059118 1 0000 11 1202 16 1602

2 0000 12 2622 15 32933 0000 13 4603 14 53974 0000 22 1 00005 0001 2 00006 0009 17 0008 3 00007 0065 16 0083 4 00008 0306 15 0431 5 00009 1006 14 1389 6 0000 21 0001

10 2443 13 3152 7 0001 20 001711 4568 12 5432 8 0009 19 0108

19 1 0000 9 0050 18 04372 0000 10 0213 17 12513 0000 11 0674 16 27144 0000 12 1661 15 46884 0000 13 3276 14 67245 0000 18 0005 23 1 00006 0003 17 0056 2 00007 0025 16 0308 3 00008 0137 15 1055 4 00009 0523 14 2546 5 0000

10 1467 13 4663 6 000011 3144 12 6856 7 0000 22 0001

20 1 0000 8 0003 21 00112 0000 9 0021 20 00763 0000 10 0099 19 03214 0000 11 0356 18 09685 0000 12 0988 17 22116 0001 19 0003 13 2188 16 40207 0009 18 0038 14 3953 15 60478 0058 17 02189 0255 16 0793

10 0821 15 203111 2012 14 394512 3873 13 6127

7

Page 606: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 606/675

ð Þ

24 1 0000 25 1 00002 0000 2 00003 0000 3 00004 0000 4 00005 0000 5 00006 0000 6 00007 0000 7 0000 24 00008 0001 23 0000 8 0000 23 00059 0008 22 0007 9 0003 22 0037

10 0044 21 0053 10 0018 21 017011 0177 20 0235 11 0084 20 0564

12 0554 19 0742 12 0294 19 142313 1374 18 1783 13 0815 18 285214 2768 17 3405 14 1827 17 470815 4631 16 5369 15 3384 16 6616

: S 1961 P 56 156–159

7

Page 607: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 607/675

K v- v - p K S

. .1 . 5 . . 1 . .1 . 5 . . 1

1 900 950 975 990 995 21 226 259 287 321 3442 684 776 842 900 929 22 221 253 281 314 3373 565 636 780 785 829 23 216 247 275 307 3304 493 565 624 689 734 24 212 242 269 301 3235 447 509 563 627 669 25 208 238 264 295 3176 410 468 519 577 617 26 204 233 259 290 3117 381 436 483 538 576 27 200 229 254 284 3058 358 410 454 507 542 28 197 225 250 279 3009 339 387 430 480 513 29 193 221 246 275 295

10 323 369 409 457 489 30 190 218 242 270 29011 308 352 391 437 468 31 187 214 238 266 28512 296 338 375 419 449 32 184 211 234 262 28113 285 325 361 404 432 33 182 208 231 258 27714 275 314 349 390 418 34 179 205 227 254 27315 266 304 338 377 404 35 177 202 224 251 26916 258 295 327 366 392 36 174 199 221 247 26517 250 286 318 355 381 37 172 196 218 244 26218 244 279 309 346 371 38 170 194 215 241 25819 237 271 301 337 361 39 168 191 213 238 25520 232 265 294 329 352 40 165 189 210 235 252

40 :

200 100 050 020 010

1 07

ffiffiffi 1 22

ffiffiffi 1 36

ffiffiffi 1 52

ffiffiffi 1 63

ffiffiffi

: L M 1956 K 51 111–121

7

Page 608: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 608/675

¼ B ¼0 5

1 0 5000 1 12 0 0002 12 17 0 0000 172 0 2500 2 1 0032 11 1 0001 16

1 7500 1 2 0193 10 2 0012 153 0 1250 3 3 0730 9 3 0064 14

1 5000 2 4 1938 8 4 0245 134 0 0625 4 5 3872 7 5 0717 12

1 3125 3 6 6128 6 6 1662 112 6875 2 13 0 0001 13 7 3145 10

5 0 0312 5 1 0017 12 8 5000 91 1875 4 2 0112 11 18 0 0000 182 5000 3 3 0461 10 1 0001 17

6 0 0156 6 4 1334 9 2 0007 161 1094 5 5 2905 8 3 0038 152 3438 4 6 5000 7 4 0154 143 6562 3 14 0 0000 14 5 0481 13

7 0 0078 7 1 0009 13 6 1189 121 0625 6 2 0065 12 7 2403 112 2266 5 3 0287 11 8 4073 103 5000 4 4 0898 10 9 5927 9

8 0 0039 8 5 2120 9 19 0 0000 191 0352 7 6 3953 8 1 0000 182 1445 6 7 6047 7 2 0004 173 3633 5 15 0 0000 15 3 0022 164 6367 4 1 0005 14 4 0096 15

9 0 0020 9 2 0037 13 5 0318 141 0195 8 3 0176 12 6 0835 132 0898 7 4 0592 11 7 1796 123 2539 6 5 1509 10 8 3238 114 5000 5 6 3036 9 9 5000 10

10 0 0010 10 7 5000 8 20 0 0000 201 0107 9 16 0 0000 16 1 0000 192 0547 8 1 0003 15 2 0002 183 1719 7 2 0021 14 3 0013 174 3770 6 3 0106 13 4 0059 165 6230 5 4 0384 12 5 0207 15

11 0 0005 11 5 1051 11 6 0577 141 0059 10 6 2272 10 7 1316 132 0327 9 7 4018 9 8 2517 123 1133 8 8 5982 8 9 4119 114 2744 7 10 5881 105 5000 6

77

Page 609: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 609/675

W - k þ

À

2 0 250 3 7 0 008 28 9 0 002 451 500 2 1 016 27 1 004 44

3 0 125 6 2 023 26 2 006 431 250 5 3 039 25 3 010 422 375 4 4 055 24 4 014 413 625 3 5 078 23 5 020 40

4 0 062 10 6 109 22 6 027 391 125 9 7 148 21 7 037 38

2 188 8 8 188 20 8 049 373 312 7 9 234 19 9 064 364 438 6 10 289 18 10 082 355 562 5 11 344 17 11 102 34

5 0 031 15 12 406 16 12 125 331 062 14 13 469 15 13 150 322 094 13 14 531 14 14 180 313 156 12 8 0 004 36 15 213 304 219 11 1 008 35 16 248 295 312 10 2 012 34 17 285 286 406 9 3 020 33 18 326 277 500 8 4 027 32 19 367 26

6 0 016 21 5 039 31 20 410 251 031 20 6 055 30 21 455 242 047 19 7 074 29 22 500 233 078 18 8 098 28 10 0 001 554 109 17 9 125 27 1 002 545 156 16 10 156 26 2 003 536 219 15 11 191 25 3 005 527 281 14 12 230 24 4 007 518 344 13 13 273 23 5 010 509 422 12 14 320 22 6 014 49

10 500 11 15 371 21 7 019 4816 422 20 8 024 4717 473 19 9 032 4618 527 18 10 042 45

78

Page 610: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 610/675

ð Þ

10 11 053 44 11 28 350 38 13 0 000 9112 065 43 29 382 37 1 000 9013 080 42 30 416 36 2 000 8914 097 41 31 449 35 3 001 8815 116 40 32 483 34 4 001 8716 138 39 33 517 33 5 001 8617 161 38 12 0 000 78 6 002 8518 188 37 1 000 77 7 002 8419 216 36 2 001 76 8 003 8320 246 35 3 001 75 9 004 8221 278 34 4 002 74 10 005 8122 312 33 5 002 73 11 007 8023 348 32 6 003 72 12 009 7924 385 31 7 005 71 13 011 7825 423 30 8 006 70 14 013 7726 461 29 9 008 69 15 016 7627 500 28 10 010 68 16 020 75

11 0 000 66 11 013 67 17 024 741 001 65 12 017 66 18 029 732 001 64 13 021 65 19 034 723 002 63 14 026 64 20 040 714 003 62 15 032 63 21 047 705 005 61 16 039 62 22 055 696 007 60 17 0456 61 23 064 687 009 59 18 055 60 24 073 678 0125 58 19 065 59 25 084 669 016 57 20 076 58 26 095 65

10 021 56 21 088 57 27 108 6411 027 55 22 102 56 28 122 6312 034 54 23 117 55 29 137 6213 042 53 24 133 54 30 153 6114 051 52 25 151 53 31 170 6015 062 51 26 170 52 32 188 5916 074 50 27 190 51 33 207 5817 087 49 28 212 50 34 227 5718 103 48 29 235 49 35 249 5619 120 47 30 259 48 36 271 5520 139 46 31 285 47 37 294 5421 160 45 32 311 46 38 318 5322 183 44 33 339 45 39 342 5223 207 43 34 367 44 40 368 5124 232 42 35 396 43 41 393 50

25 260 41 36 425 42 42 420 4926 289 40 37 455 41 43 446 4827 319 39 38 485 40 44 473 47

39 515 39 45 500 46

7

Page 611: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 611/675

ð Þ

14 0 000 105 14 46 357 59 15 39 126 811 000 104 47 380 58 40 138 802 000 103 48 404 57 41 151 793 000 102 49 428 56 42 165 784 000 101 50 452 55 43 180 775 001 100 51 476 54 44 195 766 001 99 52 500 53 45 211 757 001 98 15 0 000 120 46 227 748 002 97 1 000 119 47 244 739 002 96 2 000 118 48 262 72

10 003 95 3 000 117 49 281 7111 003 94 4 000 116 50 300 7012 004 93 5 000 115 51 319 6913 005 92 6 000 114 52 339 6814 007 91 7 001 113 53 360 6715 008 90 8 001 112 54 381 6616 010 89 9 001 111 55 402 6517 012 88 10 001 110 56 423 6418 0158 87 11 002 109 57 445 6319 018 86 12 002 108 58 467 6220 021 85 13 003 107 59 489 6121 025 84 14 003 106 60 511 6022 029 83 15 004 10523 034 82 16 005 10424 039 81 17 006 10325 045 80 18 008 10226 052 79 19 009 10127 059 78 20 011 10028 068 77 21 013 99

29 077 76 22 015 9830 086 75 23 018 9731 097 74 24 021 9632 108 73 25 024 9533 121 72 26 028 9434 134 71 27 032 9335 148 70 28 036 9236 163 69 29 042 9137 179 68 30 047 9038 196 67 31 053 8939 213 66 32 060 8840 232 65 33 068 8741 251 64 34 076 8642 271 63 35 084 8543 292 62 36 094 8444 313 61 37 104 8345 335 60 38 115 82

: S K K R 1973 C 171–259

M S M S P R

Page 612: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 612/675

K v- v - p K

S

2 2 4 333 3 6 18 024 4 5 20 0162 3 6 200 15 095 16 0792 4 8 133 12 333 15 1432 5 10 095 3 7 21 017 4 6 24 010

8 286 18 067 20 0482 6 12 071 15 167 18 095

10 214 3 8 24 012 16 181

2 7 14 056 21 048 4 7 28 00612 167 18 121 24 0302 8 16 044 3 9 27 009 21 067

14 133 24 036 20 1212 9 18 036 21 091 4 8 32 004

16 109 18 236 28 0202 10 20 030 3 10 30 007 24 085

18 091 27 028 20 22216 182 24 070 4 9 36 003

2 11 22 026 21 140 32 01420 077 3 11 33 005 28 04218 154 30 022 27 062

2 12 24 022 27 055 24 11522 066 24 110 4 10 40 00220 132 3 12 36 004 36 010

3 3 9 100 33 018 32 0303 4 12 057 30 044 30 046

9 229 27 088 28 0843 5 15 036 24 189 26 126

12 143 4 4 16 02912 229

8

Page 613: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 613/675

Page 614: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 614/675

ð Þ¼ . .1 . 5 . . 1

9 45 54 54 63 6310 50 60 70 70 8011 66 66 77 88 8812 72 72 84 96 9613 78 91 91 104 11714 84 98 112 112 12615 90 105 120 135 13516 112 112 128 144 16017 119 136 136 153 17018 126 144 162 180 18019 133 152 171 190 19020 140 160 180 200 220

:

200 100 050 020 0101 07 ffiffiffiffiffiffiffiffiffiffiffiffiffiffi

1 22 ffiffiffiffiffiffiffiffiffiffiffiffiffi 1 36 ffiffiffiffiffiffiffiffiffiffi

1 52 ffiffiffiffiffiffiffiffi 1 63 ffiffiffiffiffi

: P J K R J 1973 K S ð Þ 79–170

M S V M S P R

8

Page 615: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 615/675

W k- W N

¼1 ¼2 ¼2

1 1 500 2 2 3 167 7 8 3 022 192 1 333 3 4 333 6 4 044 18

2 667 2 5 667 5 5 089 173 1 250 4 3 3 100 9 6 133 16

2 500 3 4 200 8 7 200 154 1 200 5 5 400 7 8 267 14

2 400 4 6 600 6 9 356 13

3 600 3 4 3 067 11 10 444 125 1 167 6 4 133 10 11 556 112 333 5 5 267 9 9 3 018 213 500 4 6 400 8 4 036 20

6 1 143 7 7 600 7 5 073 192 286 6 5 3 048 13 6 109 183 429 5 4 095 12 7 164 174 571 4 5 190 11 8 218 16

7 1 125 8 6 286 10 9 291 152 250 7 7 429 9 10 364 143 375 6 8 571 8 11 455 134 500 5 6 3 036 15 12 545 12

8 1 111 9 4 071 14 10 3 015 232 222 8 5 143 13 4 030 223 333 7 6 214 12 5 061 214 444 6 7 321 11 6 091 205 556 5 8 429 10 7 136 19

9 1 100 10 9 571 9 8 182 182 200 9 7 3 028 17 9 242 173 300 8 4 056 16 10 303 164 400 7 5 111 15 11 379 155 500 6 6 167 14 12 455 14

10 1 091 11 7 250 13 13 545 132 182 10 8 333 123 273 9 9 444 114 364 8 10 556 105 455 76 545 6

8

Page 616: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 616/675

ð Þ

¼3 ¼3 ¼4

3 6 050 15 8 6 006 30 4 10 014 267 100 14 7 012 29 11 029 258 200 13 8 024 28 12 057 249 350 12 9 042 27 13 100 23

10 500 11 10 067 26 14 171 224 6 029 18 11 097 25 15 243 21

7 057 17 12 139 24 16 343 208 114 16 13 188 23 17 443 199 200 15 14 248 22 18 557 18

10 314 14 15 315 21 5 10 008 3011 429 13 16 388 20 11 016 2912 571 12 17 461 19 12 032 28

5 6 018 21 18 539 18 13 056 277 036 20 9 6 005 33 14 095 268 071 19 7 009 32 15 143 259 125 18 8 018 31 16 206 24

10 196 17 9 032 30 17 278 2311 286 16 10 050 29 18 365 2212 393 15 11 073 28 19 452 2113 500 14 12 105 27 20 548 20

6 6 012 24 13 141 26 6 10 005 347 024 23 14 186 25 11 010 338 048 22 15 241 24 12 019 329 083 21 16 300 23 13 033 31

10 131 20 17 364 22 14 057 3011 190 19 18 432 21 15 086 2912 274 18 19 500 20 16 129 2813 357 17 10 6 003 36 17 176 2714 452 16 7 007 35 18 238 2615 548 15 8 014 34 19 305 25

7 6 008 27 9 024 33 20 381 247 017 26 10 038 32 21 457 238 033 25 11 056 31 22 543 229 058 24 12 080 30

10 092 23 13 108 2911 133 22 14 143 2812 192 21 15 185 2713 258 20 16 234 2614 333 19 17 287 2515 417 18 18 346 24

16 500 17 19 406 2320 469 2221 531 21

8

Page 617: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 617/675

ð Þ

¼4 ¼4 ¼5

7 10 003 38 9 10 001 46 5 15 004 4011 006 37 11 003 45 16 008 3912 012 36 12 006 44 17 016 3813 021 35 13 010 43 18 028 3714 036 34 14 017 42 19 048 3615 055 33 15 025 41 20 075 3516 082 32 16 038 40 21 111 3417 115 31 17 053 39 22 155 3318 158 30 18 074 38 23 210 32

19 206 29 19 099 37 24 274 3120 264 28 20 130 36 25 345 3021 324 27 21 165 35 26 421 2922 394 26 22 207 34 27 500 2823 464 25 23 252 33 6 15 002 4524 536 24 24 302 32 16 004 44

8 10 002 42 25 355 31 17 009 4311 004 41 26 413 30 18 015 4212 008 40 27 470 29 19 026 4113 014 39 28 530 28 20 041 4014 024 38 10 10 001 50 21 063 3915 036 37 11 002 49 22 089 3816 055 36 12 004 48 23 123 3717 077 35 13 007 47 24 165 3618 107 34 14 012 46 25 214 3519 141 33 15 018 45 26 268 3420 184 32 16 027 44 27 331 3321 230 31 17 038 43 28 396 3222 285 30 18 053 42 29 465 3123 341 29 19 071 41 30 535 3024 404 28 20 094 4025 467 27 21 120 3926 533 26 22 152 38

23 187 3724 227 3625 270 3526 318 3427 367 3328 420 3229 473 3130 527 30

8

Page 618: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 618/675

ð Þ

¼5 ¼5 ¼6

7 15 001 50 9 15 000 60 6 21 001 5716 003 49 16 001 59 22 002 5617 005 48 17 002 58 23 004 5518 009 47 18 003 57 24 008 5419 015 46 19 006 56 25 013 5320 024 45 20 009 55 26 021 5221 037 44 21 014 54 27 032 5122 053 43 22 021 53 28 047 5023 074 42 23 030 52 29 066 4924 101 41 24 041 51 30 090 48

25 134 40 25 056 50 31 120 4726 172 39 26 073 49 32 155 4627 216 38 27 095 48 33 197 4528 265 37 28 120 47 34 242 4429 319 36 29 149 46 35 294 4330 378 35 30 182 45 36 350 4231 438 34 31 219 44 37 409 4132 500 33 32 259 43 38 469 40

8 15 001 55 33 303 42 39 531 3916 002 54 34 350 41 7 21 001 6317 003 53 35 399 40 22 001 6218 005 52 36 449 39 23 002 6119 009 51 37 500 38 24 004 6020 015 50 10 15 000 65 25 007 5921 023 49 16 001 64 26 011 5822 033 48 17 001 63 27 017 57

23 047 47 18 002 62 28 026 5624 064 46 19 004 61 29 037 5525 085 45 20 006 60 30 051 5426 111 44 21 010 59 31 069 5327 142 43 22 014 58 32 090 5228 177 42 23 020 57 33 117 5129 218 41 24 028 56 34 147 5030 262 40 25 038 55 35 183 4931 311 39 26 050 54 36 223 4832 362 38 27 065 53 37 267 4733 416 37 28 082 52 38 314 4634 472 36 29 103 51 39 365 4535 528 35 30 127 50 40 418 44

31 155 49 41 473 4332 0185 48 42 527 4233 220 47

34 257 4635 297 4536 339 4437 384 4338 430 4239 477 4140 523 40

87

Page 619: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 619/675

ð Þ

¼6 ¼6 ¼7

8 21 000 69 9 41 228 55 7 28 000 7722 001 68 42 264 54 29 001 7623 001 67 43 303 53 30 001 7524 002 66 44 344 52 31 002 7425 004 65 45 388 51 32 003 7326 006 64 46 432 50 33 006 7227 010 63 47 477 49 34 009 7128 015 62 48 523 48 35 013 7029 021 61 10 21 000 81 36 019 69

30 030 60 22 000 80 37 027 6831 041 59 23 000 79 38 036 6732 054 58 24 001 78 39 049 6633 071 57 25 001 77 40 064 6534 091 56 26 002 76 41 082 6435 114 55 27 004 75 42 104 6336 141 54 28 005 74 43 130 6237 172 53 29 008 73 44 159 6138 207 52 30 011 72 45 191 6039 245 51 31 016 71 46 228 5940 286 50 32 021 70 47 267 5841 331 49 33 028 69 48 310 5742 377 48 34 036 68 49 355 5643 426 47 35 047 67 50 402 5544 475 46 36 059 66 51 451 5445 525 45 37 074 65 52 500 53

9 21 000 75 38 090 64 8 28 000 8422 000 74 39 110 63 29 000 8323 001 73 40 132 62 30 001 8224 001 72 41 157 61 31 001 8125 002 71 42 184 60 32 002 8026 004 70 43 214 59 33 003 7927 006 69 44 246 58 34 005 7828 009 68 45 281 57 35 007 7729 013 67 46 318 56 36 010 7630 018 66 47 356 55 37 014 7531 025 65 48 396 54 38 020 7432 033 64 49 437 53 39 027 7333 044 63 50 479 52 40 036 7234 057 62 51 521 51 41 047 7135 072 61 42 060 70

36 091 60 43 076 6937 112 59 44 095 6838 136 58 45 116 6739 164 57 46 140 6640 194 56 47 168 65

88

Page 620: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 620/675

ð Þ

¼7 ¼7 ¼8

8 48 198 64 10 28 000 98 8 36 000 10049 232 63 29 000 97 37 000 9950 268 62 30 000 96 38 000 9851 306 61 31 000 95 39 001 9752 347 60 32 001 94 40 001 9653 389 59 33 001 93 41 001 9554 433 58 34 002 92 42 002 9455 478 57 35 002 91 43 003 9356 522 56 36 003 90 44 005 92

9 28 000 91 37 005 89 45 007 9129 000 90 38 007 88 46 010 9030 000 89 39 009 87 47 014 8931 001 88 40 012 86 48 019 8832 001 87 41 017 85 49 025 8733 002 86 42 022 84 50 032 8634 003 85 43 028 83 51 041 8535 004 84 44 035 82 52 052 8436 006 83 45 044 81 53 065 8337 008 82 46 054 80 54 080 8238 011 81 47 067 79 55 097 8139 016 80 48 081 78 56 117 8040 021 79 49 097 77 57 139 7941 027 78 50 115 76 58 164 7842 036 77 51 135 75 59 191 7743 045 76 52 157 74 60 221 7644 057 75 53 182 73 61 253 7545 071 74 54 209 72 62 287 7446 087 73 55 237 71 63 323 7347 105 72 56 268 70 64 360 7248 126 71 57 300 69 65 399 7149 150 70 58 335 68 66 439 7050 176 69 59 370 67 67 480 6951 204 68 60 406 66 68 520 6852 235 67 61 443 65 9 36 000 10853 268 66 62 481 64 37 000 10754 303 65 63 519 63 38 000 10655 340 64 39 000 10556 379 63 40 000 10457 419 62 41 001 10358 459 61 42 001 102

59 500 60 43 002 101

8

Page 621: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 621/675

ð Þ

¼8 ¼8 ¼9

9 44 003 100 10 36 000 116 9 45 000 12645 004 99 37 000 115 46 000 12546 006 98 38 000 114 47 000 12447 008 97 39 000 113 48 000 12348 010 96 40 000 112 49 000 12249 014 95 41 000 111 50 000 12150 018 94 42 001 110 51 001 12051 023 93 43 001 109 52 001 11952 030 92 44 002 108 53 001 118

53 037 91 45 002 107 54 002 11754 046 90 46 003 106 55 003 11655 057 89 47 004 105 56 004 11556 069 88 48 006 104 57 005 11457 084 87 49 008 103 58 007 11358 100 86 50 010 102 59 009 11259 118 85 51 013 101 60 012 11160 138 84 52 017 100 61 016 11061 161 83 53 022 99 62 020 10962 185 82 54 027 98 63 025 10863 212 81 55 034 97 64 031 10764 240 80 56 042 96 65 039 10665 271 79 57 051 95 66 047 10566 303 78 58 061 94 67 057 10467 336 77 59 073 93 68 068 10368 371 76 60 086 92 69 081 10269 407 75 61 102 91 70 095 10170 444 74 62 118 90 71 111 10071 481 73 63 137 89 72 129 9972 519 72 64 158 88 73 149 98

65 180 87 74 170 9766 204 86 75 193 9667 230 85 76 218 9568 257 84 77 245 9469 286 83 78 273 9370 317 82 79 302 9271 348 81 80 333 9172 381 80 81 365 9073 414 79 82 398 8974 448 78 83 432 8875 483 77 84 466 87

76 517 76 85 500 86

Page 622: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 622/675

ð Þ

¼9 ¼9 ¼10

10 45 000 135 10 78 178 102 10 73 007 13746 000 134 79 200 101 74 009 13647 000 133 80 223 100 75 012 13548 000 132 81 248 99 76 014 13449 000 131 82 274 98 77 018 13350 000 130 83 302 97 78 022 13251 000 129 84 330 96 79 026 13152 000 128 85 360 95 80 032 13053 001 127 86 390 94 81 038 129

54 001 126 87 421 93 82 045 12855 001 125 88 452 92 83 053 12756 002 124 89 484 91 84 062 12657 003 123 90 516 90 85 072 12558 004 122 86 083 12459 005 121 87 095 12360 007 120 ¼10 88 109 12261 009 119 89 124 12162 011 118 10 55 000 155 90 140 12063 014 117 56 000 154 91 157 11964 017 116 57 000 153 92 176 11865 022 115 58 000 152 93 197 11766 027 114 59 000 151 94 218 11667 033 113 60 000 150 95 241 11568 039 112 61 000 149 96 264 11469 047 111 62 000 148 97 289 11370 056 110 63 000 147 98 315 11271 067 109 64 001 146 99 342 11172 078 108 65 001 145 100 370 11073 091 107 66 001 144 101 398 10974 106 106 67 001 143 102 427 10875 121 105 68 002 142 103 456 10776 139 104 69 003 141 104 485 10677 158 103 70 003 140 105 515 105

71 004 13972 006 138

: S K K R 1973 C 172–259 M S M S P R

Page 623: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 623/675

K K k -W K

¼3 5

1 , 2 , 3 .1 . 5 . . 1 . 1

2 2 2 4 571 — — — —3 2 1 4 286 — — — —3 2 2 4 500 4 714 — — —3 3 1 4 571 5 143 — — —3 3 2 4 556 5 361 6 250 — —3 3 3 4 622 5 600 6 489 7 200 —

4 2 1 4 500 — — — —4 2 2 4 458 5 333 6 000 — —4 3 1 4 056 5 208 — — —4 3 2 4 511 5 444 6 144 6 444 —4 3 3 4 709 5 791 6 564 6 745 —4 4 1 4 167 4 967 6 667 6 667 —4 4 2 4 555 5 455 6 600 7 036 —4 4 3 4 545 5 598 6 712 7 144 8 9094 4 4 4 654 5 692 6 962 7 654 9 2695 2 1 4 200 5 000 — — —5 2 2 4 373 5 160 6 000 6 533 —5 3 1 4 018 4 960 6 044 — —5 3 2 4 651 5 251 6 124 6 909 —5 3 3 4 533 5 648 6 533 7 079 8 7275 4 1 3 987 4 985 6 431 6 955 —5 4 2 4 541 5 273 6 505 7 205 8 5915 4 3 4 549 5 656 6 676 7 445 8 7955 4 4 4 668 5 657 6 953 7 760 9 1685 5 1 4 109 5 127 6 145 7 309 —5 5 2 4 623 5 338 6 446 7 338 8 9385 5 3 4 545 5 705 6 866 7 578 9 2845 5 4 4 523 5 666 7 000 7 823 9 6065 5 5 4 560 5 780 7 220 8 000 9 920

3 B À1

: R L D Q D 1975 K 329–384 M

S M S P R

Page 624: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 624/675

K ’ K À À

3 1 000 167 7 1 000 000 9 1 000 000 10 1 000 000333 500 905 001 944 000 956 000

4 1 000 042 810 005 889 000 911 000667 167 714 015 833 000 867 000333 375 619 035 778 001 822 000000 625 524 068 722 003 778 000

5 1 000 008 429 119 667 006 733 001

800 042 333 191 611 012 689 002600 117 238 281 556 022 644 005400 242 143 386 500 038 600 008200 408 048 500 444 060 556 014000 592 8 1 000 000 389 090 511 023

6 1 000 001 929 000 333 130 467 036867 008 857 001 278 179 422 054733 028 786 003 222 238 378 078600 068 714 007 167 306 333 108467 136 643 016 111 381 289 146333 235 571 031 056 460 244 190200 360 500 054 000 540 200 242067 500 429 089 156 300

357 138 111 364286 199 067 431214 274 022 500143 360071 452000 548

Page 625: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 625/675

ð Þ .1 . 5 . 5 . 1 . 5

11 345 418 491 564 60012 303 394 455 545 57613 308 359 436 513 56414 275 363 407 473 51615 276 333 390 467 50516 250 317 383 433 48317 250 309 368 426 47118 242 294 346 412 45119 228 287 333 392 43920 221 274 326 379 42121 210 267 314 371 41022 203 264 307 359 39423 202 257 296 352 39124 196 246 290 341 37725 193 240 287 333 36726 188 237 280 329 36027 179 231 271 322 35628 180 228 265 312 34429 172 222 261 310 34030 172 218 255 301 333

: 10 M K 1948 4 1970 C & C L L

11 30 LK 1953

41–54

Page 626: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 626/675

p ’ k S ’ À

ðÀ Þ

3 1 000 167 7 1 000 000 8 810 011 9 1 000 000500 500 964 001 786 014 983 000

4 1 000 042 929 003 762 018 967 000800 167 893 006 738 023 950 000600 208 857 012 714 029 933 000400 375 821 017 690 035 917 001200 458 786 024 667 042 900 001

000 542 750 033 643 048 883 0025 1 000 008 714 044 619 057 867 002900 042 679 055 595 066 850 003800 067 643 069 571 076 833 004700 117 607 083 548 085 817 005600 175 571 100 524 098 800 007500 225 536 118 500 108 783 009400 258 500 133 476 122 767 011300 342 464 151 452 134 750 013200 392 429 177 429 150 733 016100 475 393 198 405 163 717 018000 525 357 222 381 180 700 022

6 1 000 001 321 249 357 195 683 025943 008 286 278 333 214 667 029886 017 250 297 310 231 650 033829 029 214 331 286 250 633 038771 051 179 357 262 268 617 043714 068 143 391 238 291 600 048657 088 107 420 214 310 583 054600 121 071 453 190 332 567 060543 149 036 482 167 352 550 066486 178 000 518 143 376 533 074429 210 8 1 000 000 119 397 517 081371 249 976 000 095 420 500 089314 282 952 001 071 441 483 097257 329 929 001 048 467 467 106200 357 905 002 024 488 450 115143 401 881 004 000 512 433 125086 460 857 005 417 135029 500 833 008 400 146

Page 627: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 627/675

ð Þ

9 383 156 10 964 000 10 636 027 10 309 193367 168 952 000 624 030 297 203350 179 939 000 612 033 285 214333 193 927 000 600 037 273 224317 205 915 000 588 040 261 235300 218 903 000 576 044 248 246283 231 891 001 564 048 236 257267 247 879 001 552 052 224 268250 260 867 001 539 057 212 280233 276 855 001 527 062 200 292217 290 842 002 515 067 188 304

200 307 830 002 503 072 176 316183 322 818 003 491 077 164 328167 339 806 004 479 083 152 341150 354 794 004 467 089 139 354133 372 782 005 455 096 127 367117 388 770 007 442 102 115 379100 405 758 008 430 109 103 393083 422 745 009 418 116 091 406067 440 733 010 406 124 079 419050 456 721 012 394 132 067 433033 474 709 013 382 139 055 446017 491 697 015 370 148 042 459000 509 685 017 358 156 030 473

10 1 000 000 673 019 345 165 018 486988 000 661 022 333 174 006 500976 000 648 025 321 184

Page 628: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 628/675

ð Þ .1 . 5 . 5 . 1 . 5 . 1

11 427 536 618 709 764 85512 406 503 587 678 734 82513 385 484 560 648 703 79714 367 464 538 626 679 77115 354 446 521 604 657 75016 341 429 503 585 635 72917 329 414 488 566 618 71118 317 401 474 550 600 69219 309 391 460 535 584 67520 299 380 447 522 570 66021 292 370 436 509 556 64722 284 361 425 497 544 63323 278 353 416 486 532 62024 275 344 407 476 521 60825 265 337 398 466 511 59726 260 331 390 457 501 58627 255 324 383 449 492 57628 250 318 376 441 483 56729 245 312 369 433 475 55730 241 307 363 426 467 548

: 10 M K 1948 4 1970 , C & C L L

11 30 J R 1961 C

8 444–448 B

7

Page 629: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 629/675

’ - - K ’

K ’

3 2 8 167 3 7 98 000 4 2 20 042 4 4 80 0006 500 96 000 18 167 78 001

3 18 028 86 000 16 208 76 00114 194 78 001 14 375 74 001

8 361 74 003 12 458 72 0024 32 005 72 004 3 45 002 70 003

26 042 62 008 43 002 68 003

24 069 56 016 41 017 66 00618 125 54 021 37 033 64 00714 273 50 027 35 054 62 012

8 431 42 051 33 075 58 0145 50 001 38 085 29 148 56 019

42 008 32 112 27 175 54 03338 024 26 192 25 207 52 03632 039 24 237 21 300 50 05226 093 18 305 19 342 48 05424 124 14 486 17 446 46 06818 182 8 128 000 44 07714 367 126 000 42 094

6 72 000 122 000 40 10562 002 114 000 38 14156 006 104 000 36 15854 008 98 001 34 19050 012 96 001 32 20042 029 86 002 30 24238 052 78 005 26 32432 072 74 008 24 35526 142 72 010 22 38924 184 62 018 20 43218 252 56 03014 430 54 038

50 04742 07938 12032 14926 23624 28518 355

: M K 1948 4 1970 C & C L L

8

Page 630: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 630/675

’ L ’

( ) .1 . 5 . 1 . 1

4 344 375 414 4325 320 344 398 4276 298 323 369 4217 281 305 351 3998 266 289 334 3839 252 273 316 366

10 240 261 305 350

11 231 251 291 33112 223 242 281 32714 208 226 262 30216 195 213 249 29118 185 201 234 27220 176 192 223 26625 159 173 202 23630 146 159 186 21940 127 139 161 19050 114 125 145 17360 105 114 133 15975 094 102 119 138100 082 089 104 121

100 816 ffiffiffiffiffi 888 ffiffiffiffiffi

1 038 ffiffiffiffiffi 1 212 ffiffiffiffiffi

: R L R C S 1987 L ’ 1 101–112

Page 631: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 631/675

T XY.Z (for Ke dall’s artial a -Correlatio Coef-cie t )

O

. 5 . 1 . 5 . 5

3 1 1 1 14 1 1 1 0 7075 1 0 816 0 802 0 6676 0 866 0 764 0 667 0 6007 0 761 0 712 0 617 0 5278 0 713 0 648 0 565 0 4849 0 660 0 602 0 515 0 443

10 0 614 0 562 0 480 0 413

11 0 581 0 530 0 453 0 38712 0 548 0 505 0 430 0 36513 0 527 0 481 0 410 0 34714 0 503 0 458 0 391 0 33115 0 482 0 439 0 375 0 31716 0 466 0 423 0 361 0 30517 0 450 0 410 0 348 0 29418 0 434 0 395 0 336 0 28419 0 421 0 382 0 326 0 27520 0 410 0 372 0 317 0 26725 0 362 0 328 0 278 0 23530 0 328 0 297 0 251 0 211

: S M 1975 K ’

155–164 S M L LP 1981 K ’

1341–48

Page 632: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 632/675

’ L

0 001 0 01

0 05

3 4 5 6 7 8

109 178 269 3882 60 106 173 261 376

28 58 103 166 252 36289 160 260 394 567

3 42 87 155 252 382 54941 84 150 244 370 532

56 117 210 341 516 7434 55 114 204 331 501 722

54 111 197 321 487 70170 145 259 420 637 917

5 68 141 251 409 620 89366 137 244 397 603 86983 172 307 499 757 1090

6 81 167 299 486 737 106379 163 291 474 719 103796 198 355 577 876 1262

7 93 193 346 563 855 123291 189 338 550 835 1204

109 225 403 655 994 14338 106 220 383 640 972 1401

104 214 384 625 950 1371121 252 451 733 1113 1603

9 119 246 441 717 1088 1569116 240 431 701 1065 1537134 278 499 811 1230 1773

10 131 272 487 793 1205 1736128 266 477 777 1180 1703147 305 546 888 1348 1943

11 144 298 534 869 1321 1905141 292 523 852 1295 1868160 331 593 965 1465 2112

12 156 324 581 946 1437 2072153 317 570 928 1410 2035

: P P 1963 : 58216–230

Page 633: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 633/675

T a b l e R C r i t i c a l V a l u e s a n d A s s o c i a t e d P r o b a b i l i t i e s f o r t h e J o n c k h e e r e - T e r p s t r a T e s t

E a c h e n t r y

i s t h e c r i t i c a l v a l u e B

a

¼ B ð a

; k ; n 1 ; n 2 ; . . . ; n k

Þ o f a J o n c k h e e r e - T e r p s t r a s t a t i s t i c B f o r g i v e n a

; k ; n ; n 2 ; . . . ;

n k ,

s u c h t h a t

P ð B 5 B

a j H

o Þ 4 a . T h e

a c t u a l r i g h t - t a i l p r o b a b i l i t y i s

e q u a l t o t h e v a l u e g i v e n i n p a r e n t h e s e s .

k ¼

3

n 1

n 2 n 3

a ¼

0 . 5

a ¼

0 . 2

a ¼

0 . 1

a ¼

0 . 0 5

a ¼ 0 .

0 2 5

a ¼

0 . 0 1

a ¼

0 . 0 0 5

2

2 2

7 ( . 4 2 2 2 2 )

9 ( . 1 6 6 6 7 )

1 0 ( . 0 8 8 8 9 )

1 1 ( . 0 3 3 3 3 )

1 2 ( . 0 1 1 1 1 )

2

2 3

9 ( . 4 3 8 1 0 )

1 2 ( . 1 3 8 1 0 )

1 3 ( . 0 7 6 1 9 )

1 4 ( . 0 3 8 1 0 )

1 5 ( . 0 1 4 2 9 )

1 6 ( . 0 0 4 7 6 )

1 6 ( . 0 0 4 7 6 )

2

2 4

1 1 ( . 4 4 7 6 2 )

1 4 ( . 1 8 0 9 5 )

1 6 ( . 0 7 1 4 3 )

1 7 ( . 0 3 8 1 0 )

1 8 ( . 0 1 9 0 5 )

1 9 ( . 0 0 7 1 4 )

2 0 ( . 0 0 2 3 8 )

2

2 5

1 3 ( . 4 5 5 0 3 )

1 7 ( . 1 5 3 4 4 )

1 9 ( . 0 6 6 1 4 )

2 0 ( . 0 3 9 6 8 )

2 1 ( . 0 2 1 1 6 )

2 3 ( . 0 0 3 9 7 )

2 3 ( . 0 0 3 9 7 )

2

2 6

1 5 ( . 4 6 0 3 2 )

1 9 ( . 1 8 4 9 2 )

2 1 ( . 0 9 4 4 4 )

2 3 ( . 0 3 9 6 8 )

2 4 ( . 0 2 3 8 1 )

2 6 ( . 0 0 6 3 5 )

2 7 ( . 0 0 2 3 8 )

2

2 7

1 7 ( . 4 6 4 6 5 )

2 2 ( . 1 6 3 6 4 )

2 4 ( . 0 8 7 8 8 )

2 6 ( . 0 4 0 4 0 )

2 8 ( . 0 1 5 1 5 )

2 9 ( . 0 0 8 0 8 )

3 0 ( . 0 0 4 0 4 )

2

2 8

1 9 ( . 4 6 8 0 1 )

2 4 ( . 1 8 8 5 5 )

2 7 ( . 0 8 2 1 5 )

2 9 ( . 0 4 0 4 0 )

3 1 ( . 0 1 6 8 4 )

3 3 ( . 0 0 5 3 9 )

3 4 ( . 0 0 2 6 9 )

2

3 3

1 2 ( . 4 0 0 0 0 )

1 5 ( . 1 5 1 7 9 )

1 6 ( . 0 9 6 4 3 )

1 8 ( . 0 3 0 3 6 )

1 9 ( . 0 1 4 2 9 )

2 0 ( . 0 0 5 3 6 )

2 1 ( . 0 0 1 7 9 )

2

3 4

1 4 ( . 4 5 7 1 4 )

1 8 ( . 1 6 1 9 0 )

2 0 ( . 0 7 3 8 1 )

2 1 ( . 0 4 5 2 4 )

2 3 ( . 0 1 3 4 9 )

2 4 ( . 0 0 6 3 5 )

2 5 ( . 0 0 1 3 8 )

2

3 5

1 7 ( . 4 2 5 0 0 )

2 1 ( . 1 6 9 4 4 )

2 3 ( . 0 8 7 7 0 )

2 5 ( . 0 3 8 1 0 )

2 6 ( . 0 2 3 0 2 )

2 8 ( . 0 0 6 7 5 )

2 9 ( . 0 0 3 1 7 )

2

3 6

1 9 ( . 4 6 6 4 5 )

2 4 ( . 1 7 5 5 4 )

2 6 ( . 0 9 9 5 7 )

2 8 ( . 0 4 9 5 7 )

3 0 ( . 0 2 1 0 0 )

3 2 ( . 0 0 7 1 4 )

3 3 ( . 0 0 3 6 8 )

2

3 7

2 2 ( . 4 4 0 0 3 )

2 7 ( . 1 8 0 3 0 )

3 0 ( . 0 8 2 3 2 )

3 2 ( . 0 4 2 6 8 )

3 4 ( . 0 1 0 3 2 )

3 6 ( . 0 0 7 3 2 )

3 7 ( . 0 0 3 6 8 )

2

3 8

2 4 ( . 4 7 2 7 3 )

3 0 ( . 1 8 4 3 0 )

3 3 ( . 0 9 1 9 2 )

3 6 ( . 0 3 7 6 8 )

3 8 ( . 0 1 8 1 0 )

4 0 ( . 0 0 7 5 4 )

4 1 ( . 0 0 4 5 1 )

2

4 4

1 7 ( . 4 6 2 5 4 )

2 1 ( . 1 9 8 1 0 )

2 4 ( . 0 7 5 5 6 )

2 6 ( . 0 3 2 0 6 )

2 7 ( . 0 1 9 0 5 )

2 9 ( . 0 0 5 4 0 )

3 0 ( . 0 0 2 5 4 )

2

4 5

2 0 ( . 4 6 7 3 9 )

2 5 ( . 1 8 0 9 5 )

2 8 ( . 0 7 6 6 2 )

3 0 ( . 0 3 6 8 0 )

3 1 ( . 0 2 3 9 5 )

3 3 ( . 0 0 8 8 0 )

3 4 ( . 0 0 4 9 1 )

2

4 6

2 3 ( . 4 7 0 7 1 )

2 9 ( . 1 6 7 9 7 )

3 2 ( . 0 7 7 4 2 )

3 4 ( . 0 4 0 7 6 )

3 6 ( . 0 1 8 9 8 )

3 8 ( . 0 0 7 5 8 )

3 9 ( . 0 0 4 4 0 )

2

4 7

2 6 ( . 4 7 3 6 6 )

3 2 ( . 1 9 3 0 5 )

3 6 ( . 0 7 7 9 7 )

3 8 ( . 0 4 4 0 6 )

4 0 ( . 0 2 2 6 1 )

4 3 ( . 0 0 6 6 0 )

4 4 ( . 0 0 4 0 8 )

2

4 8

2 9 ( . 4 7 5 9 0 )

3 6 ( . 1 8 0 7 7 )

3 9 ( . 0 9 8 7 9 )

4 2 ( . 0 4 6 8 6 )

4 5 ( . 0 1 8 6 3 )

4 7 ( . 0 0 8 9 2 )

4 9 ( . 0 0 3 7 7 )

2

5 5

2 4 ( . 4 4 2 2 8 )

2 9 ( . 1 9 0 0 0 )

3 2 ( . 0 9 1 5 7 )

3 5 ( . 0 3 5 6 5 )

3 6 ( . 0 2 4 5 3 )

3 9 ( . 0 0 6 4 3 )

4 0 ( . 0 0 3 7 3 )

2

5 6

2 7 ( . 4 7 4 0 3 )

3 3 ( . 1 9 7 0 8 )

3 7 ( . 0 8 1 7 8 )

3 9 ( . 0 4 7 1 5 )

4 1 ( . 0 2 4 8 6 )

4 4 ( . 0 0 7 7 7 )

4 5 ( . 0 0 4 9 1 )

2

5 7

3 1 ( . 4 5 3 0 3 )

3 8 ( . 1 7 0 5 7 )

4 1 ( . 0 9 3 5 5 )

4 4 ( . 0 4 4 7 7 )

4 7 ( . 0 1 8 2 0 )

4 9 ( . 0 0 8 9 4 )

5 1 ( . 0 0 3 9 3 )

2

5 8

3 4 ( . 4 7 8 4 9 )

4 2 ( . 1 7 7 6 4 )

4 6 ( . 0 8 5 0 0 )

4 9 ( . 0 4 2 8 3 )

5 2 ( . 0 1 8 8 5 )

5 4 ( . 0 0 9 9 6 )

5 6 ( . 0 0 4 8 2 )

2

6 6

3 1 ( . 4 7 6 6 2 )

3 8 ( . 1 8 8 1 6 )

4 2 ( . 0 8 5 2 8 )

4 5 ( . 0 4 0 2 7 )

4 7 ( . 0 2 2 3 5 )

5 0 ( . 0 0 7 8 6 )

5 2 ( . 0 0 3 4 3 )

Page 634: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 634/675

2

6

7

3 5 ( . 4 7 8 7 5 )

4 3 ( . 1 8 0 8 7 )

4 7 ( . 0 8 8 0 3 )

5 0 ( . 0 4 5 2 1 )

5 3 (

. 0 2 0 4 0 )

5 6 ( . 0 0 7 8 9 )

5 8 ( . 0 0 3 7 6 )

2

6

8

3 9 ( . 4 8 0 5 1 )

4 8 ( . 1 7 4 9 1 )

5 2 ( . 0 9 0 3 1 )

5 5 ( . 0 4 9 5 3 )

5 8 (

. 0 2 4 4 9 )

6 2 ( . 0 0 7 8 8 )

6 4 ( . 0 0 4 0 4 )

2

7

7

4 0 ( . 4 6 1 3 0 )

4 8 ( . 1 8 9 4 8 )

5 3 ( . 0 8 3 5 8 )

5 6 ( . 0 4 5 4 3 )

5 9 (

. 0 2 2 2 5 )

6 2 ( . 0 0 9 6 4 )

6 5 ( . 0 0 3 6 0 )

2

7

8

4 4 ( . 4 8 2 1 9 )

5 3 ( . 1 9 6 7 5 )

5 8 ( . 0 9 4 6 8 )

6 2 ( . 0 4 5 5 5 )

6 5 (

. 0 2 3 8 1 )

6 9 ( . 0 0 8 5 7 )

7 1 ( . 0 0 4 7 4 )

2

8

8

4 9 ( . 4 8 3 5 9 )

5 9 ( . 1 9 2 4 8 )

6 4 ( . 0 9 8 3 3 )

6 9 ( . 0 4 2 3 1 )

7 2 (

. 0 2 3 1 9 )

7 6 ( . 0 0 9 1 3 )

7 9 ( . 0 0 4 0 4 )

3

3

3

1 5 ( . 4 1 5 4 8 )

1 8 ( . 1 9 4 0 5 )

2 0 ( . 0 9 4 6 4 )

2 2 ( . 0 3 6 9 0 )

2 3 (

. 0 2 0 8 3 )

2 5 ( . 0 0 4 7 6 )

2 5 ( . 0 0 4 7 6 )

3

3

4

1 8 ( . 4 2 6 6 7 )

2 2 ( . 1 7 5 0 0 )

2 4 ( . 0 9 3 1 0 )

2 6 ( . 0 4 2 1 4 )

2 8 (

. 0 1 5 4 8 )

2 9 ( . 0 0 8 5 7 )

3 0 ( . 0 0 4 2 9 )

3

3

5

2 1 ( . 4 3 5 2 8 )

2 6 ( . 1 6 1 4 7 )

2 8 ( . 0 9 1 7 7 )

3 0 ( . 0 4 6 2 1 )

3 2 (

. 0 2 0 0 2 )

3 4 ( . 0 0 7 1 4 )

3 5 ( . 0 0 3 9 0 )

3

3

6

2 4 ( . 4 4 2 1 0 )

2 9 ( . 1 8 9 1 2 )

3 2 ( . 0 9 0 7 5 )

3 4 ( . 0 4 9 4 6 )

3 6 (

. 0 2 4 0 8 )

3 9 ( . 0 0 6 2 2 )

4 0 ( . 0 0 3 5 7 )

3

3

7

2 7 ( . 4 4 7 6 1 )

3 3 ( . 1 7 6 1 9 )

3 6 ( . 0 8 9 8 9 )

3 9 ( . 0 3 8 4 9 )

4 1 (

. 0 1 9 4 1 )

4 3 ( . 0 0 8 6 8 )

4 5 ( . 0 0 3 3 5 )

3

3

8

3 0 ( . 4 5 2 1 6 )

3 6 ( . 1 9 8 4 3 )

4 0 ( . 0 8 9 1 4 )

4 3 ( . 0 4 1 4 4 )

4 5 (

. 0 2 2 5 9 )

4 8 ( . 0 0 7 5 9 )

4 9 ( . 0 0 4 9 8 )

3

4

4

2 1 ( . 4 6 7 7 9 )

2 6 ( . 1 8 5 2 8 )

2 9 ( . 0 8 0 4 3 )

3 1 ( . 0 3 9 7 4 )

3 3 (

. 0 1 6 8 8 )

3 5 ( . 0 0 5 8 9 )

3 6 ( . 0 0 3 2 0 )

3

4

5

2 5 ( . 4 4 3 0 4 )

3 0 ( . 1 9 3 2 5 )

3 3 ( . 0 9 4 8 1 )

3 6 ( . 0 3 7 9 1 )

3 8 (

. 0 1 7 8 9 )

4 0 ( . 0 0 7 3 2 )

4 1 ( . 0 0 4 4 0 )

3

4

6

2 8 ( . 4 7 4 3 4 )

3 4 ( . 1 9 9 7 3 )

3 8 ( . 0 8 4 3 2 )

4 0 ( . 0 4 9 2 3 )

4 3 (

. 0 1 8 6 5 )

4 5 ( . 0 0 8 5 6 )

4 7 ( . 0 0 3 4 3 )

3

4

7

3 2 ( . 4 5 3 4 4 )

3 9 ( . 1 7 2 7 9 )

4 2 ( . 0 9 5 6 6 )

4 5 ( . 0 4 6 4 4 )

4 8 (

. 0 1 9 2 6 )

5 0 ( . 0 0 9 6 3 )

5 2 ( . 0 0 4 3 5 )

3

4

8

3 5 ( . 4 7 8 6 3 )

4 3 ( . 1 7 9 4 7 )

4 7 ( . 0 8 6 7 2 )

5 0 ( . 0 4 4 1 9 )

5 3 (

. 0 1 9 7 4 )

5 6 ( . 0 0 7 5 2

5 8 ( . 0 0 3 5 4 )

3

5

5

2 9 ( . 4 4 9 1 3 )

3 5 ( . 1 8 3 6 5 )

3 8 ( . 0 9 7 0 6 )

4 1 ( . 0 4 3 8 2 )

4 3 (

. 0 2 3 2 4 )

4 6 ( . 0 0 7 4 0 )

4 7 ( . 0 0 4 7 5 )

3

5

6

3 3 ( . 4 5 4 0 5 )

4 0 ( . 1 7 6 0 7 )

4 3 ( . 0 9 8 8 2 )

4 6 ( . 0 4 8 9 0 )

4 9 (

. 0 2 0 8 5 )

5 2 ( . 0 0 7 4 1 )

5 4 ( . 0 0 3 2 8 )

3

5

7

3 7 ( . 4 5 8 0 9 )

4 4 ( . 1 9 8 5 1 )

4 9 ( . 0 8 2 2 0 )

5 2 ( . 0 4 2 1 1 )

5 5 (

. 0 1 9 0 3 )

5 8 ( . 0 0 7 4 0 )

6 0 ( . 0 0 3 5 6 )

3

5

8

4 1 ( . 4 6 1 4 7 )

4 9 ( . 1 9 0 5 7 )

5 4 ( . 0 8 4 6 1 )

5 7 ( . 0 4 6 2 7 )

6 0 (

. 0 2 2 8 4 )

6 4 ( . 0 0 7 3 7 )

6 6 ( . 0 0 3 8 0 )

3

6

6

3 7 ( . 4 7 9 1 3 )

4 5 ( . 1 8 5 3 3 )

4 9 ( . 0 9 2 2 6 )

5 2 ( . 0 4 8 5 5 )

5 5 (

. 0 2 2 6 7 )

5 8 ( . 0 0 9 1 9 )

6 0 ( . 0 0 4 5 9 )

3

6

7

4 2 ( . 4 6 1 8 7 )

5 0 ( . 1 9 3 1 5 )

5 5 ( . 0 8 7 0 4 )

5 8 ( . 0 4 8 2 1 )

6 1 (

. 0 2 4 2 0 )

6 5 ( . 0 0 8 0 7 )

6 7 ( . 0 0 4 2 5 )

( C o n t i n u e d )

Page 635: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 635/675

T a b l e R ð C

o n t i n u e d Þ

k ¼

3

n 1

n 2

n 3

a ¼

0 . 5

a ¼

0 . 2

a ¼

0 . 1

a ¼

0 . 0 5

a ¼ 0 . 0 2 5

a ¼

0 . 0 1

a ¼

0 . 0 0 5

3

6

8

4 6 ( . 4 8 2 4 1 )

5 5 ( . 1 9 9 8 3 )

6 0 ( . 0 9 7 7 0 )

6 4 ( . 0 4 7 8 8 )

6 8 (

. 0 2 0 2 7 )

7 1 ( . 0 0 9 5 0 )

7 4 ( . 0 0 3 9 7 )

3

7

7

4 7 ( . 4 6 5 0 2 )

5 6 ( . 1 8 8 6 8 )

6 1 ( . 0 9 1 1 2 )

6 5 ( . 0 4 4 2 1 )

6 8 (

. 0 2 3 3 7 )

7 2 ( . 0 0 8 6 1 )

7 4 ( . 0 0 4 8 6 )

3

7

8

5 2 ( . 4 6 7 6 9 )

6 2 ( . 1 8 4 9 0 )

6 7 ( . 0 9 4 6 0 )

7 1 ( . 0 4 9 1 7 )

7 5 (

. 0 2 2 6 5 )

7 9 ( . 0 0 9 0 7 )

8 2 ( . 0 0 4 1 0 )

3

8

8

5 7 ( . 4 8 5 0 3 )

6 8 ( . 1 9 2 8 9 )

7 4 ( . 0 9 1 9 7 )

7 9 ( . 0 4 2 5 1 )

8 2 (

. 0 2 4 7 7 )

8 7 ( . 0 0 8 6 9 )

9 0 ( . 0 0 4 1 8 )

4

4

4

2 5 ( . 4 7 1 6 0 )

3 1 ( . 1 7 5 5 8 )

3 4 ( . 0 8 4 3 9 )

3 6 ( . 0 4 6 3 2 )

3 8 (

. 0 2 2 8 6 )

4 0 ( . 0 0 9 9 3 )

4 2 ( . 0 0 3 6 7 )

4

4

5

2 9 ( . 4 7 4 6 5 )

3 6 ( . 1 6 8 2 5 )

3 9 ( . 0 8 7 3 8 )

4 2 ( . 0 3 8 7 3 )

4 4 (

. 0 2 0 2 7 )

4 6 ( . 0 0 9 5 9 )

4 8 ( . 0 0 4 0 2 )

4

4

6

3 3 ( . 4 7 7 0 8 )

4 0 ( . 1 9 2 9 4 )

4 4 ( . 0 8 9 8 4 )

4 7 ( . 0 4 3 7 6 )

4 9 (

. 0 2 4 9 7 )

5 2 ( . 0 0 9 3 1 )

5 4 ( . 0 0 4 2 9 )

4

4

7

3 7 ( . 4 7 9 0 9 )

4 5 ( . 1 8 4 8 8 )

4 9 ( . 0 9 1 8 4 )

5 2 ( . 0 4 8 2 2 )

5 5 (

. 0 2 2 4 4 )

5 8 ( . 0 0 9 0 6 )

6 0 ( . 0 0 4 5 0 )

4

4

8

4 1 ( . 4 8 0 7 7 )

5 0 ( . 1 7 8 3 0 )

5 4 ( . 0 9 3 5 3 )

5 8 ( . 0 4 2 0 4 )

6 1 (

. 0 2 0 4 9 )

6 4 ( . 0 0 8 8 5 )

6 6 ( . 0 0 4 6 8 )

4

5

5

3 4 ( . 4 5 4 5 3 )

4 1 ( . 1 7 8 7 2 )

4 5 ( . 0 8 1 7 7 )

4 8 ( . 0 3 9 2 8 )

5 0 (

. 0 2 2 2 0 )

5 3 ( . 0 0 8 1 5 )

5 5 ( . 0 0 3 7 1 )

4

5

6

3 8 ( . 4 7 9 3 2 )

4 6 ( . 1 8 7 5 0 )

5 0 ( . 0 9 4 3 5 )

5 4 ( . 0 3 9 7 0 )

5 6 (

. 0 2 3 8 2 )

5 9 ( . 0 0 9 8 7 )

6 2 ( . 0 0 3 4 7 )

4

5

7

4 3 ( . 4 6 2 1 5 )

5 1 ( . 1 9 4 9 4 )

5 6 ( . 0 8 8 7 5 )

5 9 ( . 0 4 9 5 9 )

6 3 (

. 0 1 9 6 3 )

6 6 ( . 0 0 8 5 8 )

6 8 ( . 0 0 4 5 9 )

4

5

8

4 7 ( . 4 8 2 5 2 )

5 7 ( . 1 7 7 2 2 )

6 1 ( . 0 9 9 1 9 )

6 5 ( . 0 4 9 0 5 )

6 9 (

. 0 2 1 0 2 )

7 2 ( . 0 0 9 9 8 )

7 5 ( . 0 0 4 2 5 )

4

6

6

4 3 ( . 4 8 1 1 4 )

5 2 ( . 1 8 3 0 7 )

5 6 ( . 0 9 8 1 0 )

6 0 ( . 0 4 5 4 6 )

6 3 (

. 0 2 2 8 7 )

6 7 ( . 0 0 7 6 9 )

6 9 ( . 0 0 4 0 8 )

4

6

7

4 8 ( . 4 8 2 6 7 )

5 8 ( . 1 7 9 3 8 )

6 3 ( . 0 8 6 1 9 )

6 7 ( . 0 4 1 7 4 )

7 0 (

. 0 2 2 0 8 )

7 4 ( . 0 0 8 1 8 )

7 6 ( . 0 0 4 6 3 )

4

6

8

5 3 ( . 4 8 3 9 7 )

6 3 ( . 1 9 8 2 0 )

6 9 ( . 0 8 9 7 2 )

7 3 ( . 0 4 5 6 1 )

7 7 (

. 0 2 1 4 1 )

8 1 ( . 0 0 8 5 9 )

8 4 ( . 0 0 3 9 0 )

4

7

7

5 4 ( . 4 6 8 0 9 )

6 4 ( . 1 8 8 0 0 )

6 9 ( . 0 9 7 5 9 )

7 4 ( . 0 4 3 1 8 )

7 7 (

. 0 2 4 2 6 )

8 2 ( . 0 0 7 8 3 )

8 4 ( . 0 0 4 6 5 )

4

7

8

5 9 ( . 4 8 5 1 9 )

7 0 ( . 1 9 5 5 2 )

7 6 ( . 0 9 4 5 0 )

8 1 ( . 0 4 4 4 1 )

8 5 (

. 0 2 1 7 0 )

8 9 ( . 0 0 9 4 6 )

9 2 ( . 0 0 4 6 6 )

4

8

8

6 5 ( . 4 8 6 2 4 )

7 7 ( . 1 9 3 2 0 )

8 3 ( . 0 9 8 6 9 )

8 8 ( . 0 4 9 6 6 )

9 3 (

. 0 2 1 9 1 )

9 8 ( . 0 0 8 3 1 ) 1 0 1 ( . 0 0 4 2 8 )

5

5

5

3 9 ( . 4 5 8 8 8 )

4 7 ( . 1 7 4 7 8 )

5 1 ( . 0 8 6 6 6 )

5 4 ( . 0 4 5 5 8 )

5 7 (

. 0 2 1 3 6 )

6 0 ( . 0 0 8 7 3 )

6 2 ( . 0 0 4 4 0 )

5

5

6

4 4 ( . 4 6 2 4 8 )

5 2 ( . 1 9 7 0 6 )

5 7 ( . 0 9 0 7 8 )

6 1 ( . 0 4 1 5 1 )

6 4 (

. 0 2 0 6 6 )

6 7 ( . 0 0 9 2 1 )

7 0 ( . 0 0 3 6 0 )

5

5

7

4 9 ( . 4 6 5 6 9 )

5 8 ( . 1 9 2 0 0 )

6 3 ( . 0 9 4 3 0 )

6 7 ( . 0 4 6 6 5 )

7 1 (

. 0 2 0 0 8 )

7 4 ( . 0 0 9 6 0 )

7 7 ( . 0 0 4 1 3 )

5

5

8

5 4 ( . 4 6 8 0 6 )

6 4 ( . 1 8 7 7 4 )

6 9 ( . 0 9 7 3 4 )

7 4 ( . 0 4 3 0 0 )

7 7 (

. 0 2 4 1 3 )

8 1 ( . 0 0 9 9 2 )

8 4 ( . 0 0 4 6 1 )

5

6

6

4 9 ( . 4 8 2 8 0 )

5 9 ( . 1 8 1 1 8 )

6 4 ( . 0 8 7 8 7 )

6 8 ( . 0 4 3 0 1 )

7 1 (

. 0 2 2 9 9 )

7 5 ( . 0 0 8 6 8 )

7 7 ( . 0 0 4 9 8 )

5

6

7

5 5 ( . 4 6 8 2 9 )

6 5 ( . 1 8 9 5 2 )

7 0 ( . 0 9 9 0 6 )

7 5 ( . 0 4 4 2 7 )

7 9 (

. 0 2 0 4 2 )

8 3 ( . 0 0 8 2 4 )

8 5 ( . 0 0 4 9 4 )

5

6

8

6 0 ( . 4 8 5 2 7 )

7 1 ( . 1 9 6 8 1 )

7 7 ( . 0 9 5 7 5 )

8 2 ( . 0 4 5 3 5 )

8 6 (

. 0 2 2 3 5 )

9 0 ( . 0 0 9 8 5 )

9 3 ( . 0 0 4 9 0 )

5

7

7

6 1 ( . 4 7 0 6 6 )

7 2 ( . 1 8 7 4 1 )

7 8 ( . 0 9 0 1 9 )

8 2 ( . 0 4 9 8 1 )

8 6 (

. 0 2 4 9 9 )

9 1 ( . 0 0 9 0 4 )

9 4 ( . 0 0 4 4 7 )

5

7

8

6 7 ( . 4 7 2 7 1 )

7 9 ( . 1 8 5 5 9 )

8 5 ( . 0 9 4 3 8 )

9 0 ( . 0 4 7 3 9 )

9 4 (

. 0 2 4 8 9 )

9 9 ( . 0 0 0 7 7 ) 1 0 3 ( . 0 0 4 1 0 )

Page 636: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 636/675

5

8

8

7 3 ( . 4 8 7 2 7 )

8 6 ( . 1 9 3 4 4 )

9 3 ( . 0 9 3 1 9 )

9 8 ( . 0 4 9 1 7 ) 1 0 3 (

. 0 2 3 2 2 ) 1 0 8 ( . 0 0 9 6 8 ) 1 1 2 ( . 0 0 4 3 4 )

6

6

6

5 5 ( . 4 8 4 1 8 )

6 6 ( . 1 7 9 5 9 )

7 1 ( . 0 9 2 8 5 )

7 5 ( . 0 4 8 9 7 )

7 9 (

. 0 2 3 0 6 )

8 3 ( . 0 0 9 5 4 )

8 6 ( . 0 0 4 4 7 )

6

6

7

6 1 ( . 4 8 5 3 6 )

7 2 ( . 1 9 8 3 1 )

7 8 ( . 0 9 7 2 1 )

8 3 ( . 0 4 6 4 5 )

8 7 (

. 0 2 3 1 1

9 2 ( . 0 0 8 2 7 )

9 5 ( . 0 0 4 0 6 )

6

6

8

6 7 ( . 4 8 6 3 8 )

7 9 ( . 1 9 5 6 1 )

8 6 ( . 0 8 9 1 4 )

9 1 ( . 0 4 4 3 6 )

9 5 (

. 0 2 3 1 3 ) 1 0 0 ( . 0 0 8 9 9 ) 1 0 3 ( . 0 0 4 7 1 )

6

7

7

6 8 ( . 4 7 2 8 5 )

8 0 ( . 1 8 6 8 9 )

8 6 ( . 0 9 5 6 3 )

9 1 ( . 0 4 8 3 5 )

9 6 (

. 0 2 1 5 3 ) 1 0 1 ( . 0 0 8 2 9 ) 1 0 4 ( . 0 0 4 3 2 )

6

7

8

7 4 ( . 4 8 7 3 3 )

8 7 ( . 1 9 4 5 6 )

9 4 ( . 0 9 4 2 6 ) 1 0 0 ( . 0 4 3 5 1 ) 1 0 4 (

. 0 2 3 8 0 ) 1 1 0 ( . 0 0 8 2 9 ) 1 1 3 ( . 0 0 4 5 4 )

6

8

8

8 1 ( . 4 8 8 1 6 )

9 5 ( . 1 9 3 6 4 ) 1 0 2 ( . 0 9 8 8 5 ) 1 0 8 ( . 0 4 8 7 3 ) 1 1 3 (

. 0 2 4 3 7 ) 1 1 9 ( . 0 0 9 2 3 ) 1 2 3 ( . 0 0 4 3 9 )

7

7

7

7 5 ( . 4 7 4 7 3 )

8 8 ( . 1 8 6 4 3 )

9 5 ( . 0 8 9 4 4 ) 1 0 0 ( . 0 4 7 1 1 ) 1 0 5 (

. 0 2 2 2 5 ) 1 1 0 ( . 0 0 9 2 9 ) 1 1 4 ( . 0 0 4 1 8 )

7

7

8

8 2 ( . 4 7 6 3 7 )

9 6 ( . 1 8 6 0 2 ) 1 0 3 ( . 0 9 4 1 4 ) 1 0 9 ( . 0 4 6 0 5 ) 1 1 4 (

. 0 2 2 8 8 ) 1 2 0 ( . 0 0 8 5 9 ) 1 2 3 ( . 0 0 4 9 4 )

7

8

8

8 9 ( . 4 8 8 9 2 ) 1 0 4 ( . 1 9 3 8 0 ) 1 1 2 ( . 0 9 4 0 2 ) 1 1 8 ( . 0 4 8 3 4 ) 1 2 4 (

. 0 2 2 0 9 ) 1 3 0 ( . 0 0 8 8 5 ) 1 3 4 ( . 0 0 4 4 3 )

8

8

8

9 7 ( . 4 8 9 6 0 ) 1 1 3 ( . 1 9 3 9 3 ) 1 2 1 ( . 0 9 8 9 1 ) 1 2 8 ( . 0 4 7 9 8 )

3 4 (

. 0 2 3 1 0 ) 1 4 0 ( . 0 0 9 9 2 ) 1 4 5 ( . 0 0 4 4 5 )

k ¼

4

n 1

n 2

n 3

n 4

a ¼

0 . 5

a ¼

0 . 2

a ¼

0 . 1

a ¼

0 . 0 5

a ¼ 0 . 0 2 5

a ¼

0 . 0 1

a ¼

0 . 0 0 5

2

2

2

2

1 3 ( . 4 5 0 7 9 )

1 6 ( . 1 9 2 8 6 )

1 8 ( . 0 8 2 9 4 )

1 9 ( . 0 4 8 4 1 )

2 1 (

. 0 1 2 3 0 )

2 2 ( . 0 0 5 1 6 )

2 3 ( . 0 0 1 5 9 )

3

3

3

3

2 8 ( . 4 7 2 4 0 )

3 4 ( . 1 8 2 2 9 )

3 7 ( . 0 9 0 6 7 )

4 0 ( . 0 3 7 4 4 )

4 2 (

. 0 1 8 3 4 )

4 4 ( . 0 0 7 9 7 )

4 5 ( . 0 0 4 9 8 )

4

4

4

4

4 9 ( . 4 8 1 7 4 )

5 8 ( . 1 9 0 9 6 )

6 3 ( . 0 8 9 5 0 )

6 7 ( . 0 4 1 9 8 )

7 0 (

. 0 2 1 5 0 )

7 3 ( . 0 0 9 9 8 )

7 6 ( . 0 0 4 1 4 )

5

5

5

5

7 6 ( . 4 8 6 7 9 )

8 9 ( . 1 8 4 5 5 )

9 5 ( . 0 9 6 2 1 ) 1 0 0 ( . 0 4 9 8 3 ) 1 0 5 (

. 0 2 2 9 6 ) 1 1 0 ( . 0 0 9 2 8 ) 1 1 4 ( . 0 0 4 0 4 )

6

6

6

6

1 0 9 ( . 4 8 9 8 7 ) 1 2 6 ( . 1 8 6 3 1 ) 1 3 4 ( . 0 9 6 0 7 ) 1 4 1 ( . 0 4 7 4 3 ) 1 4 7 (

. 0 2 3 3 6 ) 1 5 4 ( . 0 0 8 9 4 ) 1 5 8 ( . 0 0 4 8 1 )

( C o n t i n u e d )

Page 637: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 637/675

T a b l e R ð C

o n t i n u e d Þ

k ¼

5

n 1

n 2

n 3

n 4

n 5

a ¼

0 . 5

a ¼

0 . 2

a ¼

0 . 1

a ¼

0 . 0 5

a ¼ 0 . 0 2 5

a ¼

0 . 0 1

a ¼

0 . 0 0 5

2

2

2

2

2

2 1 ( . 4 6 4 6 6 )

2 6 ( . 1 6 2 4 6 )

2 8 ( . 0 8 7 7 9 )

3 0 ( . 0 4 1 1 6 )

3 2 (

. 0 1 6 2 3 )

3 3 ( . 0 0 9 3 9 )

3 5 ( . 0 0 2 5 7 )

3

3

3

3

3

4 6 ( . 4 8 0 2 0 )

5 4 ( . 1 9 8 2 2 )

5 9 ( . 0 8 7 3 8 )

6 2 ( . 0 4 7 5 2 )

6 5 (

. 0 2 3 3 5 )

6 9 ( . 0 0 7 5 5 )

7 1 ( . 0 0 3 9 2 )

4

4

4

4

4

8 1 ( . 4 8 6 9 5 )

9 4 ( . 1 8 7 5 6 ) 1 0 0 ( . 0 9 9 1 0 ) 1 0 6 ( . 0 4 5 2 3 ) 1 1 0 (

. 0 2 4 5 0 ) 1 1 6 ( . 0 0 8 3 9 ) 1 1 9 ( . 0 0 4 5 5 )

5

5

5

5

5

1 2 6 ( . 4 9 0 5 8 ) 1 4 4 ( . 1 9 0 3 2 ) 1 5 3 ( . 0 9 5 4 2 ) 1 6 0 ( . 0 4 9 7 0 ) 1 6 7 (

. 0 2 3 1 8 ) 1 7 4 ( . 0 0 9 5 8 ) 1 7 9 ( . 0 0 4 7 0 )

6

6

6

6

6

1 8 1 ( . 4 9 2 7 9 ) 2 0 4 ( . 1 9 7 1 9 ) 2 1 6 ( . 0 9 8 4 2 ) 2 2 6 ( . 0 4 8 8 4 ) 2 3 5 (

. 0 2 2 9 2 ) 2 4 4 ( . 0 0 9 6 9 ) 2 5 1 ( . 0 0 4 5 6 )

k ¼

6

n 1

n 2

n 3

n 4

n 5

n 6

a ¼

0 . 5

a ¼

0 . 2

a ¼

0 . 1

a ¼

0 . 0 5

a ¼ 0 . 0 2 5

a ¼

0 . 0 1

a ¼

0 . 0 0 5

2

2

2

2

2

2

3 1 ( . 4 7 2 9 3 )

3 7 ( . 1 8 7 1 3 )

4 0 ( . 0 9 5 3 3 )

4 3 ( . 0 4 0 8 3 )

4 5 (

. 0 2 0 7 1 )

4 7 ( . 0 0 9 4 4 )

4 9 ( . 0 0 3 7 9 )

3

3

3

3

3

3

6 9 ( . 4 6 9 8 1 )

8 0 ( . 1 8 0 5 8 )

8 5 ( . 0 9 6 8 6 )

9 0 ( . 0 4 5 2 4 )

9 4 (

. 0 2 2 0 1 )

9 8 ( . 0 0 9 5 8 ) 1 0 1 ( . 0 0 4 7 3 )

4

4

4

4

4

4

1 2 1 ( . 4 9 0 0 6 ) 1 3 8 ( . 1 9 0 9 2 ) 1 4 7 ( . 0 9 1 8 1 ) 1 5 4 ( . 0 4 5 6 7 ) 1 6 0 (

. 0 2 2 7 4 ) 1 6 7 ( . 0 0 8 8 8 ) 1 7 1 ( . 0 0 4 8 6 )

5

5

5

5

5

5

1 8 9 ( . 4 8 5 6 7 ) 2 1 2 ( . 1 9 3 7 7 ) 2 2 4 ( . 0 9 6 7 9 ) 2 3 4 ( . 0 4 7 7 5 ) 2 4 2 (

. 0 2 4 7 7 ) 2 5 2 ( . 0 0 9 6 4 ) 2 5 9 ( . 0 0 4 5 6 )

6

6

6

6

6

6

2 7 1 ( . 4 9 4 5 2 ) 3 0 2 ( . 1 9 3 0 4 ) 3 1 7 ( . 0 9 9 8 2 ) 3 3 0 ( . 0 4 4 9 0 ) 3 4 2 (

. 0 2 3 6 1 ) 3 5 4 ( . 0 0 9 9 9 ) 3 6 3 ( . 0 0 4 8 5 )

B e c a u s e o f t h e s y m m e t r y o f t h e d i s t r i b u t i o n o f B u n d e r H

0 c r i t i c a l v a l u e s a n d a s s o c i a t e d p r o b a b i l i t i e s f o r a l l p o s s i b l e s a m p l e s i z e

c o m b i n a t i o n s f r o m 2 t o 8 f r o m 3 p o p u l a t i o n s c a n b e o b t a i n e d f r o m t h e t a b l e . F o r e x a m p l e , i f n 1 ¼ 4 , n 2 ¼ 5 , a n d n 3 ¼ 2 , t h e r e q u i r e d

v a l u e s a r e g i v e n b y t h e t a b l e e n t r y f o r n 1 ¼ 2 , n 2 ¼ 4 , a n d

n 3 ¼ 5 .

S o u r c e : A d a p t e d f r o m T a b l e s 1 a n d 2 o f R . E . O

d e h ( 1 9 7 1 ) , O n J o n c k h e e r e ’ s k - s a m p l e t e s t a g a i n s t o r d e r e d a l t e r n a t i v e s , T e c h n o m e t r i c e s ,

1 3 , 9 1 2 – 9 1

8 , w i t h p e r m i s s i o n .

Page 638: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 638/675

k v 10

M M

¼0 005 0 01 0 025 0 05 0 10 10 RV S

2 ¼40 ¼0 005

RV 1 22 2 78

M M

4 3 0 0833 17 0 08336 0 2500 14 0 2500

5 4 0 0167 35 0 03337 0 0500 33 0 0667

10 0 1333 30 0 1333

6 5 0 0028 65 0 00288 0 0083 63 0 0083

11 0 0250 62 0 013914 0 0472 60 0 019416 0 0750 59 0 030617 0 0806 56 0 036119 0 1306 55 0 0694

52 0 097251 0 1139

7 14 0 0048 101 0 004015 0 0079 100 0 0056

17 0 0119 98 0 008718 0 0151 97 0 010320 0 0262 93 0 020624 0 0444 92 0 025425 0 0563 88 0 046431 0 0988 87 0 053632 0 1155 81 0 0988

80 0 1115

8 23 0 0049 149 0 004324 0 0073 148 0 005226 0 0095 144 0 008427 0 0111 143 0 010532 0 0221 136 0 024933 0 0264 135 0 028639 0 0481 129 0 0481

40 0 0529 128 0 053048 0 0978 120 0 0997

7

Page 639: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 639/675

ð Þ

M M

49 0 1049 119 0 10749 34 0 0045 208 0 0046

35 0 0055 207 0 005340 0 0096 202 0 009141 0 0109 201 0 010449 0 0236 191 0 024550 0 0255 190 0 026259 0 0486 181 0 049960 0 0516 180 0 0528

71 0 0961 169 0 097872 0 1010 168 0 1030

10 51 0 0050 282 0 004659 0 0100 281 0 005172 0 0242 273 0 009773 0 0260 272 0 010385 0 0493 259 0 024086 0 0517 258 0 0252

101 0 0985 246 0 0475102 0 1017 245 0 0504

229 0 0990228 0 1023

RV

. 5 . 1 . 5 . 5 .1

10 0 62 0 72 0 89 1 04 1 2311 0 67 0 77 0 93 1 08 1 26

12 0 71 0 81 0 96 1 11 1 2913 0 74 0 84 1 00 1 14 1 3214 0 78 0 87 1 03 1 17 1 34

15 0 81 0 90 1 05 1 19 1 3616 0 84 0 93 1 08 1 21 1 3817 0 87 0 96 1 10 1 24 1 4018 0 89 0 98 1 13 1 26 1 4119 0 92 1 01 1 15 1 27 1 4320 0 94 1 03 1 17 1 29 1 44

8

Page 640: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 640/675

ð Þ RV

. 5 . 1 . 5 . 5 .1

21 0 96 1 05 1 18 1 31 1 4522 0 98 1 07 1 20 1 32 1 4623 1 00 1 09 1 22 1 33 1 4824 1 02 1 10 1 23 1 35 1 4925 1 04 1 12 1 25 1 36 1 5026 1 05 1 13 1 26 1 37 1 5127 1 07 1 15 1 27 1 38 1 5128 1 08 1 16 1 28 1 39 1 5229 1 10 1 18 1 30 1 40 1 53

30 1 11 1 19 1 31 1 41 1 5432 1 13 1 21 1 33 1 43 1 5534 1 16 1 23 1 35 1 45 1 5736 1 18 1 25 1 36 1 46 1 58

38 1 20 1 27 1 38 1 48 1 5940 1 22 1 29 1 39 1 49 1 6042 1 24 1 30 1 41 1 50 1 6144 1 25 1 32 1 42 1 51 1 6246 1 27 1 33 1 43 1 52 1 6348 1 28 1 35 1 45 1 53 1 6350 1 29 1 36 1 46 1 54 1 6455 1 33 1 39 1 48 1 56 1 6660 1 35 1 41 1 50 1 58 1 6765 1 38 1 43 1 52 1 60 1 6870 1 40 1 45 1 54 1 61 1 70

75 1 42 1 47 1 55 1 62 1 7180 1 44 1 49 1 57 1 64 1 7185 1 45 1 50 1 58 1 65 1 7290 1 47 1 52 1 59 1 66 1 7395 1 48 1 53 1 60 1 66 1 74

100 1 49 1 54 1 61 1 67 1 74

100 a 1 48 1 53 1 61 1 67 1 74100 b 1 49 1 54 1 61 1 67 1 74

a ð2 4 Þb ½2 20 ð5 þ7Þ: R B 1982 ’ 40–46

Page 641: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 641/675

’ p L ’

.1 . 5 . 1 . 1

4 444 483 556 6265 405 443 514 5856 374 410 477 5517 347 381 444 5098 327 359 421 5029 310 339 399 460

10 296 325 379 444

11 284 312 366 43312 271 299 350 41214 252 277 325 38816 237 261 311 36618 224 247 293 32820 213 234 279 32925 192 211 251 29630 176 193 229 27040 153 168 201 24150 137 150 179 21460 125 138 164 19375 113 124 146 173100 098 108 127 150

100 980 ffiffiffiffiffi 1 077 ffiffiffiffiffi

1 274 ffiffiffiffiffi 1 501 ffiffiffiffiffi

: R L R C S 1987 L ’ 1 101–112

Page 642: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 642/675

2.6 ¼4 À2 2

2. 7À 6À Á5

À6À 6À Á5

¼1 2 62.8 ð1ÞÀ ð20 3Þ2.10 ðÞ1 À ð0 9Þ

10

ðÞ1 À ð0 5Þ1 10

2.11 ðÞ11 6 ðÞ3 ð2 ffiffiffi Þ

2.12 8 2ð3 À4 Þ 0 1 232 3 À72 2 þ48 À8 1 2 1

2.13 ðÞ1 2 1 4ð þ2ÞðÞ1 2 4ð þ1Þ

2

2.1 ð À1Þð 8

À1Þ2

22.15 4ð À1Þ 4 ð 4 À1Þ

Page 643: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 643/675

2.16 ð2 ÞÀ1 0 1 2

½2ð1 À ÞÀ1 1 2 1

2.18 ðÞ 2 2ð2 þ3Þ ðÞ0 2877 0 0162.23 0 662.2 0 502.25 0 052.26 ð À1Þ2ð À1Þð þ1Þ

2

ð þ2Þ3.15 ðÞ ¼0 0012

ðÞ 3.16

ðÞ ¼2

¼0 095

ðÞ ¼2 ¼0 0253.1 ðÞ ¼6 ¼0 069 ¼11 ¼0 3770

ðÞ ¼4 ¼0 054 ¼10 ¼0 452

ðÞ ¼5 ¼0 024 ¼12 ¼0 38503.18 ¼6 0 5 ¼5 0 7573.1 Q ¼3 1526 0 25 0 50.2 Q ¼7 242 0 10 0 25.12 ¼1 063.18 ¼0 2934 0 10 0 20.20 ðÞ ¼0 3117 0 05 0 10

ðÞ ¼0 1994 0 20ðÞ ðÞ28 ðÞ47

.21 ðÞQ ¼76 89 ¼9 0 001 Q ¼61 13 ¼7 0 001

.25 Q ¼0 27 ¼1 0 50

.2 Q ¼35 54 0 001

.28

.30 ðÞ ¼0 1813 0 20

ðÞ ¼0 1948 0 20.3 ¼0 400 0 05 0 10

5.2 ðÞ ½ð À1Þð þ1Þ1 2

ðÞ1 ð2ffiffiffi ÞðÞ ½3ð À1Þ4ð þ1Þ

1 2

Page 644: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 644/675

5. ðÞR 6 ðÞ0 063

ðÞ ¼4 0 ðÞ0 070

ð Þ À4 16

ðÞD 0 À¼6 ðÞ0 08

ðÞD 0 þ ¼16 5 ðÞ0 039

ð Þ À1 5 75.10 þ ¼53 ¼0 003 ¼9 ¼0 01075.12 ¼12 ¼0 00175.13 ¼13 ¼0 08355.1 249 1157 ¼0 9376

273 779

¼0 876

5.15 ¼4 ¼0 18755.16 ðÞ À8 M 10 ¼0 961

ðÞ À7 M 7 5 ¼0 965.20 ðÞ þ ¼48 ¼0 019

ðÞ0 5 6 ¼0 916

ðÞ ¼7 ¼0 1719

ðÞ À2 6 ¼0 97865.21 ¼0 02025.22 ðÞ ¼4 ¼0 3438

ðÞ À4 13 ¼0 9688

ðÞ þ

¼18

¼0 078

ðÞ À3 9 5 ¼0 9065.23 ¼0 01765.29 ðÞ15 16 ðÞ5233 45 ðÞ ð0 8Þ

4

6.1 0 756.6 ðÞ ¼6 0 ðÞ130 12 870

ðÞ À27 80

ðÞ ¼12 5 0 ðÞ ¼0 082

ðÞ12 À 656.9 ¼54 ¼0 0015

¼4 ¼0 010

6.1 ¼0 07 3 9 ¼0 8578.9 À7 À 19 ¼0 97

À6 À 15 ¼0 948

Page 645: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 645/675

8.10 W ¼60 ¼0 0868.13 W ¼54 ¼0 8668.1 ðÞ ¼3 ¼0 9346

ðÞ ¼14 ¼0 916

ðÞ ¼6 ¼0 9188.15 ðÞW ¼14 ¼0 452

ðÞ À17 24 ¼0 9049.11 ¼0 0429.13 ðÞ0 228 0 267

ðÞ0 159 0 191

ðÞ

10.12 Q ¼3 2410.13 ¼18 9110.1 ¼43 5 0 77 0 09410.16 ðÞQ ¼43 25 ¼2 0 00110.1 ¼16 7 ¼17 96 ¼2 0 00110.22 ¼10 5 0 001 0 0111.1 ðÞ ¼0 5 ðÞD 0

ðÞ0 43 t 0 57

ðÞ ¼0 69 ðÞD 0

11.3 ðÞ ¼2ð À2 Þ

ð

À1Þ ðÞ ¼ ð2 À Þ

2

À Á11.5 ¼0 25 ¼0 26 ¼0 244 %0 268 ¼0 17 ¼0 306 ¼0 17 %0 3

11.6 ðÞ þ ¼57 ¼0 25 ¼0 4013

ðÞ ¼0 648 0 005 ¼0 8038 0 001

11. ¼0 661 0 029 0 033 ¼0 479 0 038 0 06M ¼24 75 0 0045R ð Þ ¼3 ¼0 0257

11.1 ðÞ ¼0 7143ðÞ ¼0 068

Page 646: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 646/675

ðÞ ¼0 4667

ðÞ ¼0 13611.15 ¼0 6363 ¼0 027

¼0 51 ¼0 02311.18 ðÞ ¼0 6687

ðÞ ¼0 779312.1 ¼37 ¼0 03312.2 ðÞ ¼312 W ¼0 82512. ¼7 15 ¼5 15

¼5 15 ¼2 5

12.5 ¼0 97712.6 ¼0 80 ¼ À0 9487

¼ À0 7379 ¼0 111012. 12 ¼ À0 7143 13 ¼ À0 5714

23 ¼0 5714 23 1 ¼0 2842 0 05

12.8 Q ¼4 814 ¼5 0 30Q ¼4 97 0 30

12.9 Q ¼19 14 ¼6 0 001 0 00512.13 ðÞW ¼0 80 0 02 ðÞ D B CJ13.1

ðÞ1 2

ffiffiffi

ðÞ

3

ð1

À

Þ ðÞ3

ð1

À

Þ2 4

13.2 ðÞ1 12 ðÞ 2

13.5 11 .2 Q ¼8 94 ¼2 0 01 0 051 .3 ¼1 94 ¼0 02621 . ¼0 80 ¼0 21191 .5 ¼0 26191 .6 ¼1 643 ¼0 05051 . ¼3 82 0 0011 .8 Q ¼0 5759 ¼2 0 50 0 70

1 .9 ðÞQ ¼13 83 ¼4 0 005 0 01ðÞ ¼0 35 ¼0 37

Page 647: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 647/675

ðÞ ¼ À0 16 ¼ À2 39 ¼0 0091

ðÞ ¼ À0 23661 .10 ¼0 62 ¼0 26761 .11 ¼0 08351 .12 ¼0 11331 .13 Q ¼261 27 ¼12 0 0011 .1 ¼2 01 ¼0 0222

Page 648: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 648/675

Page 649: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 649/675

J C P 1984 2 9

321–331B R R 1960

¯ B20 229–252B R R 1960 S

31 276–295B R R 1967 R

38 303–324B R D J B J M B D B 1972

O J & S

B R 1982 ’ 40–46

B 1948 35 97–112

B M L 1982

265–280B R J L P J S 2000 D

M 3 72–77

B D J 1973 K ’ 60 429–430B D J 1974 K ’

. 39 D M S C S R

B D J P 1974 K ’

23 98–100B Z 1952 K

’ 425–441

B Z 1951 22

592–596B R C J J 1980 ’

S ’ 5 309–335

B 1958 J & S

B P J L 1955 P 1 1–19

B J V 1968 P C J

8

Page 650: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 650/675

B M M 1948 2 22

B M M 1951 159–166 J C

P BB S J K 1986

1986 1 595–598

C J 1961 32 88–100

C J 1965 K S 60 843–853

C V J 1963 3 945–956

C S M M D 1988

6 254–256C S M M D 1988 M ’

— 1 947–967

C S M M D 1991 L

5 227–254C S J D 1991

Q 23 102–106

C S J D 1992 : C

60 235–242C S R M 1990

23 219–230

C S P L 1996 P 5 351–369

C S P L 1997 39 99–116

C L L 1954 2 25 579–589

C R S 1958

29 972–994C 1952 2

23 315–345

Page 651: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 651/675

C M C 1957 J &S

C J 1977 P

C J 1967 S 38

1726–1730C J 1972 K

6591–596

C J R L 1981 R BB P S C R: P129–133 35 124–129

C J L R 1978

3188–190

C J 1999 3 J &S

C ´ 1928 11 13–74 141–180

C ´ 1946 P P P J

D L 1986 L ’ 0 294–296

D 1990 2 M B

D D 1957 K S C M 28 823–838

D 1947 3 335–339

D D B 1958 22 250–257

D 1952 D K S

SRC 21103D27D J C M J V M 1972

S ’ ¼12 26 15–17D’ R B M S 1986

M D D C L R P 1994

– 23 355–371

D K 1977 S : 31 53–68

Page 652: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 652/675

D R M J 1999 P 53 303–306

D 1952 S

23 563–574D J L 1972 R

J D 6 962–963

D J 1964 M 6241–252

D D J B J J R 1979 R D P C K

D B S 1976 — A5 1287–1312

D J 1951 ð Þ 85–90

D J 1975 K S

62 5–22 R L R C S 1987 L ’

1101–112 S 1961 P

56 156–159

R P 1988 M

28 35–39 J D S 1979

: 33 140–142

D M 1987 L —

16 1–16 C S P 1961 :

8 29–40 C S P 1957

: 470–481 1983 :

51 25–58 M 1963 J

& S J L J 1955 S

26 301–312

Page 653: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 653/675

M S R 1982 M ’ B – 69 221–226

M D 1976 S

30 78–85 M D 1982 D

36 119–127 L 1987

S ’ — 1 55–59

L 1988 S ’

¼12ð1Þ16 29255–269

L 1989 S ’ P ’

— 18245–252

D S 1957 J &S

J R 1957 V O O 34 B V

B J L 2000 S ? 5 161–164

M 1937

32 675–701 M 1940

11 86–92 J J 1963 50

55–62 J L 1965 P 60 1127–1141J L 1968 :

63 692–706 J L J L 1988 C

18 267–276 L M R Q 1989 S

91–95 J D 1964

26293–304

J D 1973 C 35 15–24

Page 654: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 654/675

J D 1982 ’ 3 118–121

J D 1997 Q 3 S P S

J D S C 1991 C M –S ’ 59 258–267

J D J L 1966 S

6 B MD J D J L 1970 P

6 95–114

J D J P 1975 : 29 20–25

R D M 1978 C S 32 136–137

J R 1961 C 8

444–448 B V 1954

R SSSR80 525–528

L 1954 K S 51 160–168

L K 1954 1959 1963 M

9 732–764 5 123–163 58 310–364 Z 1972 R

J D 3 S 3437 641

Z 1976 — A5 429–453

B L K 1987 C 2 Â 3

471–476 J 1944 R

15 414–422 J 1958 C P

P K 1984

Q 1 333–341

´ J 1969 D S ´ J Z S 1967 P

Page 655: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 655/675

C D 1982 69 635–640

L 1961 8 151–165

1942 32 301–308

P 1973

68 406–408 P J S M 1975 M

62 527–529 P R M 1987

82 292–299

J C R M R P 1994 S 1

351–361 J L J 1958 S

3 469–486 J L J L L 1956

2 324–335

J L J L L 1970 D 1 783–801

1951 83–92 J

C B P 1962 J & S

R V 1965

60 1153–1162 R V C 1995

5 P S R J M D 1999 S M 2

J & S D M B C 1986

10 3 117–121

S C M M 1986 C

1986 1 630–632 R L J C 1981 R

35 124–129

Page 656: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 656/675

R L J M D 1976 K

— A5 1335–1348 R L D Q D 1975

K 329–384 M S

M S P R R L 1982 L

36 109–112J R K M 1971

98–23

J D 1973 R J D15 421

J R 1954 1 133–145

J M C 1993 R S 3 156 503

K L V 1952 73 C M C

K L V 1953 41–54

K M J K J 1955 26 189–211

K R 1956 3377–387

K M 1962 P C

K M J D 1990 5 L

K J P L S 1972

3 181–192K P J R J 1973

K S ð Þ79–170 M S M SP R

K J 1962 33 495–512

K J 1963 S 3 624–632

K J 1964 9 652–664

Page 657: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 657/675

K J 1965 36 1759–1766

K J 1972 R J D 28 1148–1149

K 1933 S ’ 83–91

K 1941 C 12 461–463

K V S 1961

1 105–121K C C 1968

M C K 1952

23 525–540K 1958

53 814–861K 1952

583–621 8 907–911

L P J P 1972 D ’

26 155–164

L P L R V 1987 C 29

635–665L P 1992 M ’

11 1521–1527L J M 1992 P M

11 1239–1251

L R 1976 S ’

— A5 1393–1407L S M D L 1968

M ’ 10 497–508

L L 1953 2 23–43

L L J M D’ 1998 : P S R J

L 1952 23 34–56

Page 658: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 658/675

L J 1944 15 58–69

L J D B 1961 S P S C

L 1978 V B V M

L 1967 K S

62 399–402L 1969 K S

6 387–389

M S 1975 K ’ 155–164

M S L L P 1981 K ’

13 41–48M B D R 1947

18 50–60

M 1989 : J S 3 2 48–53

M L M M S 1977 B C P CM C

M J S 1981 C L

M J 1950 21 116–119

. 23 637–638 1952M J 1951

22 304–306M J 1951

22 125–128

M J 1951 K S 6 68–78

M J 1952 D 23 435–441

M C 1943 1

188–194M Q 1947

12 153–157

7

Page 659: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 659/675

M J 1971 S 13 118–129

M R J 1966 M BC

M R J 1981 2 S V

M R 1931 W D L 574

M M 1940 11 367–392

M M 1950 M BC 394–406

M M 1954 25 514–522

M M 1963 2 M B C

M M D C B 1974 3 M B C 504–526

M B 1986 R 2 J D 35 215–216

M P P 1951 P 3826–32

M L 1963 R 3 973–983

M 1941 12 228–232

M R K R 1973 : O P C

R M B S 1949

M S 6 S P DC

R 1981 L

L S 1986 C S ’ Q 18194–196

P D 1972 C 2 187–196

M 1988 K ’ t S ’ 103 235–237

1995 C P C K

1955 P 26 64–68

8

Page 660: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 660/675

1967 J &S

1972 R J D 1 346–348

1973 : M C B

1987 S 82 645–647

1991 WS V

R 1971 J ’ 13 912–918

R 1977 ’ —

B6 29–48 D L R K R D P 1988 S

1 140–145

S 1988 V : 56 193–197

P S 1946 D 1 24–33

J D 1982 666–671

1973 S 68 585

1973 S ’

¼13

ð1

Þ16

2 1 19–20 D B 1962 P

C R MP P 1963 :

58216–230

P 1960 J & S

P K 1900

50 157–175P K 1904

Page 661: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 661/675

P M S 1977 K S 19

205–210P R 1979 C S

33 228P R 1963 B

3 1596–1599P J J D 1981

S V P 1993 R S 3

35 383P M L 1964

35 102–121P M L 1965 S

18 51–63

P M L P K S 1971 J & S

P J 1955 26 368–386

Q D 1967 ,

526 C CQ D 1972 W

16 7 M C R P 1989 C S ’

1 245–254R R 1986 R 2

J D 28 275R R 2001

M 5596–101

R R R V 1973 — 2 337–356

R R D 1979 J & S

R P J 1952 ’ W A55

394–404R R L D 1988 O

J & S R S 1953

2 663–668R S 1954

25 146–150

Page 662: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 662/675

R S 1965 60 1118–1126

R S 1953 2 220–238

R 1954 1 200–227

R R P 1977 : P C R M

R B L J 1976 R S D P S

S S C 1999 P 53 326–331

S B 1962 O J & S

S R 1962 P C M

S R 1979 B26 95–103

S P K 1962 11

125–143S P K 1964

18 319–336

S S 1980 R — 9

1071–1086S R 1967

38 1740–1752

S S 1956 M B C

S S J C 1988 2 M B C

S S J 1960

55 429–445 ., 56 1005 1961S M J 1965 P K

60 854–858

S V 1935 V V 12 59–81

S V 1936 S 2 202 449–452

Page 663: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 663/675

S V 1939 R

2 3–16S V 1948

19 279–281S C 1980 B

& M 6 241–242S R 1959

6 241–246

S P C S 2000 3 C

S M 1986 C 5195–233 R B D’ M S 1986 M D

S P D C M M B B M S L K B L D 1984 :

4 20 533–541

S J C 1988 : 2 234–241

S 1954

37–44S B V 1957

28 188–194S S 1987 R 2

J D 82 953S S C 1943

1 66–87

M R C C 1957 P & P D L

D 1956

2 410–426 J 1952 K ’

W 55 327–333

M 1952 S 23

346–366 L D K K R J B 1977 V

2 50 1–73 M S

Page 664: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 664/675

V M SP R

P V P 1973 1

1061–1070 V R M 1997 ’

M 39 823–829

B L 1952 1953 . 14 [

W 55] 453–458 , 15 [ . 6] 303–310 311–316 . 15 [ . 56] 80

B L 1956 V S V

B J 1940

11 147–162 J 1943

1 378–388

J 1949

342–355 J 1949 S

2064–81

J 1962 , : , , , V C

J 1965 , : , V

C J 1968 , :

V V C R 1990 W S 39–40

J 1948 S

W 51 252–261 1945

1 80–83 1947 P

3 119–122 1949 C C S R LS C

Page 665: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 665/675

S K K R 1972 C

171–259 M S M

S P R M B R 1968 P

55 1–17 S S 1948

5 6–50 S S 1962 J & S

J 1944 15 163–172 J 1944 15 97–98 J 1949 93–113 J

C P B D 1973

¯ B 35 35–44

Z J 1972 S S 6 578–580

Z 1988 R 2 J D 30 457

Z 1993 R 3 35 239–240

Page 666: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 666/675

Analysis of variance by ranks( Friedman’s test;Kruskal-Wallis -sampletest)

Ansari-Bradley scale test, 325–329 ARE, 350consistency of, 337denition and rationale, 326distribution theory, 326–327recursive relation, 326–328rejection regions, 326relation with other scale tests,

329table references, 327

ANOVA one-way, 354two-way, 450–451

ARE (asymptotic relative efciency)calculation, examples of

Ansari-Bradley test, 514–517Mann-Whitney test, 509–513median test, 509–513Mood test, 514–517sign test, 503–509van der Waerden test, 512–513Wilcoxon signed-rank test,

503–509definition, 26of control median test, 265–266of Freund-Ansari-Bradley-

Barton-David tests, 350of Klotz normal-scores test,

350of Kruskal-Wallis test, 392–393

Page 667: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 667/675

[ARE]of Mann-Whitney test, 278–279of median test, 262of median test extension, 363of Mood test, 350of percentile modified rank test:

for location, 314for scale, 350

of Siegel-Tukey test, 330of sign test, 223of Sukhatme test, 337of Terry-Hoeffding test, 314of van der Waerden test, 314of Wilcoxon rank-sum test, 301of Wilcoxon signed-rank test, 223theory for, 494–503

Associated probability, 23 Association

bivariate population parameters of concordance, 402

contingency table measures of,526–527

criteria for, 401expected rank correlation, 434–437grade correlation, 437 -sample measures of, concordance

coefficientapplications, 473–476, 481–483complete rankings, 466–476incomplete rankings, 476–483

product-moment correlation, 11tau, 402two-sample measures of

applications, 438–443Fieller coefcient, 437–438product-moment correlation

coefcient, 11rank correlation coefcient,

422–431tau coefcient, 405–420

Asymptotic relative efciency ( ARE)

Bahadur efciency, 518Balanced incomplete block design, 452

Barton-David scale test ( Ansari-Bradley scale test)

Bernoulli distribution, 13Beta distribution, 15Beta function, 16Binomial distribution, 13Bivariate population, association

measuresconcordance, 402correlation, product-moment, 11covariance, 11expected rank correlation, 434–437grade correlation, 437tau, 402–404

Block frequency, 67Brown-Mood median test ( Median

test extension)

Central Limit Theorem, 18Chebyshev’s inequality, 17–18Chernoff-Savage theorem, 290–293Chi-square distribution, 15Chi-square goodness-of-t test,

104–111applications, 108–111combining cells, 107compared with K-S test, 147–150for composite hypothesis, 108grouping measurement data, 108

Chi-square testfor goodness of t, 104–111for independence in contingency

tables, 521–529for proportions, 390–392,

529–530table, 555

Coefcient of disarray, 410, 442–443Comparisons with a control, 383–390Composite hypothesis, 20Computer simulations, 21Concordance, bivariate populations

denition, 400estimate of, 404–405probability of, 404second order, 434–436

Page 668: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 668/675

Concordance coefcientcomplete rankings, 466–476

applications, 473–476denition and rationale,

467–468distribution theory, 469–473inference, 469–471relation with average of rank

correlations, 469table, 598ties, 473

incomplete rankings, 476–483applications, 481–483denition and rationale, 476design specications, 476–477distribution theory, 479–481inference applications, 481–483multiple comparisons, 482ties, 481

Condence bands for cdf, 124–125Condence interval

for location parameter, two-sample case

Mann-Whitney test approach,275–277

median test approach, 257–259Wilcoxon rank-sum test

approach, 302for median, one sample and paired-

sample casessign test approach, 179–180Wilcoxon signed-rank test

approach, 211–215for quantile, 157–163for scale parameter, two-sample

case, 337–338, 345–348for tau, 418

Conservative test, 196Consistency

denition, 24of Ansari-Bradley test, 337of control median test, 263of Jonckheere-Terpstra test, 379of Kendall’s tau, 414–416of Kolmogorov-Smirnov goodness-

of-t test, 120

[Consistency]of Kolmogorov-Smirnov two-

sample test, 241of Mann-Whitney test, 270of Siegel-Tukey test, 334of sign test, 170of Sukhatme test, 334of Wald-Wolfowitz runs test, 239of Wilcoxon rank-sum test, 301of Wilcoxon signed-rank test,

200Consistent test

denition, 24general criteria for, 25

Contingency coefcient, 526–527Contingency tables

applications, 527–529association measures, 526–527denition, 521–522special results for Â2 table,

529–532test of independence in, 522–526

Continuity correction, 29–30Continuous random variable, 10Control median test

samples, 360–363two samples, 262–268

Correction for continuity, 29–30Correlation ( Association)

between ranks and variate values,192–194

partial, 483–488product-moment 11, 191, 400rank, 422–431

Coverages, 66–68Crame ´ r-von Mises statistics, 152Critical region, 20Critical value, 20Cumulative distribution function, 9Cumulative distribution function test

( Goodness-of-t tests)Curtailed sampling, 264–265

Daniels’ test for trend, 431David-Barton test ( Barton-David

scale test)

7

Page 669: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 669/675

Disarray, coefcient of, 410, 442–443Discordance, bivariate populations

denition, 400estimate of, 404–405probability of, 404

Discrete random variable, 10Discrete uniform distribution, 14Dispersion alternative ( Scale

alternative)Dispersion tests ( Scale tests)Distribution-free, denition, 3Double exponential distribution, 15,

507Durbin test, 481

Efcacy ( ARE)Efciency ( ARE)Empiricaldistributionfunction,37–39

consistency of, 39definition, 37distribution theory, 38–39moments, 39uniform convergence of, 39

Empirical quantile function, 40Empty block test, 72Estimation, 18–19Exact , 28Exact test, 21Exceedance statistics, 72–73Expected normal-scores test (Terry-

Hoeffding location test),307–312

Expected value, 10Exponential distribution, 15

Fieller measure of association,437–438

Fisher’s exact test, 532–537Fisher-Yates test, 307–312Fisher’s distribution, 12Freund-Ansari-Bradley scale test (

Ansari-Bradley scale test)Friedman’s test, 453–462

applications, 458–462definition and rationale, 453–458distribution theory, 454–458

[Friedman’s test]multiple comparisons, 459table, 598table references, 455

Gamma distribution, 14Gamma function, 10, 16Geometric distribution, 14Glivenko-Cantelli theorem, 30Goodman-Kruskal coefcient, 420Goodness-of-t tests

chi-square, 104–111Crame ´ r-von Mises, 152Kolmogorov-Smirnov, 111–130Lilliefors’ tests, 130–143relative merits, 147–150visual analysis of, 143–147

Grade correlation coefcient, 437

Hypergeometric distribution, 13

Identical population tests ( Chi-square test for proportions;Jonckheere-Terpstra test;Kruskal-Wallis -sample test; -sample rank statistic,generalized; Median testextension; Page’s test;Two-sample tests)

Incomplete beta integral, 42Independence, tests of

applications, 438–443concordance coefcient

complete rankings, 466–476incomplete rankings, 476–483

in contingency tables, 522–526rank correlation, 428–431tau, 410–418

Interquartile range test (Westenbergscale test), 338–339

Invariance, 20Inverse-normal-scores tests

Klotz scale test, 331–332 -sample location test, 373–375van der Waerden location test,

309–312

8

Page 670: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 670/675

Jacobians, method of, 16–17Jonckheere-Terpstra test, 376–383

Kamat scale test, 339Kendall’s coefcient of confcordance

( Concordance coefcient)Kendall’s tau coefcient ( Tau)Klotz normal-scores scale test,

331–332Kolmogorov-Smirnov goodness-of-t

testapplications, 120–130compared with chi-square test,

147–150consistency of, 120discrete populations, 129

Kolmogorov-Smirnov one-samplestatistics

applications, 120–130confidence bands, 124–125consistency of, 120definition, 112distribution-free property, 112–113distribution-theory, 112–120goodness-of-fit test, 111–123one-sided tests, 112–113, 122–123sample size determination, 125–129table, 576

Kolmogorov-Smirnov two-sampletest, 239–246

applications, 245–246consistency of, 241denition and rationale, 240distribution theory, 241–243recursive relation, 242rejection regions, 240–241, 244table, 581–583

-related sample problem, denition,353–355

Kruskal-Wallis -sample test, 363–373applications, 368–373 ARE, 392–393denition and rationale, 363–366distribution theory, 365–366multiple comparisons, 367–368

[Kruskal-Wallis -sample test]table, 392table references, 366ties, 366–367

-sample median test ( Mediantest extension)

-sample rank statistic, generalized,373–375

-sample tests ( Chi-square test for proportions; Jonckheere-Terpstra test; Kruskal-Wallis -sample test; -sample rankstatistic, generalized; Mediantest extension; Page’s test)

Laplace distribution, 15Lehmann alternative, 235Length-of-longest-run test, 87–90Lengths of runs, distribution of,

88–89Likelihood function, 19Likelihood-ratio test, 22Lilliefors’s test for exponential,

133–143Lilliefors’s test for normality, 130–133Linear combinations, moments of,

11–12Linear rank statistics

denition, 283–285distribution theory

asymptotic, 290–293exact null, 285–290moments, null, 285–287symmetry properties, 288–290

usefulness, general, 294–295Location alternative, two-sample

denition, 296–297distribution model, 296tests useful for ( Control median

test, Mann-Whitney locationtest; Median test; Percentilemodied rank test for location;Terry-Hoeffding location test;van der Waerden location test;Wilcoxon rank-sum test)

Page 671: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 671/675

Location modelone-sample, 296–297two-sample, 232–233 -sample, 354

Location-scale model, 233Location tests

-sampleKruskal-Wallis test, 363–373 -sample rank statistic,

generalized, 373–375median test extension, 355–360

one-samplesign test, 168–179Wilcoxonsigned-ranktest,196–221

two-samplecontrol median test, 262–268distribution model, 232–233Mann-Whitney test, 268–279median test, 247–262percentile modied rank test,

312–314Terry-Hoeffding test, 307–309van der Waerden test, 309–311Wilcoxon rank-sum test, 298–307

Logistic distribution, 15

Mann test for trend, 422Mann-Whitney location test, 268–279

ARE, 278–279condence-interval procedure,

275–276consistency of, 270denition and rationale, 268–270distribution theory, 269–272equivalence with Wilcoxon rank-

sum test, 300–301power, 279recursive relation, 272–273rejection regions, 270–271sample size determination,

276–278table references, 273ties, 273–275

Maximum-likelihood estimate, 19–20McNemar’s test, 537–543

Mediandistribution of, 50–52tests for ( Location tests)

Median test ARE, 262condence-interval procedure,

257–259denition and rationale, 247–250distribution theory, 247–250power, 260–262rejection regions, 251–252table references, 252ties, 253

Median test extension, 355–360Midrange, 33Midranks, 195Moment-generating functions

denition, 11table, 13–15

Moments, denition, 11Monte Carlo techniques, 6Mood scale test, 323–325

ARE, 325denition and rationale, 323–325distribution theory, 323–325

Multinomial distribution, 13Multinomial test, 543–545Multiple comparisons

one-way ANOVA, 367–368two-way ANOVA, 459, 482

Natural signicance level, 28Nominal , 27Nonparametric procedures,

advantages of, 4–7Nonparametric statistics, denition,

3–4Nonrandomness, tests sensitive to

length-of-longest-run test, 87–90number-of-runs test, 78–86rank correlation, 428–431rank von Neumann, 97–98runs up and down, 90–97tau, 410–418

Normal distribution, 14

Page 672: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 672/675

Normal-scores testKlotz scale test, 331–332 -sample location test,

373–375Terry-Hoeffding location test,

309–312van der Waerden location test,

309–312Null distribution, 20Number-of-runs test, 78–86

applications, 85–86distribution theory, 78–84moments, 81–83table, 568–572table references, 81

Number-of-runs-up-and-down test,90–95

applications, 95–96table, 573–578

One-sample coverages, 66–67One-sample test

for goodness-of-t ( Chi-squaregoodness-of-t test;Kolmogorov-Smirnov one-sample statistics; Lillieforstest)

for median ( Sign test; Wilcoxonsigned-rank test)

for randomness ( Length-of-longest run test; Number-of-runs test; Rank correlation testfor independence; rank vonNeumann test; Runs up anddown; Tau test forindependence)

Order statisticsapplications, 33condence-interval estimate of

quantiles, 157–163coverages, 66–68denition, 32distribution theory

asymptotic, 57–63exact, 40–56

[Order statistics]moments

asymptotic, 57–60exact, 53–57

tests for quantile value,163–168

tolerance limits, 64–65Ordered alternatives, 376, 462

Page’s test, 463–466Paired-sample tests for median

differencesign test, 180–182Wilcoxon signed-rank test,

215–216Paired samples, measures of

associationapplications, 438–443Fieller coefcient, 437–438product-moment correlation

coefcient, 422rank correlation coefcient,

423–424tau coefcient, 405

Parameter, denition, 1Parametric statistics, denition, 3Partial correlation coefcient,

483–488Partial tau, 483–488Pearson product-moment correlation

coefcient, 191, 401Percentile modied rank test

for location, 312–314for scale, 332–333

Permutation distribution, 288Phi coefcient, 527Pitman efciency, 26Pivotal statistic, 19Placement, 69, 73Point estimate, 18Positive variables, method of, 347–348Power, 6, 21Power efciency, 26Precedence statistics, 73Precedence test, 259–260

Page 673: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 673/675

Prob value, 23Probability distribution, 10Probability functions

denition, 10table of, 13–15

Probability-integral transformation,42–43

Probability mass function, 10Proportions, test for equality of,

390–392, 529–530 value, 23

Quantile function, 34Quantiles

condence interval for, 157–163denition, 34tests of hypotheses for, 163–168

Quartile, 34–35

r coverage, 67Random sample, 11Random variable, denition, 9Randomized decision rule, 28Randomized test, 27–29Randomness, tests of

length-of-longest-run test, 87–90number-of-runs test, 78–86rank correlation, 428–431rank von Neumann, 97–98runs up and down, 90–97tau, 410–418

Range, distribution of, 52–53Rank, denition, 68Rank correlation coefcient,

422–431applications, 438–443denition, 423–424distribution theory, 424–428independence, use in testing,

428–431population parameter analogy,

434–437properties, 424relation with sample tau,

432–433table, 595–597

[Rank correlation coefcient]table references, 426ties, 429–431trend, use in testing, 431

Rank correlation test forindependence

applications, 438–443procedure, 428–431

Ranklike tests, 340–341Rank-order statistics

correlation between ranks andvariate values, 191–194

denition, 189expected normal scores,

308–309inverse normal scores, 309–310normal scores, 308ranks, 191ties, methods of resolving,

194–196Rank statistic, 191Rank-sum test ( Wilcoxon rank-

sum test)Rank transformation, 303–304Rank von Neumann test, 97–98Rejection region, denition, 20Repeated measures design, 452Rho (Pearson product-moment

correlation coefcient), 11, 191,400

Robustness, 6Rosenbaum scale test, 339Runs

denition, 76tests based on

length-of-longest run, 87–90number-of-runs, 78–80runs up and down, 90–97Wald-Wolfowitz runs test,

235–239Runs up and down

applications, 96denition, 90distribution theory, 91–94table, 573table references, 94

Page 674: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 674/675

Sample distribution function (Empirical distributionfunction)

Sample size determinationMann-Whitney test, 276–278sign test, 178–179signed rank test, 208–211Wilcoxon rank-sum test, 314

Sample space, 9Scale alternative, two-sample

problemdenition, 321–322distribution model, 320–322tests useful for ( Ansari-Bradley

scale test; Kamat scale test;Klotz normal-scores scale test;Mood scale test; Percentilemodied rank test for scale;Rosenbaum scale test; Siegel-Tukey scale test; Sukhatmescale test; Westenberg scaletest)

Scale model, 233, 320–322Scale parameter, 320–322Scale tests, two-sample

applications, 340–348distribution model, 320–322Freund-Ansari-Bradley-Barton-

David tests, 325Kamat test, 339Klotz normal-scores test,

331–332locations unequal, 349Mood test, 323–325percentile modied rank test,

332–333Rosenbaum test, 339Siegel-Tukey test, 329–331Sukhatme test, 333–335Westenberg test, 339

Second precedence test, 262Shift alternative ( Location

alternative)Shift model ( Location model)Siegel-Tukey scale test, 329–331

applications, 182–189

[Siegel-Tukey scale test] ARE, 223consistency of, 334denition and rationale, 330–331relation with other scale tests,

330–331Signed-rank test ( Wilcoxon

signed-rank test)Signicance level, 20Signicance probability, 23Sign test, 168–189

applications, 182–189 ARE, 223compared with Wilcoxon signed-

rank test, 222–223confidence-interval procedure,

179–180consistency of, 170definition and rationale, 168–169distribution theory, 169–171paired-sample procedure, 180–182power, 171–178rejection regions, 169–170,

182–183sample size determination,

178–179table, 577zero differences, 171

Simple hypothesis, 20Simulated power

sign test, 176–178signed-rank test, 205–208

Size of a test, 20Smirnov test ( Kolmogorov-

Smirnov two-sample test)Snedecor’s Distribution, 12Spearman’s rho ( Rank correlation

coefcient)Standard normal distribution, 12Statistics

descriptive, 1inductive, 2

Stirling’s formula, 62Stochastically larger (smaller), 234Student’s distribution, 12Sufciency, 18

Page 675: Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

8/15/2019 Gibbons & Chakraboti - Nonparametric Statistical Inference - Fourth-Edition

http://slidepdf.com/reader/full/gibbons-chakraboti-nonparametric-statistical-inference-fourth-edition 675/675

Sukhatme scale test, 333–337 ARE, 337confidence-interval procedure,

337–338consistency of, 334definition and rationale, 333distribution theory, 333–334medians unknown, 336rejection regions, 334relation with other scale tests,

336–337table references, 335ties, 336

Supremum, 20Symmetry test, 216

Taupopulation parameter

condence-interval estimate of,418

denition, 402modied for discrete populations,

420–421relation with expected rank

correlation, 434relation with , 403

sample estimateapplications, 438–443consistency of, 405–407contingency table application,

530–532denition, 405distribution theory, 410–418independence, use in testing,

410–418moments of, 405–409partial correlation, 483–488phi coefcient, 531recursive relation, 411–413relation with chi-square statistic

in 2 Â2 contingency tables,530–532

relation with rank correlation,432–433

table, 593–594table references, 412

[Tau]ties, 418–420trend, use in testing, 421–422unbiasedness of, 405

Tau a, 420Tau b, 420Tau test for independence

applications, 438–443procedure, 410–418table, 593–594

Terpstra test, 376–383Terry-Hoeffding location test,

307–312Test-based condence interval,

258Ties

denition, 194general resolution of

average probability, 196average statistic, 195least favorable statistic, 196midranks, 195omission, 196randomization, 195range of probability, 196

Tolerance coefcient, 64Tolerance interval, 64Tolerance limits, 64–65Trend, tests for

Daniel’s test, 431length-of-longest run test, 87–90Mann test, 422number-of runs test, 78–86rank correlation, 431rank von Neumann test, 97–98runs up and down, 90–97tau, 422

Two-sample coverages, 67–68Two-sample problem

alternative hypotheses, 234–235denition, 232–235linear rank statistics, 283–295location alternative, 296–297location model, 232–233scale alternative, 321–322scale model, 233, 320–322