references - springer978-94-017-2450-0/1.pdf · [ki] kelker, d.: distribution theory of spherical...
TRANSCRIPT
References
[AmI]
[Am2]
[An]
[At]
[AM]
[Bd]
[BN1]
Amari, S.: Differencial geometry of curved exponential families - curvatures and information loss. Ann. Stat. 10, 1982, 357-385.
Amari, S.: Differential-Geometrical Methods in Statistics. Lecture Notes in Statistics No. 28. Berlin, Springer-Verlag, 1985.
Andel, J.: Mathematical Statistics (In Czech) Praha, SNTL 1978.
Atkinson, A. C.: Developments in the design of experiments. Int. Stat. Rev. 50 (1982), 161-177.
Atkinson, C. & Mitchell, A. F.: Rao's distance measure. Sankya, A43 (1981),345-365.
Bard, Y.: Nonlinear Parameter Estimation. New York, Academic Press, 1974.
Barndorff-Nielsen, O. E.: Information and Exponential Families in Statistical Theory. Chichester, Wiley 1979.
[BN2J Barndorff-Nielsen, O. E.: On a formula for the distribution of the maximum likelihood estimator. Biometrika 70 (1983), 343-365.
[Be] Barndorff-Nielsen, O. E. & Cox, D. R.: Edgeworth and saddlepoint approximations with statistical applications (with discussion). J. Roy. Stat. Soc. B 41 (1979), 279-312.
[Bs] Basu, D.: Statistical Information and Likelihood. A Collection of Critical Essays. Ed.: J. K. Ghost. Lecture Notes in Statistics No. 45, New York, Springer-Verlag, 1988.
[BWl] Bates, D. M. & Watts, D. G.: Relative curvature measures of nonlinearity. J. Roy. Stat. Soc. B 42 (1980), 1-25.
[BW2] Bates, D. M. & Watts, D. G.: Nonlinear Regression Analysis and its Applications. New York, Wiley 1988.
[Be] Bealc, E. M. L.: Confidence regions in nonlinear estimation (with discussion). J. Roy. Stat. Soc. B 22 (1960), 41-88.
[BM] Bird, H. A. & Milliken, G. A.: Estimable functions in the nonlinear model. Commun. Stat. Theor. Methods A 5 (1976), 999-1012.
[Bo] Box, M. J.: Bias in nonlinear estimation. J. Roy. Stat. Soc., B 33 (1971), 171-201.
[Bu] Bunke, H. : Parameter estimation in nonlinear regression. In: P. R. Krishnaiah (Ed.), Handbook of Statistics, Vol. 1, pp. 593-615. Amsterdam, North-Holland, 1980.
REFERENCES 249
[BHS]
[Ce]
[Cll]
[CI2]
[CT]
[Cr]
[Cs]
[Da]
[De]
[Di]
[Dv]
[Ef]
[Eg]
[Ei]
[Fe]
[Fi]
[Fs]
[FKT]
[Ga] [Gt] [GM]
Bunke, H. & Henschke, K. & Striiby, R. & Wisotzki,C.: Parameter estimation in nonlinear regression models. Math. Operationsforsch. u. Stat., serr. statist. 8 (1977), 23-40. Cencov, N. N.: Statistical Decision Rules and Optimal Inference (in Russian). Moskow, Nauka 1972, English Trans. (1982), AMS, Rhode Island. Clarke, G. P. Y.: Moments of the least squares estimators in a nonlinear regression model. J. Roy. Stat. Soc., Ser B 42 (1980), 227-237. Clarke, G. P. Y.: Approximate confidence limits for a parametric function in nonlinear regression. J. Amer. Statist. Ass. 82 (1987), 221-230.
Cook, R. D. & Tsai, C. 1.: Residuals in nonlinear regression. Biometrika 72 (1985), 23-29. Cramer, H.: Mathematical Methods of Statistics. Princeton University Press, 1963. Csiszar, I.: I-divergence geometry of probability distributions and minimization problems. Ann. Prob. 3 (1975), 146-158. Daniels, H. E.: Saddlepoint approximations in statistics. Ann. Math. Stat., 25 (1954), 631-650. Demidenko, E. Z.: Optimization and Regresia. (In Russian). Moskow, Nauka 1989. Dieudonne, J.: Treatise on Analysis, Vol. III. New York: Academic Press, 1972. Dvoretzky, A.: On stochastic approximation. In: Proc. of the Berkeley Symp. on Math. Stat. and Probability, Vol. 1, 39-45, Berkeley 1956. Efron, B.: The geometry of exponential families. Ann. Stat. 6 (1978), 362-376. Eguchi, S.: A differential geometric approach to statistical inference on the basis of contrast functionals. Hiroshima Mathematical Journal, 15 (1985), 341-39l. Eisenhart, L. P.: Riemannian Geometry. Princeton University Press, 1960. Fedorov, V. V.: Theory of Optimal Experiments. New York, Academic Press, 1972. Fiedler, M.: Special Matrices and Their Use in Numerical Mathematics (in Czech). Praha, SNTL 1981. Fields, C.: Small sample asymptotic expansions for multivariate Mestimates. Ann. Stat. 10 (1982), 672-689. Ford, /. & Kitsos, C. P. & Titterington, D. M.: Recent advances in nonlinear experimental design. Technometrics 31 (1989), 49-60. Gallant, A. R.: Nonlinear Statistical Models. New York, Wiley, 1987. Gantmacher, F. R.: Matrix Theory (in Russian) Moskow, Nauka, 1966. Godin, R. & Money, A. H.: Nonlinear Ll'-norm Estimation. New York, Marcel Dekker, 1989.
250 REFERENCES
[Gr] Green, P. J.: Iteratively reweighted least squares for maximum likelihood estimation, and some robust and resistant alternatives (with discussion). J. Roy. Stat. Soc. B 46 (1984), 149-192.
[Ha] Halperin, M.: Confidence interval estimation in nonlinear regression. J. Roy. Stat. Soc. B 25 (1963), 330-333.
[Hn] Hamilton, D. C.: Confidence regions for parameter subsets in nonlinear regression. Biometrika 73 (1986), 67-64.
[HBW] Hamilton, D. C. & Watts, D. G. & Bates, D. M.: Accounting for intrinsic nonlinearity in nonlinear regression parameter inference regions. Ann. Stat.; 10 (1982), 386-393.
[Hr1] Hartley, H. 0.: The modified Gauss-Newton method for the fitting of nonlinear regression functions by least squares. Technometrics 3 (1961), 269-280.
[Hr2] Hartley, H. 0.: Exact confidence regions for the parameters in non-linear regression laws. Biometrika 51 (1964), 347-353.
[Hgl] Hougaard, P.: Parametrization of non-linear models. J. Roy. Stat. Soc. B 44 (1982), 244-252.
[Hg2] Hougaard, P.: Saddlepoint approximations for curved exponential families. Stat. Probability Lett. 3 (1985), 161-166.
[Hg3] Hougaard, P.: The appropriatness of the asymptotic distribution in a nonlinear regression model in relation to curvature. J. Roy. Stat. Soc. B 47 (1985), 103-114.
[IZ] Ivanov, A. V. & Zwanzig, S.: An asymptotic expansion of the distribution of least squares estimation in the nonlinear regression model. Statistics 14 (1983),7-27.
[Je] Jennrich, R. L.: Asymptotic properties of nonlinear least squares estimation. Ann. Math. Stat., 40 (1969), 633-643.
[Jo] Johansen, S.: Functional Relations, Random Coefficients and Nonlinear Regression with Application to Kinetic Data. Lecture Notes in Statist. 22; New York, Springer-Verlag 1984.
[Jg] J;rgensen, B.: Exponential dispersion models (with discussion). J. Roy. Stat. Soc. B 49 (1987), 127-162.
[Ka1] Kass, R. E.: Canonical parametrizations and zero parameter effects curvature. J. Roy. Stat. Soc. B 46 (1984), 86-92.
[Ka2] Kass, R. E.: The geometry of asymptotic inference. Statist. Science 4 (1989), 188-219.
[KI] Kelker, D.: Distribution theory of spherical distributions and a locationscale parameter generalization. Sankhya 32 A (1970), 419-430.
[KG] Kenedy, W. J., Jr. & Gentle, J. E.: Statistical Computing. New York, Marcel Dekker 1980.
[Ki] Kiefer, J.: Generalized equivalence theory for optimum designs (approximate theory), Ann. Stat. 2 (1974), 849-879.
REFERENCES 251
[KW]
[Ko1]
[Ko2]
[Ko3]
[Kb]
[Kba]
[Ku]
[La]
[Lr]
[Ma]
[MI]
[NC]
[PI]
[P2]
[P3]
[P4]
[P5]
[P6]
[P7]
[P8]
Kiefer, J. & Wolfowitz, J.: Optimum designs in regression problems. Ann. Math. Stat. 30 (1959), 271-294. Koutkova, H.: Estimators in the singular regression model (in Czech). PhD thesis, Bratislava 1988 Koutkova, H.: Exponential regression. Fasciculi Mathematici Nr. 20, p. 111-116. Technical University, Bmo 1989. K outkova, H.: On estimable and locally estimable functions in the nonlinear regression model. Kybernetika 28 (1992), 120-128. Kubacek, L.: Foundations of Estimation Theory. Amsterdam, Elsevier, 1988. Kubtickovd, L.: Foundations of Experimental Data Analysis. Boca Raton, CRC Press, 1992. Kullback, S.: Information Theory and Statistics. New York, Wiley, 1959.
Lauritzen, S. L.: Statistical manifolds. Tech. Rep. of Aalborg University Center, 1984. Liiuter, H.: Note on the sttong consistency of the least squares estimator in nonlinear regression. Statistics, 20 (1989), 199-210.
Mahalanobis, P. C.: On the generalized distance in statistics. Proc. Nat. Inst. of Sciences of India, 2 (1936), 49-55. Malinvaud, E.: Statistical Methods of Econometrics. Amsterdam, NorthHolland, 1970. Neider, J. A. & McCullagh L. P.: Generalized Linear Models. London, Chapman and Hall, 1983. Pdzman, A.: Probability distribution of the multivariate nonlinear least squares estimates. Kybernetika 20, (1984), 209-230.
Pazman, A.: Nonlinear least squares - uniqueness versus ambiguity. Math. Operationsforsch. Stat., Ser. Statistics 15, (1984), 323-336.
Pdzman, A.: Discussion to the paper by P. J. Green. J. Roy. Stat. Soc. B 46 (1984), 182-183.
Ptizman, A.: On the uniqueness of M. L. estimates in curved exponential families. Kybernetika 22 (1986), 124-132. Pdzman, A.: Foundations of Optimum Experimental Design. Dordrecht, Reidel, 1986.
Pazman, A.: Flat Gaussian nonlinear regression models. In: Model Oriented Data Analysis, 120-124. Lecture Notes in Economics and Mathern. Systems. Springer, Berlin. 1987.
Ptizman, A.: Discussion to the paper by B. J~rgensen, J. Roy. Stat. Soc. B 49 (1987), 155-156.
Ptizman, A.: On formulas fol' the distribution of nonlinear L. S. estimates. Statistics 18 (1987), 3-15.
252
[P10]
[PH]
[P12]
[P13]
[P14]
[P15]
[P16]
[P17]
[P18]
[PP]
[Pt]
[Pu] [R1]
[R2]
[RM]
[Rt]
[Rd]
[Rs]
[Ry]
REFERENCES
Pdzman, A.: Distribution of the weighted L. S. estimates in nonlinear models with symmetrical errors. Kybernetika 24 (1988), 413-427. Pdzman, A.: On information matrices in nonlinear experimental design. J. Stat. Planning and Inference 21 (1989), 253-263.
Pdzman, A.: A sufficient statistic and a nonstandard linearization in nonlinear regression models. Kybernetika 25 (1989), 441-452.
Pdzman, A.: Almost exact distribution of estimators I - Low dimensional nonlinear regression. Statistics 21 (1990) 9-19; II - Flat nonlinear regression models. Statistics 21 (1990), 21-33. Pdzman, A.: Small-sample distributional properties of nonlinear regression estimators (a geometric approach), with discussion. Statistics 21 (1990), 323-367. Pdzman, A.: Pivotal variables and confidence regions in flat nonlinear regression models with unknown (1'. Statistics 22 (1991), 177-189. Pdzman, A.: Curvatures and the distribution of the maximum likelihood estimator in nonlinear exponential models. REBRAPE (Brazilian J. Probability and Statistics) 5 (1991), pp. 43-63.
Pdzman, A.: Geometry of the nonlinear regression with prior. Acta Mathern. Univ. Comenianae LXI (1992), pp. 263-276. Pdzman, A.: Higher dimensional nonlinear regression - a statistical use of the Riemannian curvature tensor. To appear in Statistics, 1993. Pdzman, A. & Pronzato, L.: Nonlinear experimental design based on the distribution of estimators. J. Stat. Planning and Inference 33 (1992), 385-402. Potocky, R. & To Van Ban: Confidence regions in nonlinear regression models. Appl. Math. 37 (1992), 29-39. Pukelsheim, F.: Optimal Experimental Design. Wiley, New York, 1993.
Rao, C. R.: On the distance between two populations. Sankya 9 (1949), 246-248. Rao, C. R.: Linear Statistical Inference and its Applications. New York, Wiley, 1963. Rao, C. R. & Mitra, S. K.: General Inverse of Matrices and Its Application. New York, Viley, 1971. Ratkowsky, D. A.: Nonlinear Regression Modeling. New York, Marcel Dekker, 1983. Reid, N.: Saddle-point methods and statistical inference. Stat. Sci. 3 (1988), 213-248. Ross, G. J. S.: The efficient use of function minimization in nonlinear maximum likelihood estimation. Applied Statistics 19 (1976), 205-221.
Renyi, A.: Probability Theory. (In German) Berlin VEB Deutscher Verlag, 1962; Czech transl: Praha, Academia 1972.
REFERENCES 253
[Sa]
[SZ]
[Sb]
[SW]
[Se]
lSi] [SkI]
[Sk2]
[Sv]
[Sp]
[St]
[Sj]
[Su]
[TB]
[Va]
[WP]
[Wi]
[We]
[Wo]
[Wul]
Saridis, G. N.: Stochastic approximation methods for identification and control - a survey. IEEE Trans. Automatic Control 19 (1974), 798-809.
Schmidt, W. H. & Zwanzig, S.: Second order asymptotics in nonlinear regression. J. Multivariate Anal. 18 (1986), 187-215.
Seber, G. A. F.: Linear Regression Analysis. New York, Wiley 1977.
Seber, G. A. F. & Wild, C. J.: Nonlinear Regression. New York, Wiley, 1989. Seshadri, V.: Exponential models, Brownian motion, and independence. Can. J. Stat. 16 (1988), 209-221.
Silvey, S. D.: Optimal Design. London, Chapman and Hall, 1980.
Skovgaard, 1. M.: Large deviation approximations for maximum likelihood estimators. Probab. Math. Stat. 6 (1985), 89-107.
Skovgaard, 1. M.: On the density of minimum contrast estimators. Ann. Stat. 18 (1990), 779-789.
Skovgaard, L. T.: A Riemannian geometry of the multivariate normal model. Scand. J. Stat. 11 (1984),211-222.
Spivak, M.: Calculus on Manifolds. Menlo Park, CA, Benjamin, 1965. Sternberg, S.: Lectures on Differential Geometry. Second printing. Englewood Cliffs, Prentice-Hall, 1965. Stulajter, F: Mean square error matrix of an approximate least squares estimator in a nonlinear regression model with correlated errors. Acta Mathern. Univ. Comenianae LXI (1992), pp. 251-261.
Sunduraraj, N.: A method for confidence regions for nonlinear models. Austral. J. Stat. 20 (1978), 270-274.
To Van Ban: Confidence regions in a nonlinear regression model (in Slovak). PhD thesis, Fac. Mathematics & Physics, Comenius University, Bratislava 1990.
Vajda, 1.: Theory of Statistical Interence and Information, Dordrecht, Kluwer, 1989.
Walter, E. & Pronzato, L.: Qualitative and quantitative experiment design for phenomenological models: a survey. Automatica 26 (1990), 195-213.
Wimmer, G.: On the equivalence problem in linear regression models. Part I. BLUE of the mean value. Aplikace matematiky, (1980), 417-422.
Wei, Eo-Cheng: Some second order asymptotics in nonlinear regression. Austral. J. Stat. 33 (1991), 75-84. Also: Geometrical approach to nonlinear regression asymptotics. Technical report, Nanjin Institute of Technology, 1987.
Woodroofe, M.: Very weak expansions for sequentially designed experiments. Ann. Stat. 17 (1989), 1087-1102. Wu, C. F.: Asymptotic theory of nonlinear least squares estimation. Ann. Stat. 9 (1981),501-513.
254 REFERENCES
[Wu2] Wu, C. F.: Asymptotic inference from sequential design in nonlinear situation. Biometrika 72 (1985), 553-558.
[Zv] Zvara, K.: Regression Analysis (in Czech). Praha, Academia 1989.
Basic symbols
AT A-t
Atr(A)
det(A)
I diag (at, ... , ar )
.A ( A ), X'cva,( Ji1' )
Y N
m
fJ "J J, Jc , etc. Jw
1](fJ) J(fJ),H(fJ)
e int (9) e r
- an equality given by a definition - the transposition of the matrix A - the inverse of A - the generalized inverse of A (Section 1.1) - the trace of A - the determinant of A - the identity matrix - the r x r diagonal matrix with diagonal elements
at,.··, ar
- the column- and the kernel space of A (Section 1.1) - the error vector - the observed vector - usually, the number of observations (the dimension
ofy) - usually, the number of parameters (the dimension
of fJ) - the vector of unknown parameters - the true value of fJ - estimators of fJ - the Gauss-Markov (maximum likekelihood) estima-
tor (see (2.3.1» - the mean value of y when fJ is the parameter - the matrix of the first- and the array of the second-
order derivatives of 1]( fJ) (see Section 4.1) - the same in exponential families (Section 9.2) - the same for the derivatives of the canonical map-
ping (Section 9.2) - the parameter space - the interior of the set 9 - the closure of the set 9 - the space of the canonical parameter (Section 9.1)
256
P, Pw , P('I?),etc.
P" , P" Pr, Pr", etc. &
a~/. a.~/. 81/1(x) 'P, J'P' 8x
E( ), E,,( ), etc. Var( ) u2W
BASIC SYMBOLS
- projectors in the regression model (projection matrices) (see Section 1.1, (1.3.3)-(1.3.5), (2.2.5), (2.5.1), (3.1.1), etc.)
- projectors in exponential families (see Section 9.2) - probabilities - the expectation surface or the expectation plane
(see Section 3.1, Section 4.2, (9.2.1» - the tangent plane and the tangent space to C at
the point 'I? - the canonical surface (see (9.2.2»
- different symbols for derivatives (see Section 2.1 for different notations)
- mean values - the variance or the variance matrix - the covariance matrix of the error or of the observed
vector in regression models E( 'I?), E-y - the variance matrix of the sufficient statistic in ex
ponential families (see (9.1.4» M,Mw,M('I?),Mw('I?) - the information matrix (for u = 1 in regression
Me, Mc('I?) ( , ), 1111 (a, b)e,
II lie Kint( 'I?)
Pint Cint ('I?), Cpar( 'I?) d('I?) q( ,11'1?)
q( ,11 'I?) , qe( ,11'1?) Q(,1, 'I?), Q(,1, 'I?)
D(b, ,1)
models) (see Section 1.6, Remark 2.2.1, (9.2.8» - formally like Mw, Mw('I?) but for W = C - a general inner product and norm - aTC-1b or aTC-b
- the norm correspponding to { , )e - the intrinsic curvature (see Proposition 3.1.1 and
Sections 4.2, 1.1, 9.2) - the parameter-effect curvature (see Sections 3.1,
4.2,9.2) - = [KintC'I?)]-l = the radius of curvature - curvatures of the canonical surface (Section 9.2) - the ancillary space (see Sections 3.2, 5.4, 7.2)
- the global approximation of the probability density of,1w (see (3.3.1»
- the same for other estimators (Chapters 7,9)
- the modified information matrices (see Proposition 3.3.1, Proposition 7.1.1 and Section 7.2)
- a matrix defined in Proposition 1.1.1
BASIC SYMBOLS
R(-Q)
Wi( -Q)
O(-Q) N( -Q), Z(-Q)
257
- the set of all (potentionally) regular and the set of singular points of int (0) (Section 4.1)
- the Riemannian curvature tensor (see (4.2.2), (7.6.2))
- the i th vector of the basis of the ancillary space (Sections 4.4, 7.1,9.4)
- = (Wt(-Q), ••• ,WN-m(-Q)) (see Section 6.2) - the intrinsic and the parameter-effect curvature ar-
rays (Sections 5.5,6.2) - the I-divergence (see (9.2.3), (9.2.5))
Subject index
affine connections 243 almost exact 69, 186
approximation 68 ancillary space 125, 173 approximate probability density 163,
174, 178, 179, 184, 186, 236 arc-length 57 asymptotic
normality 133, 201, 210 properties 131
bias 44, 140 bounded curvature 156
canonical mapping 224 surface 224
confidence interval 72, 74 region 27, 52, 53, 195, 197, 200
covering exponential family 218, 231 criterion of
A-optimality 29 D-optimality 29 E-optimality 30
curvature 57 arrays 128 vector 39
curve in the set e 38 on the surface CO 39
design 30, 209 measure 31
diffeomorphism 98 differentiable manifold 95
eigenvalue of a matrix 8 elliptically symmetrical distribution
182 entropy 76, 151 equivariant density 154 error vector 12, 14 estimable parameter function 24 estimators of cr 206 expectation
curve 56 mapping 224 plane 14 surface 224
experimental design 29, 209 explanatory variable 13 exponential families 215
first-order approximation 131, 140 flat models 92, 192
g-inverse matrices 9 Gauss-Markov estimator 16,41 Gauss-Newton method 116, 178 geodesic curve 39, 86,242 global approximations 154 gradient method 122
I-divergence 224 implicit function theorem 103, 143 information matrix 31,228,241 intrinsic curvature 58,60,88, 178,229
SUBJECT INDEX
intrinsically linear models 36, 90, 163 iterative methods 113
L2 estimator 15, 61, 101, 113 least-squares estimator 15 Levenberg-Marquardt method 124 likelihood ratio 203 linear
approximation 43 regression model 13
local approximations 131
maximal likelihood estimator 61,236, 66
mean-square error matrix 149 measure of information 76 metric tensor 241 moments of dw 74, 149
Newton's method 122 non-overlapping 157 normal equation 17,63, 145, 173 normed vector of curvature 63
orthogonal projector 12, 40, 53, 60, 228
orthonormal basis 125, 146, 179 overlapping 66
parameter effect curvature 61, 88, 229 space 36
pivotal variables 195 posterior density 177 potentially regular point 81 prior density 177, 211 projector 11,69,86,206
quantile 27 quasigradient methods 123
radius of curvature 57 Rao distance 241 regular
linear models 14 model 81 point 81 reparametrization 56, 89, 230
residual vector 49, 64, 152, 207 response variable 13
259
restricted sample space 156, 241 Riemannian curvature tensor 94, 185
saddle-point approximation 230 second-order approximation 140, 142 shift vector 173 singular
linear models 14 model 81, 94 point 81
square root of a matrix 8 step size 116, 118, 214 stopping rules 114 strongly consistent 132
tail product 132 tangent
plane 44, 97 space 53
three dimensional array 34 tube 66
unbiased estimator for (72 27 unbiased linear estimator 24