order preserving property of moment estimators

6
Order preserving property of moment estimators Piotr Nowak Mathematical Institute, University of Wroc!aw, Pl. Grunwaldzki 2/4, 50-384 Wroc!aw, Poland article info Article history: Received 17 November 2011 Accepted 7 February 2012 Available online 14 February 2012 Keywords: Moment estimator Maximum likelihood estimator Exponential family Stochastic ordering Total positivity abstract Balakrishnan and Mi (2001) considered order preserving property of maximum like- lihood estimators. In this paper there are given conditions under which the moment estimators have the property of preserving stochastic orders. The preserving property for the usual stochastic order as well as for likelihood ratio one is considered. Sufficient conditions are established for some parametric families of distributions. & 2012 Elsevier B.V. All rights reserved. 1. Introduction and preliminaries Suppose that X ¼ðX 1 , ... , X n Þ is a sample from a population with density f ðx; yÞ, where y 2 Y R. We consider the estimation of y by the method of moments (see, e.g., Borovkov, 1998). Let j : R-R be a function such that the function m : Y-R, mðyÞ¼ Z 1 1 jðxÞf ðx; yÞ dx, the generalized moment, is monotone and continuous on Y. Particularly, if m is strictly monotone, then m is one-to-one and m 1 exits. Then for every t 2 mðYÞ, the domain of the values mðyÞ, y 2 Y, there exists a unique solution of the equation mðyÞ¼ t, i.e. y ¼ m 1 ðtÞ. In the case when m is nondecreasing or nonincreasing we can take m 1 ðtÞ¼ inf fy : mðyÞ Ztg for any t 2 mðYÞ. Let j ¼ 1 n X n i ¼ 1 jðX i Þ be the empirical generalized moment based on X. If j 2 mðYÞ, then the estimator obtained by the method of moments (to be short moment estimator) is of the form ^ y ¼ m 1 ð jÞ, ð1Þ provided that m 1 exists. Assume for simplicity that for a given function j the value of the empirical generalized moment belongs to mðYÞ for every resulting set of observations. Contents lists available at SciVerse ScienceDirect journal homepage: www.elsevier.com/locate/jspi Journal of Statistical Planning and Inference 0378-3758/$ - see front matter & 2012 Elsevier B.V. All rights reserved. doi:10.1016/j.jspi.2012.02.011 E-mail address: [email protected] Journal of Statistical Planning and Inference 142 (2012) 1849–1854

Upload: piotr-nowak

Post on 05-Sep-2016

213 views

Category:

Documents


1 download

TRANSCRIPT

Contents lists available at SciVerse ScienceDirect

Journal of Statistical Planning and Inference

Journal of Statistical Planning and Inference 142 (2012) 1849–1854

0378-37

doi:10.1

E-m

journal homepage: www.elsevier.com/locate/jspi

Order preserving property of moment estimators

Piotr Nowak

Mathematical Institute, University of Wroc!aw, Pl. Grunwaldzki 2/4, 50-384 Wroc!aw, Poland

a r t i c l e i n f o

Article history:

Received 17 November 2011

Accepted 7 February 2012Available online 14 February 2012

Keywords:

Moment estimator

Maximum likelihood estimator

Exponential family

Stochastic ordering

Total positivity

58/$ - see front matter & 2012 Elsevier B.V. A

016/j.jspi.2012.02.011

ail address: [email protected]

a b s t r a c t

Balakrishnan and Mi (2001) considered order preserving property of maximum like-

lihood estimators. In this paper there are given conditions under which the moment

estimators have the property of preserving stochastic orders. The preserving property

for the usual stochastic order as well as for likelihood ratio one is considered. Sufficient

conditions are established for some parametric families of distributions.

& 2012 Elsevier B.V. All rights reserved.

1. Introduction and preliminaries

Suppose that X¼ ðX1, . . . ,XnÞ is a sample from a population with density f ðx; yÞ, where y 2 Y � R. We consider theestimation of y by the method of moments (see, e.g., Borovkov, 1998). Let j : R-R be a function such that the functionm : Y-R,

mðyÞ ¼Z 1�1

jðxÞf ðx; yÞ dx,

the generalized moment, is monotone and continuous on Y. Particularly, if m is strictly monotone, then m is one-to-oneand m�1 exits. Then for every t 2 mðYÞ, the domain of the values mðyÞ, y 2 Y, there exists a unique solution of the equationmðyÞ ¼ t, i.e. y¼m�1ðtÞ. In the case when m is nondecreasing or nonincreasing we can take m�1ðtÞ ¼ inffy : mðyÞZtg for anyt 2 mðYÞ. Let

j ¼ 1

n

Xn

i ¼ 1

jðXiÞ

be the empirical generalized moment based on X.If j 2 mðYÞ, then the estimator obtained by the method of moments (to be short moment estimator) is of the form

y ¼m�1ðjÞ, ð1Þ

provided that m�1 exists. Assume for simplicity that for a given function j the value of the empirical generalized momentbelongs to mðYÞ for every resulting set of observations.

ll rights reserved.

P. Nowak / Journal of Statistical Planning and Inference 142 (2012) 1849–18541850

It is well known that these estimators are strongly consistent. One often put jðxÞ ¼ xk, kZ1. Then j reduces to

mk ¼1

n

Xn

i ¼ 1

Xki ,

i.e. kth empirical moment. Particularly, m1 ¼ X .In this paper our aim is to give conditions under which moment estimators have the property of preserving stochastic

orders. We shall deal with stochastic orders, so recall their definitions.Let X and Y be two random variables, FX and FY their respective distribution functions and fX and fY their respective

density functions, if they exist. We say that X is stochastically smaller than Y (denoted by Xr stY) if FXðxÞZFY ðxÞ for allx 2 R. The stronger order than usual stochastic order is likelihood ratio one. We say that X is smaller than Y in the likelihood

ratio order (denoted by Xr lrY) if f Y ðxÞ=f XðxÞ is increasing in x. Let F�1X and F�1

Y be quantile functions of FX and FY,respectively. We say that X is smaller than Y in the dispersive order (denoted XrdispY) if F�1

Y ðaÞ�F�1X ðaÞ is an increasing

function in a 2 ð0;1Þ. For further details on stochastic orders we refer to Shaked and Shanthikumar (2007) and Marshalland Olkin (1979).

We say that the family of distributions fFðx; yÞ,y 2 Yg is stochastically increasing in y if Xðy1Þr stXðy2Þ for all y1oy2,y1,y2 2 Y, where XðyÞ has a distribution function Fðx; yÞ. It is well known that location parameter family of densitiesff ðx;yÞ � gðx�yÞ,y 2 Rg is stochastically increasing in y. Also the scale parameter family of densities ff ðx; yÞ �gðx=yÞ=y,y40g, where g(x) is a density on ð0,1Þ, is stochastically increasing in y.

Furthermore, we say that the family of densities ff ðx;yÞ,y 2 Yg has monotone likelihood ratio if f ðx; y2Þ=f ðx; y1Þ isincreasing function in x for any y1oy2, y1,y2 2 Y, i.e. Xðy1Þr lrXðy2Þ. It is well known that for some functions cðyÞ, h(x), ZðyÞand T(x) the following families have the monotone likelihood ratio:

(a)

the family of distributions with density f ðx; yÞ ¼ cðyÞhðxÞIð�1,yÞðxÞ; (b) the family of distributions with density f ðx; yÞ ¼ cðyÞhðxÞexpfZðyÞTðxÞg, provided that both Z and T are increasing.

The likelihood ratio order is closely related with the total positivity (see Karlin, 1968). Let kðx,yÞ be a measurable functiondefined on A�B, where A and B are subsets of R. We say that kðx,yÞ is totally positive of order r (to be short kðx,yÞ is TPr) iffor all x1o � � �oxm, xi 2 A, and for all y1o � � �oym, yi 2 B, and all 1rmrr, we have

kðx1,y1Þ . . . kðx1,ymÞ

^ ^

kðxm,y1Þ . . . kðxm,ymÞ

��������������Z0:

It is clear, that the ordering Xðy1Þr lrXðy2Þwhenever y1oy2, y1,y2 2 Y, is equivalent that the density f ðx; yÞ is TP2. Recallbasic facts of the theory of total positivity. We refer to Karlin (1968) for their proofs.

From the definition it follows immediately that if f and c are nonnegative functions and kðx,yÞ is TPr then fðxÞcðyÞkðx,yÞis also TPr. Similarly, if f and c are increasing functions and kðx,yÞ is TPr, then kðfðxÞ,cðyÞÞ is again TPr.

Twice differentiable positive function kðx,yÞ is TP2 if and only if

@2

@x@ylog kðx,yÞZ0:

Total positivity of many functions that arise in statistics follows from the basic composition formula. If f1ðx,zÞ is TPm,f2ðz,yÞ is TPn and if the integral

kðx,yÞ ¼

Z 1�1

f1ðx,zÞf2ðz,yÞ dz

is finite, then kðx,yÞ is TPminðm,nÞ.A particular and important case is when kðx,yÞ ¼fðy�xÞ. A nonnegative function f is said to be PFk (Polya frequency

function of order k) if fðy�xÞ is TPk.Recall that a real valued function f is said to be logconcave on the interval A if fðxÞZ0 and log f is an extended real

valued concave function (we put log 0¼�1). It is well known (see, e.g., Schoenberg, 1951) that the function f is PF2 if andonly if f is nonnegative and logconcave on R. If f and c are logconcave functions on R, such that the convolution

oðxÞ ¼Z 1�1

fðx�zÞcðzÞ dz

is defined for all x 2 R, then the function o is also logconcave on R.Another very important property of TP functions is the variation diminishing property.

Lemma 1. Let c be given by absolutely convergent integral

cðxÞ ¼Z 1�1

kðx,yÞfðyÞ dy

P. Nowak / Journal of Statistical Planning and Inference 142 (2012) 1849–1854 1851

where kðx,yÞ is TPr and f change sign at most jrr�1 times. Then c changes sign at most j times. Moreover, if c changes sign j

times, then f and c have the same arrangement of signs.

The basic tool for proving stochastic ordering of estimators is order preserving theorems under functional transforma-tions. For details and proofs we refer to Shaked and Shanthikumar (2007).

Lemma 2. Assume that f is an increasing function.

(a)

If Xr stY , then fðXÞr stfðYÞ. (b) If Xr lrY , then fðXÞr lrfðYÞ.

Lemma 3. Let X1, . . . ,Xn and Y1, . . . ,Yn be independent random variables.

(a)

If Xir stYi, i¼ 1, . . . ,n, then X1þ � � � þXnr stY1þ � � � þYn. (b) If Xir lrYi, i¼ 1, . . . ,n, then X1þ � � � þXnr lrY1þ � � � þYn, provided these random variables have logconcave densities.

Let y be a moment estimator of y based on a sample from population with density f ðx; yÞ, y 2 Y � R. Let now y1 and y2

be the moment estimators obtained on the basis of the sample X from population with density f ðx;y1Þ and of the sample Yfrom population with density f ðx; y2Þ, respectively, where y1oy2, y1,y2 2 Y. We say, that y is stochastically increasing in yif the family of its distributions is stochastically increasing in y, i.e. y1r sty2. We also shall be interested whether thestronger property y1r lry2 holds, i.e. y is increasing in y with respect to the likelihood ratio order.

Note that in general the moment estimator may not be stochastically monotone as the following example indicates.

Example 1. Consider a sample X¼ ðX1, . . . ,XnÞ from the uniform distribution on the interval ð�y,yÞ, y40. Then EðXÞ ¼ 0

and the first moment contains no information about y. We use the second moment. Then we have EðX2Þ ¼ y2=3¼mðyÞ.

Thus the moment estimator of y is of the form y ¼ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffið3=nÞ

Pni ¼ 1 X2

i

q. Since the random variable X is not stochastically

increasing in y and mðyÞ is increasing for y40, the estimator y is not stochastically monotone.

2. Results

Let X¼ ðX1, . . . ,XnÞ be a sample from a population with density f ðx; yÞ, y 2 Y. We always assume that the momentestimator y is obtained on the basis of X.

The following theorem gives a sufficient conditions for likelihood ordering of moment estimators for the one-parameterfamily in the case when the estimator is based on the sample mean.

Theorem 1. Assume that f ðx; yÞ is TP2, f ðx; yÞ is logconcave in x and mðyÞ ¼R1�1

xf ðx; yÞ dxo1 for all y. Then the moment

estimator y is increasing in y with respect to the likelihood ratio order.

Proof. Since f ðx; yÞ is TP2 we have Xiðy1Þr lrXiðy2Þ for all y1oy2, y1,y2 2 Y, i¼1, y, n. From the variation diminishingproperty (see Lemma 1) we deduce that function

mðyÞ�c¼

Z 1�1

ðx�cÞf ðx; yÞ dx

changes sign at most one for any c, from � to þ if occurs. This implies that mðyÞ is increasing. Hence the momentestimator of y is of the form y ¼m�1ðX Þ. Since f ðx; yÞ is logconcave in x, we conclude from Lemma 3(b) that X is increasingin y with respect to the likelihood ratio order. The theorem follows from the preserving property of the likelihood ratioorder under increasing operations. &

If we weakness the assumptions of Theorem 1 we can obtain the following theorem.

Theorem 2. Assume that the function f ðx; yÞ is TP2 and mðyÞ ¼R1�1

xf ðx; yÞ dxo1 for all y. Then the moment estimator of y is

stochastically increasing in y.

In the general case we can formulate the following corollary. We omit the proof because it follows by the same methodas in Theorem 1.

Corollary 1. Assume that the function j is increasing and the integral mðyÞ ¼R1�1

jðxÞf ðx; yÞ dxo1 for all y. If f ðx;yÞ is TP2,then the moment estimator y of the form (1) is stochastically increasing in y. Moreover, if the random variable jðXÞ has

logconcave density, then the estimator y is increasing in y with respect to the likelihood ratio order.

The next corollary follows immediately from Corollary 1 and from Proposition 2 of An (1998).

P. Nowak / Journal of Statistical Planning and Inference 142 (2012) 1849–18541852

Corollary 2. Let X be a random variable with a density f ðx; yÞ and j be a function such the integral mðyÞ ¼R1�1

jðxÞf ðx; yÞ dxo1 for all y. Assume that the following conditions are satisfied:

(a)

j is strictly increasing, concave and differentiable; (b) 9ð@=@xÞj�1ðxÞ9 is logconcave; (c) f ðx; yÞ is logconcave and decreasing in x on the support of X; (d) f ðx; yÞ is TP2.

Then the moment estimator y is increasing in y with respect to the likelihood ratio order.

In many situations we deal with the one-parameter exponential family with densities of the form

f ðx;yÞ ¼ cðyÞhðxÞexpfZðyÞTðxÞg, x 2 ða,bÞ, y 2 Y: ð2Þ

Recall the well known formula for moments of T(X) (see Berger and Casela, 2002)

EyðTðXÞÞ ¼ �½log cðyÞ�0

½ZðyÞ�0, ð3Þ

VaryðTðXÞÞ ¼�½log cðyÞ�00�½ZðyÞ�00 � EyðTðXÞÞ

ð½ZðyÞ�0Þ2: ð4Þ

It is easy to prove that if both T and Z are increasing (decreasing), then f ðx; yÞ is TP2 and from the variation diminishingproperty it follows that mðyÞ ¼ EyðTðXÞÞ is increasing (decreasing). Combining those facts with order preserving propertiesof the usual stochastic order we can formulate the following theorem.

Theorem 3. For the one-parameter exponential family with densities of the form (2), where both Z and T are increasing

(decreasing), the moment estimator y ¼m�1ðð1=nÞPn

i ¼ 1 TðXiÞÞ is stochastically increasing in y.

Let us make the following observations. Consider the maximum likelihood estimation of y for the one-parameterexponential family (2). Let x¼ ðx1, . . . ,xnÞ be a value of a sample X. The likelihood function is of the form

Lðx; yÞ ¼Yn

i ¼ 1

hðxiÞ½cðyÞ�nexp ZðyÞXn

i ¼ 1

TðxiÞ

( ):

So we have

@ log Lðx; yÞ@y

¼ n½log cðyÞ�0 þ½ZðyÞ�0Xn

i ¼ 1

TðxiÞ:

It is clear that @ log Lðx; yÞ=@y¼ 0 if and only if

1

n

Xn

i ¼ 1

TðxiÞ ¼�½log cðyÞ�0

½ZðyÞ�0: ð5Þ

Let y be a solution of Eq. (5). Then y is the maximum likelihood estimator (MLE) since using (4) we have

@2 log Lðx; yÞ@y2

����y ¼ y¼�nð½log cðyÞ�00Þ29y ¼ y � Vary ðTðXÞÞo0:

On the other hand y is the moment estimator, since EyðTðXÞÞ ¼�½log cðyÞ�0=½ZðyÞ�0.Thus from Theorem 3 we have the following result.

Theorem 4. For the one-parameter exponential family with densities of the form (2), where both Z and T are increasing

(decreasing), the maximum likelihood estimator y is stochastically increasing in y.

Remark 1. Theorems 2 and 3 of Balakrishnan and Mi (2001) give conditions under which maximum likelihood estimatorsfor the one-parameter exponential family of the form (2) are stochastically increasing. We have proved it in Theorem 4under weaker assumptions.

Example 2. Let X¼ ðX1, . . . ,XnÞ be a sample from the distribution with density

f ðx;yÞ ¼

ffiffiffiffiffiffiffiffiypx3

rexpf�y=xg, x40, y40:

This is the one-parameter exponential family with TðxÞ ¼ 1=x and ZðyÞ ¼ �y. Using (3) we get the moment estimatory ¼ ðð2=nÞ

Pni ¼ 1 1=XiÞ

�1. By Theorem 3 the estimator y is stochastically increasing in y. Of course, y is also the maximumlikelihood estimator.

P. Nowak / Journal of Statistical Planning and Inference 142 (2012) 1849–1854 1853

Example 3. Let X¼ ðX1, . . . ,XnÞ be a sample from the gamma distribution with density

f ðx; lÞ ¼1

GðaÞlaxa�1 expf�x=lg, x40, l40,

where a40 is known. This is clearly the exponential family with TðxÞ ¼ x and ZðlÞ ¼�1=l, so

ElðXÞ ¼

@

@lð�a log l�GðaÞÞ

@

@l1

l

� � ¼ al:

By Theorem 3 the estimator l ¼ X=a is stochastically increasing in l. Moreover, if aZ1, then the density f ðx; lÞ islogconcave in x. Hence for aZ1 by Theorem 1 the estimator l is also increasing with respect to the likelihood ratio order.

Assume now that l is fixed and a is unknown. This is also the one-parameter exponential family with TðxÞ ¼ log x andZðaÞ ¼ a. Using (3) we get

EaðlogðXÞÞ ¼@

@a ðlog GðaÞþa loglÞ ¼CðaÞþ log l¼mðaÞ,

where CðaÞ ¼ ð@=@aÞlog GðaÞ is the digamma function. By Theorem 3 the estimator a ¼m�1ðT Þ is stochastically increasingin a. This estimator obtained by the method of maximum likelihood was considered in Example 2 of Balakrishnan and Mi(2001).

Example 4. Let X¼ ðX1, . . . ,XnÞ be a sample from the logistic distribution with density

f ðx; yÞ ¼y expf�xg

½1þexpf�xg�ðyþ1Þ, x 2 R, y40:

This is the one-parameter exponential family with TðxÞ ¼ log½1þexpf�xg� and ZðyÞ ¼�y. From (3) we get EyðTðXÞÞ ¼ 1=y,hence from Theorem 3 the estimator y ¼ 1=T is stochastically increasing in y.

Example 5. Let X¼ ðX1, . . . ,XnÞ be a sample from the uniform distribution on the interval ð0,yÞ, y40. Then the momentestimator of y based on the first empirical moment is of the form y ¼ 2X and by Theorem 1 this estimator is increasing in ywith respect to the likelihood ratio order. Consider another moment estimator based on the generalized momentj ¼ ð1=nÞ

Pni ¼ 1 logXi. Easy calculations show that EyðlogðXÞÞ ¼ logy�1. Thus from Corollary 2, the moment estimator

y ¼ expfð1=nÞPn

i ¼ 1 log Xiþ1g is increasing in y with respect to the likelihood ratio order.

Now we consider the case when the family of distributions under consideration is the location family. Then we canformulate the following theorem.

Theorem 5. Let F ¼ ff ðx; yÞ � gðx�yÞ,y 2 Rg be the location family of densities. Suppose that m1 ¼ EðXð0ÞÞ ¼R1�1

xgðxÞ dxo1.Then the estimator y ¼ X�m1 is stochastically increasing in y. Moreover, if g is logconcave then y is also increasing in y with

respect to the likelihood ratio order.

Proof. The estimator y is stochastically increasing in y since the family F is stochastically increasing in y. If we assumethat g is logconcave, i.e. gðx�yÞ is TP2, then the family F has monotone likelihood ratio. Since the convolution of logconcavefunctions is the logconcave one we deduce that the estimator y is also increasing with respect to the likelihood ratioorder. &

Similar results we may obtain for the scale parameter family. The ordering property of the moment estimators of y inthis case is described by the following theorem.

Theorem 6. Let F ¼ ff ðx; yÞ � gðx=yÞ=y,y40g, where g(x) is a density on ð0,1Þ, be the scale parameter family of densities.

Suppose that exists kth moment of the random variable Xð1Þ with density g(x) and EðXkð1ÞÞ ¼ mk. Then the moment estimator

y ¼ffiffiffiffiffiffiffiffiffiffiffiffiffiffimk=mk

kp

is stochastically increasing in y. Moreover, if VarðXð1ÞÞ ¼ s2o1, then the moment estimator for y given by

y ¼ffiffiffiffiffiS2

p=s, s40, where S2 is the biased sample variance, is also stochastically increasing in y.

Proof. The first part of the theorem is obvious since the family F is stochastically increasing in y. It is also easy to provethat if y1oy2, y1,y2 2Y, then Xðy1ÞrdispXðy2Þ and then the vector of spacings U¼ ðU1, . . . ,UnÞ, where Ui ¼ Xi:n�Xi�1:n,i¼ 1, . . . ,n, X0:n ¼ 0, is stochastically increasing in y (see Oja, 1981). Since

S2¼

1

n2

X1r io jrn

ðXj:n�Xi:nÞ2¼

1

n2

X1r io jrn

ðUjþUj�1þ � � � þUiþ1Þ2,

then the sample variance is an increasing function of the vector U. Thus the theorem follows from the stochastic preservingproperty of multivariate stochastic ordered vectors under monotone operations (see Marshall and Olkin, 1979). &

P. Nowak / Journal of Statistical Planning and Inference 142 (2012) 1849–18541854

Example 6. Let X¼ ðX1, . . . ,XnÞ be a sample from the logistic distribution with density

f ðx;yÞ ¼expf�ðx�yÞg

½1þexpf�ðx�yÞg�2, x 2 R, y 2 R:

It is not difficult to see that density f ðx;0Þ is logconcave in x and EyðXÞ ¼ y. By Theorem 5 the moment estimator y ¼ X isincreasing in y with respect to the likelihood ratio order.

Example 7. Let X¼ ðX1, . . . ,XnÞ be a sample from the Weibull distribution with density

f ðx;yÞ ¼ ð1=yÞx1=y�1 expf�x1=yg, x40, y40:

It is obvious that the family of these distributions is not stochastically ordered in the parameter y. Let Ti ¼�log Xi,

i¼ 1, . . . ,n. After easy calculation we have that T1 ¼ styW1, where W1 is the Gumbel distribution with distribution function

e�e�x, x 2 R. It is known that EðW1Þ ¼ g, where g is Euler constant and VarðW1Þ ¼ p2=6. Hence y ¼

ffiffiffiffiffiffiffiffi6S2

T

q=p and y ¼ T=g are

moment estimators for y, where S2T ¼ ð1=nÞ

Pni ¼ 1ðTi�T Þ2, but we cannot apply here Theorem 6 since the random variable

W1 is supported on the whole real line.So, let us consider Zi ¼ 9log Xi9, i¼ 1, . . . ,n. Then we have Z1 ¼ sty9W19 and m¼ EðZ1Þ ¼ g�2Eið�1Þ ¼ 1:01598 . . ., where

EiðxÞ ¼ �

Z 1�x

e�t=t dt:

Similarly, after calculations we have

s2 ¼ VarðZ1Þ ¼ p2=6þ4ðg�Eið�1ÞÞEið�1Þ ¼ 0:945889 . . . :

By Theorem 6 the estimator y ¼ffiffiffiffiffiS2

Z

q=s is stochastically increasing in y. The same is true for the estimator y ¼ Z=m.

References

An, M.Y., 1998. Logconcavity versus logconvexity: a complete characterization. Journal of Economic Theory 80, 350–369.Balakrishnan, N., Mi, J., 2001. Order-preserving property of maximum likelihood estimator. Journal of Statistical Planing and Inference 98, 89–99.Berger, R.L., Casela, G., 2002. Statistical Inference, 2nd ed. Duxbury Press, Boston.Borovkov, A.A., 1998. Mathematical Statistics. Gordon and Breach Science Publishers, Amsterdam.Karlin, S., 1968. Total Positivity. Stanford University Press, California.Marshall, A.W., Olkin, I., 1979. Inequalities: Theory of Majorization and Its Applications. Academic Press, New York.Oja, H., 1981. On location, scale, skewness and kurtosis of univariate distributions. Scandinavian Journal of Statistics 8, 154–168.Schoenberg, I.J., 1951. On Polya frequency functions, I. The totally positive functions and their Laplace transforms. Journal d’Analyse Mathematique 1,

331–374.Shaked, M., Shanthikumar, J.G., 2007. Stochastic Orders. Springer Verlag, New York.