chapel hill center for stochastic processes adler … · 2014-09-28 · fwof upic file gccc i...

50
CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES R .1 ADLER ET19i4 SEP 9? TR-283 AFOSR-TR-9g-035:3 UNCLASSIFIED F49620-S5-C-614 F/S ML~ 7 m h492M ON h hG Ehh RCISSU OH h E E EEA NI T /

Upload: others

Post on 13-Jul-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

CHAPEL HILL CENTER FOR STOCHASTIC PROCESSESR .1 ADLER ET19i4 SEP 9? TR-283 AFOSR-TR-9g-035:3

UNCLASSIFIED F49620-S5-C-614 F/S ML~

7

m h492M ON h hG Ehh RCISSU OH h E E EEA NI T /

Page 2: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

2.5-a

1111111112.

P. r .. 0

Page 3: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

FWOF

UPiC FILE Gccc I

AFOSR-TR. 88-0359-k

00 CENTER FOR STOCHASTIC PROCESSES04,

CM-V

Department of StatisticsUniversity of North CarolinaChapel Hill, North Carolina

.. - d e F-P-J

ON STABLE M8JU(OV PROCESSES

by

Robert J. Adler '/

Gennady Saiorodnitsky

DTICLECTE

~AP R#.~DTechnical Report No. 203

Z* ; Z Z:

September 1987

I~~~~~~- j*. A*tula.hnb mC -

3 u' 1>

Page 4: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

PC ..J U. %A %J - UJ. . 5J 7% "W" Tt. F- - .5. . .*.- - *.*.

141. T. Hsirg. On the Iateity of crossings by a shot maois proess". July 66. Adv. Appl.Probab. 10. 1137. 743-745.

112. V. Naniwebar. On a limit theorem adInverits=e principle for symmetric statistics.July 66.%

143. 0. Sellemberg. Some am. representations. in biesriate exchngebility. July go.

144. B.C. rNguyon. orrelation lenth end Its critical aspomts for percolation processes.1 ul. M. spce July. 56.. to1 St. ptmiaton t

145 G.Kalianur nd . FresAbru.Stochastic evolution equations with values an the

appear. "6146. B.C. Nguyen. Fourier transform of the percolation free energy. July 96. Probeb. IV,

Theor. Bel. Fields. to appear. V147. A.N. Hssfer. Distribution of the maximum of a Gaussian process by Nonte Carlo. July

66. J. Sounhd Vibration. 112. 1987. 263-293.

146. T. Norberg. On the existence of probability measures on continuous asIi-lattice.Aug. 66.

149. C. Semorodnitsky. Cotinuity of Gaussian processes. Aug. 66. Aim. Probability, toappear. j

150. T. Hsing. J. limier, and X.R. Leedhetter. Limits for exeeodance point processeo.'Sept. 96. Prob. Theory end Related Fields. to appear.

151. S. Coabanis. Randomi filters which preeerve the stability of random Inputs. Sept. 86.Adv. AppI. Probability. 1917. to appear.

152. 0. Kallenberg. On the theory of conditioning in point processes, Sept. 66.Proceedings for the First World Congress of the Bernoulli Society. Tashkent. 1996.

153. C. Samorodnitsky. Local moduli of continuity for som classes of Gaussian processes.Sept. 66.

154i. V. Nandreimr. On the validity of Beurlirog theorems in polydiecs. Sept. 66.

156. R.F. Serfozo. Extreme values of queue lengths In KoOG/l end C1/WI/ systems. Sept. 86.

156. B.C. Nguyen. Cap exponents for percolation processes with triangle condition. Sept.95. 1. Statist. Physics. 49. 1987.

157. C. liallienipur and R. Wolpert. Week convergence of stochastic neurceal models. Oct. .86. Stochastic Methods In Biology. N. Kimura at al.. ads.. Lecture Notes InBionthemtics. 70. Springer. 1987. 116-145.

158. C. Kallianpur. Stochastic differential equations In duels of nuclear spaces with sowcapplications. Oct. 66. Inst. of Noth. & Its Applications. 1986. %.r

159. C. liallianpur and D.L. liarandikar. The filtering problem for infinite dimensionalstochastic processes. Jan. 87. Proc. Workshop on Stochastic Differential Systems.Stochastic Control Theory & Applications. Springer. to appear.

160. V. Perez-Abrou. Multiple stochastic integrals and nonlinear fractimnls of a nuclearspace valued Wiener process. Oct. 86. AppI. Moth. Optimization. 16. 1987. 27-245.

161. R.L. Karandikar. On the Feynman-Yac formula and Its applications to filtering theory.Oct. 86. Appl. Moth. Optimization, to appear.

162. R.L. Taylor and T.-C. Hui. Strong laws of large numbers for arrays or roewiseindependent random elements. Nov. 86.

163. M. O'Sullivan and 7.3. Fleming. Statistics for the two-somple survival analysisproblem based on product limit stimators of the survival functions. Nov. 86.

164. F. Avram. On bilinear forms in Gaussiaen random variables. Toeplitz mtrics and .,

Parseval's relation. Nov. 86. J6

166. D.B.1I. Cline. Joint stable attraction of two sum of products. Nov. 66. J.Nultivariate Anal., to appear.

166. R.J. Wilson. Nodal fields in crossing theory-a week convergence perspective. Nov. 56.

167. D.B.H. Cline. Consistency for least squares regression estimators with infinite '

variance data. Dec. 86.

168. L.L. Casmpbell. Phase distribution in a digital frequency modulation receiver. Nov.86.

169. B.C. Nguyen. Typical cluster size for 2-dim percolation processes. Dec. 86. J.

Statist. Physics, to appear.

170. H. Oodaira. Freidlein-Wentzell type estimates for a class of self-similar processes

represented by multiple Wiener integrals. Dec. 86.

171. J. Nolan. Local properties of index-P stable fields. Dec. 66. Ann. Probability, to

appear.

172. R. Nenich and R.F. Serfozo. Optimelity of shortest queue routing for dependientservice stations. Dec. 96.

173. F. Avrmm and NBS. Taqqu. Probability bounds for N-Sorohod oscillations. Dec. 96.

174. F. Noricz and D.L. Taylor. Strong laws of large nmbers for arrays of orthogonalrandom variables. Dec. 95.

175. C. Kallianpur and V. Perez-Abreu. Stochastic evolution driven by nuclear space valuedmartinigales, Apr. 87.

176. E. Nerzbach. Point processes in the plane. Feb. 87. 5%

177. Y. Kasahara. N. Seeime and W. Vervat. Log fractional stable processes. March 87. .

178. C. Kallianpur. A.C. Niemee an H. Nis. On the prediction theory Of two parameterstationary random fields. March 87. N

J. J4~ e .. vV -. -v -*F 5, d. d. '-. ' -N .,. 'r Is rL

irJm~ %a 0 %. V~ %~ %,, , ~ * ,,S5

5 %~

%a -P - 4 1 _v%

Page 5: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

Sf(5f t6SYT ATION OF THIS PAGEr

REPORT DOCUMENTATION PAGEIa. REPORT SECURITY CLASSIFICATION lb. RESTRICTIVE MARKINGS

Unclassified___________________________2a. SECURITY CLASSIFICATION AUTHORITY 3. DISTRIBUTION/ AVAILABILITY OF REPORT

2.DECLASSIFICATION / DOWNGRADING SCHEDULE ApproVVO 0"' vullvlic -Rl ease;

2b. distr ibut ion unlimited.

4PERFORMING ORGANIZATION REPORT NME()S. MONITORING ORGANIZATION REPORT NUMBER(S)

Technical Report No. 203______ _____ ___ _ _____AFOSR -Th.Pflg

6a. NAME OF PERFORMING ORGANIZATION T6b. O FICE SYMBOL 7a. NAME OF MONITORING ORGANIZATIONUniversity of North Carolina j if applicable)

6c. ADDRESS (City, State, and ZIP Code) 7b. Af8SIiy State, and ZIP Code)Statistics Dept., WRY,~'I321-A Phillips Hall 039-A Bldg 410%Chapel Hill, NC 27514 Boiling AFB DC 20332-9448

Sa. NAME OF FUNDING /SPONSORING 8b. OFFICE SYMBOL 9. PR'IC:UREMENT INSTRUMENT IDENTIFICATION NUMBERORGANIZATION (if applicabli!) Aef-,.F92 8C014

A F O S R N M F 4 9 6 20__ _ _ _ _ _ 8 5 C _ _ __01 4 4 ._ _ _ _

1c.AM W tJiy, State, and ZIP Code) 10. SOURCE OF FUNDING NUMBERSPROGRAM PROJECT ITASK IWORK UNIT

Bldg 410 ELEMENT NO. NO. NOe , jACCESSION NO.Bolling AFB DC 20332-6"8 1102F 2304 9

11. TITLE (include Security Classification)

On stable Markov processes. I12. PERSONAL AUTHOR(S)

Adler, R.J., Caibanis, S. and. Samorodnitsky, G.13a. TYPE OF REPORT 113b. TIME CQVERED 114. PATE OF REPOIT PYar, Month, Day) 5. PAGE COUNT

preprint FROM 9/87 To 8/88 September191416. SUPPLEMENTARY NOTATION

N/A

17. COSATI CODES 18. SUBJECT TERMS (Continue on reverse if necessary and identify by block number)

FIELD -GROUP SUB-GROUP Key Words & Phrases: Markov and weakly Markov stable processtime changed. Levy motion, sub-Gaussian and harmonizable proces CSmving averages, left and right stable Ornstein-Uhlenbeck :

19. ABSTRACT (Continue on reverse if necessary and identify by block number) ~ I

-Necessary conditions are given for a symmetric ct-4stable (SciS) process, l<ct<2, to be

Markov. These conditions are then applied to find Markov or weakly Markov processes within

certain important classes of SciS processes: time changed Levy motion, sub-Gaussian i

processes, moving averages and harmonizable processes. Two stationary SciS Markov processes

are introduced, the right and the left*8ScS Ornstein-Ighlenbeck processes. Some of thereut,

are in sharp contrast to the Gaussian case a=~2. . ' > . .* /

20. DISTRIBUTION IAVAILABILITY OF ABSTRACT 21. ABSTRACT SECURITY CLASSIFICATIONQUNCLASSIFIED/UNLIMITED 0 SAME AS RPT. 0QOTIC USERS Unclassified/Unlimited

22a. NAME OF RESPONSIBLE INDIVIDUAL 22b. TELEPHONE (Include Area Code) 22c. N~EICE SYMBOLMaj. Brian Woodruff (202) 767-5026

DO FORM 1473. s4 MAR 83 APR edition may b* used until exhausted. SECURITY CLASSIFICATION OF -HIS PAGE

All other editions are obsolete. Uc ss idUiiie~~Unclassified/Unlimited

Page 6: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

AFOSR. 88-035

18) cont'd. processes, stable conditional distributions.' '

,-

II.

J.J

p.

,...

'.

'p

... .. , , ... : ... ..°.,- ... . .. ... ... ., ... ... .... : . _.... ... .,. _ . . . .. . .. .. . ,. ,.-: p

Page 7: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

ON STABLE MARKOV PROCESSES

by Accessi0,FOby n-- nor 'TIcS GPA&I

Robert J. Adler2 DTIC TABfnannounced V

Stamatis Cambanis Justificatio

Gennady Samorodnitsky3 Bi..i___.Distribution/Availability Codes

Center for Stochastic Processes va--&nd/or---Department of Statistics Dist Special

University of North CarolinaChapel Hill, NC 27599-3260 il

Abstract: Necessary conditions are given for a symmetric a-stable (SaS) process,

I< a < 2, to be Markov. These conditions are then applied to find Markov or weakly

Markov processes within certain important classes of SaS processes: time changed

L~vy motion, sub-Gaussian processes, moving averages and harmonizable processes.

Two stationary SaS Markov processes are introduced, the right and the left SOS

Ornstein-Uhlenbeck processes. Some of the results are in sharp contrast to the

Gaussian case a=2.

AMS 1980 subject classification: Primary 60J25, 60G10

Key words and phrases: Markov and weakly Markov stable processes, time changedL vy motion, sub-Gaussian and harmonizable processes, moving averages, left andright stable Ornstein-Uhlenbeck processes, stable conditional distributions.

1Research supported by the Air Force Office of Scientific Research Contract No.F49620 85 C 0144. G. Samorodnitsky was also supported by the Dr. Chaim WeizmanFoundation.

2Faculty of Industrial Engineering k Management, Technion, Haifa 3200, Israel.

3Department of Mathematics, Boston University, Boston, MA 02215

J"

Page 8: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

1. INTRODUCTION

Throughout X = {X(t), t E T} is a real symmetric a-stable (SaS) process with

0 < a < 2 and T an interval on the real line; i.e. all finite linear combinations

anX(t )are SaS random variables.n=1 n 11

X is Markov if for all s < t in T, the conditional distribution of X(t) given X(u),

u < s, "coincides" with the conditional distribution of X(t) given X(s) alone, in the sense

that for -riy ui < s, i=l,...,n, and any Borel set E the equality P{X(t) E EIX(ul), ... ,

X(un), X(s)} = P{X(t) E EjX(s)} holds with probability 1. As is well-known the

Markovian property is equivalent to the conditional independence of the past a{X(u),

u < t} and future a{X(u), u > t} a-fields given the present a{X(t)}, and thus it is

symmetric in time and could be defined by requiring that for all t < s in T the conditional

distribution of X(t) given X(u), u > s, "coincides" with the conditional distribution of X(t)

given X(s) alone. Conditional distributions of non-Gaussian stable processes are generally

very difficult to compute (and generally not stable) and it is thus not easy to check for the

Markovian property. For this reason we introduce a weaker Markovian property which is,

amenable to some analysis and which concentrates on regressions.

For 1 < a < 2 we have 61 X(t) I < oo and we say that X is left weakly Markov if for all

s < t in T with probability 1,a,:

X(t) I X(u), u < s} = Xf X(t)IX(s)},

and right weakly Markov if for all s < t with probability 1,

N{X(s)IX(u), u > t} '{X(s)IX(t)}.

In the Gaussian case a,=-2 the left and right weak Markovian properties are

equivalent, and they are also equivalent to the Markovian property. Furthermore, there is

only one stationary Gaussian process which is Markov, namely the Ornstein-Uhlenbeck

i. , .-.- - .- . -.- - .- ..-.- .. ... .-. . .-,. . .. -- .- - .- .-'. ...a--- .-. .- .- . . , .- . , ,. -- -,- - -.- .-.. . ..- -

Page 9: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

2

process with covariance function R(t) = R(O)e - A t In contrast there are stationary

non-Gaussian SaS processes with 1 < a < 2 which:

(i) are left weakly Markov, without being right weakly Markov, and vice versa .!

(cf. Section 7); V4

(ii) are left and right weakly Markov without being Markov, e.g. the sub-

Ornstein-Uhlenbeck processes (Corollary 4.2);

(iii) are Markov, namely the SaS Ornstein-Uhlenbeck processes in (2.14) whose .-

covariation function is the nonsymmetric double exponential function in

(2.15);

(iv) have the symmetric covariation function R(t) = R(O)e-A It I but are neither

left nor right weakly Markov, namely the harmonizable process in (6.3).

Tvo distinct SaS stationary Markov processes are identified in this paper. These are

the right and the left SaS Ornstein-Uhlenbeck processes, which can be represented

respectively as decreasing and increasing time changes of SaS Lvy motion (cf. (2.14) and S

(3.3)), or as nonanticipating and fully anticipating moving averages of SaS L6vy motion 4"

(Theorem 5.1 and (5.4)), and are the stationary solutions of certain first order stochastic

differential equations driven by SaS white noise. Even though there might be further SaS !.

stationary Markov processes, none is currently known. Such processes are not sub-

Gaussian (Corollary 4.2) or harmonizable (Theorem 6.1); and they are neither '-

nonanticipating nor fully anticipatory invertible moving averages, as the left and the right -

SaS Ornstein-Uhlenbeck processes are the only such SaS moving averages (Theorem 5.1

and page 33). Finally, neither one of their pairwise conditional distributions can be

a-stable and symmetric, since the left and the right SaS Ornstein-Uhlenbeck processes

are again the only ones possessing this property (Theorem 3.1). This may explain the

difficulties in constructing other SaS stationary Markov processes, if indeed there are any. .

Without requiring stationarity, the Gaussian case is still quite simple: All Gaussian

Markov processes are essentially time changes of Brownian motion, see Tismoszyk (1974),

%. .0 Zp- - d'.,

Page 10: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

* - -w ~ - -- -,- - - YK . Y- -7pw

3

Borisov (1982) and Wong and Hajek (1985). For non-Gaussian SaS processes with

1 < a < 2 the picture is more complex and rich. A necessary condition for left weak

Markovianness is given in Theorem 2.1 in terms of the covariation function, and its

solution is found, i.e. in the nonstationary case (2.5) and its generalization, and in the

stationary case (2.9). While all time changes of Levy motion have covariation function of

this form (i.e. (2.5)), and are in fact Markov, they do not exhaust the class of SaS processes

with covariation function of this fcrm, e.g. LUvy bridge (see Example 2.1). p.

Time changes of Lvy motion are considered in Section 3 where they are shown to be

the only SaS Markov processes whose pairwise conditional distributions are stable and

symmetric (Theorem 3.1). In the non-Gaussian stable case there is also a marked

asymmetry: The SaS Markov processes whose right to left and left to right pairwise o

conditional distributions are stable and symmetric are few and trivial when 1 < a < 2,

whereas in the Gaussian case a-=-2 they coincide with the entire class of Gaussian Markov

processes. ,

An auxiliary result of independent interest is given in Proposition 3.1 and Corollary

3.1, characterizing the stability of the conditional distribution(s) of random variables that

are jointly SaS.

Sub-Gaussian processes are left (right) weakly Markov if and only if they are

essentially time changes of sub-Brownian motion, except for trivial cases, and they are not

Markov (Theorem 4.2). In particular, the only weakly Markov stationary SaS .

sub-Gaussian processes are the sub-Ornstein- Uhlenbeck processes (Corollary 4.2).

Sections 5 and 6 consider two specific classes of stationary SaS processes, moving

averages and harmonizable SaS processes. It is shown that in the case of either

nonanticipating or fully anticipatory invertible moving averages, the weak Markov

property cannot exist without full Markovianness and it is realized only by the right and

the left SaS Ornstein-Uhlenbeck processes correspondingly (Theorem 5.1 and page 33).

In sharp contrast to the Gaussian case a=-2, it turns out that for the stable caseN'.

Page 11: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

= - - ~ - ~. . -~ * -~ . .M .~ .. -

4V

1 < a < 2, weak Markovianness never prevails for harmonizable processes (Theorem 6.1).

Finally, a family of one-sided weakly Markov (i.e. left weakly Markov but not right b'b.5-

weakly Markov or vice versa) SaS processes is constructed in Section 7.,5 5,.

*55%

S

d~L

0

''5

5.

5.,.

-"5/h-5

5--5.5.

.5-

A A.

.5.

"'5

5~~

5.,

S

'55 5

'.5..

0~~5

0

Sr-s*45.5.

0N.'

- ~'. N. ~. N5.5*=%.

Page 12: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

2. GENERALITIES

A SaS process X can be represented by an integral of the form

(2.1) X(t) = { f(t,u) dZ(u), t E T,

where Z is a SaS random measure on some u-finite rmeasure space (U,E,p) (i.e. Z is an

independently scattered o-additive set function on =- {E E , u(E) < o)} and

Texp{irZ(E)} = exp{-(E)lrlal for E E and {f(t,.), t E T} c La(UE,,I) (Kanter

(1972), Kuelbs (1973) and Hardin (1982)). All quantities are real-valued (except in

Section 6, where complex-valued processes are discussed) and As is called the control

measure of Z. For every g E La () the integral fgdZ is a SaS r.v. with Xexp{irfgdZ} =

exp{-Irjf gI lad,} and is linear in g. Specific examples of such representations of SOS

processes will be considered in Sections 5 and 6.

The covariation function of X is

<a->(2.2) R(t,s) = Cov[X(t), X(s)] =.Ju f(t,u) f(s,u)< dg(u),

where X<q>= I xIq sgn(x), and it does not depend on the specific representation of X (for

more on the covariation see Section 3). In the Gaussian case, a=2, the covariation is a

multiple of the covariance, R(t,t) determines the distribution of X(t), and the numbers

R(t,t), R(s,s), R(t,s) determine the joint distribution of X(t), X(s); thus knowledge of R on

TxT determines the distribution of the (zero-mean) Gaussian process X. In the

non-Gaussian SaS case I < a < 2, the covariation is not generally a symmetric function of

its arguments and is linear only in the first argument, R(t,t) determines the distribution of

X(t), but the numbers R(t,t), R(s,s), R(t,s), R(s,t) do not generally determine the joint

distribution of X(t), X(s). Thus knowledge of the covariation function R on TxT generally

does not determine the bivariate distributions of the SaS process X. Still, as we shall see,

the covariation function plays a role partially analogous to the role played by the,V

*" U-€. *'- .'."-€," " ' ' ' .-.-- ' .-- ',"..' ' %, ..- t"". -",% , . . % ,"-. -. ' ---. '-.•

. ' '.' '.'

Page 13: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

6

covariaaice function in the Gaussian case.

A basic result on the left weak Markovian property is the following:

Theorem 2. 1. X is left weakly Markov if and only if

(23) COV[X(t) S' s, = 0S'sC

for all s < t and all Y E -S{X(u), u < s}, where the closure is in probability. If X is left

weakly Markov then

(2.4) ~ 3 Rtt 2 ) R(t 9 ,'tl) =R(t 3,t) Rt 2 ,t2 ) fo lt< t2 <t 3 ' .~

A covariation function R with R(t,s) #0 for all s < t in T satisfies (2.4) if and only if it is of

the formr

(2.5) R(t,s) =H(t) K(s) <&> for all s < t,

where the futnctions K, H are unique up to a multiplicative constant, have the same sign and

K(t)/11(t) is positive and nondecreasing on T. .'

Proof. It is known (see Kanter (1972)) that

Therefore X is left weakly Markov iff

;R Mt,= ~~.Xs V s < t,

and by [3, Proposition 1.5], a necessary and sufficient condition for this is (2.3) for all s < t

and Y E s-{X(U), u < S}.

Now if X is left weakly Markov, taking Y =X(u), u < s, we obtain

R(t,u) = R" R(s,u), V u < s< t,

S's

C~" 21.. .... . . . .. . . .V .-. . .

Page 14: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

7

which is (2.4). The general form (2.5) of the solution of (2.4) is obtained as in Borisov

(1982) by taking, for some interior point to of T, K(t)<aO-l>= R(tot) for t < to,

= R(t,t) R(toto)/R(t,to) for t > to, and [1(t) = R(t,t)/R(tot) for t _t

= R(t,to)/R(to,to) for t > to. Since by (2.2), R(t,t) _ 0 and by assumption R(t,t) # 0, it

follows from 0 < R(t,t) = H(t)K(t)< 1-- l > that K and H have the same sign at each point.

Also from (2.2) and H61der's inequality we obtain

(2.6) IR(t,s) _< {R(t,t)} 1/ a {R(s,s)I 1- 1/ a "

and substituting from (2.5) we have IK(s)/H(s) 1 K(t)/H(t) 1. Since KH - 1 is positive, it

is nondecreasing on T. Conversely, (2.5) implies (2.4) immediately and the property KH- 1 :

nondecreasing, is needed to show that R given by (2.5) is covariation function. The

simplest way of showing this is by constructing a SaS process with covariation (2.5), as was

done in the Gaussian case in Wong and Hajek (1985), p. 64. Indeed, using the time change

7(t) = {K(t)H-l (t)} (nondecreasing), and the SaS Lvy motion L = {L(t), t > 0}

which has stationary independent increments, L(O) =0, and Xexp{ir[L(t)-L(s)]}

exp{-I rl &l t--sI }, we can introduce the SaS process-w

(2.7) X(t) = H(t) L(r(t))

whose covariation function is for s < t,

Cov[X(t),X(s)] = H(t)H(s)<a ' - I > Cov[L(r(t)), L(r(s))j

= H(t)H(s)<a-> (S) = H(t)H(s)<a,-i> {_a'

(2.8) = H(t)K(s)<a->= R(t,s)

since K(t)H(t) > 0. ,

In the Gaussian case oa=-2, the covariation is linear in its second argument (as well as

0!

,'- .. . ._ .'. -. % ,__. % -. . % %_'.%. % __% % . --% % % ,, %._% " .% _% -... %" % . -. .. %-.."%. %",' - - • -s

Page 15: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

8

in its first), and the necessary condition (2.4) is also sufficient; thus when R(t,t) # 0, t E T,

conditions (2.3), (2.4), (2.5) and (2.7) are all equivalent, and all Gaussian Markov

processes are time changes of Brownian motion. However, in the non-Gaussian SaS case

with 1 < a < 2, generally the covariation is not linear in its second argument and the

necessary condition (2.4) is not sufficient. Also, while the time changes of SaS Levy .

motion (2.7) have covariation function of the form (2.5), they do not exhaust the class of'1'

SaS processes with covariation function of the form (2.5). "

Example 2.1 Levy bridge.

Again let L be the Lvy motion, and let B(t) = L(t) - tL(1), 0 < t < 1. This is one ofy

the possible generalizations of the Brownian bridge to the SaS case, a<2. It is 9

starightforward to check that for this process.11

{sc,' if 0 < s < t < 1R(t,s) = Cov[B(t), B(s)] = s - ] if 0<Nts1,

R~ts) '~t(1-s)[(1-s) a.-1+ so".~] if 0 < t < s < 1.

Moreover, B(t) is easily seen to satisfy the condition (2.3) for any Y = E= aix (ui)"

ui < s, for i=1,2,...,k, and, therefore, for any Y E s-p{X(u), u < s}. This process is,

therefore, left weakly Markov and, in fact, two-sided weakly Markov, since its right weak

Markovianness can be established similarly.

The Lvy bridge B(t) is an example of a two-sided weakly Markov SaS process

which is not a time changed Lvy motion (see (2.12)). Other examples of such processes

are furnished by the sub-Gaussian SaS processes (see Section 4).

The process B(t) is probably not Markov. It is interesting to note that another

possible generalization of the Brownian bridge, namely B'(t) = (1-t)L[t/(1-t)], 0 < t < 1, is ..p

clearly distinct from B(t) when a < 2 B' is a Markov process, and its covariation

function is given by

-,% ,:

Page 16: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

9

-It)s (1-s) a2if 0 < s<tl , -

R'(t,s) = Cov[B'(t), B'(s)] =t(1-s~a 'l if 0 < t < s < 1. 6

Not much seems to be known about the role of these processes (if any) in the weak

convergence of empirical processes. o

The solution of (2.4) in the general case, i.e. without the condition R(t,t) # 0 on T,

can be obtained as in the Gaussian case (Timoszyk (1974), Borisov (1982)), in the form

(2.5) on a finite or denumerable union of disjoint squares around the diagonal of TxT (and a

zero elsewhere).

When X is stationary (T = R1) then R(t,s) depends only on t-s and we write R(t,s)

R(t-s). When a=2 the converse is also true, but this is not generally true when.-

1 < a < 2. When R(t,s) = R(t--s) for all t,s E R1, we say that X is covariation stationary.

In the presence of stationarity Theorem 2.1 reduces to the following simpler form.

Corollary 2.1. Let T R I1. If X is covariation stationary and left weakly Markov, then

for some 0 < A <Co

(2.9) R(t) = R(O)e - At for all t > 0...

If X is stationary, then it is left weakly Markov if and only if

(2.10) Cov[X(t)- eAtX(O), Y] = 0

for some 0 <A <oo and allt >0, YE s-pJX(u), u <0}.

Proof. If X is covariation stationary and left weakly Markov, then (2.4) is satisfied and

can be written in the form

R(u)R(v) = R(u+v)R(O) for all u,v > 0.

*'.%

Page 17: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

10

Since by (2.6), IR(t) < R(0), V t, R(t)/R(0) is bounded and therefore the solutions of the

above equation are given by (2.9) (see Feller (1968), p. 459) o

When A = 0 in (2.9), R(t) = R(0) for all t > 0, i.e. equality holds in H61der's

inequality (2.6), and thus for each pair s < t we have X(t) = X(s) a.s. Hence X is equal in

law to a constant process {C(t) = aZ, -o < t < o}, a > 0, Z a standard SaS r.v., and every

separable modification of X has constant paths. At the other extreme, when A = +, we

have R(t) = 0 for t > 0 and R(0) > 0, so that the stationary process X is not continuous in

probability and thus its sample functions do not have measurable modifications ([2], p. 3),

i.e. it is very irregular. The interesting case then is when 0 < A < o. In the Gaussian case

a=2, the symmetry of R and the fact that it determines the distribution of X, imply that

the only stationary, Gaussian, left weakly Markov processes are the Ornstein-Uhlenbeck

processes with covariance function R(t) = R(O)eAIt --o < t < t o, which are in fact

Markov. As we shall see in the non-Gaussian SaS case 1 < a < 2 there exist left weakly

Markov stationary processes that are not Markov (see e.g. Section 4).

Results analogous to Theorem 2.1 and Corollary 2.1 are clearly valid for the right

* weak Markovian property. We will not repeat the details here; we only mention that (2.4)

takes the form

R(tl,t 2 )R(t 2 ,t 3 ) = R(tlt 3)R(t 2 ,t 2 ) for all t 1 _ t2 _ t3,

and (2.5) takes the following form:

R(t,s) = H(s)K(t)< a -' > for all t < s,

where K(t)/H(t) is positive and nondecreasing on T.

Therefore, if X is two-sided weakly Markov with R(t,t) # 0 on T, then

e4 ' -i -% %t

% J* M \4,: 1 t.%~~~~~ ~~ MP ~ !~~~4tr . .' %.t t % W P Cr a

Page 18: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

11 -I

HI )K(s) < > " < t

(2.11) R(ts) l> ,..,H 2(t )K2(s) <a-l t < S. i

When 1 < a < 2 the two pairs of functions Ki, HI, and K2, H2 need not be identical, as is

the case with the time changes of SaS Lvy motion defined by (2.7) where

2-a>K 1(t)a-- >(2.12) H2(t) = nI(tv K I > K 2 (t ) = Hx(t).

These Levy motion time changes are in fact Markov, as follows from r

X(t) = H X(s) + H(t) {L(r(t)) - a(r(s))},

and they have a-stable conditional distributions symmetric about {H(t)/H(s)}X(s) for

s < t:

X {exp[irX(t)] IX(u),u<s} = expfir ij41S X(s)} 9 exp{irH(t)[L( r(t))-L(r(s))l}

(2.13) = explir H X(s) - Ir I a H(t) a[r(t)-r(s)]}

= 6 {exp[irX(t)] I X(s)}.

In particular every two-sided weakly Markov stationary SaS process has -i

-A)1 t 0

R(t)= R()e A ,t >0,

R(O)e t <0,

for some 0 < AVA, 2 . When 1 < a < 2, the exponents A, and A 2 need not be equal, see

(2.15).

Among the time changes (2.7) of SaS Lvy motion the only stationary ones are of the

formA t ,

(2.14) X(t) = a e -At L(e ),- < t < 0, ,

for some 0 < a < oo, 0 < A < oo; and in fact it can be easily seen they are the only ones

" '2':,2 '-': 2. .'2':€',',' .',':-:Z: ? ? i'i ':a"," . ' '- '..'-,°-'.'',"".' ". .', .- . . "- ,, --,- .-- ., ,---. . .,'.a

Page 19: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

12

with stationary bivariate distributions (i.e. for these time changes, bivariate stationarity

implies stationarity). When 0 < A < o the Markov processes (2.14) will be called SaS

Ornstein-Uhlenbeck with parameters a and A. Using (2.7), (2.11) and (2.12) we conclude

that the covariation function of the SaS Ornstein-Uhlenbeck process (2.14) is given by

R()e - t , t > 0,(2.15) R(t)2"15 R(O)e a - l t t 0 , ,

and is not symmetric unless a=2.

Ne

,,"%

"',

%w %

.1 P.:e. %

Page 20: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

13

3. MORE ON TIME CHANGED LEVY MOTIONI

In Section 2 we saw that time changes of SaS Levy motion are Markov with

conditional distribution of X(t) given X(s) a-stable and symmetric, for any s < t. Here we

show that all SaS Markov processes with right to left conditional distributions (as above)

a-stable and symmetric are made up from independent segments of time changed Levy

motion; and in particular the only stationary ones with dependent values are SaS

Ornstein-Uhlenbeck processes.

Recall that a SaS Levy motion L = {L(t), t > 0} is a process with stationary

independent increments, L(O) = 0 a.s., and for all real r and t, s > 0,

(3.1) 9 exp{ir[L(t)-L(s)]} = exp{-IrI a It--si 1.

If T(t) is positive and nondecreasing on T, and H(t) is positive on T, then the time change

of the SaS Levy motion

(3.2) X(t) - H(t) L(r(t)), t E T,

is Markov and for s < t, the conditional distribution of X(t) given X(s) is a-stable and

symmetric, cf. (2.13).Note that we can regard the above time change as increasing (the new clock r(t) is

an increasing function on T). Similarly, we can define a decreasing time change of SaSS

LUvy motion by taking the clock r(t) in (3.2) to be a decreasing nonnegative function on T.

Of course, the new class of SaS processes obtained in this way consists of Markov

processes. Moreover, they have the following common property for s > t, the conditional

distribution of X(t) given X(s) is a-stable and symmetric. These properties of the time

changes of SaS L6vy motion are quite remarkable. We will see in this section that there

are not many Markov SaS processes whose conditional distributions are a-stable and

symmetric.

% %, pe o

Page 21: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

14

The conditional distributions of every Gaussian process are Gaussian and symmetric

(around the conditional mean). Non-Gaussian stable processes in general do not have

stable conditional distributions. Our aim in this section is to characterize the classes f()

and .#,r) of all SaS Markov processes X which have the property that for all s < t (s> t,

correspondingly) the conditional distribution of X(t) given X(s) is a-stable and symmetric.

Of course if 1 < a < 2 and .'{X(t)IX(s)} is symmetric about some point, this point of

symmetry is necessarily the conditional mean X{X(t)IX(s)}.

Theorem 3.1 characterizes the processes in - and j(r) when 1 < a < 2 Those

with covariation function nonvanishing everywhere are time changes of LUvy motion, as in

(3.2). The general process in &#( ) ( ( r)) is then made up of independent segments of

Lvy motion time changes on disjoint intervals. There are two extreme (and uninteresting)

cases: the (very smooth) constant process (X(t) = Z a.s. for each t), and the (very rough)

process consisting of independent random variables. In the stationary case the latter

process would have independent and identically distributed SaS r.v.'s, with scale parameter

a > 0, and we denote it by Ia = {Ia(t) , - 00 < t < o}. The only stationary processes in

J (0are the SaS Ornstein-Uhlenbeck processes (2.14) and those with iid values: Ia. Theaa

only stationary processes in (r) are the inverted SaS Ornstein-Uhlenbeck processesa

defined by

(3.3) X(t) =ae At L(e-at,

and the processes Ia. The SaS Ornstein-Uhlenbeck processes (2.14) and (3.3) coincide

trivially in the Gaussian case a=-2, but not in the case a < 2 (more on this point is said in

Theorem 3.2). ,4

In the statement of Theorem 3.1 equality in law, .2', means equality of all finite

dimensional distributions.

. 4% - -| - : _

Page 22: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

15 -

Theorem 3.1. Let X belong to J (correspondingly, A( r)) for some 1 < a < 2.

(i) If its covariation function satisfies R(t,s) J 0 for alls < t in T, then

{X(t), t E T} = {H(t)L(r(t)), t E T}

for some positive, nondecreasing (correspondingly, nonincreasing) function r on T, and

some positive function H on T.

(ii) If X is stationary then either

{X(t),-o < t < oo} = {ae tL(eaAt), 0 < t < o }

for some a > 0 and 0 < A < c (correspondingly, 0 < -A < oo), or else for some a > 0, 1

{X(t), -00 < t < 00} = (Ia(t), - 0 < t < 00}.-aa

In order to prove Theorem 3.1 we need the following properties of bivariate SaS

distributions which are of independent interest. Let us recall that the r.v.'s X and X are

jointly SaS if their joint characteristics function is of the form

expi(r 1 = X exp - JS2I rAx1 +r x21adF(xl,22

for all real rl,r 2 , where F is a uniquely determined symmetric finite measure on the unit2I

circle S2 in JR2 . When 1 < a < 2, the covariation of X with X is given by

', S2XlX2< ac-1 >d-'Cov[X, X2 = f 1 2 (Xlx 2 )

2I[5] (which is consistent with (2.2)). We denote by liXilla their scale parameter ,Xila =

Is2 IX dF(xx2) = Cov[Xi, Xi], i=1,2, and we have by Kanter (1972),

2.

-z'

Page 23: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

16

Cov[X2, X A

21x (X~J. 3: x px.'(X21X1 ) - ' xC X1 X l = P21X1,

Proposition 3.1. Let X 1 and X2 be jointly SaS and 1 < a < 2. Then the following are

equivalei. I

(i) ed(X21X 1) is a-stable and symmetric.

(ii) X2 -P 2 1X1 is independent of X 1.

(iii) F is concentrated on ± (0,1), ± ((l+p 1(/2 +p2 1/2

(iii)21 P21 +2l)

Under any of these conditions we have

(3 4 -J ,- a I Co [ X 2 , X l]I a ,,

(3.4) IlX-P 21X111a = IIX211I, IP211"IIX llla = I1X211" I X 2 a

X1,X 2 are independent iff Cov[X 2, Xl] = 0 iff P2 1 = 0.

Proof. Assume (i). Then

' jexp(ir 2 x 2 )IX 1J = exp{-Ir 2 MX 1 ) + ir2 N(Xl)},

for some real measurable functions M and N with M > 0. It follows that

N(X 1)= 6(X 2 1Xl)=P 2 1 X1 . Let

Z =P21X + M(X 1)l/a 0

where Z0 is independent of X and 9 exp(irZ0 ) = exp(-Ir a), r E 1. Then clearly

.2'(ZIXl) = .e/(X2 1Xl) and thus .2'(Xl,Z) = .2(XlX 2 ). It follows that Z - P2 1X1 is

SaS, and thus for some c > 0 and every real r, -

exp(-IrIac) = exp{ir(Z-P 2 1Xl)} = (exp {irM(Xl)l = Xexp{-rlaM(Xi)}.

By the uniqueness of the Laplace transform we conclude that M(XI) = c a.s.. Then b

.',I , , ,,W. , Q, , , , . ,,,, . ' , ,,r. W" , ." .' " , .4. €' ". ' " ." , . " .. . - . -, " % % ". " - • " . .%, =. , % -,=-, % % , ' ' . - - ",

Page 24: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

17 0

Z- = cZ 0 is independent of X1 , and this implies that X2 -P21X1 is also

independent of X 1, proving (ii).

Conversely, assuming (ii) and writing X2 = (X 2-P2 1Xl) +P21X we obtain, since

X2 - P2 1Xl isSaS,

(3.5) o {exp(ir 2 X2 ) X1} exp{-I r2 &llX2 -, 21X 1 ll + ir 2p2 1X1 }

so that (i) is satisficd, ain , in fact the constant M(X 1) = c is equal to 1IX2-P21X1IIa We '0

also obtain

IlX211I = 1IX2-P21Xx1IIa+ 1P211a a'&I'.

from which (3.4) follows. ,.

Therefore (i) or (ii) imply

del

expfi(rlX 1 +r212)} = 6 expi(r +r22X 1 - Ir a IX2-n2X II' } 1= exp{ 1 22+r2P2iI1 Xl -r 21 a X2-P21 X111 }

from which (iii) is evident. And conversely, assuming (iii) we have

Sexp{i(rXl+r 2 X2 )} = exp{-Ir 2 iad1d- lrl+rP21 1d 2 } .

for some dl,d2 > 0, and therefore 04.

X {i[a(X2-P 2 1X 1) + bX 1]} = Fexp{i[(b-aP2 1)X1 + aX2]}

exp{-aI adi - IOd

from which (ii) follows. o

Corollary 3.1 Let X1 and X2 be jointly SaS and 1 < a < 2. Then the following are

equivalent. .

°V.

- " ° " % o ° t ""a % " °" "" " "* "° % %'% " " " " """" "" '" = ° " "" ° " "" " ' "%°"" "" "° " • 0 .

Page 25: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

77 -7 -7. %R. .* *% *. tt 4 h *

18

(i) .d(X2 X 1) and .- '(X1 IX2) are a--stable and symmetric.

(ii) X1 and X2 are either independent or linearly dependent.

(iii) F is concentrated on {±(0,1), ±(1,O)} or on ± (c, (1-c 2)1/2 ) for some

O<c l. <1

Proof. In view of Proposition 3.1, (i) implies that F is concentrated on the set {±(O.1).

((l +P2 ) /2 , P2 1(1 )-lp1 2)} and also on the set {±(1,O), ±(P12 (l+Pl2)2 ,

(I+P2 Y12 )1 , from which (iii) follows. The converse is clear, as is the equivalence of (ii)

and (iii). C

Proof of Theorem 3.1. Because of symmetry we will consider #() only.

(i) Since for all s < t in T, .d'{X(t) IX(s)} is a-stable and symmetric, it follows from

Proposition 3.1 (cf. (3.5)) that

(3.6) 6 {exp[irX(t)] IX(s)} = exp{-lrI all X(t)-PtsX(S)ll a + irptsX(S) }

where pts R(t,s)IR(s,s) and JjX(t)-ptsX(S)jj = R(t,t) - IR(t,s) a/R(ss)a-l (cf. (34)

Since X is Markov and R(t,s) # 0 for s < t, by Theorem 2.1, R(t,s) has the representation

(2.5), which implies

H' (t)Pt,s = kis)

iiX(t)-ptsX(s)ii -= H(t)K(t)<-l>-IH(t)It{- } ai l = IH(t)Ia{r(t)r(s)}.

where r(t)= {K(t)/H(t)}a- -1 Thus

$ {exp[irX(t)] X(s)} = exp{-jrajH(t) a[T(t)-T(s)] + ir l X(s)},

which is the same as the conditional characteristic function for Y(t) = H(t)L(r(t)), t E 1'.

given in (2.13). It follows that Z{X(t)jX(s)} = -{Y(t)!Y(s)} for all s < t. It is also

c o tha llX(t)lla -IY(t)ll , so that .Y{X(t)} = d(Y(t)} for all t, and sinceat a

'r, V3.-% ' "b.% 2%V . . -...4 -.V. ) ?-" ? ' "-- "." " ' - .4* -' ",--.,"-.- -. '. -"- -" '-"- " "- - _- " -'-.". --

Page 26: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

I6

19

both X and Y are Markov, .e(X) = d(Y). The function H may be taken positive without

loss of generality.

(ii) If X is stationary, by Corollary 2.1, R(t,s) = R(O)e- A(t' s ) for all s < t, for some

0 < A < oo. When A = + o, R(t,s) = 0 for all s < t, and by Corollary 3.1 it follows that S.

.e(X) = Yq'(Ia) for some a > 0. When 0 < A < oo, part (i) of the theorem is applicable,

necessarily with H(t) = ae - At and r(t) be&At for some a,b > 0. This completes the

proof. 0

It is worth mentioning that in the Gaussian case (a-=-2), clearly ) = 4 r)" In

contrast when 1 < a < 2 the common processes of and r) are few and trivial.%

Theorem 3.2 (i) There is a one-to-one correspondence between A( and j(r) given by

X(t) = Y(r(t)), t E T, X E A( , for any fixed function r: T - T one-to-one,onto an a

onto, and such that 7(s) > r(t) if s < t (e.g. (t) = -t when T = R %

(ii) A process X belongs to A( n A( I < a < 2, if and only if it is of thea afollowing form: for some finite or denumerable set of intervals {1 in T, X(t)

A a(t)X n a.s. for each t E I., n=l,...,N, where a is a real function, nonvanishing in theN N

interior of In and the r.v.'s {(Xn)n= 1 X(t), t E T \ UN 1 1 are independent.(iii) The only stationary processes in K < < 2, are thprocesses

a a awith iid r.,.'s and the con5tant processes (X(t) = aZ, - < t < 0, a > 0, Z: SaS r.v.).

Proof. (i) is clear.

(ii) Suppose X E J) n A( r). Then by Corollary 3.1, for each s,t in T, the r.v.'s

X(s) and X(t) are either independent or linearly dependent. But since X is Markov, if X(t)

is a multiple of X(s), it will necessarily be a multiple of each X(u) for u in between s and t.

Hence for each t G T there is an interval It such that for all s E It, X(s) is a multiple of

at

A.% . 5," q *%" .,.•..

$% %- .% ., -,- -.

Page 27: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

20

*X(t). Denoting by JIN those intervals among the It E T, with positive Lebesgue

measure we obtain the result.

* (iii) In view of stationarity, if two r.v.'s of the process are linearly dependent, they

will all be a.s. equal; and if two r.v.'s of the process are independent, they will all be

independent.0

Nep.

Page 28: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

21

4. SUB-GAUSSIAN PROCESSES

Another important class of SaS processes is the class of sub-Gaussian

processes which is defined by

(4.1) X(t) =A /2G(t), tET,

where G = {G(t), t E T} is any Gaussian process and A is a totally skewed to the

right (a/2)-stable random variable (see Feller (1968)) which is positive with

probability 1 and satisfies

(4.2) .5exp{-uA} = exp{-ua/ 2 }, u > O.

A sub-Gaussian process (4.1) is easily seen to be a SaS process, and when T R.

X(t) is stationary if and only if G(t) is.

Our task in this section is to investigate if we can find a Markov, or at least

weakly Markov process among sub-Gaussian processes. The obvious candidates are

the sub-Gaussian processes given by (4.1) with G being Gaussian Markov process.

For example, G(t) could be a Brownian motion; in this case we call the

corresponding process X(t) a sub-Brownian motion. If T = R1 and G(t) is Ornstein-

Uhlenbeck process, we call X(t) sub-Ornstein-Uhlenbeck process. The first result

characterizes the weakly Markov sub-Gaussian Processes.

Theorem 4. 1. Let X be a sub-Gaussian SaS process given by (4. 1). Then the following are

equivalent.

(i) X is left weakly Markov.

(ii) X is right weakly Markov.

(iii) G is Markov.

Page 29: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

22

Proof. We note for future reference the following fact (see [5] ):If (Y'Y 2) isa

zero-mean Gaussian vector in R2with Sy2= Sy2= 1, Y1Y2=p, and ifX1 /2

S2= A1/2Y2' where A is distributed according to (4.2) and is independent of

(YjY ),then

(4.3) Cov[X1 , x2] 2- / 2p..121

Assume (i). Then by Theorem 2. 1 the covariation function of X satisfies (2.4)

and by(4.3) we conclude that the covariance function of the Gaussian process G in

(4.1) satisfies the same relation. Thus G must be Markov (since it is Gaussian),

i.e. (iii) holds.

Conversely, assume (iii); we will show (i). For any s < t, any u1,u2,... ,uk s

and any al,a 2 ,...,ak, we have by (4.3),2 k

g(G2(s)) '

= 2-&/2[g{E1 aiG(ui)}2](a --2)/2 Cov[G(t) - g(G t' G(s), i aG(ui)1 = 0

1(G (s)) =

since G is Markov. Therefore, by Theorem 2.1, X is left weakly Markov, i.e. (i)

holds. The equivalence of (ii) and (iii) is now deduced by the symmetry.o

Corollary 4. 1. Any sub-Gaussian weakly Markov SaS process with covariation function

R(t,s) # 0 for any t,s, is equivaleitt to a nondecreasing time change of sub--Brownian

motion.

Proof. Formula (4.3) implies that the covariation function of a sub-Gaussian

process determines its finite dimensional distributions. Therefore, we have only )

''S

. .%-

Page 30: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

-X -Y -J F ~

23

to demonstrate that the covariation function of a sub-Gaussian weakly Markov SaS

process can be realized by a time changed sub-Brownian motion. This is easy. By

Theorem 2.1 and (4.3), the covariation function of a sub-Caussian weakly Markov

SaS process can be represented as R(t,s) = H(t)K(s)< a -l> for all s,t, where

K(t)/H(t) is positive and iondecreasing on T. Let G1 (t) = H(t)B(r(t)), where B is

the standard Brownian motion, and r(t) = 2[K(t)/H(t)]2 (a -1l )/ a. It is

1/2straightforward to check using (4.3) that the process Xl(t) = A1/2 1 (t) has R(t,s)

as its covariation function. Since X1 is a time change of sub-Brownian motion, the

proof is complete. 0

Among all SaS sub-Gaussian processes only very few and highly degenerate are

Markov. All these processes are characterized in the following simple fashion. We

say that a deterministic function a(t), t E T, is born at the level a1 and dies at the

level a2 if for some s1 < s 2, which may be boundary points of T, a(t) = a1 for t < s,

(or t < sl) a(t) = a2 fort > s2 (or t >s 2), and a(t) {al,a 2 } outside of the above

intervals; s 1 is the birth time and s2 is the death time.

Theorem 4.2. A SaS sub-Gaussian process given by (4.1) is Markov if and only if theGaussian process 6 has one of the following forms: G(t) = a(t) Y or G(t) = .,

a(t){Y 1(t to) + Y2 1(tto)} or (t) = a(t){Y1 1(t < t o ) + Y2 1(t> t)},tT,

where (YY2) is a jointly Gaussian vector in R2 , xY=1 Y l2 , toET, and a(t) isa

real fitnction that is born and dies at the level zero.

A technical result precedes the proof of the theorem.

Lemma 4.1. Let G 2 Gi3 be i. i. d. standard normal random variables and let W be a

positive random variable not equal a.s. to a constant and independent of G1, 02, G3 ""

Then there is a Borel set B such that P(WG1 E B) > 0 and for any x1 E B the random

variables W 2 and WG are not independent given WG1 = x1 .

2. "..''.

... "Z- %

Page 31: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

24

Proof. Suppose on the contrary that for almost every value x of W1G the r.v. 's

W01 and WG are independent given WG1 = x1. Then for almost any X1, x2, we have

(4.4) fl(YjXl) = f2(YlXlX 2)

for almost any y, where f 1 (. x,) is the conditional density of WG3 given W1G = X,

and f2(. 1xlX 2) is the conditional density of WO3 givenWG 1 =x2, WG2 = Wx2 " e have

(( °(27r)- 1/2w-2exp[-(y 2+x2)/(2w2)]dF(w)(4.5) f1(ylx1 ) 0 fo w1 2xp x/(2w 2)]Fw

fo(27r1-1/2 -3 exp-( 2 +x2 x2,)/(2w2 )]dF(w ) "]f(2,Y~'w expL-ty +Xl+X2 ),\)

( 4 .6 ) f ( Y l , 2 0 o 1 2 . . .( 2(ylxlx 2) f 0 w-2exp[-(x2+x2)/ (2w2 )]dF(w)

where F is the distribution function of W. It follows from (4.5) and (4.6) that

fl(. J.) is continuous on 1R2 , and f2(. ",.) is continuous on R3. We conclude that

(4.4) is equivalent to

2 22 2 2 2 2 2(4.7) g3 (y +xl+x 2) gl(x 1 ) =g 2 (y +xl) g2 (xl+x 2),

where

gn(r) = w -wexp [-r/(2w)dF(w) r > 0.

By Hl6der's inequality we conclude that the left hand side of (4.7) is strictly

larger than its right hand side for all triples (y,xl,x 2 ) of the kind (O,x 1 ,O). By

the continuity we conclude that there is an c > 0 such that (4.7) and, therefore,

(4.4), do not hold for the triples (y,xl,x 2) E (-c, )3. This contradicts the

assumption that for almost any x1 and x2, (4.4) is true for almost any y. o .,.

Proof of Theorem 4.2. Suppose that the sub-4aussian process X given by (4.1) is

.4

.-5%

-.. , .5.'. . . . . . . . . . ... * .................% " % " %" "% •- . .-. -. *• ,- . • . . o %

Page 32: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

25

Markov. If all r.v.'s of the Gaussian process G are linearly dependent, then -A

G(t) = a(t)Y 1 for some standard normal r.v. Yl" Now assume G has two linearly

independent r.v.'s, say G(tl) and G(t2), t1 < t2. Furthermore, assume

9{G(t1 )G(t2)} 0 0, as the argument is even simpler in the case of independence.

Then we can write G(t 2 ) = a2 10 1 , G (tl) = al1 61 + a12 G2, where G1, G2 are i.i.d.

standard normal r.v. 's and the coefficients a11 , a12 , a21 are different from zero.

Take any t 3 > t Then we can write G(t3) = a3 1G1 + a3 2G2 + a3 3G3 , where G3 is a

standard normal r.v. independent of (G1,G2). The process G is Markov by Theorem

4.1; therefore its covariance function satisfies the relation (2.4). Rewriting

this relation for the triple (t1,t2 t3) in terms of our particular representation

of G(ti), i=1,2,3, we obtain

(a11a31 +a 12a32) =31)

and since a11 , a12 and a2 1 are different from zero, we conclude that a3 2 =0.

Recall now that by the assumed Markovianness of X(t), for almost any x2, X(tl) %

and X(t3 ) are independent given X(t2 ) = x2. Since a32 = O, it follows that

a 2A 1/2G2 and a33A1/2G3 are independent given a21A

1/2G1 = x1. Suppose first that12 I3/321 1 I2G V.

a33 #0. Then it follows that for almost any x1 , A1/2 and 1/2 are independent

/223

given A 1 2 = x /a21. But this is impossible by Lemma 4.1. Therefore a33 =0. It

follows that G(t3 ) = a3 1G1 , and thus for any t t 2 , G(t) is linearly dependent on

G(t2). The same argument shows that for any t < t, G(t) is linearly dependent on

G(tl), and that there is a to E [tlt 2] such that for any ty t < to, G(t) is

linearly dependent on G(tl), while for any to <t t 2 , (t) is linearly dependent..

on 0(t 2 ). G(to) itself has to be linearly dependent either on G(t1 ) or on G(t2). 4"

This proves the dependence structure of G(t) described in the statement of the

theorem.

% %

Page 33: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

26

Suppose now that there are points s 1 < s2 < s 3 such that a(s 1 ) t 0, a(s 2 ) = 0

and a(s3 ) # 0. Applying the assumed Markovianness of X(t) to the triple (s s2 ,s3 )

shows that one of the following pairs of r.v. 's is independent: (A1/2Y1, A1/2Y1) or

(A1/2Y2, A1/2Y2) or (A1/2Y1 , A1/2 Y2). The r.v.'s of the first two pairs are

obviously dependent, and those of the third pair are also known to be dependent

(see Lemma 2.1 in [6]). Therefore a(t) must be born and die at the level zero.

Since it is obvious that for any Gaussian process of the above form the

corresponding sub-Gaussian process is Markov, the proof of the theorem is

complete. o

In the particular case of a stationary sub-Gaussian SaS process we can draw

the following simple conclusion from the above results.

Corollary 4.2 The only weakly Markov stationary sub-Gaussian SaS processes are the

sub-Ornstein--Uhlenbeck processes and the constant processes. The only Markov stationary

sub-Gaussian SaS processes are the constant processes.

Corollary 4.2 shows that the sub-{}rnstein-Uhlenbeck process is the only

weakly Markov stationary sub-Gaussian process. However, the class of sub-Gaussian

processes is not closed under linear combinations of its independent members.

Therefore a natural question arises: could we obtain a new weakly Markov

stationary SaS process as a linear combination of independent stationary sub-

Gaussian processes?

Let G1, G 2 ". 'n be independent stationary Gaussian processes such that for

any t,

(4.8) Gi(t)Gi(O)}= pi(t), pi(O) =1, i=1,2,.. .,n.

We assume that all the correlation functions plp 2,...,pn are different. Let

Page 34: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

27

AJI 21-., nbe i.i.d. random variables distributed according to (4.2) and

independent of the Caussian processes GC212 .. . G n Let b1,b 2,1. . . Ib be positive

real numbers. We will prove that if n > 1, the stationary SaS process

cannot be weakly Markov. We start with the following lemma.

Lemma 4.2. Let y.i: S -. i=i 2, . .. , n, be arbitrary distinct functions on a set S.

Then there is a finite subset Is'~'* Ik I S (k ni.ndra10bes~'2*

such that

SO-rp1(s) E _ .. S) any i=2,...,n.

Proof. We prove the lemma by induction in n. For n=2 the claim of the lemma is

trivial, and one can choose k=1 and 01= 1. Suppose that the claim of the lemma is

correct for n =n * We shall prove this claim for no+1 functions, I1~2'

00

Is1's 2'..,'k c S, and real numbers 01,0 2, ... AO0 , such that

k kzA(i) = E j=1 Oilj -E 0 O-V (si) # 0, any i=2,.. .,n 0.

-=If A(n0+1) #0, then there is nothing to prove, so we assume that LA(n 0+l) =0.

Then, there is an 5k +1 E S such that V1isk +1) Vn +1( 1 Put

i=2 .... I

Clearly, we can choose a real number 'k +1 stsyn h olwn odtos0

Page 35: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

na TI -A J -X -~ -,,J -. V.,

28

(a) 9ko+1 00,

(b) Oko+1 -Ai/-,i for all such i=2,...,no, for which -yi O.

Thenko+l ko+l

Ej=1 j 'Pl(Sj) # 1 0j pi(sj) for any i=2,...,no, no + 1.

so that the proof of the lemma is complete. 0

Proof of the claim. If the process given by (4.9) were weakly Markov, we would

have for any t, any r> 0, any si >0 , i=l,. . .,k, any real numbers c, 01, Ok,

that for some 0 < A <xo

(4.10) Cov[X(t+r) - e-ArX(t), X(t) + c _j -sj) =0.

Since the sub-Gaussian processes Ai/2Gi(t), i=l,...,n, are independent, we

conclude by the properties of covariation (see Weron (1984)) and by (4.3) and

(4.10) that

0 En b0 CeAT j= 12 i(t k .. (0=i=l bCvA/{i (t+r) -e-ArGi (t)) ', A /i~ t+~ ~ _jitsj),]

S 2-a/2 n ba [Var{Gi(t) + ck 0jG (t-sj) ] (&-2)/2 xi=l "j=1 ji

(4.11) Cov[Gi(t+r ) -e-ArGi(t), Gi(t ) +c 1 0 j G i(t s

2- a/2 n 1 [1+ 2c ' (sj)+c 2 k -= j ( ]a-2)/2.,,.

1i j =1 ss

[{pi(r)--e-A}+ c( A71+{pi(sj-r) -eAr Pi(sj)}]

That is, for anyt, any r>0, anys i >_0, i=1,...,k, any realc, 01'..Ok, wehave

,fr

''

Page 36: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

,.

29

E l l O.{p.(s.+r) - e - Arp'(s*il ~J=1 j I j Akp~j 2}bt (t-)/(4.12) X [1 + 2c Yc2O (s +C E . -0OP(S

Enl 0 jAr 2cP =10~(s.) + c2 Ek Ek 0.O.(-i=1 b (e7-pi(r)}[1 + 2c = s + =1 jOi(sj-se)] (a-2)/2

Since the correlation functions pl,... ,p are all different, we can find a T

such that not al1 the express ions e 7-pi (7)1 i=1,. ,n, are equal to zero. We can

assume, without loss of generality, that e- r 0 for i=l,...n and re-A-pi(r) =0 for i=no+l,. . . ,n, for some 1 < n0 n. According to Lemma 4.2, there

are points s i > 0, and real numbers Oi, i=l,... ,k, such that

Then for these fixed values of r,s1,...,sk, 1,...,'0 k, we can rewrite (4.12) as

n a! _)+c2 )(-2)/ i=n a1 2)(1 2cp~-c2 o)(&-2 )/ 2

(4.13) 1 + (c+2cLi+ Enu )(2 2 = = n 0 ( 2 2

for certain real numbers a. , li, ai, i=1,... ,n, a 2) i=1, ...,no. Then the fact

that for any 0 < d < 1/2, any p, the following family of functions of c,2 2--d 2( 2?~~ -d ~l..p

(1 + 2cpi + c 7, c (I+2cli + 2 2, -

2is linearly independent as long as all the pairs (#i,a) are different, implies

that a(2) = 0. But we know that a(2) j 0. This contradiction proves that no process1 1of the kind (4.9) can be weakly Markov. o:

-V.

LI

5 '-: . ,., : ¢ -,:,,,, ,-,, > .,,-...,,..,.............................,...,..,.............-......":-:'--"

Page 37: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

-V . . .w. W'% -- .L 1%Y-

30

UN.

5. MOVING AVERAGES ..4

A SaS moving average is a process of the form

X(t) = f000 f (t-s) dL(s),- < t < oo,

where L is an independeeeently scattered SaS measure on (R1,31,eb), i.e. {L(s),

-W<s<00} is a SaS LUvy motion, and f E L (IR',..,Leb). X(t) is clearly a

stationary process. When f vanishes on the negative line, X is a nonanticipating

moving average and

(5.1) X (t) = f t f (t-s) dL (s) .-- 100

A nonanticipating moving average is called invertible if -S{X(t), t<71

-s{AL(t) , t T} = p{L(t)-L(s), s'~t T} for all r, i.e. the increments of L

represent the innovations of X.

We now determine all nonanticipating, invertible SaS moving averages which

are left weakly Markov; these are in fact Markov, and we shall show that they are

precisely the SaS Orrnstein-Uhlenbeck processes, thus obtaining an integral

representation for these processes.

Theorem 5.A X is a left weakly Markov, nonanticipating, invertible, SaS moving average

with I <~ a <2, if and only if it is of the formr

(5.2) X(t) a aft e-A(t5s)dL(s), < (t < o,-00

for some 0 <a, A < o, if and only if it is a SaS Or-nstein- Uhlenbeck process.

Proof. Let X be a nonanticipating, invertible SaS moving average as given by

(5.1). Then Xis aleft weaklyMarkov iff for someo <A <oo,

Cov [X(r) - e A TX(O, y] =0

%0>--

Page 38: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

31

0t-SI t <1 A t <1 i. for alY L v

for all r>O and allY sp{X(t), t<O} -Y{L(t), t0}, i.e. l = 0 gd

0where fJ 0g(s) lads<o; i.e. iffVr>O, VgELa((--,O), 2(--m,O),Leb),

'S

r - t 0 0 .

0 = Cov f" f ( r-s) dL (s) - e7 f__f(-s)dL(s), Jo g(s)dL(s)I

0O f(r-s)g(s)<O-l>ds - e-ArfO f(-s)g(s)<a'-l>ds.

Now putting g(s) = (_x,0) (s), x > 0, we obtain that if X is left weakly Markov then(-X,-

for all r >0, x >0,

f 0 f(r-s)ds e- A ro f(-s)ds,

i.e.

fr+Xf(u)du e-A fxf(u)du,

and putting F(x) = f f(u)du,

F(r+x) = F(r) +e-AF(x), Vr, x>O.

The parameter A cannot take the values 0 or +oo. Indeed, if A = +oo then

F(r+x) = F(r), V r, x > 0, implies F(x) = Const., x > 0, hence f(x) = 0 a.e. on (0,0)

and X(t) = 0 a.s. for all t, i.e. the process X is identically zero. On the other

hand, if A= 0 then F(r+x) =F(r) + F(x), V r, x> 0, implies F(x) =cx, x > 0, hence

f(x) = c a.s. on (0,oo) which does not belong to L (Leb) unless c 0 in which case

again the process X is identically zero. It follows that 0 < A <o. Interchanging 7

and x we also have

F(r+x) = F(x) + e-AXF(r), V r, x > 0,

and thus

F(x) Fr 7-A x T' Vr, x>O

l-e 1-e

.'

Page 39: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

32

Hence F(x) =c(1-e A), x > 0, and f (x) =cAe Ax a. e. on (0,oo) In view of the

symmetry of the distribution of X, the finite constant a = cA may be taken

positive. Thus (5.2) is shown.

Conversely, it is clear that the process (5.2) satisfies the necessary and

suf fic ient cond it ion f or being lef t weakly Markov; is invert ible, in f act it is the

stationary solution of the stochastic differential equation XV (t) + AX(t)

ab' (t) , where L' (t) is SaS white noise; and is Markov, as follows from (s <t)

(5.3) X (t) = e-A(t) X(s) + aft e-A (tu)dL(u)

where the second term is independent of { X(u) , u < s}.

We f inal ly show that the process (5. 2) is the SaS Ornstein-IUhlenbeck process

Y(t) - aet La cAt)

Indeed

IIX~t)It = &Ie A(t5)ds a&-

Ily (t)I ,a a e~t e aAt =a&

and thus .2'{X(t)} Zf./Y(t)). From (5.3), using the independence of the two terms

on the right hand side we have for s < t,

9 {expI~irX(t)] IX(s)} = exp{ ire -A(t-S)X(s) - Iral ft eaA(t-u)dul

=expf ireA'(t5)X(s) - 1rIlaaa(l eA~)/(aA)}.

At 1/aAtOn the other hand. from (2. 13) with 11(t) = aGet (aA)- ,7-(t) = e" , we have

-A~t- -aA(t-s

9 lep~ir~t) IY~) Iexpiire' )Y~) r a a( _ (0

% %

Page 40: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

- - - -. .,..-% L.Cj. .- - -~ -_ -' - -

4t

Therefore ./{X(t)IX(s)} I {Y(t)IY(s)} for a!l s < t. Since both X and Y are

Markov it follows that e(X) = /(Y). It is clear that the SaS {}rnstein-Uhlenbeck

processes Y exhaust the class of all SaS Ornstein-Uhlenbeck processes (cf.

(2.14)).

When X is a SaS moving average and f vanishes on the positive line, we say that

X is fully anticipatory and

X(t) = J0ft(t-s) dL(s).

A fully anticipatory moving average is called invertible if s-p{X(t), t>}

s-{AL(t), t>-} for all 7, i.e. the increments of L represent the backward

innovations of X. It is easily seen that X(t) is fully anticipatory and invertible

(with kernel f(.)) iff X (-t) is nonanticipating and invertible (with kernel

f(-.)). Thus the only fully anticipatory, invertible SaS moving averages

(1 < a < 2) which are right weakly Markov are the Markov processes

*A'

(5.4) Y(t) = a' t eA' (t-s)dL(s), -- < t < ,

where 0 < a' A' < x, which have covariation function

RyO-A' a-l)t > ,

(5.5) Ry(t) AtR y(0) e t < 0.

Theorem 3. 2 (iii) implies that the two classes of Markov processes introduced in

this section, the nonanticipating and the fully anticipatory invertible SOS

moving averages, are clearly distinct when 1 < a <(2.

% o °o . -%-* °, *.. . • , . . . • , . . . • . . - / .. ,, -

Page 41: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

34

6. HARMONIZABLE PROCESSESL-

A complex harmonizable SaS process has a harmonic spectral representationitUdz%

(6.1) X(t) : 10 e dZ(u) <-, -.

where Z is a complex. independently scattered, isotropic SaS measure on (IR1 ,5 tI)

with p a finite symmetric measure. For every complex-valued function g E L (IL) the

r.v. fgdZ is complex isotropic SaS with ' exp{fefgdZ} = exp{-lz I j]gJ di}, for

complex z. Covariation is given by

Cov [fg 1dZ, fg2dZ] fg~g < a> d

for all g E g2 e L w(4), here z<q> = Izq1-lz. All properties of covariation and Id.

regression used here in the real case are also valid in this complex isotropic case

(see [4]).

The harmonizable process (6.1) is stationary and has covariation function

(6.2) R(t) = Jo eitudy(u). W

Taking its real part, would provide a real harmonizable process, but it is more

convenient to work with complex quantities when dealing with Fourier type

representations. Note that when a=2 all (continuous in probability) stationary

Gaussian processes are harmonizable, while when 1 < a < 2, the harmonizable, the

* nonanticipating movir- averages and the sub-Gaussian SaS processes form disjoint

* classes [6]. We now show that (nontrivial) harmonizable processes cannot be

weakly Markov.

Theorem 6.1. The only harmonizable SaS process with 1 < a < 2 which is left or right

weakly Mark-oc is the conslail SoS process.

%%a

., '-ih &A.,",,' al,, A" OR.4 A','M '''' _ ''*'_ : '- L ''-'A ' ' '', ,*, A. . A , ,". . , .",:,, ...,.., ,,. . .. • . " .

Page 42: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

35 N

Proof. Let X be harmonizable SaS as in (6.1). Assume furthermore that X is, say,

left weakly Markov. Then R(t) = R(O)eAt for all t > 0 and some 0 < A < o. The

continuity of R in (6.2) excludes the valueA= o. When A = 0, X is a constant SaS

process with i concentrated at 0. Assume from now on that 0 < A < x. Since R is an

even function (cf. (6.2)) it follows that

R(t) = R(O) e-ItI forallt,

and by (6.2), 0d~u@ _ AR(O)

7r(A +u2)

Since X is right weakly Markov, it satisfies

Cov[X(r)-e -AX(0),Y] : 0, V>0 VYE-{X(s), s<}.

Taking Y= X(O) + X(-v) f. (l+e-vu)dZ(u) with v > 0, we obtain for all r, v > 0,

0= - (eirU-e -A)(l+e-i u)<0-l> dp(u)

jo (e iru e-Ar) (1+e ivu du

[2(1+cosvU)] A +u

Introduce for each v > 0 the measure pl by

dp(u) 1

(1+cosvu)l-' 2 (A+u 2)

Then it is symmetric and finite since for vu (2k+l)r, 1 + cosvu [vu-(2k+1)7r]2/2VI

and 2(I-a/2) =2-a< 1. We have for all r, v>0,

i+eivu)d = eAf (+eiVu)dav(U).i00 -..-

%

Z'

Page 43: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

36

Let fv( be the characteristic function of the probability measure A/1().

Since g is symmetric, f is real and even, and for all r, v > 0,

f fv(r ) + fv(r+v) = e Ar 1 00 (l+cosvu)dv(u) e-rb(v).

V

It follows that for allk=1,2,..., andv, r>O,

f(rA2kv) = fv(r) -eArb(v) (le-AV+e- 2 Av_ -(2k-1)Av)

= fv~r) - A -'r v 1-- - 2 k Ay

f V(r)e bv)l+e -AV

and thus

lim fV(r+2kv) = (r) -e - T by r, v > 0.V. -AVk +001+ele.

But since f is the Fourier transform of an absolutely continuous, finite measure,

by the Ripmann-Lebesgue lemma, f(±oo) = 0. Hence f7(r) = eA 7b(v)/(l+e-AV)V V

V r, v >0, and by the symmetry of 5

= b e- AITI, V r, V v>0.

The uniqueness of Fourier transform now implies that for any v > 0,

1 biy A/ 2_2-1 bA for almost any u(1+cosvu) 1-a/2 (A 2+u2 )v (O i+e V 7r(A +u2)

and thus

(1+cosvu)1-a/2 - 7r(l+e-AV) 1A ~ v/s(u1 u- E R

-. A b (v),u (IR)

-JV

d%

Page 44: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

I / I.

37

which is a contradiction, as the left hand side depends on u while the right hand

side does not. It follows that X with 0 < A < o cannot be right weakly Markov and

similarly it cannot be left weakly Markov. o £

The proof of Theorem 4.1 provides an example of a stationary SaS process with

covariation function R(t) = R(O)e-A lt l (0 < A < cn) which is neither right nor left

weakly Markov. In fact this is a harmonizable process with representation

(6.3) X(t) = [R(O)A/r] f_ e u R(u)u2+A2)l/ad

where Z is complex isotropic SaS motion.

Ie

V..

,,k

5,.A

4-

'p - , l m M~ mk~ hdli1l~ 'Ilillllll~ M :: ! -: :: : :: ' " :=\" " i

Page 45: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

p 38

7. A FAMILY OF ONE-SIDED WEAKLY MARKOV PROCESSES

Every left weakly Markov process constructed in earlier sections is also

right weakly Markov, and vice versa. In this section we construct non-faussian Sos

processes which are left weakly Markov processes but are not right weakly Markov.

and vice versa. Thus for non-4Iaussian SaS processes weak Markovianness is not a

symmetric property. For simplicity we consider the case of stationary processes.

Lemma 7.1.. Let X1and X2be independent left (correspondingly right) weakly M~arkot,

stationary SaS processes with covariation functions satisfying

for all t > 0 (correspondingly all t < 0) , i=1 ,2, and some r1, r 2, a >0. Then for any real

numbers b1, b, the process

X(t) = b 1 Xl(t) + b 2 X2 (t), <o t <c,

is left (correspondingly right) weakly Mlarkov stationary SaS process.

Proof. Only weak Markovianness requires proof. We assume that the original

proeses releft weakly Markov. For any T 0, any ut,... <u 0, ayra

c1,. .. ' ,c, we have, using the independence of X 1andX2

= Cov [b{X1 (7-) -eaX( + b21X2 (7) e-ar 2 0),

b E=c Xiuj) + b2 Ek cX(u) -x

= lb 11 aCov [Xi (7) -e arX1 (), E'j1 ci X (Ili))

aL~(rf -ar ka21 IUIovLA 2~ - X2 (0)1 Ek__ ci X2(ui)] 0 '

Page 46: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

39 -

because of the left weak Markovianness of X1 and X 2 . Theorem 2.1 implies that X is

left weakly Markov. 0'

Recall that we have introduced three families of weakly Markov stationary SaS

processes. The SaS Ornstein-Uhlenbeck processes (2.13), the inverted SaS

Ornstein-Uhlenbeck processes (3.3), and the sub-Ornstein-Uhlenbeck processes in

Section 4.

Fix a > 0. Let X1 be SaS Ornstein-Uhlenbeck process such that

R 1(t) = -at It >0.

Then by (2.15) we have

R, (t) = e(~1a t0<a0.

Similarly, let Xbe inverted SaS Ornstein-Uhlenbeck process with

R 2(t) e eat, t >0.

Then

R(t = at/(a-1), t 0.

Finally, let X3 be sub-Ornstein-Uhlenbeck process with-ata

R() eat t >O.

Then (4.3) implies that

R3(t = at, t .

The three processes X1, X, X3 are assumed to be independent. Let b1, b, b3 be

nonnegative numbers, at most one of which is equal to zero. Def ine a new stochastic

process

(7.1) X (t) b X (t) + bX(t + b3X() t~ c

1 2X(t) X3(t, < < 0

'V-.X.

Page 47: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

40

By Lemma 7.1, X is left weakly Markov stationary SaS process. Using the

independence of X1 , X2, X3 we obtain

R(t) = Cov[X(t) ,X(0)]

+bovX2()(t)+bCv (RX(t)=bCov [Xl(t),I Xl1(0)] + b2 cC2()X2 (0)1 +b a CovX3(a X()

=b' Rl(t) + b2' R(t + O R3t ,2 2(t) 3tP

(ba + 0 + ba)e-at , t O, >= { a (C--)at eat/(a--l) eatbb e1 + ba + eba t (0,1 2~(~ )a

and this is not the covariation function of a right weakly Markov stationary SaS

process (cf. Section 2). Therefore, (7.1) defines a whole family (with parameters

a,bl,b 2,b3) of left weakly Markov stationary SaS processes that are not right

weakly Markov.

It is clear that in a similar way we can define a family of right weakly Markov

SaS processes which are not left weakly Markov.

.%N.

\l'

;V' , .. ' ''-,: •".' .' ? ,'-; ' 'L''',." \ '.-.'_:-"; :': .''-. '',' . ., 'r ,..- ,-- ,... ..- -,-. ...- ,.

Page 48: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

41

REFERENCES

[1] BORISOV, I. S. (1982) On a criterion for Gaussian random processes to beMarkovian. Theor. Probab. Appl. 27, 863-865.

[2] CAMBANIS, S., HARDIN, C. D., JR. and WERON, A. (1987) Ergodic properties ofstationary stable processes. Stochastic Proc. Appi. 24, 1-18.

[3] CAMBANIS, S., HARDIN, C. D. JR. and WERON, A. (1985) Innovations and Wolddecompositions of stable sequences. Center for Stochastic ProcessesTech. Rept. No. 106, Statist. Dept., Univ. of North Carolina.

[4] CAMBANIS, S. and MIAMEE, A. C. (1985) On prediction of harmonizable stableprocesses. Center for Stochastic ProcessesTech. Rept. No. 110,Statistics Dept., Univ of North Carolina.

[5] CAMBANIS, S. and MILLER, G. (1981) Linear problems in pth order and stableprocesses. SIAM J. Appl. Math. 41, 43-69.

[6] CAMBANIS, S. and SOLTANI, A. R. (1984) Prediction of stable processes:Spectral and moving average representations. Z. Wahrsch. verw. Geb. 66,593-612.

[7] FELLER, W. (1968) An Introduction to Probabilitty Theory and Its Appli-cations, Vol. I. Wiley, New York.

[8] HARDIN, D. C. JR. (1982) On the spectral representation of symmetric stableprocesses. J. Multivariate Anal. 12, 385-401. %

[9] KANTER, M. (1972) Linear sample spaces and stable processes. J. Funct.Anal. 9, 441-456. • .% '

[10] KULBS, J. (1973) A representation theorem for symmetric stable processesand stable measures on H. Z. Wahrsch. verw. Geb. 26, 259-271. ..

[11] TIMOSZYK, W. (1974) A characterization of Gaussian processes that areMarkovian. Coll. Math. 30, 157-167.

[12] WERON, A. (1984) Stable processes and measures: A survey. ProbabilityTheory on Vector Spaces III. D. Snyzol and A. Weron, Eds., LectureNotes in Mathematics, Vol. 1080, Springer, 306-364.

[13] WONG, E. and HAJEK, B. 1985) Stochastic Processes in Engineering Systems.Springer, New York.

YpI4.

Page 49: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

179. 2. Brigla. himok an the mltiple 1Wine Integral. Nor. f7.

100. 2. Brtipa. Stochasic filterirg solution ter ila-psd linear premsan theirmtnimn to amouable transformation. Mar. ST.

101. C. Soor~tsk. Mexico of symeric @table poessesu. Mar. ST.

10. U.L. Sord. epresoentation of barsmmblo periodically orrelated prmundm theirowarlimme. Apr. 17.

103. M.L. god Neuparseutrin tire seriesnlyaiu for perodically ourrelated process"e.A.. . i.

IE. T. Awrl an. Codaira. Freidlin-Wasli estimates ma thu lor of the Iteratedlogarithm for a alamo of Stocasetic proessus eulated to symtric statistics. Say1.

105. R.F. Surfioos. Point processeu. May 17. ~Omeation Remob Sanbak an StochasticPreee. to apper.

15. Z.D. SM. 5.0. Linag and V. Vervuat. Strong repsouitatter of wink o vu.JmeA

117. 0. Kallemboug. Dlomuling identities and predictable transfomtion Inowlamumbility. Jun. ST.

l1o. 0. Sallemberg. A cleretry approach to the lhniell-loogorov theorem and urnrelated results. June ST. Both. Nmcr.. to appea.

189. C. Saeradmituby. Extreme of aboard stable procees. Juso 17.

190. D. Nulart. N. Gra and X. Zubel. On thu relations between Increasing fumetioneaeuociatud with tio-parrster ntuaus martimemluu. Jueo 17. 6

191. F. Avrmn and X. Taqqu.. Wook comero of rn-u of movieg avemego In thu a-sabledomain of iattraction. Jan 17.

19. N.1. Leudbotter. Haoald Cranr (1003-1985). July 17. 181 Sovie., to appar.

193. R. Lofte. Predicting tromforms, of stable notes. Juy 17.

194. R. LOWug red B.S. Schreiber. Strategioes based an amiizing enped log. July 67.

196. J3. Sneilti. Series repreetationsa ofInfinitely divisible rondom vectors and aSeneralind uhot noita In rach spces. July 17.

196. J. Secto. On byprcotractivity of a-utmblo ~. variable*, 0OW2. July 57.

197. 1. Kuzmunova-ftolpo and S.?. Rachoy. Empicit uolution of inet probems 1. July* 57.

196. T. liming. On the extrem order statistics for a stationary uaquino. July 17.

199. T. Using. On the clrracteriuction of certain point procesuu". Ag. 97.

200. J.P. Nolan. Contirnity of symetric stable processes. Aug. S7.

*201, N. Ma~rques and S. Comanu. Admissible and singular translates of stable processes.hug. 97.

20. 0. Kallanhorg. Oc-diuuaiml uniquaeness and convergenco reults for exchuengembleprocesses, Ag. ST.

203. R.J. Adler, S. Cobnts and C. Smuorodnitaky. On stoal lowboy processe. Sept. 87.

2D4, G. Kalliaiur and V. Porox-Abrou. Stochastic ovolution equtioons driven by nuclearopera valued matmgales. Sept. 57.

205. R.L. Smith. Approximstion in extramu value theory. Sept. 97.

206. E. Willoitin. Estimation of convolution taflo. Sept. ST.%

2D7. .1. Rosinalgi. On path properties of curtain Infinitely divisile processes. Sept. 07.

20S. A.H. Korezlioglu. Comptation of filters by smpliig and quantization. Sept. ST.

209. J3. Bather. Stopping rules and observed aignificanca levels. Sept. ST.

210. S.T. Rachey and .E.. Yuklch. Convolution metrics ad ratos of convorgmne in thecentral limit theorem. Sept. 07.%

211. T. Fujimelul. Normud llma. eqution with degenerate diffusion coefficients and Its Sapplications to differential euations. Oct. 57.

212. C. Sans. Y.C. Yoe and X. Vu. Sequential tests for the drift of a Wiener processwith a emooth prior. red the beat eqbation. Oct. 67.

213. E.L. Smith. Eatron value theory for dennt sequences via thu Stin-hm method ofPoimeo apprmcIMtien. Oct. ST.

214. C. PoAdr&. A nte on vector biuaures. Now. ST.

215. N.2. Leadbtter. On the sacedance random measures for stationry Processes. Nov. 67.

216. N. Narquos, A study en Lehsaus I Ic en of measrnaee indueud by utableprocesses, Nov. 17.

217. N.T. Alpuim. High level oeedeamoe In stationary sequencee with extromal index. Dec.%97.

219. S.F. Serfos. Poison. functiumals of low processoe end yarnasing networks. Dec. 87.

219. .1. Rather. Stopping rules and ordured families of distributionse. Dec. 07.

22D. S. Cambnis and S. Srejin. Two cleses of "itf-similar stable process" with

stationay increments. Jan. SI.

% \ % %, , .* %M. 0'w .. 0a *V % "A 'A 'Of0

Page 50: CHAPEL HILL CENTER FOR STOCHASTIC PROCESSES ADLER … · 2014-09-28 · FWOF UPiC FILE Gccc I AFOSR-TR. 88-0359-k 00 CENTER FOR STOCHASTIC PROCESSES 04, CM-V Department of Statistics

jw

1

fJLIflEb/7oM~

(.

~ ~N~4dW~ -N'~%~Nf.~'<