appdx_h_ocr

Upload: rajesh-bathija

Post on 03-Apr-2018

220 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/29/2019 appdx_h_ocr

    1/19

    APPENDIX HINTRODUCTION TO PROBABILITYAND RANDOM PROCESSES

    Th is append ix is no t intended to be a definitive dissertation o n the subject ofra nd om processes. Th e ma jor concepts, definitions, an d results which areemployed in the text are stated here with little discussion and no proof. T hereader w ho requires a m ore com plete presentation of this material is referredt o any one of several excellent books on the subject: amon g them D aven portand Root (Ref . 2), Laning and Battin (Ref. 3), and Lee (Ref. 4). Possiblythe m ost im po rtan t function served by this app endix is the definition of theno tation an d of certain conventions used in the text.PROBABILITYConsider an event E which is a possible outcome of a random experiment.W e deno te by P (E ) the probability of this event, and think of it intuitively asth e limit, as the num ber of trials becomes large, of the ratio of the num ber oftimes E occurred to the num ber of times the experiment was tried. Th ejoint event that A and B and C, etc., occurred is denoted by ABC . , andthe probabil i ty of this joint event, by P (A B C . . ). If these events A, B, C ,etc., are mutua lly independent, which means that the occurrence of any o neo f them bears n o relation to the occurrence of any other, the probabili ty ofth e joint event is the product of the probab ilities of the simple events. T hatis , P ( A B C . . . = P(A)P(B)P(C) . . (H- 1 )

  • 7/29/2019 appdx_h_ocr

    2/19

    630 PROBABILITY A N D RA ND OM PROCESSES

    if the events A , B, C , etc., are mutually indepen dent. Actually, the mathe-matical definition of independence is the reverse of this statement, but theresult of consequence is that independence of events and the multiplicativeproperty of probabilities go together.

    RANDOM VARIABLESA rando m variable X i s in simplest terms a variable which takes o n values a tran do m ; it may be tho ught of as a function of the outcomes of some rando mexperime nt. T he m ann er of specifying the probability with which differentvalues are tak en by the r an do m variable is by th e probability distributionfunction F(x), which is defined by

    or by the probab ility density function f (x), which is defined by

    T he inverse of the defining relation for the prob ability density function is(H-4)

    A n evident characteristic of any probability distribu tion o r density function is

    F ro m the definition, the interpretation off (x) as the density of probabilityof the event tha t X takes a value in the vicinity of x is clear.F (x + dx) - F(x)f(x) = lim

    &+o d xP ( x < X s x + dx)= lim

    d x 4 0 dx

    Th is functio n is finite if the probability th at X takes a value in the infinitesimalinterval between x and x + dx (the interval closed on the right) is an infini-tesimal of order dx. This is usually tru e of ran dom variables which takevalues over a continuo us range. If, however, X takes a set of discrete valuesx i with nonzero probabilities p i ,f (x) is infinite a t these values of x . Th is isaccommodated by a set of delta functions weighted by the appropriateprobabilities.f (x) =Xpi d(x -XJ (H-7)

    Z

  • 7/29/2019 appdx_h_ocr

    3/19

    PROBABILITY A N D R A N D O M PROCESSES 631

    A suitable definition of the delta function, 6 ( x ) , for the present purpose is afunction which is zero everywhere except at x = 0, and infinite at that pointin such a way that the integral of the function across the singularity is unity.An important property of the delta function which follows from this definitionis

    if G ( x ) is a finite-valued function which is continuous at x = x,.A random variable may take values over a continuous range and, in

    addition, take a discrete set of values with nonzero probability. Theresulting probability density function includes both a finite function of x andan additive set of probability-weighted delta functions; such a distribution iscalled mixed.The simultaneous consideration of more than one random variable is oftennecessary or useful. In the case of two, the probability of the occurrence ofpairs of values in a given range is prescribed by the oint probability distributionfunction. F 2 ( x , y )= P ( X 5 x and Y I y ) 0 3 - 9 )where X and Y are the two random variables under consideration. Thecorresponding joint prob ability dens ity function is

    ( H -10)I t is clear that the individual probability distribution and density functionsfor X and Y can be derived from the joint distribution and density functions.For the distribution of X,

    (H-12)Corresponding relations give the distribution of Y. These concepts extenddirectly to the description of the joint characteristics of more than tworandom variables.

    If X and Yare independent, the event X Iis independent of the eventY 5 y ; thus the probability for the joint occurrence of these events is theproduct of the probabilities for the individual events. Equation ( H - 9 ) thengives F , ( x , y )= P ( X s x and Y I y )= P ( X < x ) P ( Y I Y )= F X ( X ) F YY ) ( H -13 )

  • 7/29/2019 appdx_h_ocr

    4/19

    632 PROBABILITY A N D R AN DO M PROCESSES

    From Eq. (H -10)the joint probability density function is, then,

    Expectations and statistics of random variables The expectation of arandom variable is defined in words to be the sum of all values the randomvariable may take, each weighted by the probability with which the value istaken. For a random variable which takes values over a continuous range,this summation is done by integration. The probability, in the limit asdx -+ 0 , that X takes.a value in the infinitesimal interval of width dx near xis given by Eq. ( H - 6 ) to be f ( x ) d x . Thus the expectation of X, which wedenote by 8, is

    X = S>f(x) dx ( H - 15)This is also called the mean value of X, or the mean of the distribution of X.This is a precisely defined number toward which the average of a number ofobservations of X tends, in the probabilistic sense, as the number of observa-tions becomes large. Equation (H-15) is the analytic definition of theexpectation, or mean, of a random variable. This expression is usable forrandom variables having a continuous, discrete, or mixed distribution if theset of discrete values which the random variable takes is represented byimpulses in f ( x ) according to Eq. ( H - 7 ) .

    I t is of frequent importance to find the expectation of a function of arandom variable. If Y is defined to be some function of the random variableX, say, Y = g(X), then Y is itself a random variable with a distributionderivable from the distribution of X. The expectation of Y is defined byEq. ( H - 1 9 ,where the probability density function for Y would be used in theintegral. Fortunately, this procedure can be abbreviated. The expectationof any function of X can be calculated directly from the distribution of X bythe integral

    (H -16)An important statistical parameter descriptive of the distribution of X is itsmean-squared value. Using E q. ( H - 1 6 ) ,the expectation of the square of Xis written -

    x2= E x ' / ( . ) dx (H -17)The variance of a random variable is the mean-squared deviation of therandom variable from its mean; it is denoted by u2.

    (H-18)

  • 7/29/2019 appdx_h_ocr

    5/19

    PROBABILITY A N D R A N D O M PROCESSES 633

    The square root of the variance, or IS,is called the standard deviation of therandom variable.Other functions whose expectations we shall wish to calculate are sums andproducts of random variables. It is easily shown that the expectation of thesum of random variables is equal to the sum of the expectations,

    whether or not the variables are independent, and that the expectation of theproduct of random variables is equal to the product of the expectations,

    if the variables are independent. It is also true that the variance of the sumof random variables is equal to the sum of the variances if the variables areindependent.A very important concept is that of statistical dependence between randomvariables. A partial indication of the degree to which one variable is relatedto another is given by the covariance, which is the expectation of the productof the deviations of two random variables from their means.

    This covariance, normalized by the standard deviations of X and Y, iscalled the correlation coeficient, and is denoted p.

    The correlation coefficient is a measure of the degree of linear dependencebetween X and Y. If X and Y are independent, p is zero; if Y is a linearfunction of X, p is f1. If an attempt is made to approximate Y by somelinear function of X, the minimum possible mean-squared error in theapproximation is aU2(1- p 3 . This provides another interpretation of p asa measure of the degree of linear dependence between random variables.

    One additional function associated with the distribution of a randomvariable which should be introduced is the characteristic function. It isdefined byg(t>= exp (jtX)

    (H-23)A property of the characteristic function which largely explains its value isthat the characteristic function of a sum of independent random variables is

  • 7/29/2019 appdx_h_ocr

    6/19

    634 PROBABILITY A N D R AN D O M PROCESSES

    the produ ct of the characteristic func tions of the individual variables. Ifthe characteristic function of a random variable is known, the probabilitydensity function can be determined fro m

    f(x) =1S o g ( t ) e r p (-jtx ) d t (H-24)2 n -a ,Notice that Eqs. (H-23) and (H-24) are in the form of a Fourier transformpair. An othe r useful relation is

    The uniform and normal probability distributions Two specificforms of probability distribution which are referred to in the text are theuniform distribution and the norm al distribution. Th e uniform distributionis characterized by a uniform (constant) probability density over some finiteinterval. T he ma gnitude of the density function in this interval is thereciprocal of the interval width as required to make the integral of thefunction unity. This function is pictured in Fig. H-1. The normalprobability density function, shown in F ig. H -2, h as the analytic form

    where the tw o parameters which define the distribution are m, the m ean, anda, the stan da rd deviation. By calculating the characteristic function fo r anormally distributed random variable, one can immediately show that thedistribution of the sum of independent normally distributed variables is alsonorm al. Actually, this remarkab le property of preservation of fo rm of thedistribution is true of the sum of normally distributed random variableswhether they are indepen dent or not. Even mo re remarka ble is the fact thatunder certain circumstances the distribution of the sum of independentrandom variables, each having an arbitrary distribution, tends toward thenormal distribution as the number of variables in the sum tends towardinfinity. Th is statem ent, together with the con dition s und er which the resultcan be proved , is known as the central limit theorem. Th e conditions arerarely tested in practical situations, but the empirically observed fact is thata grea t ma ny ran do m variables-and especially thos e encoun tered bycontrol-system engineers-display a distribution which closely app rox ima testhe norm al. Th e reason for the com mo n occurrence of normally distributedrandom variables is certainly stated in the central limit theorem.Reference is made in the text to two random variables which possess a

  • 7/29/2019 appdx_h_ocr

    7/19

    PROBABILITY A N D R A N D O M PROCESSES 63 5

    bivariate normal distribution. The form of the joint probability densityfunction fo r such zero-mean variables is

    where mii = XiXj.This can also be written in terms of statistical parameters previouslydefined as

    R A N D O M PROCESSESA random process may be thought of as a collection, or ensemble, of func-tions of time, an y on e of which m ight be observed on any trial of an experi-me nt. T he ensemble may include a finite nu m be r, a countable infinity, o ra non coun table infinity of such functions. We shall deno te the ensemble offunctions by {x(t)), and any observed member of the ensemble by x(t).T h e value of the observed mem ber of the ensemble a t a particular time, sa y,t , , as shown in Fig. H-3, is a random variable; on repeated trials of theexpe riment, x(tl) takes different values at ran do m . Th e probability th atx(t,) takes values in a certain range is given by the probability distributionfun ction , as it is for any ran do m variable. I n this case we show explicitlyin the notation the dependence on the time of observation.

    T h e correspondin g probability density function is

    These functions suffice to define, in a probabilistic sense, the range ofam plitudes which the rand om process displays. T o gain a sense of howquickly varying the members of the ensemble are likely to be, one has toobserve the same member function a t mo re than o ne t ime. Th e probabili tyfo r the occu rrence of a pair o f values in certain ranges is given by the second-ord er joint probability distribution functionF2(xl,tl;x,,t,) = P[x(t , ) I xl and x( t J 5 x 2 ] (H-30)

  • 7/29/2019 appdx_h_ocr

    8/19

    -- -- -

    636 PROBABILITY A N D RA ND OM PROCESSES

    and the corresponding joint probability density function

    Higher-ordered joint distribution and density functions can be definedfollowing this pattern, but only rarely does one attempt to deal with morethan the second-order statistics of random processes.If two random processes are under consideration, the simplest distributionand density functions which give some indication of their joint statisticalcharacteristics are the second-order functions

    Actually, the characterization of random processes, in practice, is usuallylimited to even less information than that given by the second-order distribu-tion or density functions. Only the first moments of these distributions arecommonly measured. These moments are called auto- and cross-correlationfunctions. The autocorrelation function is defined as

    and the cross-correlation function as

    In the case where x(tl), x(t,), and y(t,) are all zero, these correlation functionsare the covariances of the indicated random variables. If they are thennormalized by the corresponding standard deviations, according to Eq.(H-22), they become correlation coefficients which measure on a scale from- 1 to +1 the degree of linear dependence between the variables.A stationary random process is one whose statistical properties are invariantin time. This implies that the first probability density function for theprocess, f(xl,tl), is independent of the time of observation t,. Then all themoments of this distribution, such as x(t,) and x(t,),, are also independent oftime; they are constants. The second probability density function is notin this case dependent on the absolute times of observation, t, and t,, butstill depends on the difference between them. So if t, is written as

  • 7/29/2019 appdx_h_ocr

    9/19

    PROBABILITY A N D R A N D O M PROCESSES 637

    fi(x l,tl;x2 ,t2) becomes f2(x,, t,; x2 , t1 + T), which is independent of t,, butstill a function of T. The correlation functions are then functions only ofthe single variable T .

    9 x x ( ~ )= x ( t ~ ) x ( t ~ 7) (H-37)

    Both of these are independent of t1 if the random processes are stationary.We note the following properties of these correlation functions:

    One further concept associated with stationary random processes is theergodic hypothesis. Th is hypothesis claims tha t any statistic calculated byaveraging over all members of an ergodic ensemble at a fixed time can alsobe calculated by averaging over all time o n a single representative member ofthe ensemble. Th e key to this notion is the word "representative." If aparticu lar me mber of the ensem ble is to be statistically representative of all,it m ust display a t various points in time the full range of am plitude, rate ofchange of am plitude, etc., which are to be fou nd am ong all the members ofthe ensemble. A classic example of a stationary ensemble which is notergo dic is the ensemble of cons tant functions. T he failing in this case is thatn o mem ber of the ensemble is representative of all. In practice, almo st allempirical results for stationary processes are derived from tests on a singlefunction under the assum ption tha t the ergodic hypothesis holds. In thiscase the common statistics associated with a random process are written

    -x2= lim - ST ~ ( t ) ~tI ~ - ~T -T yXx(r)= lim - x(t )x( t + T) dt (H-43)' ST-m 2 T -T

    An example of a stationary ergodic random process is the ensemble ofsinusoids of given amplitude and frequency with a uniform distribution ofphase. The mem ber functions of this ensemble are all of the form~ ( t )= A sin (wt + 0) (H-45)

  • 7/29/2019 appdx_h_ocr

    10/19

    --

    638 PROBABILITY A N D RA ND OM PROCESSES

    where I9 is a random variable having the uniform distribution over theinterval (0,27r) radian s. Any average taken over the mem bers of thisensemb le a t an y fixed time would find all phase ang les represented w ith equalprobability den sity. But the same is tru e of an average over all time on anyone member. F o r this process, then, all mem bers of the ensemble qualifya s "representative." N ote th at any distribu tion of the phase angle I9 othertha n the uniform d istribution over an integral num ber of cycles would definea nonstationary process.An oth er ran do m .process which plays a central role in the text is thegaussian process, which is characterized by the property that its jointprobability distribution functions of all orders are multidimensional normaldistributions. Fo r a gaussian process, then , the distribution of x(t ) for anyt is the nor m al distribution , for which the density function is expressed by Eq.(H-26); the joint distribu tion of x(t,) an d x(t,) for an y t, and t, is thebivariate nor m al distribution of Eq. (H-27), a nd so on for the higher-orderedjoint distributions. T he n-dimensional norm al distribution for zero-meanvariables is specified by the elements of the nth -orde r covariance ma trix, th at-s, by the m ij = XiXjor i, j = 1, 2, . . . , n. But in this caseand

    T hu s all the statistics of a gau ssian process are defined by the au toco rrelationfunc tion for the process. This prop erty is clearly a grea t boo n to analyticoperations.

    LINEAR SYSTEMSTh e inpu t-outp ut relation for a linear system may be w ritten

    tYO)= ~ m x ( T ) w ( t 7 7 - )A-where x(t) = input functiony ( t ) = output

    w ( t , ~ )= system weighting function, the resp onsea t t ime t t o a unit impulse input a t time T.Using this relatio n, the statistics of the ou tpu t process can be written in termsof those of the input.

  • 7/29/2019 appdx_h_ocr

    11/19

    PROBABILITY A N D R AN D O M PROCESSES 639

    If the input process is stationary and the system time-invariant, the outputprocess is also station ary in the steady state. These expressions then reduceto

    An alytic operations on linear invariant systems ar e facilitated by the use ofintegral transforms which transform the convolution input-output relationof Eq. ( H - 5 1 )int o the algebraic operation of multiplication. Since mem bersof stationary random ensembles must necessarily be visualized as existingfor all negative and positive time, the two-sided Fourier transform is theappropriate transformation to employ in this instance. The Fouriertransforms of the correlation functions defined above then appear quitenaturally in analysis . Th e Fourie r transform of the autocorrelation function

    is called the power spec tral density function, o r power density spectrum of therandom process { x ( t ) } . Th e term "power" is here used in a generalizedsense, indicating the expected squared value of th e mem bers of the ensemble.@,,(w) is indeed the spectral distribution of power density for { x ( t ) )in thatintegration of cDxx(w)over frequencies in the band from w , to w , yields themean-squared value of the process which consists only of those harmoniccomponents of { x ( t ) )that lie between w, and w ,. In particular, the mean-squared value of { x ( t ) )itself is given by integration of the power density

  • 7/29/2019 appdx_h_ocr

    12/19

    640 PROBABILITY A N D R AN DO M PROCESSES

    spectrum for the ran dom process over the full range of w. This last resultis seen as a specialization of the inverse transform relation corresponding toEq. (H-56).

    The in put -outp ut relation for power spectral density functions is derived bycalculating the Fourier transform of the autocorrelation function of theoutput of a linear invariant system as expressed by Eq. (H-54).

    where W (j w ) is the steady-state sinusoidal response function for the system,which is also the Fo urier transform of the system weighting function.W( jw ) = /-Iw(t) exp (-jwt) dt

    A particularly simple form for the power density spectrum is a con stan t,@,(w) = @, This implies that power density is distributed uniformly overall frequency components in the full infinite range. By analogy with thecorresponding situation in the case of white light, such a random process,usually a noise, is called white noise. Th e autocorre lation fun ction forwhite noise is a delta function.

    Th e mean-squ ared va lue of white noise, qn ,(0), is infinite, an d so the processis no t physically realizable. How ever, it doe s serve as a very useful app roxi-mation to situations in which the noise is wideband com pared w ith the band-width of the system , and the c oncept is useful in analytic operations.Th e F ou rier transform of the cross-correlation function is called the crosspower spe ctra l density function.

    If x(t ) is the input to , and y( t) the o utput fr om , a l inear invariant system, s othat qz y( r) s given by Eq. (H-55), then the in put-o utpu t cross power densityspectrum is@,,(w) = W (jo)@,,(w) (H-63)

  • 7/29/2019 appdx_h_ocr

    13/19

    PROBABILITY A N D R A N D O M PROCESSES 641

    Th e calculation of the distribution of amplitudes of the rando m processeswhich app ear a t various points in linear systems is in general a mo st com pli-cated prob lem . T he only case in which a simple result is know n is tha t of agaussian process applied t o a linear system ; in this case it can be proved th atthe processes appearing a t all points in the system are also gaussian. Th ecredibility of this property is perhaps indicated by noting that the input-output relation for a linear system [Eq. (H-46)j is the limit of a sum of theformNy ( t ) = lim X ( T J W ( ~ , T JT^ (H-64)

    Ar-0 i=-mN - 4 ,r ~ = t

    We have already noted that the sum of normally distributed ran do m variablesis normally distributed, and since any constant times a normally distributedvariable is also normal, we may conclude that any linear combination ofnormally distributed ran do m variables is normally distributed. Eq ua tion(H-64)expresses the ou tpu t of a linear system a s the limit of a linear com bina -tion of past values of the inp ut. Th us, if the inp ut at all past times has anormal distribution, the outp ut a t any time must also be normal. Thisproperty a lso holds for the higher-ordered joint distributions, with the resulttha t if x ( t ) is a gaussian process, so is y ( t ) .

    Of even greater consequence is the empirically observed fact that non-gaussian inputs tend to become more nearly gaussian as a result of linearjiltering. If the inp ut were nongaussian white noise, one could refer to Eq .(H-64) and invoke the central limit theorem to argue that, as the filteringbandw idth is decreased, the num ber of terms in the sum for y ( t ) which m akea significant contribution increases, and thus the distribution of y ( t ) shouldapp roa ch the no rm al, regardless of the distribution of the ~ ( 7 ~ ) .n fact, thistendency is observed for nonwhite inputs as well; so the gaussian randomprocess has the singular position of being that process toward which manyothe rs tend a s a result of linear filtering. T he mos t evident exception to thisrule is a random ensemble in which every member contains a periodiccom pon ent of the same period. Low -pass linear filtering tends to reducethese periodic signals to their fundamental sinusoidal components, and asinusoid does not display the norma l distribution of amplitudes. But ifevery member of a random ensemble has a periodic component of the sameperiod, the process contains nonzero power a t the fundam ental frequency ofthese periodic com pone nts, a nd perhaps a t som e of the harmonics of thisfrequency as well. N onz ero pow er at an y discrete frequency implies infinitepower density a t tha t frequency. Th us it might be said th at if a rando mprocess has a finite power density spec trum, it may be expected to app roa cha gaussian process a s a result of low-pass linear filtering. U nfortu nate ly,it does not seem possible to phrase this general statement in quantitativeterms.

  • 7/29/2019 appdx_h_ocr

    14/19

    641 PROBABILITY A N D R AN DO M PROCESSES

    INSTANTANEOUS NONLlNEARlTlES

    T he secon d-order statistics of the o utp ut of a static single-valued nonlinearitydepend on the form of the nonlinearity

    an d the second-order joint probability density function for the input ra ndo mprocess. F or example, the autocorrelation function for the ou tpu t process is

    which is independent of t , if ( x ( t ) } is station ary. If interest is centered onclosed-loop systems which contain a nonlinear device, it is rarely possible,as a practical matter, to determine the second-order probability densityfunction for the input to the nonlinearity, unless one can argu e, on the basis oflinear filtering, tha t the process should be approxim ately gaussian. F o r astationary gaussian process, f 2 ( x l , t , ; x , , t , + T )is given by Eq. ( H - 2 7 ) withsince we are considering zero-mean variables. Also,

    Equation ( H - 2 7 ) becomes

    Th us, for a gaussian input, Eq. ( H - 6 6 ) is written

    x12- 2 p x 1 x 2+ xz2x exp [- 2a2(1 - p2) I

    The evaluation of this double integral can be systematized through theexpansion of the gaussian joint probability density function into a dou ble

  • 7/29/2019 appdx_h_ocr

    15/19

    PROBABILITY A N D RA N DO M PROCESSES 643

    series of Hermite polynomials. These functions are used in varying forms;perhaps the most convenient for the present purpose is(H-72 )Hk (u )= (- exp g )$[exp (- g)]

    The first few of these functions are

    H3(u )= u3 - 3~ (H-73d )and in general,Hk+l(u)= uHk ( 4 - kHk-l(u) (H-74 )

    These functions form a complete set and are orthogonal over the doublyinfinite interval with respect to the weighting function exp ( - u 2 / 2 ) . T heorthogonality conditions are

    Expansion of the gaussian probability density function in terms of theseHermite polynomials gives (Ref. 1)

    pk= exp (- + '")Hk(u1)Hk(u2)l (H-76 )2 k=OWith this expansion, Eq. (H-71 ) becomes

    1where a , =--12 n k ! :-xp ~ , ( u )u-mThis expresses the output autocorrelation function as a power series in thenormalized input autocorrelation function. No te that H,(u) is odd for kod d a nd is even for k even ; thus, in the com m on case of an o dd nonlinearity,a , = 0 for k even.

  • 7/29/2019 appdx_h_ocr

    16/19

    644 PROBABILITY A N D R AN DO M PROCESSES

    It is clear that the o utp ut of the nonlinearity c ontains power a t higherfrequencies than the input. The normalized autocorrelation functionp X x ( r )has a ma gnitude less than or equ al to unity everywhere; so the powersof this function appearing in Eq. (H-77) fall off to zero faster than thefunction itself. F o r exam ple, if the input process has the familiar exponentialautocorrelation function, p , , 3 ( ~ ) will decay th ree times fa ster, as indicated inFig. H-4. But this contribution to the output autocorrelation functionhas a power spectral density function which is three times wider than that ofthe inp ut process. Th e higher-ordered terms in the expansion of the outpu tautoco rrelation function make still broade r b and contributions to the outp utpowe r spectral density func tion. Th is is in a way analogous to the responseof a nonlinear device to a sinusoidal input, which consists of a fundamentalfrequency compo nent plus higher-frequency harmonics.Other statistics of the output of the nonlinearity can also be expressed interms of these a,. Th e mean of the ou tpu t is

    1- du1 ( m ) exp (- f)d 2 T -min the case of a n unbiased stationary gaussian inp ut. Also, in this case, theinp ut-o utp ut cross-correlation function is

    a) 2 k !I*pZu(r)=- c ? ~ ~ y ( o u ~ )v k = , -, u 1 ~ d u , exp (- "':3 Hk(ul)Hk(u31= 2 a,,'[- Smu1 exp (- $)Hk(u , ) du l lk=o 2 / 2 ~ k !m

    using Eq. (H-73b) and the orthogonality conditions of Eq. (H-75). Theinp ut- ou tpu t cross-correlation func tion for a static single-valued nonlinearitywith a n unbiased gaussian inp ut is thus found to be just a constant times theinp ut autocorrelation function, as it is for a static linear gain. In the linearcase, the constant is the gain, whereas in the nonlinear case, the constantdepends both on the nonlinearity and on the rms value of the input.

  • 7/29/2019 appdx_h_ocr

    17/19

    PROBABILITY A N D R A N D O M PROCESSES 645

    REFERENCES1. Cramer, H.: "Mathematical Methods of Statistics," Princeton University Press,

    Princeton, N.J., 1946, p. 133.2. Davenport, W. B., Jr., and W. L. Root: "An Introduction to Random Signals and

    Noise," McGraw-Hill Book Company, New York, 1958.3. Laning, J. H., Jr., and R. H. Battin: "Random Processes in Automatic Control,"

    McGraw-Hill Book Company, New York, 1956.4. Lee, Y. W.: "Statistical Theory of Communication," John Wiley & Sons, Inc., New

    York, 1960.

    Figure H-I The un iform probability dens ity function.

    Figure H-2 Th e norma l probability density function.

  • 7/29/2019 appdx_h_ocr

    18/19

    646 PROBABILITY A N D R A N D O M PROCESSES

    Figure H-3 Members of the ensemble { x ( t ) } .

  • 7/29/2019 appdx_h_ocr

    19/19

    PROBABILITY A N D R AN D O M PROCESSES 647

    Input autocorrelation functionand output term of order 1-O utp ut term of order 3

    (a) Autocorrelation functions

    Input power spectral density functionand output term of order 1Output term of order 3

    W

    ( 6 ) Power spectral density functions

    Figure H-4 Characteristics of the nonlinearity input and output random processes.