clarifying chaos iii. chaotic and stochastic ......chaotic and stochastic processes, chaotic...

Click here to load reader

Post on 20-Feb-2021




0 download

Embed Size (px)


  • Tutorials and Reviews

    International Journal of Bifurcation and Chaos, Vol. 9, No. 5 (1999) 785–803c© World Scientific Publishing Company



    RAY BROWN4650 N. Washington Blvd., #606 Arlington, VA 22201, USA

    LEON O. CHUADepartment of Electrical Engineering and Computer Sciences,

    University of California, Berkeley, CA 94720, USA

    Received September 5, 1998; Revised November 18, 1998

    In this tutorial we continue our program of clarifying chaos by examining the relationshipbetween chaotic and stochastic processes. To do this, we construct chaotic analogs of stochasticprocesses, stochastic differential equations, and discuss estimation and prediction models. Theconclusion of this section is that from the composition of simple nonlinear periodic dynamicalsystems arise chaotic dynamical systems, and from the time-series of chaotic solutions of finite-difference and differential equations are formed chaotic processes, the analogs of stochasticprocesses. Chaotic processes are formed from chaotic dynamical systems in at least two ways.One is by the superposition of a large class of chaotic time-series. The second is through thecompression of the time-scale of a chaotic time-series. As stochastic processes that arise fromuniform random variables are not constructable, and chaotic processes are constructable, weconclude that chaotic processes are primary and that stochastic processes are idealizations ofchaotic processes.

    Also, we begin to explore the relationship between the prime numbers and the possiblerole they may play in the formation of chaos.

    1. Introduction

    In this tutorial we examine the relationshipbetween chaotic and stochastic processes. In Sec. 1we discuss the definition of stochastic and chaoticprocesses and the construction of uniform randomvariables. In Sec. 2 we show how chaotic variablesare constructed and how they are related to themore familiar chaotic time-series. In Sec. 3 we con-struct the chaotic analog of white noise, and the firstform of Brownian motion. In Sec. 4 we constructthe chaotic analog of conventional Brownian mo-tion, and in Sec. 5 we construct chaotic analogs ofwide-sense stationary processes, Poisson, and otherprocesses. In Sec. 6, we show how chaotic analogs of

    stochastic differential equations are solved and dis-cuss the role of the Ito and Stratonovich stochasticcalculus. In Sec. 7, we look at chaotic processes asdriving forces. In Sec. 8 we discuss filtering and es-timation briefly. In Sec. 9, we review the Theoryof Chaos, and in Sec. 10 we touch on the potentialrole of prime numbers in chaos.

    1.1. Definitions of chaotic andstochastic processes

    A stochastic process is defined as a one-parameterfamily of random variables and a random variable isdefined as a measurable function. The construction


  • 786 R. Brown & L. O. Chua

    of stochastic processes is reducible to the construc-tion of random variables, and the construction ofrandom variables is reducible to the construction ofa uniform random variable.

    Typically, when a sample of a uniform randomvariable is needed, one uses a pseudo-random num-ber generator (which, in fact, is a chaotic dynam-ical system) to make a “random” selection. Forexample, f(x) = x on the interval [0, 1] is a mea-surable function that no one is going to want tocall a random variable even though it fits the for-mal definition. However, by sampling this functionrandomly, we are able to work with it as a randomvariable in the sense of probability theory. Thisis because when dealing with random variables, weonly deal with the range of the function, not its do-main. This is due to the fact that it is the frequencywith which a function takes on its values that is im-portant to probability theory, not precisely wherethe time-order of these values are assumed by thefunction. Because of this last point, the concept ofa probability distribution evolved.

    A chaotic process may be defined as a one-parameter family of measurable functions whicharise from chaotic dynamical systems. Contraryto probability theory, in dynamics we are in-terested in the time-order of the values of thefunctions we encounter, and so the concept offunction is more appropriate than the concept ofa probability distribution. In order to discussboth subjects without constantly translating fromfunctions to distributions and back, we will discussboth subjects from the point of view of functions,the language more appropriate to the study of dy-namics. To do this, we will need to construct ameasurable function such that if we were to sampleit sequentially, at a very high sampling rate, ratherthan randomly, we will still obtain a random sampleof the function’s values.

    One more word about random selections. Ifwe are to firmly establish the connection betweenchaotic and stochastic processes, the role of randomselections must be more closely examined. Whenforming a realization of a stochastic process, foreach time, t, one must make a random selectionof a value of a measurable function, or present themeasurable function explicitly for evaluation. Ifwe were to insist on the measurable function beingpresented for evaluation, presumably the inverse ofits distribution function would be presented and a

    random evaluation would be made using a pseudo-random number generator. But this introduces asmall measure of mystery into the function eval-uation process. For example, how is this processformally defined mathematically? It appears to bea function composition between the inverse of thedistribution function and a pseudo-random numbergenerator on a computer such as RND in Basic. Iff is the inverse of a distribution function in ques-tion, then f(RND(1)) is the composition. But theexact definition of the function RND is hidden tomost of its users. Since RND is a periodic func-tion, it has only a finite number of values, say N .Successive calls to RND are thus equivalent to func-tions evaluations at the successive points 1/N , 2/N ,3/N, . . . , 1. Unless we randomize the timer func-tion. In that case, the evaluation starts at someoffset from 1/N determined by the computer timefunction in fractions of seconds of a day. Afteradding back these details, the first evaluation ofthe random variable in question can be formallyexpressed as f(RND(1)(1/N + t0)) where t0 is afunction of the computer time stamp. But whatif we need another random variable with the samedistribution as f to get a realization of the process?For example, what if it is a discrete process of inde-pendent identically distributed random variables?Now we must have a denumerable set of them. Wecannot formally use the same distribution functionf since this is not a different measurable function.Typically, it is assumed that if the random selec-tion is independent at each step, then we can usef over again without any harm. But we are tryingto be exact and formal, thus we must use anotherfunction besides f , otherwise everything is impre-cise. At this point it is clear that actually present-ing the denumerable set of independent, identicallydistributed, random variables poses a practical, andperhaps, also a formal problem. If we do not exhibitthe denumerable set, then we are only dealing withstochastic processes in some ideal sense while we areburying some interesting formalities in the processof making a random selection. Any formal processthat will clear this up will necessarily lead to theuse of chaotic dynamical systems to construct ran-dom variables and so lead us to the position thatstochastic processes are idealizations of what are, infact, chaotic processes. We defer the details to thefollowing sections.

  • Chaotic and Stochastic Processes, Chaotic Resonance, and Number Theory 787

    1.2. The problem of constructingrandom variables asmeasurable functions

    In this section, we review the problem of construct-ing a measurable function on the interval [0, 1]which looks like a uniform random variable, i.e. thegraph of f on the interval [0, 1] must look like atraditional white noise time series. The problemthat we discuss in this construction applies to ran-dom variables in general, but we restrict our dis-cussion to uniform random variables so that we arenot dealing in generalities. To carry out this discus-sion some measure theory is required. We have keptthis to a minimum with the aid of Professor MorrisHirsch who greatly simplified this discussion.

    Ideally, we seek a measurable function de-fined on the interval [0, 1] with the followingproperties:

    (1) It is Lesbegue measurable;(2) It is bounded;(3) It has a uniform distribution on every

    subinterval;(4) It is not a constant almost everywhere.

    However, no such function exists. The proofis as follows: Suppose, f is such a function. Thenby property 1 and 2 it is integrable in the sense ofLesbegue. By property 3 we have

    s−1∫ t+st

    f(x)dx =

    ∫ 10f(x)dx = c (1)

    and so ∫ t+st

    (f(x)− c)dx = 0 (2)

    for all s, s + t ∈ [0, 1], which implies1 f(x) = calmost everywhere, in violation of 4. We must con-clude that if a function can be constructed which wewould agree is a uniform random variable, it cannotbe a measurable function.

    While a uniform random variable is not math-ematically constructable as a measurable function,a reasonable candidate can be proven to exist as anonmeasurable function on the interval [0, 1]. Theproof follows a standard argument using the axiomof choice. Partition [0, 1] by the equivalence relationx ∼ y if x− y ∈ Q, where Q is the rational. Select

    one point from each equivalence class and form theset P. For x ∈ P, the sets {x + Q} are the men-tioned partition. Define f(x + r) = r for x ∈ P.f is uniform in the sense that for every x ∈ P,f(x+ Q) = Q, a dense set, while {x+ Q} is also adense set. Thus, f takes on every value of a set ofdense subset of [0, 1] equally often. However, f isnot measurable since f−1(0) = P.2

    The nonconstructability of a uniform randomvariable makes better sense than, at first, it seems.If we could construct an idealized white noiseprocess, then Brownian motion could not be the in-tegral of such a process since the integral would bea constant time t. It is the imperfections in whitenoise models, and nature, that give rise to forms ofBrownian motion. These imperfections are the re-sult of our models being chaotic processes ratherthan stochastic processes. Clearly, by reviewinghow stochastic processes are modeled on a computerwe conclude that a chaotic process is the math-ematical approximation of a stochastic process.Consequently, one form of Brownian motion can berealized as the integral of a chaotic “noise” process,even though it cannot be realized as the integral ofstochastic white noise.

    2. The Chaotic Analog of aUniform Random Variable

    In this section we construct the chaotic analog ofa uniform random variable. To do this, we needa method of function evaluation that will give re-sults similar to making a random selection. Sincea pseudo-random selection method can be basedon a modulo 1 multiplication scheme such as x →2·xmod (1), [a chaotic dynamical system with Lya-punov exponent log(2)] we might pursue the fol-lowing line of development. Given a distributionfunction F which is invertible, construct its inverseG, fix the number n as a large integer, and formthe function G(Tn(x)) on the interval [0, 1], whereT (x) = 2 · xmod (1). If F (x) = x, a uniform dis-tribution, then this results in the construction ofa measurable function on [0, 1] having a large, butfinite, number of discontinuities, and nearly satis-fies condition 3. But this is clearly unsatisfactoryfrom a theoretical point of view. In particular, if a

    1See [Royden, 1988, Chap. 5, Lemma 8].2The inverse image of a measureable set, in this case the point 0, is not a measureable set, i.e. P is not measurable, thus f isnot measurable by the definition of measurable function.

  • 788 R. Brown & L. O. Chua

    measurable function on [0, 1] is to be considered arandom variable it should not be possible to sam-ple this function at a uniform sampling rate andobtain a constant value. In fact, the average of ev-ery large uniform sample should be the average ofthe random variable. This is not true of G(Tn(x)).We overcome this difficulty with the followingconstruction:

    g(x) =∞∑n=0

    anf(Tn(x)) (3)

    where an are the terms of an absolutely convergentseries, f is a function on [0, 1] which is not constantalmost everywhere, and T is a measure preservingmultiplication mod 1 mapping of [0, 1] onto itself.This series is uniformly and absolutely convergent.If f(x) = x, we have the chaotic analog of a uni-form random variable. In the case where T (x) =2xmod (1), the number of discontinuities of the nthterm of this series is greater than 2n. Figure 1 isthe graph for this case where an = 0.165

    (19−n), andT (x) = 16807.123 mod (1) and only 20 terms areused. We use 16807.123 in this example so thatwe do not have to resort to double precision to geta similar result using a large number of terms ofx→ 2 · xmod (1). The choice of 16807 arises fromthe fact that x → 16807 · xmod (2147483647) isa good choice for a simple pseudo-random num-ber generator, see [Brown & Chua, 1996]. The red,nearly straight line across the figure is the integralof g as a function of x. As g becomes more uniform,this integral must approach a straight line whoseslope is the mean value of g on [0, 1].

    By the following example, we conclude that arandom variable results from the superposition ofthe dynamics of a large number of chaotic processes.

    Example. Superposition of Chaotic Time-series.The superposition of a discrete set of chaotic time-series can be written (where we assume they areevaluated at the time step l · h, for h a small incre-ment of time) as

    g(l · h) =∞∑


    l(xk) (4)

    where the xk are a dense set of initial conditions,T is a chaotic map, and ak are arbitrary constants.Assuming, ergodicity this could be considered to beof the form

    g(l · h) =∞∑


    l(T k(x)) (5)

    Fig. 1. The graph of g(x) of Eq. (4) where an = 0.165|n−19|

    and T (x) = 16807.123 · xmod (1), and only 20 terms areused. Graphing g gives a resulting image similar to makinga random selection of g.

    for some fixed x. Rearranging this we get

    g(l · h) =∞∑


    k+l(x)) (6)


    g(l · h) =∞∑


    k(T l(x)) (7)

    By fixing l, this can be considered a function of theinitial condition x,

    g(x) =∞∑


    k(x) (8)

    Another discrete form is

    g(x) =∞∑k=1

    akTk(x) (9)

    If we choose the logistic map as T we get

    g(x) =∞∑k=1

    ak sin2(2k2πx) (10)

    With the left unilateral binary shift we get

    g(x) =∞∑k=1

    ak{2kx} (11)

  • Chaotic and Stochastic Processes, Chaotic Resonance, and Number Theory 789

    and for the map xn+1 = 2x2n − 1 we get

    g(x) =∞∑k=1

    ak cos(2k2πx) (12)

    The coefficients, ak have the role of arbitraryconstants as occurring in the theory of ordinarydifferential equations to be determined by measure-ments. For chaotic maps with positive Lyapunovexponents, as these example have, such series candefine functions which are C∞, continuous, con-tinuous and nowhere differentiable, and nowherecontinuous.

    Another variation of Eq. (6) is

    g(x, l · h) =∞∑


    k(x) (13)

    which gives a moving average representation to beencountered later.

    As implied by the above examples, Eq. (3) can-not be the solution of any finite system of ODEssince the arbitrary constants, an form an infiniteset.

    2.1. Generalized chaotic variables

    The construction of Eq. (3) can be generalized:

    g(x) =∞∑n=0

    anf(Tn(x)) (14)

    where Tn : [0, 1] → [0, 1], are different dynamicalsystems, at least some, but not necessarily all, ofwhich are chaotic, and the an are the terms of anabolutely convergent series. If f is the inverse ofa Gaussian distribution function, then this gives ageneralization of a Gaussianly distributed chaoticvariable. A continuous version of this constructionis

    g(x) =

    ∫ ∞0


    2.2. Chaotic variables, chaotictime-series, and realizations ofchaotic or stochastic processes

    One difference between a random variable and a re-alization of a independent, identically distributed(iid) stochastic process is that a random variable isdefined formally on a measure space, such as [0, 1],of measure 1, and a realization of a iid stochas-tic process is defined for all time. The stochastic

    properties of a random variable must also exist forrealizations of iid stochastic processes. Similar dis-tinctions hold for chaotic variables and realizationsof chaotic processes. With these distinctions made,we proceed to the main discussion of this section.

    An important distinction must be made be-tween chaotic variables, and later chaotic processes,and solutions of chaotic difference and differentialequations. We conventionally refer to the latteras chaotic time-series. Chaotic variables arise fromthe superposition of a large number of chaotic time-series. We recognize that this distinction is not yetprecise, but it is adequately descriptive.

    Chaotic time-series are the building blocks, andthus the source, of chaotic variables, processes, and,we argue, stochastic processes. Chaotic time-serieslead to chaotic variables and random variables inat least two ways. The preceding constructions ofchaotic variables is one route. A second route isa result of the ratio of the sampling rate to thetime-scale of the series. To make this point clear,we graph a chaotic time-series using three samplingrates and compare the integral of the squares ofthe sampled points for each of the three techniques.Figure 2 shows the three graphs. In each graph, thered curve is the plot of the average of the sum ofsquare of the time-series samples. This is essentiallya simple integration scheme. For a highly oscillatingfunction we have the approximate relationship:

    ∫ t0g(x)kdx ≈ ck · t (15)

    for k = 2, as was found in Fig. 1. In Fig. 2(a),the time-series is sampled at a rate sufficient to re-veal its true form, i.e. sampling at any higher rateprovides no new information. The integral of thesquare is, therefore, highly variable. In Fig. 2(b)the time-scale is compressed and the sampling rateheld constant so that the total number of points inthe graphs are constant. The integral of the squarereflects the increasing degree of complexity in thetime-series. In Fig. 2(c), the time-series appears al-most as complex as Fig. 1, and the integral reflectsthis increase in variability of the time-series as itconverges toward a straight line.

    This example shows that a chaotic time-seriesmay be sampled in such a way as to begin to appearstochastic. Alternatively, if a chaotic time-serieshas a very high oscillation rate it can approach arealization of a chaotic or stochastic variable in itsappearance. For example, this can happen when

  • 790 R. Brown & L. O. Chua




    Fig. 2. (a) In this figure we graph the time-series of one com-ponent of the solution of a twist-and-flip map equation havinggyration conductance function log(0.69 · exp(sin(2 · π · r))/r)for r ≤ 1 and 5 for r > 1, but in (b) we have compressed thetime-scale by a factor of 150, and in (c) the time-scale is nowcompressed by a factor of 700.

    the system is time compressed as when one samplesthe time-one map of a chaotic time-series in place ofthe time-series itself. This point-of-view provides aphysical interpretation of the Smale–Birkhoff the-orem and illustrates the difficulty in distinguish-ing between chaotic time-series and realizations ofchaotic or stochastic processes. It also explains whyif the function sin(t) is sampled in exponential time,it will produce a chaotic sequence as in the logisticequation.

    We note that Eq. (15), for k = 2, is not a suffi-cient condition to identify a realization of a chaoticor stochastic process as demonstrated by the follow-ing example:

    ∫ x0

    sin2(λs)ds ≈ c · x (16)

    for large λ. This function has only one very high fre-quency and a uniform sampling scheme will revealthat this function is not a realization of a chaoticor stochastic process.

    A function having a large number of frequen-cies and which satisfies Eq. (15) for k = 2 is alsonot necessarily a realization of a chaotic or stochas-tic process. What is needed is a chaotic range offrequencies and/or a chaotic range of amplitudescombined with satisfying Eq. (15) for k = 2 fora measurable function of x to be a realization ofa chaotic or stochastic process. Thus, while notdefining chaotic or stochastic phenomena, Eq. (15)for k = 2 can be used to distinguish between chaotictime-series and realizations of chaotic and stochas-tic processes. However, if Eq. (15) holds for a largenumber of integer values of k, then it may be suffi-cient to define a chaotic or stochastic process. Thisis still an open question.

    A conclusion of this example is that the com-plete distinction between realizations of chaotic orstochastic processes and chaotic time-series is re-duced to the ratio of the sampling rate to the timescales: The limit of a chaotic time-series, as we com-press the time-scale and hold the sampling rate con-stant, appears to be a realization of a stochastic orchaotic process.

    Also, Fig. 1 illustrates that the limit of a su-perposition of chaotic time-series as the number oftime-series approaches infinity, where the time-scaleand sampling rates are held constant, is a chaoticvariable which cannot be distinguished from a ran-dom variable by any measurement process.

  • Chaotic and Stochastic Processes, Chaotic Resonance, and Number Theory 791

    This suggests that there is a duality, when con-sidering chaotic dynamical systems, between com-pression of the time-scale and an increase in thenumber of systems: The dynamical features of asingle chaotic system under time-scale compressioncan be equivalent to the combined dynamics of alarge ensemble of chaotic dynamical systems con-sidered in uncompressed time.

    This duality concept can be further illustrated.Consider the logistic equation x → 4x(1 − x). Itssolution is xn = sin

    2(2π2nθ0). Thus the logistictime-series in real time is equivalent to samplinga periodic process in exponential time, t = 2n, orthe superposition of an uncountable number of pe-riodic time-series, a Fourier transform sampled inreal time. A chaotic variable can be equivalent to achaotic time-series sampled in exponential time orto the superposition of an infinite number of chaotictime-series sampled in real time. Thus the natureof exponential time sampling is to complexify dy-namics in the same way as multiplying the numberof dynamical systems in real time.

    A real-world example provides an interestingconsequence of this duality. Using a uniform sam-pling rate to measure a periodic signal source that isinitially accelerating at an exponential rate resultsin obtaining a chaotic sample of the signal. How-ever, as the velocity of the signal source approachesthe speed of light, order is restored.

    3. Chaotic Analog ofStochastic Processes

    Starting from Eq. (3) we may form an analog of astochastic process as follows:

    g(x, t) = gt(x) =∞∑n=0

    anf(Tn(x+ t)) (17)

    If f is the inverse of a Gaussian distribution, thenthe process is the chaotic analog of a Gaussianstochastic process. For any choice of x, this gives aGaussian process in t, and vice versa, for T , a mea-sure preserving chaotic map of the interval [0, 1].For any choice of the sequence an, so long as thesum of an converges, we can obtain a mean zeroand variance 1 process by normalizing g(x, t) fort = 0.

    As a result, the entire process has these fea-tures. This works because g is a function of x + t.

    Fig. 3. The covariance of gt(x) and gs(x) of Eq. (9), wherean = 0.165

    n, t = 0.5, 0 < s < 1, and T (x) = {229.1 · x}.60 terms are used, and f(x) = tan(2 · x− 1). The deviationsfrom white noise covariance is an important feature in chaoticprocesses.

    By choosing

    g(x, t) = gt(x) =∞∑n=0

    anf(Tn(x exp(t))) (18)

    we obtain another chaotic process. Figure 3 is agraph representing the degree of independence ofgs+t and gs, for Eq. (18). The lack of correlationbetween the process at s and t is due to the sensi-tive dependence on initial conditions of the chaoticdynamical system used in the construction. In par-ticular the vertical axis is the covariance:

    h(s) =

    ∫ 10g0.5(x)gs(x)dx (19)

    where s ranges from 0 to 1. If we could graph anideal white noise stochastic process, we would notsee the small irregularities that we see in Fig. 3.However, the irregularities in Fig. 3 are to be ex-pected in chaotic processes. They represent chaoticresonance that can appear when chaotic dynamicsreinforce one another in unpredictable ways.

    If in Eq. (17), f is the inverse of a Gaussiandistribution function, we get the analog of a Gaus-sian white-noise chaotic process. For any t, and forfixed x, this is a function whose absolute value ismeasurable, and is thus integrable. The integral∫ t

    0gs(x)ds (20)

    could be considered the chaotic analog of Brown-ian motion, if Brownian motion were actually the

  • 792 R. Brown & L. O. Chua

    integral of white noise. As such, it is smootherthan stochastic Brownian motion in that it isdifferentiable.

    In analogy with stochastic processes we will re-fer to Eq. (17) as W (x, t) and Eq. (20) as B0(x, t)when f is the inverse of a Gaussian distributionfunction. The dependence on x, the “random” vari-able, is shown explicitly to distinguish it from itsstochastic analog.

    We do not consider Eq. (20) to be the final wordon the chaotic analog of Brownian motion, hencewe label it B0(x, t) and have reserved the notationB(x, t) for a form of Brownian motion to be dis-cussed in the next section. There we specificallyexamine Brownian motion as it is used in stochas-tic processes and, while Brownian motion modelscan be constructed by using coarse approximationsof the integral of white noise, this approximationdoes not converge to Brownian motion in the limit.Hence, there is something missing when we startwith the discrete theory of Brownian motion andtry to pass to the limit to obtain continuous Brow-nian motion.

    We conclude by noting that there are as manychaotic processes as one may imagine. For example,in analogy with Eqs. (17) and (18) we may considerthe processes,

    g(x, t) = gt(x) =∞∑n=0

    anf(Tn(x(1 + t2))) (21)


    g(x, t) = gt(x) =∞∑n=0

    anhn(t)f(Tn(x)) (22)

    4. Chaotic Analog of Brownian Motion

    Constructing Brownian motion directly withoutconstructing white noise still requires constructingdiscrete white noise, and thus a uniform randomvariable, hence it is not constructable as a mea-surable process. However, if we assume that whitenoise can be constructed, then, as we have seen,Brownian motion cannot be the integral of whitenoise. In order to understand the conventionally un-derstood relationship of Brownian motion to whitenoise, we examine a typical construction, whichassumes the existence of a uniform random vari-able, such as that found in [Karlin & Taylor, 1975,Chap. 7]. This construction depends on the con-struction of a continuous function that is nowhere

    Fig. 4. Ten sample paths for Brownian motion model con-structed from the Schauder functions following [Karlin &Taylor, 1973] where a random number generator is used toobtain the Gaussian “random” samples.

    differentiable. Such constructions are a subjectof classical analysis that go back to Weierstrass.The construction of Karlin and Taylor starts withthe Haar functions, derives the Schauder functions,S(t), and then Brownian motion. Given S(t), therepresentation from Karlin and Taylor of B(t) is

    B(t) =∞∑1

    ξkSk(t) (23)

    where the ξk are independent mean zero, vari-ance one, Gaussian random variables. In Fig. 4we show the paths of ten sample functions usingthe Schauder function construction of Brownianmotion.

    We clarify the exposition of Karlin and Taylorwith the observation that all of the Schauder func-tions can be obtained from the single function:

    h(t) = 1− |2{t} − 1|

    by iteration of a chaotic map. Here {t} is thefractional part of t. Specifically, by considering theiterates of h under the unilateral shift T : t→ {2·t}we clarify their presentation to get

    B(x, t) = W (x, t0) · t+∞∑k=1

    Wk(x, t) ·h(T k−1(t))



    Wk(x, t) =2k−1∑j=1

    W (x, tj)χj(t)

  • Chaotic and Stochastic Processes, Chaotic Resonance, and Number Theory 793

    W (x, t) is a Gaussian white noise process, and χj(t)is the indicator function for the interval [2j−1, 2j ].We note that in the construction found in [Karlin& Taylor, 1975], Wk → W in the L1 sense, andthat W , in their presentation, must be thought ofas ideal white noise.

    Our rearrangement of the classical Brownianmotion formula suggests that it is possible to drawa closer relationship to chaos if we can construct theindicator functions from dynamical systems. Thiscan be done as follows: By the formula f ∨g we willmean the maximum of f and g. Next, let f(t) =sgn(sin(2πt)), and fj(t) = f(T

    j(t)) = fj−1(T (t)),where T (t) = {2 · t}. Finally, define iterativelyfj+1,k(t)=(−1)(k−1) ·fj+1(t)·(fj,1+kmod(2j )(t) ∨ 0) .

    (24)With this dynamical system formulation we


    Wk(x, t) =2k−1∑j=1

    W (x, T j(t0))fj,k(t) . (25)

    This rearrangement of B shows that the relation-ship between W and B is neither one of integral orderivative.

    If we define W in Eq. (25) by Eq. (17), Brow-nian motion is realized as an infinite dimensionalchaotic process.

    In summary, Fig. 1 shows that as g in Eqs. (3)and (17) becomes more uniform, the integral ofg approaches a straight line whose slope is themean value of g. If we use Eq. (17) to constructa mean zero Gaussian process, then this integralmust be nearly zero. The more perfectly we con-struct our mean zero Gaussian chaotic variable, themore closely its integral must approach zero. Inthe limit of this process, Brownian motion, as theintegral of white noise, must cease altogether. Ifwe choose to think of Brownian motion as the in-tegral of a Gaussian chaotic white noise processes,then we obtain the function B0(x, t), which is dif-ferentiable. As the chaotic processes become moreuniform, B0(x, t) → 0 and to the extent they areless uniform, B0(x, t) models conventional Brown-ian motion. We also find that the process B(x, t)is constructable as a chaotic process and is thusderivable from chaotic dynamics as well. However,B(x, t) and B0(x, t) only coincide in appearance fora limited range of chaotic processes.

    Thus we have the chaotic analog of Brownianmotion in Eq. (BRN) and Eq. (20) if we use Eq. (17)for our white noise process.

    We emphasize at this point that we are ledto this model of chaotic white noise by imposingthe restriction that its integral, B0(x, t), must pro-vide a credible candidate for Brownian motion ifit is approximated with a coarse numerical integra-tion technique, and that it coincides in appearancewith B(x, t) in accordance with conventional prac-tice. Once we have made a commitment to thismodel, we convey with it the implications that canbe derived from using the model. Among these im-plications is found, see Fig. 3, the phenomena ofchaotic resonance, which may be the chaotic ana-log of stochastic resonance, and will be discussed inthe next paragraph. A second phenomena, perhapsmore unexpected, is the phenomena we are callingchaotic catastrophe which is unrelated to what iscommonly known as catastrophe theory. This is dis-cussed in the section on financial markets. Chaoticcatastrophe occurs when Brownian motion transi-tions from B0 to B. A third consequence of thistheory is that if we start with a coarse model of achaotic analog of white noise, B0(x, t) is not con-stant. If we then add more chaotic noise to ourmodel to force it to approach a uniform Gaussianwhite noise, then B0(x, t) → 0 and the effects ofnoise are cancelled out, essentially by adding noiserather than removing it. Fourth, the determinationof whether the Ito or Stratonovich theory appliescan be made by determining whether the forces in-volved in a process are primarily frequency chaotic,leading to B0(x, t), or amplitude chaotic, leading toB(x, t). When both are involved, a hybrid theorymay be appropriate.

    Stochastic resonance is described as a phenom-ena whereby the presence of noise actually enhancesa signal rather than obscures it. Chaotic resonanceis the coincidental correlation that appear to berandom in occurrence, but which enable complexphenomena to reinforce one another in a mannerthat might enable the emergence of new speciesor other dynamical events which have an order tothem, i.e. are not formless and “random”. The van-ishing of B0 as noise becomes more uniform mayconnect these two ideas. When B0 exists, it is dueto a lack of uniformity in a noise process, thus theremust exist chaotic resonance. As B0 → 0 during aperiod of increasing noise, or decreasing chaotic res-onance, a signal hidden in the noise will also appearjust as the integral of a uniform process in Fig. 1converges to a constant multiplied times time. Thusincreasing noise, i.e. making it more uniform, lookslike stochastic resonance as described earlier.

  • 794 R. Brown & L. O. Chua

    Thus we see that chaos provides two mecha-nisms for what might be called emergence. First,through the presence of some level of chaotic res-onance complex order appears or emerges; secondfrom the total absence of chaotic resonance, B0 → 0and a hidden dynamical system may emerge froman increase in the level of uniformity of noise.

    The second form of emergence may have abearing on the natural neural noise in the humanbrain. For example, particular signals may selec-tively appear as indications of specific thought orother cognitive processes when the level of neuralnoise is made more uniform according to some de-riving force. Thus, cognitive processes may appearas a result of an increase in the uniformity of neu-ral noise. Also, cognitive processes may appear as aresult of a decrease in the level of uniformity of neu-ral noise as this will give rise to chaotic resonance.In both cases, a robust source of neural noise maybe the ideal medium for the generation of cogni-tive processes. Each of these processes is linked tothe other through B0. As B0 → 0 the first pro-cess is engaged, as B0 → B, the second processis engaged.

    4.1. Amplitude chaos andBrownian motion

    The graphs of the Schauder functions, Sk(t), inEq. (23) are little tents. Each function used isnonzero on an interval of length 1/2k and zero else-where. By grouping these terms by the length ofthe domain where the function is nonzero we ob-tain sums of the form


    h(tj)Sj(t) . (26)

    If, instead of tent shaped graphs, we used functionswhose graphs were like 1 − cos2(2π · 2kt) over thesame interval we would obtain a segment of a so-lution to a differential equation which has ampli-tude chaos much like Example 2, Sec. 5 of [Brown& Chua, 1998]. Solutions to such chaotic ampli-tude equations all have the same frequency but witheach segment of the solution over one period of theforcing function having a different amplitude, thesequence of amplitudes being chaotic. The situa-tion is much like the solutions of the chaotic Duffingequation, except that only the amplitude changeschaotically instead of both amplitude and frequencychanging.

    The net result of this discussion and that ofwhite noise is that classical Brownian motion andwhite noise can be realized as the superposition ofan infinite set of time-series solutions from chaoticdifferential equations having amplitude chaos, inthe case of Brownian motion, and frequency chaosin the case of white noise.

    5. Other Stochastic Processes

    In this section we construct analogs of continuous,wide-sense stationary, Poisson, and other processes.

    5.1. Continuous chaotic processes

    To this point we have constructed chaotic analogs ofstochastic processes which contain the assumed dis-continuity features of stochastic processes. In thissection, we construct continuous analogs and notethat from a measurement point of view, it may beimpossible to detect the difference between contin-uous and nowhere continuous stochastic or chaoticprocesses.

    As a special case of Eq. (17) we have

    gt(x) =∞∑n=0

    an sin(2n · 2π(x+ t)) (27)

    where we choose an · 2n > 1. This series convergesuniformly and is thus continuous, but is not differ-entiable term by term. If we only use a finite num-ber of terms, the function has all derivatives andis periodic. However, if we use a very large num-ber of terms, then it begins to resemble white noise.Also, when we use only a finite number of terms, itmust be the solution of a homogeneous linear dif-ferential equation of very high order. If we use aninfinite number of terms, it must solve, in a formalsense, a second-order hyperbolic linear partial dif-ferential equation having as a boundary condition achaotic variable. All of these features demonstratethat there may be a very fine line between periodicand nonchaotic phenomena and chaotic or stochas-tic processes. If we are given a sampling rate, itis always possible to use a large enough number ofterms to assure that the sampling rate will fail toreveal that the equation is periodic and not classicalwhite noise.

    With a small modification, the periodicity canbe abolished. For example we may use:

    gt(x) =∞∑n=0

    an sin(2.1n · 2π(x+ t)) (28)

  • Chaotic and Stochastic Processes, Chaotic Resonance, and Number Theory 795

    A further property of these processes is somethinganalogous to sensitive dependence on initial con-ditions. Ideally this property should read: Thereexists a number τ such that given a point x0 anda neighborhood of the point, Ux0 , there is anotherpoint x ∈ Ux0 with |gt(x) − gt(x0)| > τ , for almostall t. While this cannot be true for a continuousfunction, it can be true in practical circumstanceswhen the sampling rate, for a fixed choice of thean, cannot confirm or deny the continuity of thefunction. Thus, relative to a sampling rate, it canhappen that the function may as well not be con-tinuous. We conclude that relative to a given mea-suring frequency it may be impossible to decide ifa process is stochastic and discontinuous or chaoticand continuous. Further, given any measurementprocess, we may always construct a finite dimen-sional, infinitely differentiable chaotic process forwhich it is impossible to distinguish said processfrom an infinite dimensional, totally discontinuousstochastic process. For example, it is possible toconstruct a twist-and-flip map where the integralcurves are diamond shaped, and only the amplitudeis chaotic, thus allowing us to mimic Brownian mo-tion exactly with a superposition of a finite numberof twist-and-flip chaotic processes.

    5.2. Wide-sense stationaryprocesses and chaos

    Every measure-preserving mapping of a mea-sure space defines a stationary stochastic process.Conversely, every stationary processes can beformally understood as arising from measure-preserving dynamical systems. This is the subjectof ergodic theory. A close review of the proof ofthis fact in [Doob, 1953], reveals that this line ofthought, while accurate, could benefit from a moredirect example of how chaotic and stationary pro-cesses overlap.

    The most general wide-sense real-valued sta-tionary process can be put into the form

    x(t) =k∑j=1

    uj cos(2πλjt) + vj sin(2πλjt)

    where uj, vj are mutually orthogonal real randomvariables, or can be approximated arbitrarily closeby a process of this form, [Doob, 1953], where theλj depend on the process.

    If we take uj = aj sin(2π2jx), vj =

    aj cos(2π2jx), and λj = 2

    j , then we have the

    chaotic process

    g(x, t) =k∑j=1

    an sin(2π2j(x+ t))

    discussed earlier. By our choice of the sequence an,and the frequencies λj we can make this process aschaotic as we wish.

    In general, we may construct the wide-sensestationary process


    ajfj(x) exp(2πiλjt)

    as a chaotic process by choosing the fjappropriately.

    5.3. Poisson processes

    The basic construction of a Poisson process fromchaotic processes is illustrated by the followingexample:

    f(x, t) =∞∑n=0

    αnfn(x, t) (29)


    fn(x, t) = 0.5 · (1 + sgn(t− gn(x)))


    gn(x) =n∑k=1


    T is a chaotic map of the unit interval, αn, βn areconstants chosen to fit the data, and h is a properlychosen function. For example, one choice for h isexp(−λ · u)

    5.4. Martingale and Markov processes

    In this section, we will be using conventional ter-minology found in leading textbooks on stochasticprocesses, particularly [Doob, 1953].

    In order to define a Martingale we need thenotion of conditional expectation. Given twomeasurable functions, f , g on [0, 1], we define theconditional expectation of f , given that g(x) ∈ [a, b]by the average value of f over the interval g−1[a, b],and we write this as

    E(f |g(x)∈ [a, b])= 1µ(g−1([a, b]))



    where µ(·) may be thought of as Lesbegue measure.

  • 796 R. Brown & L. O. Chua

    In ergodic theory, conditional expectation is de-fined with respect to a partition of the domain of ameasurable function as follows: Let P = {Ei} be apartition of [0, 1]. Then

    E(f |P) =∑i



    f(x)dx · χEi(x) .

    The conditional expectation of f takes on the av-erage value of f over each element of the partition,Ei. If we start with a partition of the range of g andtake its inverse under g, let the granularity of thepartition increase indefinitely and apply this defini-tion, we obtain the familiar definition of conditionalexpectation found in elementary texts on probabil-ity theory.

    Given the definition of conditional expectation,a stochastic process ξt, is a martingale, if eachrandom variable has a finite mean value, and ift1 < t2 < · · · < tn < t then

    E(ξt|ξt1 , . . . , ξtn) = ξtn . (30)

    In our terminology, the random variables ξt aremeasurable functions, such as g(x, t) of Eq. (17),on [0, 1], and the implications of this condition isjust a routine statement about a series of nestedpartitions of [0, 1]. The most common example of aMartingale is the sum of a sequence of independentrandom variables having mean zero. This equatesto a sum of chaotic functions of the form g(x, tn) ofEq. (17).

    The definition of a Markov process is parallel tothat of a Martingale with the notion of conditionalprobability replacing that of conditional expecta-tion. The conditional probability of a set B given apartition, P, is defined as E(χB|P). As with a Mar-tingale, there is nothing special about this processthat is not already discussed. A sum of indepen-dent random variables, or chaotic functions, definesa Markov process.

    5.5. Levy processes

    A Levy process can be decomposed into a sumof a composite Poisson, a translation, and aBrownian motion process. As these processesare constructable from chaotic processes, there isnothing further needed to construct a Levy pro-cess from chaotic processes than has already beendiscussed.

    6. Stochastic Differential Equationsand the Stochastic Calculus

    In this section we examine chaotic analogs ofstochastic differential equations and the role of thestochastic calculus.

    6.1. Chaotic analog of stochasticdifferential equations

    In direct analogy with the theory of stochastic dy-namical systems, we have the systems, see [Karlin& Taylor, 1981],

    dY (t)

    dt= h(t, Y (t),W (t, x)) (31)

    dY (t)

    dt= h(t, Y (t)) +W (t, x) (32)

    dY (t)

    dt= −βY (t) +W (t, x) , β > 0 (33)

    dY (t)

    dt= −µ(Y (t), t) + σ(Y (t), t) dB0(t, x)


    dY (t) = −µ(Y (t), t) + σ(Y (t), t)dB(t, x) (35)

    Equation (33) is the analog of the Ornstein–Uhlenbeck equation, and Eq. (34) is the analog ofthe standard stochastic differential equation modelfor a diffusion processes. Equation (35) is neededin the theory of chaotic processes, just as in thecase of stochastic processes, to be able to handlethe chaotic process B(x, t), which is nowhere dif-ferentiable.

    For chaotic processes, the integral∫ t0f(s)dB0(s, x) (36)

    exists. This equation has the representation∫ t0f(s)W (s, x)ds (37)

    This implies that Eq. (34), as a chaotic process, canbe written as

    dY (t)

    dt= −µ(Y (t), t) + σ(Y (t), t)W (t, x) . (38)

  • Chaotic and Stochastic Processes, Chaotic Resonance, and Number Theory 797

    We may construct g of Eq. (17) so that, to a veryclose approximation,∫ 1

    0W (t, x)W (s, x)dx

    =∫ 1

    0W (t, x)2dx , for s = t , (39)

    and nearly zero elsewhere, and∫ 10W (t, x)2dx ≈ 1 . (40)

    Also, ∫ t0W (s, x)2ds ≈ c · t . (41)

    The existence and uniqueness theory forEqs. (31)–(34) can be found in [Coddington &Levinson, 1955]. In general, there exists solutionsof these equations. When the right-hand sides areLipschitz, these solutions are unique.

    Integration of stochastic differential equationsmay proceed through two separate paths, result-ing in very different solutions. The theory ofStratonovich requires that the rules of calculus musthold for these equations. The theory of Ito holdsthat Martingales should be preserved through inte-gration even if it means that the rules of calculusfail. See [Karlin & Taylor, 1981] for a completediscussion.

    The rules of calculus hold for chaotic processesthat involve B0, thus the solution of Eqs. (31)–(34)are in agreement with the Stratonovich stochas-tic calculus. However, processes involving B(x, t)are solved using the Ito theory. Thus, in the the-ory of chaotic processes, the Ito theory and theStratonovich theory are both required.

    6.2. Example: Solving chaoticanalogs of stochasticdifferential equations

    With the above results, Eq. (33), the Ornstein–Uhlenbeck process, has the same solution as if itwere treated as a stochastic ODE, with the excep-tion that the solution involves B0(x, t) instead ofB(x, t). All chaotic process equations which do notinvolve the Ito calculus, and thus involve B0 insteadof B, are solved in the same way as their stochasticanalogs. For Eq. (33) we have the solution,

    Y (x, t)=B0(x, t)−β∫ t

    0exp(−β(t−τ))B0(x, τ)dτ .


    Expectations and variances are easily computed byusing the formulae

    E[Y (x, t)] =

    ∫ 10Y (x, t)dx (43)

    E[Y (x, t)Y (x, s)] =

    ∫ 10Y (x, t)Y (x, s)dx (44)

    The Growth Equation is a good example ofthe differences between the Ito solution and theS-solution of a stochastic differential equation.Our construction must necessarily agree with theS-solution if we use B0 instead of B. We proceedas follows: We rewrite the growth equation

    dY = Y dB0 (45)



    dt= Y (x, t)


    = Y (x, t)W (x, t) . (46)

    By applying the usual rules of calculus we have

    Y (x, t) = C(x) · exp(B0(x, t)) . (47)

    The Ito solution is

    Y (x, t) = C(x) · exp(B(x, t)− t


    ). (48)

    These two solutions of Eq. (45) are quite differentdepending on the underlying dynamics.

    6.3. Chaos and the financial markets

    The existence of the chaotic processes B0(x, t) andB(x, t) lead to the possibility of processes beingformed which coincide with B0 over some timespans and with B over others. For example, thesolution of the following equation

    h(x, t) = 0.5(1 + sgn(sin(ωt))) ·B0(x, t)+ 0.5(1 − sgn(sin(ωt))) ·B(x, t) (49)

    transitions between these two processes on a peri-odic basis when ω is an integer.

    This possibility may have significant conse-quences for the financial analysis of derivativesecurities. Presently, market analysts use theBlack–Scholes option pricing model for derivative

  • 798 R. Brown & L. O. Chua

    pricing. Black–Scholes in turn assumes that the un-derlying processes are B and not B0. This leadsto the use of the Ito transformation law in thetreatment of derivative security pricing equations.However, if the relevant processes involve B0, thenerrors in speculating on derivative securities mayresult that lead to disastrous consequences. Theprospect of this happening leads us to define chaoticcatastrophe.

    Definition. A chaotic catastrophe occurs when achaotic process involving B0 transitions to one thatalso involves B, or vice versa.

    This definition allows for the situation whereB0 and B may be simultaneously present althoughthis may seem ambiguous. This is called a catastro-phe because the solution of a given problem suchas the growth equation, Eq. (45), will transitionfrom Eq. (47) to Eq. (48) without warning. Thesolution given by Eq. (47) is differentiable almosteverywhere whereas the solution given by Eq. (48)is nowhere differentiable as-well-as having a factorof exp(−t/2) not present in Eq. (47). Betting onthe solution of Eq. (45) being Eq. (48), when in factthe solution, over a short period of time, is actuallyEq. (47) could result in a significant loss.

    This observation underscores the need formethods of analysis to distinguish between dynam-ical systems involving B0 and those involving B.This need is made more clear by the fact that overcertain approximation ranges, using coarse approx-imation methods, B and B0 appear to be the same.

    7. Stochastic Chaos as aDriving Force

    Consider the chaotic process given by

    s(x, t) =∞∑n=0

    an sin(bn · (x+ t)) (50)

    where b > 1, and the an chosen so that∫ y0s(x, t)2dx ≈ c(t) · y . (51)

    If we truncate this series after any finite number ofterms, it is infinitely differentiable, and, if b is an in-teger, the truncated series is periodic while Eq. (51)is still true. That is,

    s(x, t) =N∑n=0

    an sin(bn · (x+ t)) (52)

    Fig. 5. The graph of s(0.5, t), Eq. (42), as a function of t,where T (x) = {16807 · x}, an = 0.995n , and ten terms areused.

    is not distinguishable from a stochastic process forlarge N , say N > 10. This is due to the sensitivedependence on sampling that characterized chaoticprocesses, see Fig. 5. We may use this as a drivingforce in a second-order ODE such as the Duffingequation:

    ÿ + αẏ + y3 =N∑n=0

    an sin(bn · (x+ t)) (53)

    which can be expressed as an autonomous equa-tion in three dimensions. If α = 0.05, a0 = 7.5,and an = 0 for n > 0 this equation is the well-known Duffing equation having the Ueda Japaneseattractor shown in Fig. 6(a).

    When an 6= 0, b an integer, this is in apractical sense equivalent to the Duffing equationhaving a stochastic forcing term. But clearly, thisis just a very complex periodic forcing term that re-sembles a noise process, as Fig. 5 shows. The firstreturn map for the solution of Eq. (53) exists onR2 × S1. A hyperbolic fixed point can be locatednear (1.04, 0.58) and the first return map generatesa strange attractor, Fig. 6(b). Due to monitor reso-lution limitations, this attractor does not show thefine detail seen in Fig. 6(a). However, the attractoris what we would expect to see from a system hav-ing a stochastic forcing term in that the attractor isdiffusing under the influence of the chaotic forcingprocess.

  • Chaotic and Stochastic Processes, Chaotic Resonance, and Number Theory 799



    Fig. 6. The strange attractor for Eq. (53) where the forcing term is (a) 7.5 sin(t). A hyperbolic fixed point can be foundnear (1.04, 0.595); (b) that of Fig. 5, Eq. (42). A hyperbolic fixed point can be found near (1.00067, 0.50099).

  • 800 R. Brown & L. O. Chua

    8. Filtering of Chaotic Processes

    The chaotic analog of the conventional Kalmanfilter setting is

    Y (x, tn+1) = ΦY (x, tn) +W1(x, tn) (54)

    ψn = θY (x, tn) +W2(x, tn) (55)

    where Wi are chaotic processes that are analogsof Gaussian white noise, Φ, θ are matrices, andY (x, tn) is a vector. Clearly, when these equationsare modeled on a computer, the relevant noise pro-cesses are chaotic, rather than stochastic. The ad-vantage of approaching this problem as a chaoticprocess problem is that, in reality, white noisedoes not exist. Hence, the noise processes mustdeviate from white noise in some fashion. Thesedeviations may be taken into consideration in theestimation process, thus providing additional infor-mation about the dynamics that will lead to a betterestimation formula. By assuming Gaussian whitenoise and proceeding with the use of conditionalexpectations along conventional lines, valuable in-formation may be omitted. A better approach isas follows. First, perform a spectral analysis ofthe noise. From this analysis, and the facts of thephysical environment, the likely chaotic dynamicsinvolved may be determined. As we have noted,white noise can be constructed from superpositionof a wide range of chaotic dynamics, as was the casefrom Brownian motion and Poisson processes. Eachof these constructions will have a deviation from ac-tual white noise that is peculiar to the situation. Ifwe derive the chaotic dynamics involved and usea superposition of these dynamics to formulate ourdynamical system estimation method, we will likelyobtain a better estimation process as it will be muchcloser to reality than the assumption of white noisewill provide.

    This theory does not diminish the significanceof stochastic processes. Stochastic process the-ory is appropriate when the number and varietyof chaotic dynamical systems that are present ina superposition processes are so large that thereis no practical value to be gained from consider-ing the chaotic dynamics separately. It is likelyhowever, that there are many problems that havebeen traditionally treated with stochastic processmethods that are likely better treated with chaoticprocess methods.

    9. Theory of Chaos

    9.1. Stochastic processesarise from chaos

    Every chaotic process we have discussed formallyfits the definition of a stochastic process, i.e. aone-parameter family of measurable functions. Ev-ery chaotic variable is, likewise, a random variable.However, without the aid of the definition of ran-dom selection, the reverse is not true in any usefulsense. The problem still lies with the fact that thereis no universally accepted way of defining randomselection within the framework of the foundations ofmathematics. Chaotic processes need not be sam-pled randomly, and so we avoid this issue when weare dealing with chaotic processes.

    We now define every chaotic process we havediscussed as a stochastic process, and retain a sin-gle formalism, stochastic processes, within which todiscusses these processes. This requires extendingthe boundaries of stochastic processes to includethese processes more directly. Doing this doesnot limit our formalism. Rather, it provides newavenues for exploration. Traditional stochasticprocesses is bound to consider only the evolutionof processes which are idealizations such as whitenoise, whereas the extension to chaotic processescalls for an examination of imprefect processes; ourobjective being to exploit these imprefections toimprove interpolation, filtering, and prediction ofcomplex phenomona.

    As noted in Sec. 2, a superposition of time-series of chaotic dynamical systems leads us tochaotic processes, which, based on our discussion,we will now refer to as stochastic processes. Aninteresting conclusion of our constructions is thatevery chaotic dynamical system leads us to astochastic process in precisely the manner of ourexample. In fact, each chaotic dynamical systemleads us to an array of stochastic processes thatrange from noise processes to diffusion processes.Our example using the logistic map illustrated this.In an upcoming paper we will explore the formationof stochastic processes from a wide range of chaoticsystems further. For now, we observe that given achaotic dynamical system such as is defined by theHénon map, there is a noise process and a diffussionprocess that arises from this system. Likewise wehave processes based on the Chua circuit, and theChirikov map. Appropriately enough, we will callthese stochastic processes, Hénon processes, Chua

  • Chaotic and Stochastic Processes, Chaotic Resonance, and Number Theory 801

    processes, etc. As these systems define an array ofprocesses, we will single out two for which we willhave special names: Noise processes and Diffusionprocesses. Thus, for the Chua processes, we haveChua noise, and Chua diffusion. For the Hénonprocesses, we have Hénon noise and Hénon diffu-sion, and so forth.

    The significance of these processes is that ifwhile examining a complex phenomona in a noisyenvironment, it can be determined that the under-lying noise is Chua noise, for example, there maybe some advantages obtained in the filtering andprediction of the phenomona.

    9.2. Six conjectures

    Whether or not a Theory of Chaos is possible isunanswerable at present. However, we set forth sixconjectures, based on our examples and counterex-amples in the hope that these conjectures may helplead to a theory eventually.

    Conjecture 1. The spatial and temporal complex-ity which arises from chaotic dynamics has a specialsignificance in that it is both the source of creativeprocesses and the source of disorder.

    Conjecture 2. Stochastic processes arise in na-ture from chaotic time-series in at least two ways:One is when the time-series is sampled in expo-nential time, or is time-compressed exponentially.The formation of stochastic processes also proceedsthrough the superposition of a large class chaotictime-series.

    Conjecture 3. The imprefect stochastic processesformed by chaos can correlate with noncomplex pro-cesses to create stochastic resonances.

    Conjecture 4. Chaotic time-series arise fromnonlinear interactions which can be as simple as thecomposition of two period-two nonlinear dynamicalsystems. This is demonstrated by the Henon mapand many other systems.

    Conjecture 5. Pure chaotic systems may be clas-sified into three categories: Demiurgic systems,Bernoulli systems and regenerative systems. We re-fer to the system illustrated by Example 2 of [Brown& Chua, 1997] as regenerative since, while it doesnot have a positive Lyapunov exponent, it does havea negative Lyapunov exponent that is generating

    complexity in a dimension separate from the sub-space of the negative Lyapunov exponent. Hybridsystems are combinations of these, or have subseriesof these embedded in their time series. These sub-series may be spread very thin through the seires solong as they form a true infinite subseries. All suchsystems are examples of chaos.

    Conjecture 6. A chaotic time-series may appearas a stochastic process depending on the samplingfrequency used to measure the time-series. Further,periodic processes may appear chaotic depending onthe sampling frequency.

    10. Prime Numbers in Dynamics

    This section is unrelated to the first and is includedto show that there may be a link between primenumbers and chaos.

    In this section we prove three theoremsshowing that the prime numbers can be used to ap-proximate complex sequences defined by symbolicdynamics. The first theorem is proven in detail andprovides an example of how to prove the later the-orems which are only sketched. All theorems aresufficiently simple and straight-forward that theyare likely already known, at least, informally. Wehope that we have not inadvertently omitted creditto the proper authors.

    Theorem 1. The set of rational numbers of theform p/q, where p and q are primes, p > q, is densein the interval [0, 1].

    Proof. We show that these rationals are dense inthe rationals in [0, 1].

    Let pn be the sequence of prime numbers. Fromnumber theory, see [Hasse, 1980], we know thatpn+1−pn < pγn, where 0.5 < γ < 1. Define the func-tion p+(n) as the first prime above n, and p−(n) asthe first prime below n. Let a/b be a rational num-ber in [0, 1].

    We have the estimate:

    ∣∣∣∣a · nb · n − p+(a · n)p+(b · n)∣∣∣∣ = ∣∣∣∣a · n · p+(b · n) − b · n · p+(a · n)n · b · p+(b · n)

    ∣∣∣∣a · n · p+(b · n)− p+(a · n)p+(b · n)

    ∣∣∣∣∣∣∣∣∣+p+(a · n)p+(b · n)− p+(a · n)b · n

    n · b · p+(b · n)


  • 802 R. Brown & L. O. Chua

    ≤∣∣∣∣a · n · p+(b · n)− p+(a · n)p+(b · n)n · b · p+(b · n)


    ∣∣∣∣p+(a · n)p+(b · n)− p+(a · n)b · nn · b · p+(b · n)∣∣∣∣

    ≤∣∣∣∣p+(b · n) a · n− p+(a · n)n · b · p+(b · n)


    ∣∣∣∣p+(a · n) p+(b · n)− b · nn · b · p+(b · n)∣∣∣∣

    ≤∣∣∣∣a · n− p+(a · n)n · b


    ∣∣∣∣p+(a · n) p+(b · n)− b · nn · b · p+(b · n)∣∣∣∣

    ≤∣∣∣∣a · n− p+(a · n)n · b


    ∣∣∣∣p+(b · n) p+(b · n)− b · nn · b · p+(b · n)∣∣∣∣

    since p+(b · n) ≥ p+(a · n).

    ∣∣∣∣a · n− p+(a · n)n · b∣∣∣∣+ ∣∣∣∣p+(b · n) p+(b · n)− b · nn · b · p+(b · n)


    ∣∣∣∣a · n− p+(a · n)n · b∣∣∣∣+ ∣∣∣∣p+(b · n)− b · nn · b

    ∣∣∣∣≤∣∣∣∣p+(a · n)− p−(a · n)n · b

    ∣∣∣∣+ ∣∣∣∣p+(b · n)− p−(b · n)n · b∣∣∣∣

    ≤∣∣∣∣p−(a · n)γn · b

    ∣∣∣∣+ ∣∣∣∣p−(b · n)γn · b∣∣∣∣

    ≤∣∣∣∣ (a · n)γn · b

    ∣∣∣∣+ ∣∣∣∣ (b · n)γn · b∣∣∣∣

    ≤∣∣∣∣ (b · n)γn · b

    ∣∣∣∣+ ∣∣∣∣ (b · n)γn · b∣∣∣∣

    = 2

    ∣∣∣∣ (b · n)γn · b∣∣∣∣

    = 2(b · n)γ−1 → 0

    as n→∞, since γ < 1. �

    If n is an integer and we place a decimal pointto the left of n, we obtain a number in the interval[0, 1]. For example, if n = 76532 then the number in[0, 1] we get is 0.76532. This number defines manymore fractions by considering all of its right shifts,i.e. 0.076532, 0.0076532, 0.00076532, . . . . The set ofall such numbers formed from integers in this way

    is clearly dense in [0, 1]. We may formally describethis set, N , as follows:

    For some integer n, and some integer k > 2,

    N ={x|x = n

    10[(log10 n)]+k

    }where [y] is the integer part of y.

    Using this fact we have the following less obvi-ous theorem:

    Theorem 2. The set

    P ={x|x = p

    10[(log10 p)]+k

    }where p is prime, and k ≥ 2 is dense in [0, 1].

    The proof of this theorem follows from the fol-lowing result:

    Theorem 3. Let n be any integer. Then there is aprime number whose leading digits are the same asthose of n.

    Proof of Theorem 3. Let k = n 103[log10 n]. There ex-ist a prime no further away from k than y = [k2/3].Any number less than y+k and greater than k willhave the digits of n as its first 10[log10 n]+1 digits.Since the set N is dense in [0, 1], so is P. �

    These results show that prime numbers maypossibly play a role in dynamics. First, since theratios of primes are dense in the rational numbers,demiurgic chaotic systems, [Brown & Chua, 1997],can generate complex orbits from the prime ratios.Second, since any irrational number having positivealgorithmic complexity can be approximated by theelements of P to any desired degree of accuracy, thecomplexity inherent in any finite sequence of digitsis already present in some prime number.


    We would like to thank Professor Morris Hirschfor his reading of the original manuscript and hismany suggestions for improvements which havebeen incorporated into the text. All errors that re-main are solely the responsibility of the authors.This work is supported in part by ONR GrantNo. N00014-97-1-0463.

  • Chaotic and Stochastic Processes, Chaotic Resonance, and Number Theory 803

    ReferencesBrown, R. & Chua, L. [1998] “Clarifying chaos II:

    Bernoulli chaos, zero Lyapunov exponents, andstrange attractors,” Int. J. Bifurcation and Chaos8(1), 1–32.

    Brown, R. & Chua, L. [1997] “Chaos: Generating com-plexity from simplicity,” Int. J. Bifurcation and Chaos7(11), 2427–2436.

    Coddington, E. & Levinson, N. [1955] Theory of Ordi-nary Differential Equations (McGraw-Hill, NY).

    Doob, J. [1953] Stochastic Processes (John Wiley, NY).Hasse, H. [1980] Number Theory (Springer-Verlag,

    Berlin).Karlin, S. & Taylor, H. [1975] A First Course in Stochas-

    tic Processes (Academic Press, NY).Karlin, S. & Taylor, H. [1981] A Second Course in

    Stochastic Processes (Academic Press, NY).