learning from the past, predicting the statistics for the future, and learning an evolving system...

41
Introduction and Motivation Preliminary of Rough Paths Theory Our research Numerical Examples Conclusion Learning from the past, predicting the statistics for the future, and learning an evolving system Ni Hao Joint work with Prof. Terry Lyons 1 Department of Mathematics University of Oxford 2 Oxford-Man Institute of Quantitative Finance Stochastic analysis seminar, June 10, 2013 Ni Hao Learning from the past, predicting the statistics for the future, and

Upload: wahyu-ekowati

Post on 20-Jan-2016

11 views

Category:

Documents


0 download

DESCRIPTION

Rough Paths Theory.

TRANSCRIPT

Page 1: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Learning from the past, predicting the statisticsfor the future, and learning an evolving system

Ni HaoJoint work with Prof. Terry Lyons

1Department of MathematicsUniversity of Oxford

2Oxford-Man Institute of Quantitative Finance

Stochastic analysis seminar, June 10, 2013

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 2: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Outline

1 Introduction and Motivation

2 Preliminary of Rough Paths Theory

3 Our researchEncode a time series to its signaturesCombine Time Series Model into Expected SignatureFramework

4 Numerical ExamplesAR modelsModel Misspecification: The mixture of two AR models

5 Conclusion

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 3: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Motivation

The setting of the problem

Under the probability space (Ω,F ,P), Xtt∈[0,T ] is a E-valuedstochastic process and Y is a real valued random variable,such that there exists a function f ,

Y = f (Xtt∈[0,T ]) + ε

where E[ε|Xtt∈[0,T ]] = 0.Now we observe a sequence of data pairs Xi ,YiNi=1 and weare interested in recovering the functional relationship f fromthe data set.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 4: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Useful tools1 Rough Paths Theory2 Regression analysis

The plan of my talk1 Encode a time series into the signature of a two

dimensional path which includes the time axis.2 Use the property of the signature to turn the nonlinear

problem to the linear problem.3 Numerical examples

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 5: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

A Brief Introduction to Rough Paths Theory

A brief introduction of Rough Path Theory1 A non-linear extension of the classical theory of controlled

differential equation.2 The essential object of Rough Path Theory is the signature

of the path.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 6: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

The signature of the path

Definition (The signature of the path)Let J denote a compact interval. Let X : J → E be a continuouspath with finite p−variation for some p < 2. The signature of Xof T ((E)) defined as follows

XJ = (1,X 1J ,X

2J , ...),

where, for each n ≥ 1,X n

J =∫

u1<...<un,u1,...,un∈J dXu1 ⊗ . . .⊗ dXun . The signature of X isalso denoted by S(X ).

Notation: ρn is defined as follows:

ρn : S(X )→ (1,X 1J , . . . ,X

nJ ).

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 7: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Remarks:1 Assume that E has finite dimension d and choose a basis

(e1, ...,ed ) of E . Then it implies that for every n,ei1 ⊗ . . .⊗ eini1,...in∈1,...d is the basis of E⊗n.

2 The signature of the path X can be regarded as anon-commuting polynomials of the variable e1, ...,ed andthe coefficient of the monomial ei1 ⊗ . . .⊗ ein is∫

u1<...<un,u1,...,un∈J dX (i1)u1

. . . dX (in)un . Thus we can also write

XJ =∞∑

n=0

∑i1,...in∈1,...d

∫u1<...<un,u1,...,un∈J

dX (i1)u1

. . . dX (in)un ei1⊗. . .⊗ein

or use πI defined byπI(XJ) =

∫u1<...<un,u1,...,un∈J dX (i1)

u1. . . dX (in)

un .

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 8: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Shuffle Product Property

DefinitionWe define the set Sm,n of (m,n) shuffles to be the subset ofpermutation in the symmetric group Sm+n defined by

Sm,n = σ ∈ Sm+n : σ(1) < · · · < σ(m), σ(m + 1) < · · · < σ(m + n)

DefinitionLet I = (i1, . . . , ik1), J = (j1, . . . , jk2) and (k1, . . . , km+n) = I ∗ J.Define the shuffle product of πI and πJ , denoted by πI

πJ ,which maps from VE to T ((E)), and for every X : [0,1]→ Ebeing a continuous path of bounded variation,

(πI πJ)(S(X )) =

∑σ∈Sm,n

πkσ−1(1)

,...,kσ−1(m+n)(S(X )).

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 9: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Theorem (Shuffle product property)

For every X : [0,1]→ E being a continuous path of boundedvariation,

(πI πJ)(S(X )) = πI(S(X ))πJ(S(X ))

RemarkAny polynomial in S(X ) can be rewritten as a linear function inS(X ).

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 10: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

Outline

1 Introduction and Motivation

2 Preliminary of Rough Paths Theory

3 Our researchEncode a time series to its signaturesCombine Time Series Model into Expected SignatureFramework

4 Numerical ExamplesAR modelsModel Misspecification: The mixture of two AR models

5 Conclusion

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 11: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

Embed the time series into its signature

Embed a time series into its signature

Step1 :Embed the time series (ti , ri)ni=m into the twodimensional axis path. Let us define the two dimensional axispath, which maps [2m,2n + 1] to R+ × R, and is given asfollows:

R(s) =

tme1 + rm(s − 2m)e2, if s ∈ [2m,2m + 1)

[ti + (ti+1 − ti )(s − 2i − 1)]e1 + rie2, if s ∈ [2i + 1,2i + 2)

tie1 + [ri + (ri+1 − ri )(s − 2i − 2)]e2, if s ∈ [2i + 2,2i + 3)

where i = m,m + 1, ...,n − 1 and eii=1,2 are orthonormalbasis of R2.Step2 :Compute the signature of this transformed continuouspath and denote it by S((ti , ri)ni=m).

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 12: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

ExampleLet us consider the time series(2,2), (3,5), (4,3), (5,4), (6,6), (7,3), (8,2) and we embed itto the lattice path as shown in the following figure.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 13: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

LemmaLet X denote the signature of the time series (ti , ri)ni=1.Assume that tini=1 are known, for every i = 1, . . . ,n,

∆R = T−1A

where

A =

0!π2(X)1!π12(X)

...(n − 1)!π1...12(X)

,T =

1 1 . . . 1t1 t2 . . . tn...

.... . .

...tn−11 tn−1

2 . . . tn−1n

,

∆R :=

r1

r2 − r1...

rn − rn−1

.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 14: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

Combine linear time series models into the expectedsignature framework

Classical linear time series models:It includes (a) simple autoregressive(AR) models, (b)simple moving-average(MA) models, (c) mixedautoregressive moving-average (ARMA) models,(d)seasonal models, etc.Let rtt be a time series. The linear time series modelsattempts to capture the linear relationship between rt andthe information available prior to time t , e.g. the past pvalues rt−i , i = 1, ...,p.The foundation of time series analysis is stationarity.Roughly speaking, the stationarity of a time series saysthat the joint distribution of rt1 , ..., rtk is invariant under timechanges, for every integer k .Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 15: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

Outline

1 Introduction and Motivation

2 Preliminary of Rough Paths Theory

3 Our researchEncode a time series to its signaturesCombine Time Series Model into Expected SignatureFramework

4 Numerical ExamplesAR modelsModel Misspecification: The mixture of two AR models

5 Conclusion

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 16: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

Definition (Simple autoregressive models)

Let rt be a time series. The notation AR(p) indicates anautoregressive model of order p. The AR(p) model is definedas follows:

rt = Φ0 + Φ1rt−1 + · · ·+ Φprt−p + at

where p is a non-negative integer and at is assumed to be awhite noise with mean zero and variance σ2

a.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 17: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

Standard Estimation Method to AR models1 It is in the same form as a multiple linear regression model

with lagged values serving as the explanatory variables.2 Given the sequence of rt−p, ..., rt−1, rtt∈Z, by the

standard linear regression (e.g. least square method), wewill obtain Φipi=0 and get the estimation of the conditionalexpectation of rt given the past p values rt−i(i = 0, ...,p).

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 18: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

How does classical time series models fit in our setting?

Suppose that rtt is a time series generated by the AR(p)model or other classical time series models. The meanequation is a polynomial of the p-lagged values, and then thereexists a linear functional F such that the mean equation can bewritten as the linear functional on S(rt−ipi=1) by the shuffleproduct property of signatures.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 19: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

Recall our original problem

Y = f (Xss∈[0,T ]) + Noise = f (S(X )0,T ) + Noise

assuming that E [Noise|(S(X )0,T ] = 0. Suppose that f can bewell approximated by a polynomial. Then we assume that

Y ≈ A(S(X )0,T ) + Noise

where A is linear function.Given a sequence of samples

S(Xi)0,T ,Yi

, it is natural to use

the linear regression to estimate the function A, and obtain

E [Y |Xss∈[t−∆t ,t]] = A(Xss∈[0,T ])

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 20: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Encode a time series to its signaturesCombine Time Series Model into Expected Signature Framework

Models AR(p) Model Our ModelDependent Variable rt rtExplanatory Variable rt−1, . . . , rt−p S(rt−ipi=1)

Output E [rt |rt−1, ...rt−p] E [rt |S(rt−ipi=1)]

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 21: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Outline

1 Introduction and Motivation

2 Preliminary of Rough Paths Theory

3 Our researchEncode a time series to its signaturesCombine Time Series Model into Expected SignatureFramework

4 Numerical ExamplesAR modelsModel Misspecification: The mixture of two AR models

5 Conclusion

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 22: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Example

We generated a time series based on AR(3) model, withparameters [φ0;φ1;φ2;φ3] = [0; 0.5; 0.3; 0.2]. We use thestandard calibration methods of AR models to get theestimated model parameters[φ0; φ1; φ2; φ3] = [0.0331; 0.5052; 0.2747; 0.2222]. On the otherhand, we apply our methods to regress rt to S(rt−i3i=1).

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 23: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The Difference of the true conditional mean and estimatedmean by AR calibration for the learning set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 24: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The estimated mean by AR calibration v.s. The trueconditional mean for the learning set.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 25: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The Difference of the true conditional mean and estimatedmean by AR calibration for the backtesting set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 26: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The estimated mean by AR calibration v.s. The trueconditional mean for the backtesting set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 27: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The Difference of the true conditional mean and estimatedmean by our approach for the learning set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 28: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The estimated mean by our approach v.s. The trueconditional mean for the learning set.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 29: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The Difference of the true conditional mean and estimatedmean by our approach for the backtesting set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 30: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The estimated mean by our approach v.s. The trueconditional mean for the backtesting set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 31: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Outline

1 Introduction and Motivation

2 Preliminary of Rough Paths Theory

3 Our researchEncode a time series to its signaturesCombine Time Series Model into Expected SignatureFramework

4 Numerical ExamplesAR modelsModel Misspecification: The mixture of two AR models

5 Conclusion

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 32: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The Difference of the true conditional mean and estimatedmean by AR calibration for the learning set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 33: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The estimated mean by AR calibration v.s. The trueconditional mean for the learning set.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 34: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The Difference of the true conditional mean and estimatedmean by AR calibration for the backtesting set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 35: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The estimated mean by AR calibration v.s. The trueconditional mean for the backtesting set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 36: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The Difference of the true conditional mean and estimatedmean by our approach for the learning set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 37: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The estimated mean by our approach v.s. The trueconditional mean for the learning set.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 38: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The Difference of the true conditional mean and estimatedmean by our approach for the backtesting set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 39: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

AR modelsModel Misspecification: The mixture of two AR models

Figure: The estimated mean by our approach v.s. The trueconditional mean for the backtesting set

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 40: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Reference

Conclusion1 Our method is non-parametric and systematic.2 There is still space to improve the performance of

regression, e.g. weighted least square methods.

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system

Page 41: Learning From the Past, Predicting the Statistics for the Future, And Learning an Evolving System (Ni Hao)

Introduction and MotivationPreliminary of Rough Paths Theory

Our researchNumerical Examples

Conclusion

Reference

Reference I

Terry.J. Lyons, Michael J. Caruana, Thierry LevyDifferential Eequations Driven by Rough Paths.Springer, 2006.

Terry Lyons, Hao Ni, Daniel LevinLearning from the past, predicting the statistics for thefuture, learning an evolving system

Ni Hao Learning from the past, predicting the statistics for the future, and learning an evolving system