chapter 4 chaos in the high-dimensional nonlinear system

Post on 27-Dec-2015

232 Views

Category:

Documents

3 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Chapter 4

Chaos in the High-dimensional Nonlinear System

CONTENT

4.1 Chaos in the Neural Network 4.2 Symmetric Generalized Lorenz Str

ange Attractors 4.3 Symmetric generalized Rossler str

ange attractors

4.1.1 The emergency and develop- ment of Neural Network Theory

Artificial Neural Network has become an important aspect of scientific research in nonlinear.

The research was started early in 1943. The study of Artificial Neural Network

began in 1962, put forward by the Rosenblatt with perceptron model.

The renaissance of neural network research is in 1980s.

Neurons link model, parallel and distributed processing seems to be the only way to the brand new development of computer.

With the emergence of supercomputer, the study of Neural Networks Theory has made great achievements .

Neural network has been demonstrated potential capacity in a number of aspects.

In 1986, it was discovered there is chaotic phenomenon in the brain.

From 1920s, people began to study the neural network mix with chaos, that is to research chaos phenomenon in the brain.

4.1.2 Chaotic neural network research

1. What the role of Chaos in our brain? Babloyantz etc. think Chaos can enha

nce the resonance capacity of the brain so it can have a very extensive response to outside stimulates.

Nicolis thinks chaos is the of self-reference logic generator.

Amit recognizes chaos not only won’t hamper the study to new models, but probably only if there is no chaos, it may only deepen the previous models but not study new models with memory.

Tauda has proved that a chaotic neural network composed by low-dimensional chaotic dynamics has a strong ability to convey information from outside effectively, especially for time-varying external input information.

2. With the study of chaotic neural network, people have been trying to use it chaos to achieve some kind of functional:

Ikeguchi etc. try to use the chaotic neural network on associative memory.

Parodi et al used the chaotic behavior of the pseudo-random to create the noise model corresponds to the certain input mode, and to achieve information coding.

Nara et al using complex dynamics of non-symmetric recursive neural network for the search of complexity model, to prove that the complex orbit of the chaotic search based on the activation state space is more effective than random search, and the dynamic structure of chaotic orbit play an important role on the effectiveness of search behavior.

Yoshizawa studied the of non-monotonic and activity neural network, and shown that chaos can be used to eliminate the pseudo-memory state of network, when the network can not remember the memory rightly, it will play a chaotic behavior instead of giving the results of pseudo-memory.

3.The approach of generating chaos in neural network system:

Up to now, it has been found that it can be generated from period-doubling bifurcation, homoclinic and heteroclinic orbit.

Besides, the Rulle-Takens-Newhouse Road is also a usual way to generate chaos in neural network.

4.1.3 Physiological structure of neural network

Neurons are basic units to compose neural networks. Although the diversity and micro-structure of neurons, but from the information processing point of view, it can be seen as a basic information processing unit, composed by the cell body, axons and dendrites. As shown in Fig. 4.1:

Neural network is characterized by prominent large-scale processing units and their mutual association, although the unit is simple, because of non-linear, its behavior can be very complicated, and there is the capacity of parallel and distributed processing.

From the dynamic point of view, the brain's neural network system is the form of a large number of neurons linked by a high degree of complex super-high-dimensional nonlinear system, the main functions of the network, the forward study and feedback associative memory can both be achieved by its dynamics subsystem (the state value dynamic systems and the weight dynamic subsystems) .

These excellent functions of neural network can be achieved by the evolution of neural network dynamics.

Artificial neural network is based on the study of physiology of the human brain with the aim of achieving some certain functions by simulating the mechanism of human brain .

4.1.4 Three-feedback neural network model This section will study the three-feedback n

eural network model which has the chaotic characteristics: namely, using the criterion of Lyapunov exponent to construct strange attractors of neural network structure, then analyzing its characteristics of moving path and calculating dimc of the strange attractor, and at last seeking the application based on the research.

x x xN1 2, , ,

The input layer containing D units of , the hidden layer with N units of , and output layer containing D units of . The transform function of hidden layer is, the output of network is a linear combination of hidden layer’s outputs, that is,

4.1 4.2

y y yD1 2, , ,

x w yi ij jj

D

tanh

1 i N1 2, , ,

z s xj ji ii

N

1 j D1 2, , ,

z z zD1 2, , ,

In eq.(4.1), tanh is defined by: 4.3 With every iteration, each output is fe

d back to the corresponding input on it, it can be expressed as

4.4

tanh u e u 1 2 12

y zj j j D1 2, , ,

The strange attractor of neural network The network can output D infinite length of numeric

al sequence of, because , for this reason the author did not use instead of as independent variables’ time series to construct attractor.

Parameters are selected as follows: choose N=4, so that is the transverse and is the longitudinal coordinates, respectively. expresses the height (in the lower right corner of the map projection), is used to map a linear color palette with 16 kinds of colors; initially:

xi 11,z j xi

x1 x2x3x4

y j 0 j D1 2, , ,xi 0 5. i N1 2, , ,D16 s0 5.

Attempts to replace the transform function tanh of hidden layer in the network with other functions such as Asymmetric Logistic Maps

4.5 To construct strange attractor by the method

mentioned above. Figure 4.3 is a set of representative results:

L u u u 4 1

(a)the transformation function of hidden layer is equation (8)

(b) the transformation function of hidden layer is equation (8)

(c) the transformation function of hidden layer is equation (8)

(d) the transformation function of hidden layer is equation (8)

(e) the transformation function of hidden layer is equation (10)

(f) the transformation function of hidden layer is equation (10)

(g)the transformation function of hidden layer is equation (10)

(h) the transformation function of hidden layer is equation (10)

4.1.5 Quantitative analysis of strange attractors of neural network

According to the method put forward by Wolf et al, the author used FORTRAN programmed the calculation procedures to compute 1, to improve the computation speed of looking for the points to replace, the algorithm of quick sort is used.

In order to verify the correctness of the procedure, the authors calculated 1 of the mapping Hnon:

Mapping Hnon 4.6x ax y

y bxn n n

n n

12

1

1

Table 4.1 shows the compute results of 1 of

neural network’s strange attractors:

Table 4.1 1 of the strange attractor

Table 4.2 shows the D2 of dimc in mapping Hnon

Fig 4.4 ~ curve of mapping Hnon : 1—m=4 ; 2—m=5 。ln r lnC r

Use network to iterative 90,000 times and obtain the attractor time series of abscissa, select 2000 data points, the author show the attractor in Figure 4.3 as Figure 4.5, the ~ curve of strange attractors. the results of D2 is shown in Table 4.3.

lnC rln r

~ curve of the strange attractor of neural network

ln r lnC r

ln(r) ln(r)

(a) (b)

ln(r) ln(r)

(c) (d)

ln(r) ln(r)

(e) (f)

ln(r) ln(r)

(g) (h)

Fig. 4.5 ~ curve of the strange attractor of neural networkln(r) lnC r

Table 4.3 dimc D2 of Neural Network’s strange attractor

The size of fractal dimension reflects the extent of space, which is used by the fractal structure of the neural network strange attractors:

The larger the dimension of strange attractor, the greater space it occupies, the more dense structure and more complex systems;

In contrast, the more sparse structure, the more easy systems.

4.1.6 Conclusion In this section, the author studied the

dynamic behavior of the artificial neural network which is chaotic, and results showed that: different from the conventional neural network with the only characteristics of gradient descent, the chaotic neural network has richer dynamics characteristics and it is far away from the equilibrium point. At the same time, there exist various of attractors, not only fixed point, limit cycle and torus, but also strange attractors.

Because neural network system is a super-high-dimensional and strongly non-linear dynamical systems, despite we have gotten some success, but the current theory and the basic understanding of its dynamic behavior is still not enough, in order to develop neural network theory much more better and to apply it, it is necessary to study the problem of dynamics further.

4.2 Symmetric Generalized Lorenz Strange Attractors

The forced dissipative system discussed by Lorenz is:

4.7 Lorenz found that when Rayleigh

number over a critical value , the system will behave chaotic.

dx dt x y

dy dt xz rx y

dz dt xy bz

r2 24 74 .

4.2.1 Results and analysis

Based on Lorenz equations giving the generalized form as follows:

dx dt a x a y

dy dt a xz a x a y

dz dt a xy a z

1 2

3 4 5

6 7

Using the following formula to convert rectangular coordinates into polar coordinates:

In accordance with the following formula to convert then rectangular coordinates back to polar coordinates:

r x x x x

f y y y y S N s

min max min

min max min 2

x w r

y h r

c z z z z

p

p

05 1

0 5 1

14

. sin

. cos

min max min

Construct the strange attractor A of equation (4.8), the method is to choose an appropriate time interval:

from equ.(4.11) can obtain the non-linear differential equation

x x a x a y

y y a x z a x a y

z z a x y a z

n n n n

n n n n n n

n n n n n

1 1 2

1 3 4 5

1 6 7

x a x a y

y a x z a x a y

z a x y a z

n n n

n n n n n

n n n n

1 1 2

1 3 4 5

1 6 7

1

1

1

Theorem 4.1 Construct the strange attractor A of equation (4.8), through (4.9) and (4.10) to complete coordinate transformation, then there is

f X f X ekn

kn

ij

N S

2

k j N S 1 2 3 0 1 1, , , , , , ,

Table 4.4 Construct strange attractor of generalized Lorenz equations:

(a1) (a2)

(a3)

(a4)

(b1) (b2)

(b3)(b4)

(c) (d) (e)

(f) (g) (h)

Fig.4.6 Generalized Lorenz equations’ strange attractor

The ~ curve of generalized Lorenz’s strange attractor

ln r lnC r

ln(r) ln(r)

(a) (b)

ln(r) ln(r)

(c)(d)

ln(r) ln(r)

(e)(f)

ln(r) ln(r)

(g) (h)

Fig.4.7 The ~ curve of generalized Lorenz’s strange attractor

ln(r) lnC r

Table 4.6 Dimc D2 of generalized Lorenz’ strange attractor

图标号

嵌入维数 m

关联维数 D2

均方误差

平均值 D

2

图标号 嵌入维数 m

关联维数 D2

均方误差

平均值 D2

4.6(a)

45

0.4270.477

1.5E21.2E2

0.4520.025

4.6(e) 56

0.3100.360

3.7E23.4E2

0.3350.025

4.6(b)

45

0.3910.475

2.7E22.1E2

0.4330.042

4.6(f) 45

0.2720.328

2.2E21.5E2

0.3000.028

4.6(c)

45

0.3870.464

3.1E23.7E2

0.4260.039

4.6(g) 45

0.3180.411

2.8E22.3E2

0.3650.047

4.6(d)

56

0.2200.256

1.5E22.1E2

0.2380.018

4.6(h) 45

0.3810.424

1.9E22.7E2

0.4030.022

4.2.2 Conclusion Chaos is a complex form of motion of the

dynamic system which is obey to a decisive equation (differential or discrete form). As May said in 1976 : "a simple dynamic system may be not lead to the easy nature of dynamics."

Because chaos is generated from separation and collapse repeated, but separation and collapse is not one-to-one mapping (that is irreversible), so it only possible in non-linear. That is chaos can only appeared in non-linear systems.

The existence of chaos, not only relate to the non-linear characteristics of the system (the form of non-linear equations), but also relate to the parameters’ value in the equation. E.g. there is not chaos in the Logistic map. Therefore the existence of chaos is often relate to the bifurcation of nonlinear systems.

Because of exclusion and collapse, in chaos, the motion of the system (such as the iterative process of the representative point ) is often very sensitive to initial conditions: small differences in initial conditions is likely to cause the great differences in iterative process.

We can see that the above conclusions is also right to the chaotic solution of differential equations.

4.3   Symmetric generalized Rössler strange attractors Rössler equation is a very simple non-linear

ordinary differential equations given by Rössler in 1976, when he was studying the chemical reaction issues with Intermediate product, through appropriate scaling transformation. which give. Up to now, people have being studied the chaos generated from Rössler equation in order to enrich the chaos theory.

4.3.1 The results The generalized form based on Rössler equ

ation is given as follows:

Convert generalized Rössler equation into non-linear differential equation by the first-order differential equation algorithm.

dx dt a y a z

dy dt a x a y

dz dt a a xz a z

1 2

3 4

5 6 7

4.14

From eq.(4.15 can obtain the non-linear differential equation

x x a y a z

y y a x a y

z z a a x z a z

n n n n

n n n n

n n n n n

1 1 2

1 3 4

1 5 6 7

x x a y a z

y a x a y

z a a x z a z

n n n n

n n n

n n n n

1 1 2

1 3 4

1 5 6 7

1

1

4.15

4.16

Theorem 4.2 Construct the strange attractor A of equation (4.14), use the coordinate transformation above, when ,there is:

Theorem 4.2 explains that if ,then So strange attractors have the structure with characteristics of rotational symmetry.

f X f X ekn

kn

ij

N S

2

k j N S 1 2 3 0 1 1, , , , , , ,

AX n

X e An

ij

N S

2

w h: :11

Construct the strange attractor of generalized

Rossler equations The selection of control parameters , the

total number of sectors and the nested factor .

a a1 7~N s f

(a1) (a2)

(a3) (a4)

(b1) (b2)

(b3) (b4)

(c) (d) (e)

(f) (g) (h)

Fig.4.8 The strange attractor of generalized Rössler equations

The largest Lyapunov exponent 1 of generalized Rössler equation’s strange attractors:

The ~ curve of generalized

Rössler’s strange attractor

lnC r

lnC r

ln r lnC r

ln r ln r

lnC r

(a) (b)

lnC r

lnC r

ln r ln r

lnC r

(c) (d)

lnC r

lnC r

ln r ln r

(e) (f)

lnC r

lnC r

lnC r

ln r rln

lnC r

(g) (h)

Fig.4.9 The ~ curve of generalized Rossler’s strange attractor

ln r lnC r

table 4.9 Dimc D2 of the generalized Rössler’s strange attractor

The size of Fractal dimension reflects the space occupied by generalized Rssler’s strange attractors which has the structure of fractal :

The greater the dimension, the greater the space occupied by it, the more compact structure;

Contrariwise, the more sparse structure.

4.3.2 Conclusion In this section, we constructed the generali

zed Rössler’s strange attractors which has rotational symmetry structure, and analyzed the characteristics of strange attractors’ structure.

As can be seen, the discrete mapping and continuous flow decide by the differential equations have some common laws, which shows there is intrinsic relationship between the discrete mapping and the differential equations.

top related