research article global -stability of complex-valued neural networks...

10
Research Article Global -Stability of Complex-Valued Neural Networks with Unbounded Time-Varying Delays Xiaofeng Chen, 1 Qiankun Song, 1 Xiaohui Liu, 2 and Zhenjiang Zhao 3 1 Department of Mathematics, Chongqing Jiaotong University, Chongqing 400074, China 2 School of Computer Science and Technology, Nanjing University of Science and Technology, Nanjing 210094, China 3 Department of Mathematics, Huzhou Teachers College, Huzhou 313000, China Correspondence should be addressed to Qiankun Song; [email protected] Received 4 January 2014; Revised 23 February 2014; Accepted 25 February 2014; Published 3 April 2014 Academic Editor: Derui Ding Copyright © 2014 Xiaofeng Chen et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. e complex-valued neural networks with unbounded time-varying delays are considered. By constructing appropriate Lyapunov- Krasovskii functionals, and employing the free weighting matrix method, several delay-dependent criteria for checking the global -stability of the addressed complex-valued neural networks are established in linear matrix inequality (LMI), which can be checked numerically using the effective LMI toolbox in MATLAB. Two examples with simulations are given to show the effectiveness and less conservatism of the proposed criteria. 1. Introduction e nonlinear system is everywhere in real life [14], while complex network is one of the kinds of it [57]. And neural network is one of the most important of complex networks [8, 9]. In recent years, there have been increasing research interests in analyzing the dynamic behaviors of complex- valued neural networks due to their widespread applica- tions in filtering, imaging, optoelectronics, speech synthesis, computer vision, and so on. For example, see [1016] and the references therein. In these applications, the stability of complex-valued neural networks plays a very important role. In real-valued neural networks, the activation function is usually chosen to be smooth and bounded. However, in the complex domain, according to Liouville’s theorem [17], every bounded entire function must be constant. us, if the activation function is entire and bounded in the complex domain, then it is a constant. is is not suitable. erefore, the problem of choosing activation functions is the main challenge for complex-valued neural networks. In [1820], authors considered three kinds of activation functions, respectively. In [18], authors considered a class of continuous-time recurrent neural networks with two types of activations functions and gave several sufficient conditions for existence, uniqueness, and global stability of the equilibrium point. In [19], a class of generalized discrete complex-valued neural networks were studied. e existence of a unique equilibrium pattern is discussed and a sufficient condition of global exponential stability was given. In [20], authors discuss a class of discrete-time recurrent neural networks with complex-valued linear threshold neurons. e boundedness, global attractivity, and complete stability of such networks were investigated. Some conditions for those properties were also derived. In [21], the discrete- time delayed neural networks with complex-valued linear threshold neurons were investigated, and several criteria on boundedness and global exponential stability were obtained. In [2224], authors considered different kinds of time delays in complex-valued neural networks. In [22], the boundedness and complete stability of complex-valued neural networks with time delay were studied. Some conditions to guarantee the boundedness and complete stability for the considered neural networks were derived by using the local inhibition and the energy minimization method. In [23], authors investigated the problem of the dynamic behaviors of a class of complex-valued neural networks with mixed time delays. Some sufficient conditions for assuring the existence, Hindawi Publishing Corporation Abstract and Applied Analysis Volume 2014, Article ID 263847, 9 pages http://dx.doi.org/10.1155/2014/263847

Upload: others

Post on 14-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

Research ArticleGlobal 120583-Stability of Complex-Valued Neural Networks withUnbounded Time-Varying Delays

Xiaofeng Chen1 Qiankun Song1 Xiaohui Liu2 and Zhenjiang Zhao3

1 Department of Mathematics Chongqing Jiaotong University Chongqing 400074 China2 School of Computer Science and Technology Nanjing University of Science and Technology Nanjing 210094 China3Department of Mathematics Huzhou Teachers College Huzhou 313000 China

Correspondence should be addressed to Qiankun Song qiankunsong163com

Received 4 January 2014 Revised 23 February 2014 Accepted 25 February 2014 Published 3 April 2014

Academic Editor Derui Ding

Copyright copy 2014 Xiaofeng Chen et alThis is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

The complex-valued neural networks with unbounded time-varying delays are considered By constructing appropriate Lyapunov-Krasovskii functionals and employing the free weighting matrix method several delay-dependent criteria for checking the global120583-stability of the addressed complex-valued neural networks are established in linearmatrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLAB Two examples with simulations are given to show the effectiveness andless conservatism of the proposed criteria

1 Introduction

The nonlinear system is everywhere in real life [1ndash4] whilecomplex network is one of the kinds of it [5ndash7] And neuralnetwork is one of the most important of complex networks[8 9] In recent years there have been increasing researchinterests in analyzing the dynamic behaviors of complex-valued neural networks due to their widespread applica-tions in filtering imaging optoelectronics speech synthesiscomputer vision and so on For example see [10ndash16] andthe references therein In these applications the stability ofcomplex-valued neural networks plays a very important role

In real-valued neural networks the activation functionis usually chosen to be smooth and bounded Howeverin the complex domain according to Liouvillersquos theorem[17] every bounded entire function must be constant Thusif the activation function is entire and bounded in thecomplex domain then it is a constant This is not suitableTherefore the problem of choosing activation functions isthe main challenge for complex-valued neural networksIn [18ndash20] authors considered three kinds of activationfunctions respectively In [18] authors considered a classof continuous-time recurrent neural networks with twotypes of activations functions and gave several sufficient

conditions for existence uniqueness and global stability ofthe equilibrium point In [19] a class of generalized discretecomplex-valued neural networks were studied The existenceof a unique equilibrium pattern is discussed and a sufficientcondition of global exponential stability was given In [20]authors discuss a class of discrete-time recurrent neuralnetworks with complex-valued linear threshold neuronsThe boundedness global attractivity and complete stabilityof such networks were investigated Some conditions forthose properties were also derived In [21] the discrete-time delayed neural networks with complex-valued linearthreshold neurons were investigated and several criteria onboundedness and global exponential stability were obtainedIn [22ndash24] authors considered different kinds of time delaysin complex-valued neural networks In [22] the boundednessand complete stability of complex-valued neural networkswith time delay were studied Some conditions to guaranteethe boundedness and complete stability for the consideredneural networks were derived by using the local inhibitionand the energy minimization method In [23] authorsinvestigated the problem of the dynamic behaviors of aclass of complex-valued neural networks with mixed timedelays Some sufficient conditions for assuring the existence

Hindawi Publishing CorporationAbstract and Applied AnalysisVolume 2014 Article ID 263847 9 pageshttpdxdoiorg1011552014263847

2 Abstract and Applied Analysis

uniqueness and exponential stability of the equilibriumpointof the system are derived using the vector Lyapunov functionmethod homeomorphism mapping lemma and the matrixtheory In [24] the complex-valued neural networks withboth leakage time delay and discrete time delay as well as twotypes of activation functions on time scales were consideredSeveral delay-dependent criteria for checking the globalstability of the addressed complex-valued neural networksare established in linear matrix inequality by constructingappropriate Lyapunov-Krasovskii functionals and employingthe free weighting matrix method However the obtainedresults were based on the assumption that the time delay isbounded As we know time delays occur and vary frequentlyand irregularly in many engineering systems and sometimesthey depend on the histories heavily and may be unbounded[25ndash27]

How to guarantee the desirable stability if the time delaysare unbounded Recently the authors proposed two newconcepts power stability and 120583-stability [28 29] which canbe applied to dynamical systems with unbounded time-varying delays In [28] authors considered the dynamical sys-tems with unbounded time-varying delays Two approacheswere developed to derive sufficient conditions ensuring theexistence uniqueness of the equilibrium and its global sta-bility Moreover under mild conditions authors proved thatthe dynamical systems with unbounded time-varying delayswere globally power stable In [29] authors discussed the 120583-stability of dynamical systems with unbounded time-varyingdelays and explored that under mild conditions the delayedsystem with very large time delays has a unique equilibriumwhich is 120583-stable globally In [30] authors investigatedthe global robust stability for uncertain stochastic neuralnetworks with unbounded time-varying delays and norm-bounded parameter uncertainties A new concept of globalrobust 120583-stability in themean square for neural networks wasgiven first and then by means of the linear matrix inequalityapproach stability criteria were presented To the best ofour knowledge however few authors have considered theproblem on 120583-stability of complex-valued neural networkswith unbounded time-varying delays

Motivated by the above discussions the objective of thispaper is to study the global 120583-stability of complex-valuedneural networks with unbounded time-varying delays byemploying a combination of Lyapunov-Krasovskii function-als and the free weighting matrix method The obtainedsufficient conditions donot require the existence of the partialderivatives of the activation functions and are expressedin linear matrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABTwo examples are given to show the effectiveness and lessconservatism of the proposed criteria

In the literature the usual method analyzing complex-valued neural networks is to separate it into real part andimaginary part and then to recast it into equivalent real-valued neural networks [18 24] However this separation isnot always expressible in an analytical form if the real andimaginary parts of activation functions cannot be separatedIn this paper we do not separate the complex-valued neuralnetworks and directly consider its properties on C119899

Notations The notations are quite standard Throughout thispaper let 119894 denote the imaginary unit that is 119894 = radicminus1 C119899R119898times119899 andC119898times119899 denote respectively the set of 119899-dimensionalcomplex vectors119898 times 119899 real matrices and complex matricesThe subscripts 119879 and lowast denote matrix transposition matrixcomplex conjugation and transposition respectively Forcomplex vector 119911 isin C119899 let |119911| = (|119911

1| |1199112| |119911

119899|)119879 be the

module of the vector 119911 and 119911 = radicsum119899

119896=1|119911119896|2 the norm of the

vector 119911The notation119883 ge 119884 (resp119883 gt 119884) means that119883minus119884

is positive semidefinite (resp positive definite) 120582max(119875) and120582min(119875) are defined as the largest and the smallest eigenvalueof positive definite matrix 119875 respectively Sometimes thearguments of a function or a matrix will be omitted in theanalysis when no confusion can arise

2 Problems Formulation and Preliminaries

In this section we will recall some definitions and lemmaswhich will be used in the proofs of our main results

In this paper we consider the following complex-valuedneural networks with time-varying delays

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591 (119905))) + 119867 (1)

for 119905 gt 0 where 119906(119905) = (1199061(119905) 1199062(119905) 119906

119899(119905))119879

isin C119899

is the state vector of the neural networks with 119899 neu-rons at time 119905 119862 = diag119888

1 1198882 119888

119899 isin R119899times119899 is the

self-feedback connection weight matrix with 119888119896gt 0 (119896 = 1

2 119899) 119863 = (119889119896119895)119899times119899

isin C119899times119899 and 119864 = (119890119896119895)119899times119899

isin

C119899times119899 are the connection weight matrix and the delayedconnection weight matrix respectively 119891(119906(119905)) = (119891

1(1199061(119905))

1198912(1199062(119905)) 119891

119899(119906119899(119905)))119879

isin C119899 denotes the neuron activationat time 119905 119867 = (ℎ

1 ℎ2 ℎ

119899)119879

isin C119899 is the external inputvector

The initial condition associated with neural network (1) isgiven by

119906 (119904) = 120595 (119904) 119904 isin (minusinfin 0] (2)

where 120595(119904) = (1205951(119904) 1205952(119904) 120595

119899(119904))119879 is continuous in

(minusinfin 0]As we know the activation functions play a very impor-

tant role in the study of global stability of neural networksHowever in the complex domain the activation functionscannot be both entire and bounded In this paper we willconsider the following two types of activation functions

(H1) Let 119906 = 119909 + 119894119910 119891119895(119906) can be expressed by separating

into its real and imaginary part as

119891119895(119906) = 119891

119877

119895(119909 119910) + 119894119891

119868

119895(119909 119910) (3)

for 119895 = 1 2 119899 where 119891119877119895(sdot sdot) R2 rarr R and 119891

119868

119895(sdot sdot)

R2 rarr R Suppose there exist positive constants 1205851198771119895 1205851198772119895 1205851198681119895

and 1205851198682119895such that

10038161003816100381610038161003816119891119877

119895(1199091 1199101) minus 119891119877

119895(1199092 1199102)10038161003816100381610038161003816le 1205851198771

119895

10038161003816100381610038161199091 minus 1199092

1003816100381610038161003816 + 1205851198772

119895

10038161003816100381610038161199101 minus 1199102

1003816100381610038161003816

10038161003816100381610038161003816119891119868

119895(1199091 1199101) minus 119891119868

119895(1199092 1199102)10038161003816100381610038161003816le 1205851198681

119895

10038161003816100381610038161199091 minus 1199092

1003816100381610038161003816 + 1205851198682

119895

10038161003816100381610038161199101 minus 1199102

1003816100381610038161003816

(4)

Abstract and Applied Analysis 3

for any 1199091 1199092 1199101 1199102isin R 119895 = 1 2 119899 Moreover we define

Γ = diag1205741 1205742 120574

119899 where 120574

119895= 2[(120585

119877

119895)2

+ (120585119868

)2

119895] 120585119877119895

=

max1205851198771119895 1205851198772

119895 and 120585119868

119895= max1205851198681

119895 1205851198682

119895

(H2) The real and imaginary parts of 119891119895(sdot) cannot be

separated but 119891119895(sdot) is bounded and satisfies the following

condition10038161003816100381610038161003816119891119895(1199111) minus 119891119895(1199112)10038161003816100381610038161003816le 120578119895

10038161003816100381610038161199111 minus 1199112

1003816100381610038161003816 (5)

for any 1199111 1199112isin C 119895 = 1 2 119899 where 120578

119895is a constant

Moreover we define Π = diag12057821 1205782

2 120578

2

119899

For the time-varying delays 120591(119905) we give the followingassumption

(H3) 120591(119905) is nonnegative and differentiable and satisfies120591(119905) le 120588 lt 1 where 120588 is a nonnegative constant

As usual a vector 119906 = (1199061 1199062 119906

119899)119879 is said to be an

equilibrium point of neural network (1) if it satisfies

minus119862119906 + (119863 + 119864)119891 (119906) + 119867 = 0 (6)

Throughout the paper the equilibrium of neural network(1) is assumed to exist For notational convenience wewill always shift an intended equilibrium point 119906 of neuralnetwork (1) to the origin by letting 119911(119905) = 119906(119905) minus 119906 It is easyto transform neural network (1) into the following form

(119905) = minus119862119911 (119905) + 119863119892 (119911 (119905)) + 119864119892 (119911 (119905 minus 120591 (119905)))

119911 (119904) = 120601 (119904) 119904 isin (minusinfin 0]

(7)

for 119905 gt 0 where119892(119911(sdot)) = 119892(119911(sdot)+119906)minus119892(119906) and120601(119904) = 120595(119904)minus119906Next we introduce some definitions and lemmas to be

used in the stability analysis

Definition 1 Suppose that 120583(119905) is a positive continuousfunction and satisfies 120583(119905) rarr infin as 119905 rarr infin If there existscalars119872 gt 0 and 119879 ge 0 such that

119911 (119905) le119872

120583 (119905) 119905 ge 119879 (8)

then neural network (1) is said to be 120583-stable

Definition 2 If there exist scalars 119872 gt 0 120576 gt 0 and 119879 ge 0

such that

119911 (119905) le119872

119890120576119905 119905 ge 119879 (9)

then neural network (1) is said to be exponentially stable

Definition 3 If there exist scalars 119872 gt 0 120576 gt 0 and 119879 ge 0

such that

119911 (119905) le119872

119905120576 119905 ge 119879 (10)

then neural network (1) is said to be power stable

Remark 4 It is obvious that both 119890120576119905 and 119905

120576 approaches+infin when 119905 approaches +infin Therefore neural network (1)is 120583-stable if it is exponentially stable or power stable Inother words exponential stability and power stability are twospecial cases of 120583-stability

Lemma 5 (see [22]) Given a Hermitianmatrix119876 then119876 lt 0

is equivalent to

(119876119877

119876119868

minus119876119868

119876119877) lt 0 (11)

where 119876119877 = Re (119876) and 119876119868 = Im (119876)

3 Main Results

In this section we are going to give several criteria forchecking the global 120583-stability of the complex-valued neuralnetworks (1)

Theorem 6 Assume that assumptions (H1) and (H3) holdNeural network (1) is globally120583-stable if there exist two positivedefinite Hermitian matrices 119875

1= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2

two real positive diagonal matrices 1198771gt 0 and 119877

2gt 0 two

complex matrices 1198761= 119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 a positive

differential function120583(119905) defined on [0infin) and three constants120572 ge 0 120573 gt 0 and 119879 gt 0 such that

(119905)

120583 (119905)le 120572

120583 (119905 minus 120591 (119905))

120583 (119905)ge 120573 forall119905 isin [119879infin) (12)

and the following LMI holds

(Ω119877

Ω119868

minusΩ119868

Ω119877) lt 0 (13)

where

Ω119877

=(((

(

Ω119877

11Ω119877

120 Ω

119877

14Ω119877

15

Ω119877

21Ω119877

220 Ω

119877

24Ω119877

25

0 0 Ω119877

330 0

Ω119877

41Ω119877

420 minus119877

10

Ω119877

51Ω119877

520 0 minus119877

2

)))

)

Ω119868

=(((

(

Ω119868

11Ω119868

120 Ω

119868

14Ω119868

15

Ω119868

21Ω119868

220 Ω

119868

24Ω119868

25

0 0 Ω119868

330 0

Ω119868

41Ω119868

420 0 0

Ω119868

51Ω119868

520 0 0

)))

)

(14)

in which

Ω119877

11= 2120572119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ

Ω119877

12= 119875119877

1+ (119876119877

1)119879

+ 119862119876119877

2

Ω119877

14= minus (119876

119877

1)119879

119863119877

minus (119876119868

1)119879

119863119868

Ω119877

15= minus (119876

119877

1)119879

119864119877

minus (119876119868

1)119879

119864119868

4 Abstract and Applied Analysis

Ω119877

21= 119875119877

1+ 119876119877

1+ (119876119877

2)119879

119862

Ω119877

22= 119876119877

2+ (119876119877

2)119879

Ω119877

24= minus (119876

119877

2)119879

119863119877

minus (119876119868

2)119879

119863119868

Ω119877

25= minus (119876

119877

2)119879

119864119877

minus (119876119868

2)119879

119864119868

Ω119877

33= minus 120573

2

(1 minus 120588) 119875119877

2+ 1198772Γ

Ω119877

41= minus (119863

119877

)119879

119876119877

1minus (119863119868

)119879

119876119868

1

Ω119877

42= minus (119863

119877

)119879

119876119877

2minus (119863119868

)119879

119876119868

2

Ω119877

51= minus (119864

119877

)119879

119876119877

1minus (119864119868

)119879

119876119868

1

Ω119877

52= minus (119864

119877

)119879

119876119877

2minus (119864119868

)119879

119876119868

2

Ω119868

11= 2120572119875

119868

1+ 119875119868

2+ 119862119876119868

1minus (119876119868

1)119879

119862

Ω119868

12= 119875119868

1minus (119876119868

1)119879

+ 119862119876119868

2

Ω119868

14= (119876119868

1)119879

119863119877

minus (119876119877

1)119879

119863119868

Ω119868

15= (119876119868

1)119879

119864119877

minus (119876119877

1)119879

119864119868

Ω119868

21= 119875119868

1+ 119876119868

1minus (119876119868

2)119879

119862

Ω119868

22= 119876119868

2minus (119876119868

2)119879

Ω119868

24= (119876119868

2)119879

119863119877

minus (119876119877

2)119879

119863119868

Ω119868

25= (119876119868

2)119879

119864119877

minus (119876119877

2)119879

119864119868

Ω119868

33= minus 120573

2

(1 minus 120588) 119875119868

2

Ω119868

41= (119863119868

)119879

119876119877

1minus (119863119877

)119879

119876119868

1

Ω119868

42= (119863119868

)119879

119876119877

2minus (119863119877

)119879

119876119868

2

Ω119868

51= (119864119868

)119879

119876119877

1minus (119864119877

)119879

119876119868

1

Ω119868

52= (119864119868

)119879

119876119877

2minus (119864119877

)119879

119876119868

2

(15)

Proof Consider the following Lyapunov-Krasovskii func-tional candidates

119881 (119905) = 1205832

(119905) 119911lowast

(119905) 1198751119911 (119905) + int

119905

119905minus120591(119905)

1205832

(119904) 119911lowast

(119904) 1198752119911 (119904) d119904

(16)

Calculate the derivative of 119881(119905) along the trajectories ofneural network (7) We obtain

(119905) = 2 (119905) 120583 (119905) 119911lowast

(119905) 1198751119911 (119905) + 120583

2

(119905) lowast

(119905) 1198751119911 (119905)

+ 1205832

(119905) 119911lowast

(119905) 1198751 (119905) + 120583

2

(119905) 119911lowast

(119905) 1198752119911 (119905)

minus 1205832

(119905 minus 120591 (119905)) 119911lowast

(119905 minus 120591 (119905)) 1198752119911 (119905 minus 120591 (119905))

times (1 minus 120591 (119905))

le 1205832

(119905) [2120572119911lowast

(119905) 1198751119911 (119905) +

lowast

(119905) 1198751119911 (119905)

+ 119911lowast

(119905) 1198751 (119905) + 119911

lowast

(119905) 1198752z (119905) minus 120573

2

(1 minus 120588)

times 119911lowast

(119905 minus 120591 (119905)) 1198752119911 (119905 minus 120591 (119905))] 119905 ge 119879

(17)

In deriving inequality (17) we have made use of condition(12) and assumption (H3)

From assumption (H1) we get

10038161003816100381610038161003816119892119877

119895(119909119895 119910119895)10038161003816100381610038161003816le 1205851198771

119895

10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+ 1205851198772

119895

10038161003816100381610038161003816119910119895

10038161003816100381610038161003816

10038161003816100381610038161003816119892119868

119895(119909119895 119910119895)10038161003816100381610038161003816le 1205851198681

119895

10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+ 1205851198682

119895

10038161003816100381610038161003816119910119895

10038161003816100381610038161003816

(18)

for all 119895 = 1 2 119899 where 119909119895= Re(119911

119895) 119910119895= Im(119911

119895) Hence

10038161003816100381610038161003816119892119895(119911119895(119905))

10038161003816100381610038161003816

2

le [(120585119877

119895)2

+ (120585119868

)2

119895

] (10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+10038161003816100381610038161003816119910119895

10038161003816100381610038161003816)2

le 2 [(120585119877

119895)2

+ (120585119868

)2

119895

]10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816

2

2

= 120574119895

10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816

2

(19)

where 120585119877119895= max1205851198771

119895 1205851198772

119895 120585119868119895= max1205851198681

119895 1205851198682

119895 120574119895= 2[(120585

119877

119895)2

+

(120585119868

)2

119895] 119895 = 1 2 119899 Let 119877

1= diag(119903

1 1199032 119903

119899) gt 0 It

follows from (19) that

1199031198951205832

(119905) 119892lowast

119895(119911119895(119905)) 119892119895(119911119895(119905)) minus 119903

1198951205832

(119905) 120574119895119911lowast

119895(119905) 119911119895(119905) le 0

(20)

for 119895 = 1 2 119899 Thus

1205832

(119905) 119892lowast

(119911 (119905)) 1198771119892 (119911 (119905)) minus 120583

2

(119905) 119911lowast

(119905) 1198771Γ119911 (119905) le 0 (21)

Also we can get that

1205832

(119905) 119892lowast

(119911 (119905 minus 120591 (119905))) 1198772119892 (119911 (119905 minus 120591 (119905)))

minus 1205832

(119905) 119911lowast

(119905 minus 120591 (119905)) 1198772Γ119911 (119905 minus 120591 (119905)) le 0

(22)

From (7) we have that

0 = 1205832

(119905) [ (119905) + 119862119911 (119905) minus 119863119892 (119911 (119905)) minus 119864119892 (119911 (119905 minus 120591 (119905)))]lowast

times [1198761119911 (119905) + 119876

2 (119905)] + 120583

2

(119905) [1198761119911 (119905) + 119876

2 (119905)]lowast

times [ (119905) + 119862119911 (119905) minus 119863119892 (119911 (119905)) minus 119864119892 (119911 (119905 minus 120591 (119905)))]

(23)

It follows from (17) (21) (22) and (23) that

(119905) le 1205832

(119905) 119908lowast

(119905) Ω119908 (119905) 119905 ge 119879 (24)

Abstract and Applied Analysis 5

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(25)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Γ Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21= 1198751+ 1198761+ 119876lowast

2119862 Ω33= minus1205732

(1 minus 120588)1198752+ 1198772Γ From (14)

we have thatΩ = Ω119877

+ 119894Ω119868 It follows from (13) and Lemma 5

thatΩ lt 0 Thus we get from (24) that

(119905) le 0 119905 ge 119879 (26)

which means 119881(119905) is monotonically nonincreasing for 119905 ge 119879So we have

119881 (119905) le 119881 (119879) 119905 ge 119879 (27)

From the definition of 119881(119905) in (16) we obtain that

1205832

(119905) 120582min (1198751) 119911(119905)2

le 119881 (119905) le 1198810le infin 119905 ge 119879 (28)

where 1198810= max

0le119904le119879119881(119904) It implies that

119911 (119905) le119872

120583 (119905) 119905 ge 119879 (29)

where119872 = radic1198810120582min(1198751) The proof is completed

Theorem 7 Assume that assumptions (H2) and (H3) holdNeural network (1) is globally120583-stable if there exist two positivedefinite Hermitian matrices 119875

1= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2

two real positive diagonal matrices 1198771gt 0 and 119877

2gt 0 two

complex matrices 1198761= 119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 a positive

differential function120583(119905) defined on [0infin) and three constants120572 ge 0120573 gt 0 and119879 gt 0 such that conditions (12) (13) and (14)in Theorem 6 are satisfied where Ω119877

11= 2120572119875

119877

1+ 119875119877

2+ 119862119876119877

1+

(119876119877

1)119879

119862 + 1198771Π and Ω119877

33= minus1205732

(1 minus 120588)119875119877

2+ 1198772Π

Proof From assumption (H2) we get10038161003816100381610038161003816119892119895(119911119895(119905))

10038161003816100381610038161003816le 120578119895

10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816(30)

for 119895 = 1 2 119899 Let 1198771= diag(119903

1 1199032 119903

119899) gt 0 It follows

from (30) that

1199031198951205832

(119905) 119892lowast

119895(119911119895(119905)) 119892119895(119911119895(119905)) minus 119903

1198951205832

(119905) 1205782

119895119911lowast

119895(119905) 119911119895(119905) le 0

(31)

for 119895 = 1 2 119899 Hence

1205832

(119905) 119892lowast

(119911 (119905)) 1198771119892 (119911 (119905)) minus 120583

2

(119905) 119911lowast

(119905) 1198771Π119911 (119905) le 0 (32)

Also we can get that

1205832

(119905) 119892lowast

(119911 (119905 minus 120591 (119905))) 1198772119892 (119911 (119905 minus 120591 (119905)))

minus 1205832

(119905) 119911lowast

(119905 minus 120591 (119905)) 1198772Π119911 (119905 minus 120591 (119905)) le 0

(33)

Consider the Lyapunov-Krasovskii functional (16) From(17) (23) (32) and (33) we can get that

(119905) le 1205832

(119905) 119908lowast

(119905) Ω119908 (119905) 119905 ge 119879 (34)

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(35)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Π Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21

= 1198751+ 1198761+ 119876lowast

2119862 Ω33

= minus1205732

(1 minus 120588)1198752+ 1198772Π From the

proof of Theorem 6 we know that neural network (1) is 120583-stable The proof is completed

Corollary 8 Assume that assumptions (H1) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that

conditions (13) and (14) inTheorem 6 are satisfied whereΩ11987711=

2120576119875119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811= 2120576119875

119868

1+ 119875119868

2+ 119862119876119868

1minus

(119876119868

1)119879

119862 Ω11987733= minus119890minus2120576120591

(1minus120588)119875119877

2+1198772Γ Ω11986833= minus119890minus2120576120591

(1minus120588)119875119868

2

Proof Let 120583(119905) = 119890120576119905 (119905 ge 0) then (119905)120583(119905) = 120576 120583(119905 minus

120591(119905))120583(119905) = 119890minus120576120591(119905)

ge 119890minus120576120591 Take 120572 = 120576 120573 = 119890

minus120576120591 and 119879 = 0It is obvious that condition (12) inTheorem 6 is satisfiedTheproof is completed

Corollary 9 Assume that assumptions (H1) and (H3) holdand (119905) le 120575119905(0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 6 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Γ

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Proof Let 120583(119905) = 119905120576 (119905 ge 119879) then (119905)120583(119905) = 120576119905 le 120576119879

120583(119905 minus 120591(119905))120583(119905) = (1 minus 120591(119905)119905)120576

ge (1 minus 120575)120576 Take 120572 = 120576119879

120573 = (1 minus 120575)120576 It is obvious that condition (12) inTheorem 6 is

satisfied The proof is completed

It is similar to the proof of Corollaries 8 and 9 and wehave the following results

Corollary 10 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

6 Abstract and Applied Analysis

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that conditions

(13) and (14) in Theorem 7 are satisfied where Ω11987711

= 2120576119875119877

1+

119875119877

2+119862119876119877

1+ (119876119877

1)119879

119862+1198771ΠΩ11986811= 2120576119875

119868

1+119875119868

2+119862119876119868

1minus (119876119868

1)119879

119862Ω119877

33= minus119890minus2120576120591

(1 minus 120588)119875119877

2+ 1198772Π Ω11986833= minus119890minus2120576120591

(1 minus 120588)119875119868

2

Corollary 11 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120575119905 (0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 7 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Π Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862 Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Π

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Remark 12 In [18 23] the authors studied the global stabilityof complex-valued neural networks with time-varying delaysand the activation function was supposed to be

119891119895(119911) = 119891

119877

(119909 119910) + 119894119891119868

(119909 119910) (36)

where 119911 = 119909+ 119894119910 and 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) are real-valued for all 119895 =

1 2 119899 and the partial derivatives of 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) with

respect to119909 and119910were required to exist and be continuous Inthis paper both the real parts and the imaginary parts of theactivation functions are no longer assumed to be derivable

Remark 13 In [21] the authors investigated boundedness andstability for discrete-time complex-valued neural networkswith constant delay and the activation functionwas supposedto be

119891119895(119911) = max 0Re (119911) + 119894max 0 Im (119911) (37)

Also the authors gave a complex-valued LMI criterion forchecking stability of the considered neural networks But afeasible way to solve the given complex-valued LMI is notprovided In this paper the criteria for checking stabilityof the complex-valued neural networks are established inthe real LMI which can be checked numerically using theeffective LMI toolbox in MATLAB

4 Examples

Example 1 Consider a two-neuron complex-valued neuralnetwork with constant delay

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591)) + 119867 (38)

where

119862 = (1 0

0 1) 119863 = (

1 + 119894 minus2 + 119894

1 minus 119894 minus1)

119864 = (1 + 119894 119894

minus1 + 119894 2) 119867 = (

2 minus 119894

minus2 + 119894) 120591 = 2

119891119877

119895(119906119895) =

1

32(10038161003816100381610038161003816Re (119906119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Re (119906119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

119891119868

119895(119906119895) =

1

32(10038161003816100381610038161003816Im (119906

119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Im (119906

119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

(39)

It can be verified that the activation functions 119891119895sat-

isfy assumption (H1) the time-varying delay 120591(119905) satisfiesassumption (H3) and 119863

119877

= ( 1 minus21 minus1

) 119863119868 = ( 1 1minus1 0

) 119864119877 =

( 1 0minus1 2

) 119864119868 = ( 1 11 0

) Γ = (164 0

0 164) 120588 = 0 There exists a

constant 120576 = 02 and by employing MATLAB LMI toolboxwe can find the solutions to the LMI under conditions (13)and (14) in Corollary 8 as follows

1198751= (

412118 minus68812 minus 42410119894

minus68812 + 42410119894 406762)

1198752= (

260370 minus41385 minus 07600119894

minus41385 + 07600119894 259447)

1198771= (

7859319 0

0 8570773)

1198772= (

5605160 0

0 6171000)

1198761= (

minus616337 + 11580119894 92494 + 06392119894

91464 + 01434119894 minus614512 minus 13717119894)

1198762= (

minus631096 minus 05628119894 102662 + 27895119894

101047 minus 33169119894 minus620592 + 06291119894)

(40)

Then all conditions in Corollary 8 are satisfied Therefore theneural network (38) is globally exponentially stable Figures1 and 2 depict the real and imaginary parts of states of theconsidered neural network (38) where initial condition is1199111(119905) = 05 119911

2(119905) = minus1 minus 05119894

Example 2 Consider a two-neuron complex-valued neuralnetwork with unbounded time-varying delays

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591 (119905))) + 119867 (41)

where

119862 = (1 0

0 1) 119863 = (

15 + 3119894 minus1 + 2119894

minus2 minus 2119894 25 + 05119894)

119864 = (05 + 119894 119894

minus2 + 05119894 1 + 05119894) 119867 = (

2 + 119894

minus2 + 119894)

120591 (119905) = 02119905

1198911(1199061) =

1

20(10038161003816100381610038161199061 + 1

1003816100381610038161003816 minus10038161003816100381610038161199061 minus 1

1003816100381610038161003816)

1198912(1199062) =

1

20(10038161003816100381610038161199062 + 1

1003816100381610038161003816 minus10038161003816100381610038161199062 minus 1

1003816100381610038161003816)

(42)

It can be verified that the activation functions 119891119895satisfy

Condition (H2) the time-varying delay 120591(119905) satisfies assump-tion (H3) and 119863

119877

= ( 15 minus1minus2 25

) 119863119868 = ( 3 2minus2 05

) 119864119877 = ( 05 0minus2 1

)

Abstract and Applied Analysis 7

0 10 20 30 40minus3

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 1 State trajectories of neural network (38)

minus1

minus05

0

05

1

15

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 2 State trajectories of neural network (38)

119864119868

= ( 1 105 05

) Π = ( 001 00 001

) 120588 = 02 There exist twoconstants 120576 = 1 and 119879 = 10 and by employing MATLABLMI toolbox we can find the solutions to the LMI underconditions (13) and (14) in Corollary 11 as follows

1198751= (

398524 147299 minus 32336119894

147299 + 32336119894 315075)

1198752= (

239471 97114 minus 13426119894

97114 + 13426119894 188527)

1198771= 103

times (13534 0

0 10135)

1198772= (

5846278 0

0 4293048)

0 10 20 30 40

minus3

minus4

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 3 State trajectories of neural network (41)

minus02

0

02

04

06

08

1

12

14

16

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 4 State trajectories of neural network (41)

1198761= (

minus518770 minus 15115119894 minus167003 + 06185119894

minus168517 minus 09537119894 minus407204 + 10687119894)

1198762= (

minus604708 + 02332119894 minus212500 + 36232119894

minus216247 minus 35095119894 minus482459 minus 01731119894)

(43)

Then all conditions in Corollary 11 are satisfied Thus theneural network (41) is globally power stable Figures 3 and 4depict the real and imaginary parts of states of the considerednetwork (41) where initial condition is 119909

1(119905) = 05 sin(05119905)119894

1199092(119905) = minus05 + 15 cos(03119905)119894

8 Abstract and Applied Analysis

5 Conclusion

In this paper the global 120583-stability of complex-valued neuralnetworks with unbounded time delays has been investi-gated by employing a combination of Lyapunov-Krasovskiifunctionals and the free weighting matrix method Severalsufficient conditions for checking the global 120583-stability ofthe considered complex-valued neural networks have beengiven in linearmatrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABAs direct applications we can get the exponential stabilityand power stability with respect to different time-varyingdelays Two examples have been provided to demonstrate theeffectiveness and less conservatism of the proposed criteria

Remark 14 Note that the activation functions in this paperare continuous Recently neural networks with discontinu-ous activations have received more and more attention [31ndash34] Based on the results of this paper we will consider 120583-stability of complex-valued neural networks with discontin-uous activations in future

Conflict of Interests

The authors declare that they have no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China under Grants 61273021 and 11172247and in part by the Natural Science Foundation Project of CQcstc2013jjB40008

References

[1] H Dong Z Wang and H Gao ldquoDistributed 119867infin

filteringfor a class of Markovian jump nonlinear time-delay systemsover lossy sensor networksrdquo IEEE Transactions on IndustrialElectronics vol 60 no 10 pp 4665ndash4672 2013

[2] J Hu Z Wang B Shen and H Gao ldquoQuantised recursivefiltering for a class of nonlinear systems with multiplicativenoises and missing measurementsrdquo International Journal ofControl vol 86 no 4 pp 650ndash663 2013

[3] Y Fang K Yan and K Li ldquoImpulsive synchronization of a classof chaotic systemsrdquo Systems Science and Control Engineeringvol 2 no 1 pp 55ndash60 2014

[4] M Darouach and M Chadli ldquoAdmissibility and control ofswitched discrete-time singular systemsrdquo Systems Science andControl Engineering vol 1 no 1 pp 43ndash51 2013

[5] B Shen ZWang D Ding andH Shu ldquo119867infinstate estimation for

complex networks with uncertain inner coupling and incom-plete measurementsrdquo IEEE Transactions on Neural Networksand Learning Systems vol 24 no 12 pp 2027ndash2037 2013

[6] B Shen Z Wang and X Liu ldquoSampled-data synchronizationcontrol of dynamical networks with stochastic samplingrdquo IEEETransactions on Automatic Control vol 57 no 10 pp 2644ndash2650 2012

[7] J Liang Z Wang Y Liu and X Liu ldquoState estimation for two-dimensional complex networks with randomly occurring non-linearities and randomly varying sensor delaysrdquo InternationalJournal of Robust and Nonlinear Control vol 24 no 1 pp 18ndash38 2014

[8] X Kan Z Wang and H Shu ldquoState estimation for discrete-time delayed neural networks with fractional uncertainties andsensor saturationsrdquo Neurocomputing vol 117 pp 64ndash71 2013

[9] Z Wang B Zineddin J Liang et al ldquoA novel neural networkapproach to cDNAmicroarray image segmentationrdquo ComputerMethods and Programs in Biomedicine vol 111 no 1 pp 189ndash198 2013

[10] A Hirose Complex-Valued Neural Networks Springer Heidel-berg GmbH and Co K Berlin Germany 2012

[11] G Tanaka and K Aihara ldquoComplex-valued multistate associa-tive memory with nonlinear multilevel functions for gray-levelimage reconstructionrdquo IEEE Transactions on Neural Networksvol 20 no 9 pp 1463ndash1473 2009

[12] S Chen L Hanzo and S Tan ldquoSymmetric complex-valued RBFreceiver for multiple-antenna-aided wireless systemsrdquo IEEEtransactions on Neural Networks vol 19 no 9 pp 1659ndash16652008

[13] I Cha and S A Kassam ldquoChannel equalization using adaptivecomplex radial basis function networksrdquo IEEE Journal onSelected Areas in Communications vol 13 no 1 pp 122ndash1311995

[14] T Nitta ldquoOrthogonality of decision boundaries in complex-valued neural networksrdquo Neural Computation vol 16 no 1 pp73ndash97 2004

[15] M Faijul Amin and K Murase ldquoSingle-layered complex-valued neural network for real-valued classification problemsrdquoNeurocomputing vol 72 no 4ndash6 pp 945ndash955 2009

[16] B K Tripathi and P K Kalra ldquoOn efficient learning machinewith root-power mean neuron in complex domainrdquo IEEETransactions on Neural Networks vol 22 no 5 pp 727ndash7382011

[17] J H Mathews and R W Howell Complex Analysis for Math-ematics and Engineering Jones and Bartlett Publications 3rdedition 1997

[18] J Hu and JWang ldquoGlobal stability of complex-valued recurrentneural networks with time-delaysrdquo IEEETransactions onNeuralNetworks vol 23 no 6 pp 853ndash865 2012

[19] V S H Rao and G R Murthy ldquoGlobal dynamics of a classof complex valued neural networksrdquo International Journal ofNeural Systems vol 18 no 2 pp 165ndash171 2008

[20] W Zhou and J M Zurada ldquoDiscrete-time recurrent neuralnetworks with complex-valued linear threshold neuronsrdquo IEEETransactions on Circuits and Systems II Express Briefs vol 56no 8 pp 669ndash673 2009

[21] C Duan and Q Song ldquoBoundedness and stability for discrete-time delayed neural network with complex-valued linearthreshold neuronsrdquo Discrete Dynamics in Nature and Societyvol 2010 Article ID 368379 19 pages 2010

[22] B Zou and Q Song ldquoBoundedness and complete stabilityof complex-valued neural networks with time delayrdquo IEEETransactions on Neural Networks and Learning Systems vol 24no 8 pp 1227ndash1238 2013

[23] X Xu J Zhang and J Shi ldquoExponential stability of complex-valued neural networks with mixed delaysrdquo Neurocomputingvol 128 pp 483ndash490 2014

Abstract and Applied Analysis 9

[24] X Chen and Q Song ldquoGlobal stability of complex-valuedneural networks with both leakage time delay and discrete timedelay on time scalesrdquo Neurocomputing vol 121 pp 254ndash2642013

[25] S-I Niculescu Delay Effects on Stability A Robust ControlApproach Springer London UK 2001

[26] V B Kolmanovskiı and V R Nosov Stability of Functional-Differential Equations Academic Press London UK 1986

[27] Y Liu Z Wang J Liang and X Liu ldquoSynchronization ofcoupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delaysrdquo IEEETransactions on Cybernetics vol 43 no 1 pp 102ndash114 2013

[28] T Chen and L Wang ldquoPower-rate global stability of dynamicalsystems with unbounded time-varying delaysrdquo IEEE Transac-tions on Circuits and Systems II Express Briefs vol 54 no 8 pp705ndash709 2007

[29] T Chen and L Wang ldquoGlobal 120583-stability of delayed neural net-works with unbounded time-varying delaysrdquo IEEETransactionson Neural Networks vol 18 no 6 pp 1836ndash1840 2007

[30] X Liu and T Chen ldquoRobust 120583-stability for uncertain stochasticneural networks with unbounded time-varying delaysrdquo PhysicaA Statistical Mechanics and Its Applications vol 387 no 12 pp2952ndash2962 2008

[31] X Liu and J Cao ldquoOn periodic solutions of neural networksvia differential inclusionsrdquo Neural Networks vol 22 no 4 pp329ndash334 2009

[32] X Liu and J Cao ldquoRobust state estimation for neural networkswith discontinuous activationsrdquo IEEE Transactions on SystemsMan and Cybernetics B Cybernetics vol 40 no 6 pp 1425ndash1437 2010

[33] X Liu T Chen J Cao and W Lu ldquoDissipativity and quasi-synchronization for neural networks with discontinuous acti-vations and parameter mismatchesrdquo Neural Networks vol 24no 10 pp 1013ndash1021 2011

[34] X Liu J Cao and W Yu ldquoFilippov systems and quasi-sync-hronization control for switched networksrdquo Chaos vol 22 no3 Article ID 033110 2012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 2: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

2 Abstract and Applied Analysis

uniqueness and exponential stability of the equilibriumpointof the system are derived using the vector Lyapunov functionmethod homeomorphism mapping lemma and the matrixtheory In [24] the complex-valued neural networks withboth leakage time delay and discrete time delay as well as twotypes of activation functions on time scales were consideredSeveral delay-dependent criteria for checking the globalstability of the addressed complex-valued neural networksare established in linear matrix inequality by constructingappropriate Lyapunov-Krasovskii functionals and employingthe free weighting matrix method However the obtainedresults were based on the assumption that the time delay isbounded As we know time delays occur and vary frequentlyand irregularly in many engineering systems and sometimesthey depend on the histories heavily and may be unbounded[25ndash27]

How to guarantee the desirable stability if the time delaysare unbounded Recently the authors proposed two newconcepts power stability and 120583-stability [28 29] which canbe applied to dynamical systems with unbounded time-varying delays In [28] authors considered the dynamical sys-tems with unbounded time-varying delays Two approacheswere developed to derive sufficient conditions ensuring theexistence uniqueness of the equilibrium and its global sta-bility Moreover under mild conditions authors proved thatthe dynamical systems with unbounded time-varying delayswere globally power stable In [29] authors discussed the 120583-stability of dynamical systems with unbounded time-varyingdelays and explored that under mild conditions the delayedsystem with very large time delays has a unique equilibriumwhich is 120583-stable globally In [30] authors investigatedthe global robust stability for uncertain stochastic neuralnetworks with unbounded time-varying delays and norm-bounded parameter uncertainties A new concept of globalrobust 120583-stability in themean square for neural networks wasgiven first and then by means of the linear matrix inequalityapproach stability criteria were presented To the best ofour knowledge however few authors have considered theproblem on 120583-stability of complex-valued neural networkswith unbounded time-varying delays

Motivated by the above discussions the objective of thispaper is to study the global 120583-stability of complex-valuedneural networks with unbounded time-varying delays byemploying a combination of Lyapunov-Krasovskii function-als and the free weighting matrix method The obtainedsufficient conditions donot require the existence of the partialderivatives of the activation functions and are expressedin linear matrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABTwo examples are given to show the effectiveness and lessconservatism of the proposed criteria

In the literature the usual method analyzing complex-valued neural networks is to separate it into real part andimaginary part and then to recast it into equivalent real-valued neural networks [18 24] However this separation isnot always expressible in an analytical form if the real andimaginary parts of activation functions cannot be separatedIn this paper we do not separate the complex-valued neuralnetworks and directly consider its properties on C119899

Notations The notations are quite standard Throughout thispaper let 119894 denote the imaginary unit that is 119894 = radicminus1 C119899R119898times119899 andC119898times119899 denote respectively the set of 119899-dimensionalcomplex vectors119898 times 119899 real matrices and complex matricesThe subscripts 119879 and lowast denote matrix transposition matrixcomplex conjugation and transposition respectively Forcomplex vector 119911 isin C119899 let |119911| = (|119911

1| |1199112| |119911

119899|)119879 be the

module of the vector 119911 and 119911 = radicsum119899

119896=1|119911119896|2 the norm of the

vector 119911The notation119883 ge 119884 (resp119883 gt 119884) means that119883minus119884

is positive semidefinite (resp positive definite) 120582max(119875) and120582min(119875) are defined as the largest and the smallest eigenvalueof positive definite matrix 119875 respectively Sometimes thearguments of a function or a matrix will be omitted in theanalysis when no confusion can arise

2 Problems Formulation and Preliminaries

In this section we will recall some definitions and lemmaswhich will be used in the proofs of our main results

In this paper we consider the following complex-valuedneural networks with time-varying delays

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591 (119905))) + 119867 (1)

for 119905 gt 0 where 119906(119905) = (1199061(119905) 1199062(119905) 119906

119899(119905))119879

isin C119899

is the state vector of the neural networks with 119899 neu-rons at time 119905 119862 = diag119888

1 1198882 119888

119899 isin R119899times119899 is the

self-feedback connection weight matrix with 119888119896gt 0 (119896 = 1

2 119899) 119863 = (119889119896119895)119899times119899

isin C119899times119899 and 119864 = (119890119896119895)119899times119899

isin

C119899times119899 are the connection weight matrix and the delayedconnection weight matrix respectively 119891(119906(119905)) = (119891

1(1199061(119905))

1198912(1199062(119905)) 119891

119899(119906119899(119905)))119879

isin C119899 denotes the neuron activationat time 119905 119867 = (ℎ

1 ℎ2 ℎ

119899)119879

isin C119899 is the external inputvector

The initial condition associated with neural network (1) isgiven by

119906 (119904) = 120595 (119904) 119904 isin (minusinfin 0] (2)

where 120595(119904) = (1205951(119904) 1205952(119904) 120595

119899(119904))119879 is continuous in

(minusinfin 0]As we know the activation functions play a very impor-

tant role in the study of global stability of neural networksHowever in the complex domain the activation functionscannot be both entire and bounded In this paper we willconsider the following two types of activation functions

(H1) Let 119906 = 119909 + 119894119910 119891119895(119906) can be expressed by separating

into its real and imaginary part as

119891119895(119906) = 119891

119877

119895(119909 119910) + 119894119891

119868

119895(119909 119910) (3)

for 119895 = 1 2 119899 where 119891119877119895(sdot sdot) R2 rarr R and 119891

119868

119895(sdot sdot)

R2 rarr R Suppose there exist positive constants 1205851198771119895 1205851198772119895 1205851198681119895

and 1205851198682119895such that

10038161003816100381610038161003816119891119877

119895(1199091 1199101) minus 119891119877

119895(1199092 1199102)10038161003816100381610038161003816le 1205851198771

119895

10038161003816100381610038161199091 minus 1199092

1003816100381610038161003816 + 1205851198772

119895

10038161003816100381610038161199101 minus 1199102

1003816100381610038161003816

10038161003816100381610038161003816119891119868

119895(1199091 1199101) minus 119891119868

119895(1199092 1199102)10038161003816100381610038161003816le 1205851198681

119895

10038161003816100381610038161199091 minus 1199092

1003816100381610038161003816 + 1205851198682

119895

10038161003816100381610038161199101 minus 1199102

1003816100381610038161003816

(4)

Abstract and Applied Analysis 3

for any 1199091 1199092 1199101 1199102isin R 119895 = 1 2 119899 Moreover we define

Γ = diag1205741 1205742 120574

119899 where 120574

119895= 2[(120585

119877

119895)2

+ (120585119868

)2

119895] 120585119877119895

=

max1205851198771119895 1205851198772

119895 and 120585119868

119895= max1205851198681

119895 1205851198682

119895

(H2) The real and imaginary parts of 119891119895(sdot) cannot be

separated but 119891119895(sdot) is bounded and satisfies the following

condition10038161003816100381610038161003816119891119895(1199111) minus 119891119895(1199112)10038161003816100381610038161003816le 120578119895

10038161003816100381610038161199111 minus 1199112

1003816100381610038161003816 (5)

for any 1199111 1199112isin C 119895 = 1 2 119899 where 120578

119895is a constant

Moreover we define Π = diag12057821 1205782

2 120578

2

119899

For the time-varying delays 120591(119905) we give the followingassumption

(H3) 120591(119905) is nonnegative and differentiable and satisfies120591(119905) le 120588 lt 1 where 120588 is a nonnegative constant

As usual a vector 119906 = (1199061 1199062 119906

119899)119879 is said to be an

equilibrium point of neural network (1) if it satisfies

minus119862119906 + (119863 + 119864)119891 (119906) + 119867 = 0 (6)

Throughout the paper the equilibrium of neural network(1) is assumed to exist For notational convenience wewill always shift an intended equilibrium point 119906 of neuralnetwork (1) to the origin by letting 119911(119905) = 119906(119905) minus 119906 It is easyto transform neural network (1) into the following form

(119905) = minus119862119911 (119905) + 119863119892 (119911 (119905)) + 119864119892 (119911 (119905 minus 120591 (119905)))

119911 (119904) = 120601 (119904) 119904 isin (minusinfin 0]

(7)

for 119905 gt 0 where119892(119911(sdot)) = 119892(119911(sdot)+119906)minus119892(119906) and120601(119904) = 120595(119904)minus119906Next we introduce some definitions and lemmas to be

used in the stability analysis

Definition 1 Suppose that 120583(119905) is a positive continuousfunction and satisfies 120583(119905) rarr infin as 119905 rarr infin If there existscalars119872 gt 0 and 119879 ge 0 such that

119911 (119905) le119872

120583 (119905) 119905 ge 119879 (8)

then neural network (1) is said to be 120583-stable

Definition 2 If there exist scalars 119872 gt 0 120576 gt 0 and 119879 ge 0

such that

119911 (119905) le119872

119890120576119905 119905 ge 119879 (9)

then neural network (1) is said to be exponentially stable

Definition 3 If there exist scalars 119872 gt 0 120576 gt 0 and 119879 ge 0

such that

119911 (119905) le119872

119905120576 119905 ge 119879 (10)

then neural network (1) is said to be power stable

Remark 4 It is obvious that both 119890120576119905 and 119905

120576 approaches+infin when 119905 approaches +infin Therefore neural network (1)is 120583-stable if it is exponentially stable or power stable Inother words exponential stability and power stability are twospecial cases of 120583-stability

Lemma 5 (see [22]) Given a Hermitianmatrix119876 then119876 lt 0

is equivalent to

(119876119877

119876119868

minus119876119868

119876119877) lt 0 (11)

where 119876119877 = Re (119876) and 119876119868 = Im (119876)

3 Main Results

In this section we are going to give several criteria forchecking the global 120583-stability of the complex-valued neuralnetworks (1)

Theorem 6 Assume that assumptions (H1) and (H3) holdNeural network (1) is globally120583-stable if there exist two positivedefinite Hermitian matrices 119875

1= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2

two real positive diagonal matrices 1198771gt 0 and 119877

2gt 0 two

complex matrices 1198761= 119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 a positive

differential function120583(119905) defined on [0infin) and three constants120572 ge 0 120573 gt 0 and 119879 gt 0 such that

(119905)

120583 (119905)le 120572

120583 (119905 minus 120591 (119905))

120583 (119905)ge 120573 forall119905 isin [119879infin) (12)

and the following LMI holds

(Ω119877

Ω119868

minusΩ119868

Ω119877) lt 0 (13)

where

Ω119877

=(((

(

Ω119877

11Ω119877

120 Ω

119877

14Ω119877

15

Ω119877

21Ω119877

220 Ω

119877

24Ω119877

25

0 0 Ω119877

330 0

Ω119877

41Ω119877

420 minus119877

10

Ω119877

51Ω119877

520 0 minus119877

2

)))

)

Ω119868

=(((

(

Ω119868

11Ω119868

120 Ω

119868

14Ω119868

15

Ω119868

21Ω119868

220 Ω

119868

24Ω119868

25

0 0 Ω119868

330 0

Ω119868

41Ω119868

420 0 0

Ω119868

51Ω119868

520 0 0

)))

)

(14)

in which

Ω119877

11= 2120572119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ

Ω119877

12= 119875119877

1+ (119876119877

1)119879

+ 119862119876119877

2

Ω119877

14= minus (119876

119877

1)119879

119863119877

minus (119876119868

1)119879

119863119868

Ω119877

15= minus (119876

119877

1)119879

119864119877

minus (119876119868

1)119879

119864119868

4 Abstract and Applied Analysis

Ω119877

21= 119875119877

1+ 119876119877

1+ (119876119877

2)119879

119862

Ω119877

22= 119876119877

2+ (119876119877

2)119879

Ω119877

24= minus (119876

119877

2)119879

119863119877

minus (119876119868

2)119879

119863119868

Ω119877

25= minus (119876

119877

2)119879

119864119877

minus (119876119868

2)119879

119864119868

Ω119877

33= minus 120573

2

(1 minus 120588) 119875119877

2+ 1198772Γ

Ω119877

41= minus (119863

119877

)119879

119876119877

1minus (119863119868

)119879

119876119868

1

Ω119877

42= minus (119863

119877

)119879

119876119877

2minus (119863119868

)119879

119876119868

2

Ω119877

51= minus (119864

119877

)119879

119876119877

1minus (119864119868

)119879

119876119868

1

Ω119877

52= minus (119864

119877

)119879

119876119877

2minus (119864119868

)119879

119876119868

2

Ω119868

11= 2120572119875

119868

1+ 119875119868

2+ 119862119876119868

1minus (119876119868

1)119879

119862

Ω119868

12= 119875119868

1minus (119876119868

1)119879

+ 119862119876119868

2

Ω119868

14= (119876119868

1)119879

119863119877

minus (119876119877

1)119879

119863119868

Ω119868

15= (119876119868

1)119879

119864119877

minus (119876119877

1)119879

119864119868

Ω119868

21= 119875119868

1+ 119876119868

1minus (119876119868

2)119879

119862

Ω119868

22= 119876119868

2minus (119876119868

2)119879

Ω119868

24= (119876119868

2)119879

119863119877

minus (119876119877

2)119879

119863119868

Ω119868

25= (119876119868

2)119879

119864119877

minus (119876119877

2)119879

119864119868

Ω119868

33= minus 120573

2

(1 minus 120588) 119875119868

2

Ω119868

41= (119863119868

)119879

119876119877

1minus (119863119877

)119879

119876119868

1

Ω119868

42= (119863119868

)119879

119876119877

2minus (119863119877

)119879

119876119868

2

Ω119868

51= (119864119868

)119879

119876119877

1minus (119864119877

)119879

119876119868

1

Ω119868

52= (119864119868

)119879

119876119877

2minus (119864119877

)119879

119876119868

2

(15)

Proof Consider the following Lyapunov-Krasovskii func-tional candidates

119881 (119905) = 1205832

(119905) 119911lowast

(119905) 1198751119911 (119905) + int

119905

119905minus120591(119905)

1205832

(119904) 119911lowast

(119904) 1198752119911 (119904) d119904

(16)

Calculate the derivative of 119881(119905) along the trajectories ofneural network (7) We obtain

(119905) = 2 (119905) 120583 (119905) 119911lowast

(119905) 1198751119911 (119905) + 120583

2

(119905) lowast

(119905) 1198751119911 (119905)

+ 1205832

(119905) 119911lowast

(119905) 1198751 (119905) + 120583

2

(119905) 119911lowast

(119905) 1198752119911 (119905)

minus 1205832

(119905 minus 120591 (119905)) 119911lowast

(119905 minus 120591 (119905)) 1198752119911 (119905 minus 120591 (119905))

times (1 minus 120591 (119905))

le 1205832

(119905) [2120572119911lowast

(119905) 1198751119911 (119905) +

lowast

(119905) 1198751119911 (119905)

+ 119911lowast

(119905) 1198751 (119905) + 119911

lowast

(119905) 1198752z (119905) minus 120573

2

(1 minus 120588)

times 119911lowast

(119905 minus 120591 (119905)) 1198752119911 (119905 minus 120591 (119905))] 119905 ge 119879

(17)

In deriving inequality (17) we have made use of condition(12) and assumption (H3)

From assumption (H1) we get

10038161003816100381610038161003816119892119877

119895(119909119895 119910119895)10038161003816100381610038161003816le 1205851198771

119895

10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+ 1205851198772

119895

10038161003816100381610038161003816119910119895

10038161003816100381610038161003816

10038161003816100381610038161003816119892119868

119895(119909119895 119910119895)10038161003816100381610038161003816le 1205851198681

119895

10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+ 1205851198682

119895

10038161003816100381610038161003816119910119895

10038161003816100381610038161003816

(18)

for all 119895 = 1 2 119899 where 119909119895= Re(119911

119895) 119910119895= Im(119911

119895) Hence

10038161003816100381610038161003816119892119895(119911119895(119905))

10038161003816100381610038161003816

2

le [(120585119877

119895)2

+ (120585119868

)2

119895

] (10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+10038161003816100381610038161003816119910119895

10038161003816100381610038161003816)2

le 2 [(120585119877

119895)2

+ (120585119868

)2

119895

]10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816

2

2

= 120574119895

10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816

2

(19)

where 120585119877119895= max1205851198771

119895 1205851198772

119895 120585119868119895= max1205851198681

119895 1205851198682

119895 120574119895= 2[(120585

119877

119895)2

+

(120585119868

)2

119895] 119895 = 1 2 119899 Let 119877

1= diag(119903

1 1199032 119903

119899) gt 0 It

follows from (19) that

1199031198951205832

(119905) 119892lowast

119895(119911119895(119905)) 119892119895(119911119895(119905)) minus 119903

1198951205832

(119905) 120574119895119911lowast

119895(119905) 119911119895(119905) le 0

(20)

for 119895 = 1 2 119899 Thus

1205832

(119905) 119892lowast

(119911 (119905)) 1198771119892 (119911 (119905)) minus 120583

2

(119905) 119911lowast

(119905) 1198771Γ119911 (119905) le 0 (21)

Also we can get that

1205832

(119905) 119892lowast

(119911 (119905 minus 120591 (119905))) 1198772119892 (119911 (119905 minus 120591 (119905)))

minus 1205832

(119905) 119911lowast

(119905 minus 120591 (119905)) 1198772Γ119911 (119905 minus 120591 (119905)) le 0

(22)

From (7) we have that

0 = 1205832

(119905) [ (119905) + 119862119911 (119905) minus 119863119892 (119911 (119905)) minus 119864119892 (119911 (119905 minus 120591 (119905)))]lowast

times [1198761119911 (119905) + 119876

2 (119905)] + 120583

2

(119905) [1198761119911 (119905) + 119876

2 (119905)]lowast

times [ (119905) + 119862119911 (119905) minus 119863119892 (119911 (119905)) minus 119864119892 (119911 (119905 minus 120591 (119905)))]

(23)

It follows from (17) (21) (22) and (23) that

(119905) le 1205832

(119905) 119908lowast

(119905) Ω119908 (119905) 119905 ge 119879 (24)

Abstract and Applied Analysis 5

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(25)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Γ Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21= 1198751+ 1198761+ 119876lowast

2119862 Ω33= minus1205732

(1 minus 120588)1198752+ 1198772Γ From (14)

we have thatΩ = Ω119877

+ 119894Ω119868 It follows from (13) and Lemma 5

thatΩ lt 0 Thus we get from (24) that

(119905) le 0 119905 ge 119879 (26)

which means 119881(119905) is monotonically nonincreasing for 119905 ge 119879So we have

119881 (119905) le 119881 (119879) 119905 ge 119879 (27)

From the definition of 119881(119905) in (16) we obtain that

1205832

(119905) 120582min (1198751) 119911(119905)2

le 119881 (119905) le 1198810le infin 119905 ge 119879 (28)

where 1198810= max

0le119904le119879119881(119904) It implies that

119911 (119905) le119872

120583 (119905) 119905 ge 119879 (29)

where119872 = radic1198810120582min(1198751) The proof is completed

Theorem 7 Assume that assumptions (H2) and (H3) holdNeural network (1) is globally120583-stable if there exist two positivedefinite Hermitian matrices 119875

1= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2

two real positive diagonal matrices 1198771gt 0 and 119877

2gt 0 two

complex matrices 1198761= 119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 a positive

differential function120583(119905) defined on [0infin) and three constants120572 ge 0120573 gt 0 and119879 gt 0 such that conditions (12) (13) and (14)in Theorem 6 are satisfied where Ω119877

11= 2120572119875

119877

1+ 119875119877

2+ 119862119876119877

1+

(119876119877

1)119879

119862 + 1198771Π and Ω119877

33= minus1205732

(1 minus 120588)119875119877

2+ 1198772Π

Proof From assumption (H2) we get10038161003816100381610038161003816119892119895(119911119895(119905))

10038161003816100381610038161003816le 120578119895

10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816(30)

for 119895 = 1 2 119899 Let 1198771= diag(119903

1 1199032 119903

119899) gt 0 It follows

from (30) that

1199031198951205832

(119905) 119892lowast

119895(119911119895(119905)) 119892119895(119911119895(119905)) minus 119903

1198951205832

(119905) 1205782

119895119911lowast

119895(119905) 119911119895(119905) le 0

(31)

for 119895 = 1 2 119899 Hence

1205832

(119905) 119892lowast

(119911 (119905)) 1198771119892 (119911 (119905)) minus 120583

2

(119905) 119911lowast

(119905) 1198771Π119911 (119905) le 0 (32)

Also we can get that

1205832

(119905) 119892lowast

(119911 (119905 minus 120591 (119905))) 1198772119892 (119911 (119905 minus 120591 (119905)))

minus 1205832

(119905) 119911lowast

(119905 minus 120591 (119905)) 1198772Π119911 (119905 minus 120591 (119905)) le 0

(33)

Consider the Lyapunov-Krasovskii functional (16) From(17) (23) (32) and (33) we can get that

(119905) le 1205832

(119905) 119908lowast

(119905) Ω119908 (119905) 119905 ge 119879 (34)

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(35)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Π Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21

= 1198751+ 1198761+ 119876lowast

2119862 Ω33

= minus1205732

(1 minus 120588)1198752+ 1198772Π From the

proof of Theorem 6 we know that neural network (1) is 120583-stable The proof is completed

Corollary 8 Assume that assumptions (H1) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that

conditions (13) and (14) inTheorem 6 are satisfied whereΩ11987711=

2120576119875119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811= 2120576119875

119868

1+ 119875119868

2+ 119862119876119868

1minus

(119876119868

1)119879

119862 Ω11987733= minus119890minus2120576120591

(1minus120588)119875119877

2+1198772Γ Ω11986833= minus119890minus2120576120591

(1minus120588)119875119868

2

Proof Let 120583(119905) = 119890120576119905 (119905 ge 0) then (119905)120583(119905) = 120576 120583(119905 minus

120591(119905))120583(119905) = 119890minus120576120591(119905)

ge 119890minus120576120591 Take 120572 = 120576 120573 = 119890

minus120576120591 and 119879 = 0It is obvious that condition (12) inTheorem 6 is satisfiedTheproof is completed

Corollary 9 Assume that assumptions (H1) and (H3) holdand (119905) le 120575119905(0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 6 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Γ

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Proof Let 120583(119905) = 119905120576 (119905 ge 119879) then (119905)120583(119905) = 120576119905 le 120576119879

120583(119905 minus 120591(119905))120583(119905) = (1 minus 120591(119905)119905)120576

ge (1 minus 120575)120576 Take 120572 = 120576119879

120573 = (1 minus 120575)120576 It is obvious that condition (12) inTheorem 6 is

satisfied The proof is completed

It is similar to the proof of Corollaries 8 and 9 and wehave the following results

Corollary 10 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

6 Abstract and Applied Analysis

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that conditions

(13) and (14) in Theorem 7 are satisfied where Ω11987711

= 2120576119875119877

1+

119875119877

2+119862119876119877

1+ (119876119877

1)119879

119862+1198771ΠΩ11986811= 2120576119875

119868

1+119875119868

2+119862119876119868

1minus (119876119868

1)119879

119862Ω119877

33= minus119890minus2120576120591

(1 minus 120588)119875119877

2+ 1198772Π Ω11986833= minus119890minus2120576120591

(1 minus 120588)119875119868

2

Corollary 11 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120575119905 (0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 7 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Π Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862 Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Π

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Remark 12 In [18 23] the authors studied the global stabilityof complex-valued neural networks with time-varying delaysand the activation function was supposed to be

119891119895(119911) = 119891

119877

(119909 119910) + 119894119891119868

(119909 119910) (36)

where 119911 = 119909+ 119894119910 and 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) are real-valued for all 119895 =

1 2 119899 and the partial derivatives of 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) with

respect to119909 and119910were required to exist and be continuous Inthis paper both the real parts and the imaginary parts of theactivation functions are no longer assumed to be derivable

Remark 13 In [21] the authors investigated boundedness andstability for discrete-time complex-valued neural networkswith constant delay and the activation functionwas supposedto be

119891119895(119911) = max 0Re (119911) + 119894max 0 Im (119911) (37)

Also the authors gave a complex-valued LMI criterion forchecking stability of the considered neural networks But afeasible way to solve the given complex-valued LMI is notprovided In this paper the criteria for checking stabilityof the complex-valued neural networks are established inthe real LMI which can be checked numerically using theeffective LMI toolbox in MATLAB

4 Examples

Example 1 Consider a two-neuron complex-valued neuralnetwork with constant delay

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591)) + 119867 (38)

where

119862 = (1 0

0 1) 119863 = (

1 + 119894 minus2 + 119894

1 minus 119894 minus1)

119864 = (1 + 119894 119894

minus1 + 119894 2) 119867 = (

2 minus 119894

minus2 + 119894) 120591 = 2

119891119877

119895(119906119895) =

1

32(10038161003816100381610038161003816Re (119906119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Re (119906119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

119891119868

119895(119906119895) =

1

32(10038161003816100381610038161003816Im (119906

119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Im (119906

119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

(39)

It can be verified that the activation functions 119891119895sat-

isfy assumption (H1) the time-varying delay 120591(119905) satisfiesassumption (H3) and 119863

119877

= ( 1 minus21 minus1

) 119863119868 = ( 1 1minus1 0

) 119864119877 =

( 1 0minus1 2

) 119864119868 = ( 1 11 0

) Γ = (164 0

0 164) 120588 = 0 There exists a

constant 120576 = 02 and by employing MATLAB LMI toolboxwe can find the solutions to the LMI under conditions (13)and (14) in Corollary 8 as follows

1198751= (

412118 minus68812 minus 42410119894

minus68812 + 42410119894 406762)

1198752= (

260370 minus41385 minus 07600119894

minus41385 + 07600119894 259447)

1198771= (

7859319 0

0 8570773)

1198772= (

5605160 0

0 6171000)

1198761= (

minus616337 + 11580119894 92494 + 06392119894

91464 + 01434119894 minus614512 minus 13717119894)

1198762= (

minus631096 minus 05628119894 102662 + 27895119894

101047 minus 33169119894 minus620592 + 06291119894)

(40)

Then all conditions in Corollary 8 are satisfied Therefore theneural network (38) is globally exponentially stable Figures1 and 2 depict the real and imaginary parts of states of theconsidered neural network (38) where initial condition is1199111(119905) = 05 119911

2(119905) = minus1 minus 05119894

Example 2 Consider a two-neuron complex-valued neuralnetwork with unbounded time-varying delays

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591 (119905))) + 119867 (41)

where

119862 = (1 0

0 1) 119863 = (

15 + 3119894 minus1 + 2119894

minus2 minus 2119894 25 + 05119894)

119864 = (05 + 119894 119894

minus2 + 05119894 1 + 05119894) 119867 = (

2 + 119894

minus2 + 119894)

120591 (119905) = 02119905

1198911(1199061) =

1

20(10038161003816100381610038161199061 + 1

1003816100381610038161003816 minus10038161003816100381610038161199061 minus 1

1003816100381610038161003816)

1198912(1199062) =

1

20(10038161003816100381610038161199062 + 1

1003816100381610038161003816 minus10038161003816100381610038161199062 minus 1

1003816100381610038161003816)

(42)

It can be verified that the activation functions 119891119895satisfy

Condition (H2) the time-varying delay 120591(119905) satisfies assump-tion (H3) and 119863

119877

= ( 15 minus1minus2 25

) 119863119868 = ( 3 2minus2 05

) 119864119877 = ( 05 0minus2 1

)

Abstract and Applied Analysis 7

0 10 20 30 40minus3

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 1 State trajectories of neural network (38)

minus1

minus05

0

05

1

15

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 2 State trajectories of neural network (38)

119864119868

= ( 1 105 05

) Π = ( 001 00 001

) 120588 = 02 There exist twoconstants 120576 = 1 and 119879 = 10 and by employing MATLABLMI toolbox we can find the solutions to the LMI underconditions (13) and (14) in Corollary 11 as follows

1198751= (

398524 147299 minus 32336119894

147299 + 32336119894 315075)

1198752= (

239471 97114 minus 13426119894

97114 + 13426119894 188527)

1198771= 103

times (13534 0

0 10135)

1198772= (

5846278 0

0 4293048)

0 10 20 30 40

minus3

minus4

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 3 State trajectories of neural network (41)

minus02

0

02

04

06

08

1

12

14

16

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 4 State trajectories of neural network (41)

1198761= (

minus518770 minus 15115119894 minus167003 + 06185119894

minus168517 minus 09537119894 minus407204 + 10687119894)

1198762= (

minus604708 + 02332119894 minus212500 + 36232119894

minus216247 minus 35095119894 minus482459 minus 01731119894)

(43)

Then all conditions in Corollary 11 are satisfied Thus theneural network (41) is globally power stable Figures 3 and 4depict the real and imaginary parts of states of the considerednetwork (41) where initial condition is 119909

1(119905) = 05 sin(05119905)119894

1199092(119905) = minus05 + 15 cos(03119905)119894

8 Abstract and Applied Analysis

5 Conclusion

In this paper the global 120583-stability of complex-valued neuralnetworks with unbounded time delays has been investi-gated by employing a combination of Lyapunov-Krasovskiifunctionals and the free weighting matrix method Severalsufficient conditions for checking the global 120583-stability ofthe considered complex-valued neural networks have beengiven in linearmatrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABAs direct applications we can get the exponential stabilityand power stability with respect to different time-varyingdelays Two examples have been provided to demonstrate theeffectiveness and less conservatism of the proposed criteria

Remark 14 Note that the activation functions in this paperare continuous Recently neural networks with discontinu-ous activations have received more and more attention [31ndash34] Based on the results of this paper we will consider 120583-stability of complex-valued neural networks with discontin-uous activations in future

Conflict of Interests

The authors declare that they have no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China under Grants 61273021 and 11172247and in part by the Natural Science Foundation Project of CQcstc2013jjB40008

References

[1] H Dong Z Wang and H Gao ldquoDistributed 119867infin

filteringfor a class of Markovian jump nonlinear time-delay systemsover lossy sensor networksrdquo IEEE Transactions on IndustrialElectronics vol 60 no 10 pp 4665ndash4672 2013

[2] J Hu Z Wang B Shen and H Gao ldquoQuantised recursivefiltering for a class of nonlinear systems with multiplicativenoises and missing measurementsrdquo International Journal ofControl vol 86 no 4 pp 650ndash663 2013

[3] Y Fang K Yan and K Li ldquoImpulsive synchronization of a classof chaotic systemsrdquo Systems Science and Control Engineeringvol 2 no 1 pp 55ndash60 2014

[4] M Darouach and M Chadli ldquoAdmissibility and control ofswitched discrete-time singular systemsrdquo Systems Science andControl Engineering vol 1 no 1 pp 43ndash51 2013

[5] B Shen ZWang D Ding andH Shu ldquo119867infinstate estimation for

complex networks with uncertain inner coupling and incom-plete measurementsrdquo IEEE Transactions on Neural Networksand Learning Systems vol 24 no 12 pp 2027ndash2037 2013

[6] B Shen Z Wang and X Liu ldquoSampled-data synchronizationcontrol of dynamical networks with stochastic samplingrdquo IEEETransactions on Automatic Control vol 57 no 10 pp 2644ndash2650 2012

[7] J Liang Z Wang Y Liu and X Liu ldquoState estimation for two-dimensional complex networks with randomly occurring non-linearities and randomly varying sensor delaysrdquo InternationalJournal of Robust and Nonlinear Control vol 24 no 1 pp 18ndash38 2014

[8] X Kan Z Wang and H Shu ldquoState estimation for discrete-time delayed neural networks with fractional uncertainties andsensor saturationsrdquo Neurocomputing vol 117 pp 64ndash71 2013

[9] Z Wang B Zineddin J Liang et al ldquoA novel neural networkapproach to cDNAmicroarray image segmentationrdquo ComputerMethods and Programs in Biomedicine vol 111 no 1 pp 189ndash198 2013

[10] A Hirose Complex-Valued Neural Networks Springer Heidel-berg GmbH and Co K Berlin Germany 2012

[11] G Tanaka and K Aihara ldquoComplex-valued multistate associa-tive memory with nonlinear multilevel functions for gray-levelimage reconstructionrdquo IEEE Transactions on Neural Networksvol 20 no 9 pp 1463ndash1473 2009

[12] S Chen L Hanzo and S Tan ldquoSymmetric complex-valued RBFreceiver for multiple-antenna-aided wireless systemsrdquo IEEEtransactions on Neural Networks vol 19 no 9 pp 1659ndash16652008

[13] I Cha and S A Kassam ldquoChannel equalization using adaptivecomplex radial basis function networksrdquo IEEE Journal onSelected Areas in Communications vol 13 no 1 pp 122ndash1311995

[14] T Nitta ldquoOrthogonality of decision boundaries in complex-valued neural networksrdquo Neural Computation vol 16 no 1 pp73ndash97 2004

[15] M Faijul Amin and K Murase ldquoSingle-layered complex-valued neural network for real-valued classification problemsrdquoNeurocomputing vol 72 no 4ndash6 pp 945ndash955 2009

[16] B K Tripathi and P K Kalra ldquoOn efficient learning machinewith root-power mean neuron in complex domainrdquo IEEETransactions on Neural Networks vol 22 no 5 pp 727ndash7382011

[17] J H Mathews and R W Howell Complex Analysis for Math-ematics and Engineering Jones and Bartlett Publications 3rdedition 1997

[18] J Hu and JWang ldquoGlobal stability of complex-valued recurrentneural networks with time-delaysrdquo IEEETransactions onNeuralNetworks vol 23 no 6 pp 853ndash865 2012

[19] V S H Rao and G R Murthy ldquoGlobal dynamics of a classof complex valued neural networksrdquo International Journal ofNeural Systems vol 18 no 2 pp 165ndash171 2008

[20] W Zhou and J M Zurada ldquoDiscrete-time recurrent neuralnetworks with complex-valued linear threshold neuronsrdquo IEEETransactions on Circuits and Systems II Express Briefs vol 56no 8 pp 669ndash673 2009

[21] C Duan and Q Song ldquoBoundedness and stability for discrete-time delayed neural network with complex-valued linearthreshold neuronsrdquo Discrete Dynamics in Nature and Societyvol 2010 Article ID 368379 19 pages 2010

[22] B Zou and Q Song ldquoBoundedness and complete stabilityof complex-valued neural networks with time delayrdquo IEEETransactions on Neural Networks and Learning Systems vol 24no 8 pp 1227ndash1238 2013

[23] X Xu J Zhang and J Shi ldquoExponential stability of complex-valued neural networks with mixed delaysrdquo Neurocomputingvol 128 pp 483ndash490 2014

Abstract and Applied Analysis 9

[24] X Chen and Q Song ldquoGlobal stability of complex-valuedneural networks with both leakage time delay and discrete timedelay on time scalesrdquo Neurocomputing vol 121 pp 254ndash2642013

[25] S-I Niculescu Delay Effects on Stability A Robust ControlApproach Springer London UK 2001

[26] V B Kolmanovskiı and V R Nosov Stability of Functional-Differential Equations Academic Press London UK 1986

[27] Y Liu Z Wang J Liang and X Liu ldquoSynchronization ofcoupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delaysrdquo IEEETransactions on Cybernetics vol 43 no 1 pp 102ndash114 2013

[28] T Chen and L Wang ldquoPower-rate global stability of dynamicalsystems with unbounded time-varying delaysrdquo IEEE Transac-tions on Circuits and Systems II Express Briefs vol 54 no 8 pp705ndash709 2007

[29] T Chen and L Wang ldquoGlobal 120583-stability of delayed neural net-works with unbounded time-varying delaysrdquo IEEETransactionson Neural Networks vol 18 no 6 pp 1836ndash1840 2007

[30] X Liu and T Chen ldquoRobust 120583-stability for uncertain stochasticneural networks with unbounded time-varying delaysrdquo PhysicaA Statistical Mechanics and Its Applications vol 387 no 12 pp2952ndash2962 2008

[31] X Liu and J Cao ldquoOn periodic solutions of neural networksvia differential inclusionsrdquo Neural Networks vol 22 no 4 pp329ndash334 2009

[32] X Liu and J Cao ldquoRobust state estimation for neural networkswith discontinuous activationsrdquo IEEE Transactions on SystemsMan and Cybernetics B Cybernetics vol 40 no 6 pp 1425ndash1437 2010

[33] X Liu T Chen J Cao and W Lu ldquoDissipativity and quasi-synchronization for neural networks with discontinuous acti-vations and parameter mismatchesrdquo Neural Networks vol 24no 10 pp 1013ndash1021 2011

[34] X Liu J Cao and W Yu ldquoFilippov systems and quasi-sync-hronization control for switched networksrdquo Chaos vol 22 no3 Article ID 033110 2012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 3: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

Abstract and Applied Analysis 3

for any 1199091 1199092 1199101 1199102isin R 119895 = 1 2 119899 Moreover we define

Γ = diag1205741 1205742 120574

119899 where 120574

119895= 2[(120585

119877

119895)2

+ (120585119868

)2

119895] 120585119877119895

=

max1205851198771119895 1205851198772

119895 and 120585119868

119895= max1205851198681

119895 1205851198682

119895

(H2) The real and imaginary parts of 119891119895(sdot) cannot be

separated but 119891119895(sdot) is bounded and satisfies the following

condition10038161003816100381610038161003816119891119895(1199111) minus 119891119895(1199112)10038161003816100381610038161003816le 120578119895

10038161003816100381610038161199111 minus 1199112

1003816100381610038161003816 (5)

for any 1199111 1199112isin C 119895 = 1 2 119899 where 120578

119895is a constant

Moreover we define Π = diag12057821 1205782

2 120578

2

119899

For the time-varying delays 120591(119905) we give the followingassumption

(H3) 120591(119905) is nonnegative and differentiable and satisfies120591(119905) le 120588 lt 1 where 120588 is a nonnegative constant

As usual a vector 119906 = (1199061 1199062 119906

119899)119879 is said to be an

equilibrium point of neural network (1) if it satisfies

minus119862119906 + (119863 + 119864)119891 (119906) + 119867 = 0 (6)

Throughout the paper the equilibrium of neural network(1) is assumed to exist For notational convenience wewill always shift an intended equilibrium point 119906 of neuralnetwork (1) to the origin by letting 119911(119905) = 119906(119905) minus 119906 It is easyto transform neural network (1) into the following form

(119905) = minus119862119911 (119905) + 119863119892 (119911 (119905)) + 119864119892 (119911 (119905 minus 120591 (119905)))

119911 (119904) = 120601 (119904) 119904 isin (minusinfin 0]

(7)

for 119905 gt 0 where119892(119911(sdot)) = 119892(119911(sdot)+119906)minus119892(119906) and120601(119904) = 120595(119904)minus119906Next we introduce some definitions and lemmas to be

used in the stability analysis

Definition 1 Suppose that 120583(119905) is a positive continuousfunction and satisfies 120583(119905) rarr infin as 119905 rarr infin If there existscalars119872 gt 0 and 119879 ge 0 such that

119911 (119905) le119872

120583 (119905) 119905 ge 119879 (8)

then neural network (1) is said to be 120583-stable

Definition 2 If there exist scalars 119872 gt 0 120576 gt 0 and 119879 ge 0

such that

119911 (119905) le119872

119890120576119905 119905 ge 119879 (9)

then neural network (1) is said to be exponentially stable

Definition 3 If there exist scalars 119872 gt 0 120576 gt 0 and 119879 ge 0

such that

119911 (119905) le119872

119905120576 119905 ge 119879 (10)

then neural network (1) is said to be power stable

Remark 4 It is obvious that both 119890120576119905 and 119905

120576 approaches+infin when 119905 approaches +infin Therefore neural network (1)is 120583-stable if it is exponentially stable or power stable Inother words exponential stability and power stability are twospecial cases of 120583-stability

Lemma 5 (see [22]) Given a Hermitianmatrix119876 then119876 lt 0

is equivalent to

(119876119877

119876119868

minus119876119868

119876119877) lt 0 (11)

where 119876119877 = Re (119876) and 119876119868 = Im (119876)

3 Main Results

In this section we are going to give several criteria forchecking the global 120583-stability of the complex-valued neuralnetworks (1)

Theorem 6 Assume that assumptions (H1) and (H3) holdNeural network (1) is globally120583-stable if there exist two positivedefinite Hermitian matrices 119875

1= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2

two real positive diagonal matrices 1198771gt 0 and 119877

2gt 0 two

complex matrices 1198761= 119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 a positive

differential function120583(119905) defined on [0infin) and three constants120572 ge 0 120573 gt 0 and 119879 gt 0 such that

(119905)

120583 (119905)le 120572

120583 (119905 minus 120591 (119905))

120583 (119905)ge 120573 forall119905 isin [119879infin) (12)

and the following LMI holds

(Ω119877

Ω119868

minusΩ119868

Ω119877) lt 0 (13)

where

Ω119877

=(((

(

Ω119877

11Ω119877

120 Ω

119877

14Ω119877

15

Ω119877

21Ω119877

220 Ω

119877

24Ω119877

25

0 0 Ω119877

330 0

Ω119877

41Ω119877

420 minus119877

10

Ω119877

51Ω119877

520 0 minus119877

2

)))

)

Ω119868

=(((

(

Ω119868

11Ω119868

120 Ω

119868

14Ω119868

15

Ω119868

21Ω119868

220 Ω

119868

24Ω119868

25

0 0 Ω119868

330 0

Ω119868

41Ω119868

420 0 0

Ω119868

51Ω119868

520 0 0

)))

)

(14)

in which

Ω119877

11= 2120572119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ

Ω119877

12= 119875119877

1+ (119876119877

1)119879

+ 119862119876119877

2

Ω119877

14= minus (119876

119877

1)119879

119863119877

minus (119876119868

1)119879

119863119868

Ω119877

15= minus (119876

119877

1)119879

119864119877

minus (119876119868

1)119879

119864119868

4 Abstract and Applied Analysis

Ω119877

21= 119875119877

1+ 119876119877

1+ (119876119877

2)119879

119862

Ω119877

22= 119876119877

2+ (119876119877

2)119879

Ω119877

24= minus (119876

119877

2)119879

119863119877

minus (119876119868

2)119879

119863119868

Ω119877

25= minus (119876

119877

2)119879

119864119877

minus (119876119868

2)119879

119864119868

Ω119877

33= minus 120573

2

(1 minus 120588) 119875119877

2+ 1198772Γ

Ω119877

41= minus (119863

119877

)119879

119876119877

1minus (119863119868

)119879

119876119868

1

Ω119877

42= minus (119863

119877

)119879

119876119877

2minus (119863119868

)119879

119876119868

2

Ω119877

51= minus (119864

119877

)119879

119876119877

1minus (119864119868

)119879

119876119868

1

Ω119877

52= minus (119864

119877

)119879

119876119877

2minus (119864119868

)119879

119876119868

2

Ω119868

11= 2120572119875

119868

1+ 119875119868

2+ 119862119876119868

1minus (119876119868

1)119879

119862

Ω119868

12= 119875119868

1minus (119876119868

1)119879

+ 119862119876119868

2

Ω119868

14= (119876119868

1)119879

119863119877

minus (119876119877

1)119879

119863119868

Ω119868

15= (119876119868

1)119879

119864119877

minus (119876119877

1)119879

119864119868

Ω119868

21= 119875119868

1+ 119876119868

1minus (119876119868

2)119879

119862

Ω119868

22= 119876119868

2minus (119876119868

2)119879

Ω119868

24= (119876119868

2)119879

119863119877

minus (119876119877

2)119879

119863119868

Ω119868

25= (119876119868

2)119879

119864119877

minus (119876119877

2)119879

119864119868

Ω119868

33= minus 120573

2

(1 minus 120588) 119875119868

2

Ω119868

41= (119863119868

)119879

119876119877

1minus (119863119877

)119879

119876119868

1

Ω119868

42= (119863119868

)119879

119876119877

2minus (119863119877

)119879

119876119868

2

Ω119868

51= (119864119868

)119879

119876119877

1minus (119864119877

)119879

119876119868

1

Ω119868

52= (119864119868

)119879

119876119877

2minus (119864119877

)119879

119876119868

2

(15)

Proof Consider the following Lyapunov-Krasovskii func-tional candidates

119881 (119905) = 1205832

(119905) 119911lowast

(119905) 1198751119911 (119905) + int

119905

119905minus120591(119905)

1205832

(119904) 119911lowast

(119904) 1198752119911 (119904) d119904

(16)

Calculate the derivative of 119881(119905) along the trajectories ofneural network (7) We obtain

(119905) = 2 (119905) 120583 (119905) 119911lowast

(119905) 1198751119911 (119905) + 120583

2

(119905) lowast

(119905) 1198751119911 (119905)

+ 1205832

(119905) 119911lowast

(119905) 1198751 (119905) + 120583

2

(119905) 119911lowast

(119905) 1198752119911 (119905)

minus 1205832

(119905 minus 120591 (119905)) 119911lowast

(119905 minus 120591 (119905)) 1198752119911 (119905 minus 120591 (119905))

times (1 minus 120591 (119905))

le 1205832

(119905) [2120572119911lowast

(119905) 1198751119911 (119905) +

lowast

(119905) 1198751119911 (119905)

+ 119911lowast

(119905) 1198751 (119905) + 119911

lowast

(119905) 1198752z (119905) minus 120573

2

(1 minus 120588)

times 119911lowast

(119905 minus 120591 (119905)) 1198752119911 (119905 minus 120591 (119905))] 119905 ge 119879

(17)

In deriving inequality (17) we have made use of condition(12) and assumption (H3)

From assumption (H1) we get

10038161003816100381610038161003816119892119877

119895(119909119895 119910119895)10038161003816100381610038161003816le 1205851198771

119895

10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+ 1205851198772

119895

10038161003816100381610038161003816119910119895

10038161003816100381610038161003816

10038161003816100381610038161003816119892119868

119895(119909119895 119910119895)10038161003816100381610038161003816le 1205851198681

119895

10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+ 1205851198682

119895

10038161003816100381610038161003816119910119895

10038161003816100381610038161003816

(18)

for all 119895 = 1 2 119899 where 119909119895= Re(119911

119895) 119910119895= Im(119911

119895) Hence

10038161003816100381610038161003816119892119895(119911119895(119905))

10038161003816100381610038161003816

2

le [(120585119877

119895)2

+ (120585119868

)2

119895

] (10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+10038161003816100381610038161003816119910119895

10038161003816100381610038161003816)2

le 2 [(120585119877

119895)2

+ (120585119868

)2

119895

]10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816

2

2

= 120574119895

10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816

2

(19)

where 120585119877119895= max1205851198771

119895 1205851198772

119895 120585119868119895= max1205851198681

119895 1205851198682

119895 120574119895= 2[(120585

119877

119895)2

+

(120585119868

)2

119895] 119895 = 1 2 119899 Let 119877

1= diag(119903

1 1199032 119903

119899) gt 0 It

follows from (19) that

1199031198951205832

(119905) 119892lowast

119895(119911119895(119905)) 119892119895(119911119895(119905)) minus 119903

1198951205832

(119905) 120574119895119911lowast

119895(119905) 119911119895(119905) le 0

(20)

for 119895 = 1 2 119899 Thus

1205832

(119905) 119892lowast

(119911 (119905)) 1198771119892 (119911 (119905)) minus 120583

2

(119905) 119911lowast

(119905) 1198771Γ119911 (119905) le 0 (21)

Also we can get that

1205832

(119905) 119892lowast

(119911 (119905 minus 120591 (119905))) 1198772119892 (119911 (119905 minus 120591 (119905)))

minus 1205832

(119905) 119911lowast

(119905 minus 120591 (119905)) 1198772Γ119911 (119905 minus 120591 (119905)) le 0

(22)

From (7) we have that

0 = 1205832

(119905) [ (119905) + 119862119911 (119905) minus 119863119892 (119911 (119905)) minus 119864119892 (119911 (119905 minus 120591 (119905)))]lowast

times [1198761119911 (119905) + 119876

2 (119905)] + 120583

2

(119905) [1198761119911 (119905) + 119876

2 (119905)]lowast

times [ (119905) + 119862119911 (119905) minus 119863119892 (119911 (119905)) minus 119864119892 (119911 (119905 minus 120591 (119905)))]

(23)

It follows from (17) (21) (22) and (23) that

(119905) le 1205832

(119905) 119908lowast

(119905) Ω119908 (119905) 119905 ge 119879 (24)

Abstract and Applied Analysis 5

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(25)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Γ Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21= 1198751+ 1198761+ 119876lowast

2119862 Ω33= minus1205732

(1 minus 120588)1198752+ 1198772Γ From (14)

we have thatΩ = Ω119877

+ 119894Ω119868 It follows from (13) and Lemma 5

thatΩ lt 0 Thus we get from (24) that

(119905) le 0 119905 ge 119879 (26)

which means 119881(119905) is monotonically nonincreasing for 119905 ge 119879So we have

119881 (119905) le 119881 (119879) 119905 ge 119879 (27)

From the definition of 119881(119905) in (16) we obtain that

1205832

(119905) 120582min (1198751) 119911(119905)2

le 119881 (119905) le 1198810le infin 119905 ge 119879 (28)

where 1198810= max

0le119904le119879119881(119904) It implies that

119911 (119905) le119872

120583 (119905) 119905 ge 119879 (29)

where119872 = radic1198810120582min(1198751) The proof is completed

Theorem 7 Assume that assumptions (H2) and (H3) holdNeural network (1) is globally120583-stable if there exist two positivedefinite Hermitian matrices 119875

1= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2

two real positive diagonal matrices 1198771gt 0 and 119877

2gt 0 two

complex matrices 1198761= 119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 a positive

differential function120583(119905) defined on [0infin) and three constants120572 ge 0120573 gt 0 and119879 gt 0 such that conditions (12) (13) and (14)in Theorem 6 are satisfied where Ω119877

11= 2120572119875

119877

1+ 119875119877

2+ 119862119876119877

1+

(119876119877

1)119879

119862 + 1198771Π and Ω119877

33= minus1205732

(1 minus 120588)119875119877

2+ 1198772Π

Proof From assumption (H2) we get10038161003816100381610038161003816119892119895(119911119895(119905))

10038161003816100381610038161003816le 120578119895

10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816(30)

for 119895 = 1 2 119899 Let 1198771= diag(119903

1 1199032 119903

119899) gt 0 It follows

from (30) that

1199031198951205832

(119905) 119892lowast

119895(119911119895(119905)) 119892119895(119911119895(119905)) minus 119903

1198951205832

(119905) 1205782

119895119911lowast

119895(119905) 119911119895(119905) le 0

(31)

for 119895 = 1 2 119899 Hence

1205832

(119905) 119892lowast

(119911 (119905)) 1198771119892 (119911 (119905)) minus 120583

2

(119905) 119911lowast

(119905) 1198771Π119911 (119905) le 0 (32)

Also we can get that

1205832

(119905) 119892lowast

(119911 (119905 minus 120591 (119905))) 1198772119892 (119911 (119905 minus 120591 (119905)))

minus 1205832

(119905) 119911lowast

(119905 minus 120591 (119905)) 1198772Π119911 (119905 minus 120591 (119905)) le 0

(33)

Consider the Lyapunov-Krasovskii functional (16) From(17) (23) (32) and (33) we can get that

(119905) le 1205832

(119905) 119908lowast

(119905) Ω119908 (119905) 119905 ge 119879 (34)

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(35)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Π Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21

= 1198751+ 1198761+ 119876lowast

2119862 Ω33

= minus1205732

(1 minus 120588)1198752+ 1198772Π From the

proof of Theorem 6 we know that neural network (1) is 120583-stable The proof is completed

Corollary 8 Assume that assumptions (H1) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that

conditions (13) and (14) inTheorem 6 are satisfied whereΩ11987711=

2120576119875119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811= 2120576119875

119868

1+ 119875119868

2+ 119862119876119868

1minus

(119876119868

1)119879

119862 Ω11987733= minus119890minus2120576120591

(1minus120588)119875119877

2+1198772Γ Ω11986833= minus119890minus2120576120591

(1minus120588)119875119868

2

Proof Let 120583(119905) = 119890120576119905 (119905 ge 0) then (119905)120583(119905) = 120576 120583(119905 minus

120591(119905))120583(119905) = 119890minus120576120591(119905)

ge 119890minus120576120591 Take 120572 = 120576 120573 = 119890

minus120576120591 and 119879 = 0It is obvious that condition (12) inTheorem 6 is satisfiedTheproof is completed

Corollary 9 Assume that assumptions (H1) and (H3) holdand (119905) le 120575119905(0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 6 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Γ

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Proof Let 120583(119905) = 119905120576 (119905 ge 119879) then (119905)120583(119905) = 120576119905 le 120576119879

120583(119905 minus 120591(119905))120583(119905) = (1 minus 120591(119905)119905)120576

ge (1 minus 120575)120576 Take 120572 = 120576119879

120573 = (1 minus 120575)120576 It is obvious that condition (12) inTheorem 6 is

satisfied The proof is completed

It is similar to the proof of Corollaries 8 and 9 and wehave the following results

Corollary 10 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

6 Abstract and Applied Analysis

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that conditions

(13) and (14) in Theorem 7 are satisfied where Ω11987711

= 2120576119875119877

1+

119875119877

2+119862119876119877

1+ (119876119877

1)119879

119862+1198771ΠΩ11986811= 2120576119875

119868

1+119875119868

2+119862119876119868

1minus (119876119868

1)119879

119862Ω119877

33= minus119890minus2120576120591

(1 minus 120588)119875119877

2+ 1198772Π Ω11986833= minus119890minus2120576120591

(1 minus 120588)119875119868

2

Corollary 11 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120575119905 (0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 7 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Π Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862 Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Π

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Remark 12 In [18 23] the authors studied the global stabilityof complex-valued neural networks with time-varying delaysand the activation function was supposed to be

119891119895(119911) = 119891

119877

(119909 119910) + 119894119891119868

(119909 119910) (36)

where 119911 = 119909+ 119894119910 and 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) are real-valued for all 119895 =

1 2 119899 and the partial derivatives of 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) with

respect to119909 and119910were required to exist and be continuous Inthis paper both the real parts and the imaginary parts of theactivation functions are no longer assumed to be derivable

Remark 13 In [21] the authors investigated boundedness andstability for discrete-time complex-valued neural networkswith constant delay and the activation functionwas supposedto be

119891119895(119911) = max 0Re (119911) + 119894max 0 Im (119911) (37)

Also the authors gave a complex-valued LMI criterion forchecking stability of the considered neural networks But afeasible way to solve the given complex-valued LMI is notprovided In this paper the criteria for checking stabilityof the complex-valued neural networks are established inthe real LMI which can be checked numerically using theeffective LMI toolbox in MATLAB

4 Examples

Example 1 Consider a two-neuron complex-valued neuralnetwork with constant delay

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591)) + 119867 (38)

where

119862 = (1 0

0 1) 119863 = (

1 + 119894 minus2 + 119894

1 minus 119894 minus1)

119864 = (1 + 119894 119894

minus1 + 119894 2) 119867 = (

2 minus 119894

minus2 + 119894) 120591 = 2

119891119877

119895(119906119895) =

1

32(10038161003816100381610038161003816Re (119906119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Re (119906119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

119891119868

119895(119906119895) =

1

32(10038161003816100381610038161003816Im (119906

119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Im (119906

119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

(39)

It can be verified that the activation functions 119891119895sat-

isfy assumption (H1) the time-varying delay 120591(119905) satisfiesassumption (H3) and 119863

119877

= ( 1 minus21 minus1

) 119863119868 = ( 1 1minus1 0

) 119864119877 =

( 1 0minus1 2

) 119864119868 = ( 1 11 0

) Γ = (164 0

0 164) 120588 = 0 There exists a

constant 120576 = 02 and by employing MATLAB LMI toolboxwe can find the solutions to the LMI under conditions (13)and (14) in Corollary 8 as follows

1198751= (

412118 minus68812 minus 42410119894

minus68812 + 42410119894 406762)

1198752= (

260370 minus41385 minus 07600119894

minus41385 + 07600119894 259447)

1198771= (

7859319 0

0 8570773)

1198772= (

5605160 0

0 6171000)

1198761= (

minus616337 + 11580119894 92494 + 06392119894

91464 + 01434119894 minus614512 minus 13717119894)

1198762= (

minus631096 minus 05628119894 102662 + 27895119894

101047 minus 33169119894 minus620592 + 06291119894)

(40)

Then all conditions in Corollary 8 are satisfied Therefore theneural network (38) is globally exponentially stable Figures1 and 2 depict the real and imaginary parts of states of theconsidered neural network (38) where initial condition is1199111(119905) = 05 119911

2(119905) = minus1 minus 05119894

Example 2 Consider a two-neuron complex-valued neuralnetwork with unbounded time-varying delays

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591 (119905))) + 119867 (41)

where

119862 = (1 0

0 1) 119863 = (

15 + 3119894 minus1 + 2119894

minus2 minus 2119894 25 + 05119894)

119864 = (05 + 119894 119894

minus2 + 05119894 1 + 05119894) 119867 = (

2 + 119894

minus2 + 119894)

120591 (119905) = 02119905

1198911(1199061) =

1

20(10038161003816100381610038161199061 + 1

1003816100381610038161003816 minus10038161003816100381610038161199061 minus 1

1003816100381610038161003816)

1198912(1199062) =

1

20(10038161003816100381610038161199062 + 1

1003816100381610038161003816 minus10038161003816100381610038161199062 minus 1

1003816100381610038161003816)

(42)

It can be verified that the activation functions 119891119895satisfy

Condition (H2) the time-varying delay 120591(119905) satisfies assump-tion (H3) and 119863

119877

= ( 15 minus1minus2 25

) 119863119868 = ( 3 2minus2 05

) 119864119877 = ( 05 0minus2 1

)

Abstract and Applied Analysis 7

0 10 20 30 40minus3

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 1 State trajectories of neural network (38)

minus1

minus05

0

05

1

15

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 2 State trajectories of neural network (38)

119864119868

= ( 1 105 05

) Π = ( 001 00 001

) 120588 = 02 There exist twoconstants 120576 = 1 and 119879 = 10 and by employing MATLABLMI toolbox we can find the solutions to the LMI underconditions (13) and (14) in Corollary 11 as follows

1198751= (

398524 147299 minus 32336119894

147299 + 32336119894 315075)

1198752= (

239471 97114 minus 13426119894

97114 + 13426119894 188527)

1198771= 103

times (13534 0

0 10135)

1198772= (

5846278 0

0 4293048)

0 10 20 30 40

minus3

minus4

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 3 State trajectories of neural network (41)

minus02

0

02

04

06

08

1

12

14

16

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 4 State trajectories of neural network (41)

1198761= (

minus518770 minus 15115119894 minus167003 + 06185119894

minus168517 minus 09537119894 minus407204 + 10687119894)

1198762= (

minus604708 + 02332119894 minus212500 + 36232119894

minus216247 minus 35095119894 minus482459 minus 01731119894)

(43)

Then all conditions in Corollary 11 are satisfied Thus theneural network (41) is globally power stable Figures 3 and 4depict the real and imaginary parts of states of the considerednetwork (41) where initial condition is 119909

1(119905) = 05 sin(05119905)119894

1199092(119905) = minus05 + 15 cos(03119905)119894

8 Abstract and Applied Analysis

5 Conclusion

In this paper the global 120583-stability of complex-valued neuralnetworks with unbounded time delays has been investi-gated by employing a combination of Lyapunov-Krasovskiifunctionals and the free weighting matrix method Severalsufficient conditions for checking the global 120583-stability ofthe considered complex-valued neural networks have beengiven in linearmatrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABAs direct applications we can get the exponential stabilityand power stability with respect to different time-varyingdelays Two examples have been provided to demonstrate theeffectiveness and less conservatism of the proposed criteria

Remark 14 Note that the activation functions in this paperare continuous Recently neural networks with discontinu-ous activations have received more and more attention [31ndash34] Based on the results of this paper we will consider 120583-stability of complex-valued neural networks with discontin-uous activations in future

Conflict of Interests

The authors declare that they have no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China under Grants 61273021 and 11172247and in part by the Natural Science Foundation Project of CQcstc2013jjB40008

References

[1] H Dong Z Wang and H Gao ldquoDistributed 119867infin

filteringfor a class of Markovian jump nonlinear time-delay systemsover lossy sensor networksrdquo IEEE Transactions on IndustrialElectronics vol 60 no 10 pp 4665ndash4672 2013

[2] J Hu Z Wang B Shen and H Gao ldquoQuantised recursivefiltering for a class of nonlinear systems with multiplicativenoises and missing measurementsrdquo International Journal ofControl vol 86 no 4 pp 650ndash663 2013

[3] Y Fang K Yan and K Li ldquoImpulsive synchronization of a classof chaotic systemsrdquo Systems Science and Control Engineeringvol 2 no 1 pp 55ndash60 2014

[4] M Darouach and M Chadli ldquoAdmissibility and control ofswitched discrete-time singular systemsrdquo Systems Science andControl Engineering vol 1 no 1 pp 43ndash51 2013

[5] B Shen ZWang D Ding andH Shu ldquo119867infinstate estimation for

complex networks with uncertain inner coupling and incom-plete measurementsrdquo IEEE Transactions on Neural Networksand Learning Systems vol 24 no 12 pp 2027ndash2037 2013

[6] B Shen Z Wang and X Liu ldquoSampled-data synchronizationcontrol of dynamical networks with stochastic samplingrdquo IEEETransactions on Automatic Control vol 57 no 10 pp 2644ndash2650 2012

[7] J Liang Z Wang Y Liu and X Liu ldquoState estimation for two-dimensional complex networks with randomly occurring non-linearities and randomly varying sensor delaysrdquo InternationalJournal of Robust and Nonlinear Control vol 24 no 1 pp 18ndash38 2014

[8] X Kan Z Wang and H Shu ldquoState estimation for discrete-time delayed neural networks with fractional uncertainties andsensor saturationsrdquo Neurocomputing vol 117 pp 64ndash71 2013

[9] Z Wang B Zineddin J Liang et al ldquoA novel neural networkapproach to cDNAmicroarray image segmentationrdquo ComputerMethods and Programs in Biomedicine vol 111 no 1 pp 189ndash198 2013

[10] A Hirose Complex-Valued Neural Networks Springer Heidel-berg GmbH and Co K Berlin Germany 2012

[11] G Tanaka and K Aihara ldquoComplex-valued multistate associa-tive memory with nonlinear multilevel functions for gray-levelimage reconstructionrdquo IEEE Transactions on Neural Networksvol 20 no 9 pp 1463ndash1473 2009

[12] S Chen L Hanzo and S Tan ldquoSymmetric complex-valued RBFreceiver for multiple-antenna-aided wireless systemsrdquo IEEEtransactions on Neural Networks vol 19 no 9 pp 1659ndash16652008

[13] I Cha and S A Kassam ldquoChannel equalization using adaptivecomplex radial basis function networksrdquo IEEE Journal onSelected Areas in Communications vol 13 no 1 pp 122ndash1311995

[14] T Nitta ldquoOrthogonality of decision boundaries in complex-valued neural networksrdquo Neural Computation vol 16 no 1 pp73ndash97 2004

[15] M Faijul Amin and K Murase ldquoSingle-layered complex-valued neural network for real-valued classification problemsrdquoNeurocomputing vol 72 no 4ndash6 pp 945ndash955 2009

[16] B K Tripathi and P K Kalra ldquoOn efficient learning machinewith root-power mean neuron in complex domainrdquo IEEETransactions on Neural Networks vol 22 no 5 pp 727ndash7382011

[17] J H Mathews and R W Howell Complex Analysis for Math-ematics and Engineering Jones and Bartlett Publications 3rdedition 1997

[18] J Hu and JWang ldquoGlobal stability of complex-valued recurrentneural networks with time-delaysrdquo IEEETransactions onNeuralNetworks vol 23 no 6 pp 853ndash865 2012

[19] V S H Rao and G R Murthy ldquoGlobal dynamics of a classof complex valued neural networksrdquo International Journal ofNeural Systems vol 18 no 2 pp 165ndash171 2008

[20] W Zhou and J M Zurada ldquoDiscrete-time recurrent neuralnetworks with complex-valued linear threshold neuronsrdquo IEEETransactions on Circuits and Systems II Express Briefs vol 56no 8 pp 669ndash673 2009

[21] C Duan and Q Song ldquoBoundedness and stability for discrete-time delayed neural network with complex-valued linearthreshold neuronsrdquo Discrete Dynamics in Nature and Societyvol 2010 Article ID 368379 19 pages 2010

[22] B Zou and Q Song ldquoBoundedness and complete stabilityof complex-valued neural networks with time delayrdquo IEEETransactions on Neural Networks and Learning Systems vol 24no 8 pp 1227ndash1238 2013

[23] X Xu J Zhang and J Shi ldquoExponential stability of complex-valued neural networks with mixed delaysrdquo Neurocomputingvol 128 pp 483ndash490 2014

Abstract and Applied Analysis 9

[24] X Chen and Q Song ldquoGlobal stability of complex-valuedneural networks with both leakage time delay and discrete timedelay on time scalesrdquo Neurocomputing vol 121 pp 254ndash2642013

[25] S-I Niculescu Delay Effects on Stability A Robust ControlApproach Springer London UK 2001

[26] V B Kolmanovskiı and V R Nosov Stability of Functional-Differential Equations Academic Press London UK 1986

[27] Y Liu Z Wang J Liang and X Liu ldquoSynchronization ofcoupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delaysrdquo IEEETransactions on Cybernetics vol 43 no 1 pp 102ndash114 2013

[28] T Chen and L Wang ldquoPower-rate global stability of dynamicalsystems with unbounded time-varying delaysrdquo IEEE Transac-tions on Circuits and Systems II Express Briefs vol 54 no 8 pp705ndash709 2007

[29] T Chen and L Wang ldquoGlobal 120583-stability of delayed neural net-works with unbounded time-varying delaysrdquo IEEETransactionson Neural Networks vol 18 no 6 pp 1836ndash1840 2007

[30] X Liu and T Chen ldquoRobust 120583-stability for uncertain stochasticneural networks with unbounded time-varying delaysrdquo PhysicaA Statistical Mechanics and Its Applications vol 387 no 12 pp2952ndash2962 2008

[31] X Liu and J Cao ldquoOn periodic solutions of neural networksvia differential inclusionsrdquo Neural Networks vol 22 no 4 pp329ndash334 2009

[32] X Liu and J Cao ldquoRobust state estimation for neural networkswith discontinuous activationsrdquo IEEE Transactions on SystemsMan and Cybernetics B Cybernetics vol 40 no 6 pp 1425ndash1437 2010

[33] X Liu T Chen J Cao and W Lu ldquoDissipativity and quasi-synchronization for neural networks with discontinuous acti-vations and parameter mismatchesrdquo Neural Networks vol 24no 10 pp 1013ndash1021 2011

[34] X Liu J Cao and W Yu ldquoFilippov systems and quasi-sync-hronization control for switched networksrdquo Chaos vol 22 no3 Article ID 033110 2012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 4: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

4 Abstract and Applied Analysis

Ω119877

21= 119875119877

1+ 119876119877

1+ (119876119877

2)119879

119862

Ω119877

22= 119876119877

2+ (119876119877

2)119879

Ω119877

24= minus (119876

119877

2)119879

119863119877

minus (119876119868

2)119879

119863119868

Ω119877

25= minus (119876

119877

2)119879

119864119877

minus (119876119868

2)119879

119864119868

Ω119877

33= minus 120573

2

(1 minus 120588) 119875119877

2+ 1198772Γ

Ω119877

41= minus (119863

119877

)119879

119876119877

1minus (119863119868

)119879

119876119868

1

Ω119877

42= minus (119863

119877

)119879

119876119877

2minus (119863119868

)119879

119876119868

2

Ω119877

51= minus (119864

119877

)119879

119876119877

1minus (119864119868

)119879

119876119868

1

Ω119877

52= minus (119864

119877

)119879

119876119877

2minus (119864119868

)119879

119876119868

2

Ω119868

11= 2120572119875

119868

1+ 119875119868

2+ 119862119876119868

1minus (119876119868

1)119879

119862

Ω119868

12= 119875119868

1minus (119876119868

1)119879

+ 119862119876119868

2

Ω119868

14= (119876119868

1)119879

119863119877

minus (119876119877

1)119879

119863119868

Ω119868

15= (119876119868

1)119879

119864119877

minus (119876119877

1)119879

119864119868

Ω119868

21= 119875119868

1+ 119876119868

1minus (119876119868

2)119879

119862

Ω119868

22= 119876119868

2minus (119876119868

2)119879

Ω119868

24= (119876119868

2)119879

119863119877

minus (119876119877

2)119879

119863119868

Ω119868

25= (119876119868

2)119879

119864119877

minus (119876119877

2)119879

119864119868

Ω119868

33= minus 120573

2

(1 minus 120588) 119875119868

2

Ω119868

41= (119863119868

)119879

119876119877

1minus (119863119877

)119879

119876119868

1

Ω119868

42= (119863119868

)119879

119876119877

2minus (119863119877

)119879

119876119868

2

Ω119868

51= (119864119868

)119879

119876119877

1minus (119864119877

)119879

119876119868

1

Ω119868

52= (119864119868

)119879

119876119877

2minus (119864119877

)119879

119876119868

2

(15)

Proof Consider the following Lyapunov-Krasovskii func-tional candidates

119881 (119905) = 1205832

(119905) 119911lowast

(119905) 1198751119911 (119905) + int

119905

119905minus120591(119905)

1205832

(119904) 119911lowast

(119904) 1198752119911 (119904) d119904

(16)

Calculate the derivative of 119881(119905) along the trajectories ofneural network (7) We obtain

(119905) = 2 (119905) 120583 (119905) 119911lowast

(119905) 1198751119911 (119905) + 120583

2

(119905) lowast

(119905) 1198751119911 (119905)

+ 1205832

(119905) 119911lowast

(119905) 1198751 (119905) + 120583

2

(119905) 119911lowast

(119905) 1198752119911 (119905)

minus 1205832

(119905 minus 120591 (119905)) 119911lowast

(119905 minus 120591 (119905)) 1198752119911 (119905 minus 120591 (119905))

times (1 minus 120591 (119905))

le 1205832

(119905) [2120572119911lowast

(119905) 1198751119911 (119905) +

lowast

(119905) 1198751119911 (119905)

+ 119911lowast

(119905) 1198751 (119905) + 119911

lowast

(119905) 1198752z (119905) minus 120573

2

(1 minus 120588)

times 119911lowast

(119905 minus 120591 (119905)) 1198752119911 (119905 minus 120591 (119905))] 119905 ge 119879

(17)

In deriving inequality (17) we have made use of condition(12) and assumption (H3)

From assumption (H1) we get

10038161003816100381610038161003816119892119877

119895(119909119895 119910119895)10038161003816100381610038161003816le 1205851198771

119895

10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+ 1205851198772

119895

10038161003816100381610038161003816119910119895

10038161003816100381610038161003816

10038161003816100381610038161003816119892119868

119895(119909119895 119910119895)10038161003816100381610038161003816le 1205851198681

119895

10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+ 1205851198682

119895

10038161003816100381610038161003816119910119895

10038161003816100381610038161003816

(18)

for all 119895 = 1 2 119899 where 119909119895= Re(119911

119895) 119910119895= Im(119911

119895) Hence

10038161003816100381610038161003816119892119895(119911119895(119905))

10038161003816100381610038161003816

2

le [(120585119877

119895)2

+ (120585119868

)2

119895

] (10038161003816100381610038161003816119909119895

10038161003816100381610038161003816+10038161003816100381610038161003816119910119895

10038161003816100381610038161003816)2

le 2 [(120585119877

119895)2

+ (120585119868

)2

119895

]10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816

2

2

= 120574119895

10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816

2

(19)

where 120585119877119895= max1205851198771

119895 1205851198772

119895 120585119868119895= max1205851198681

119895 1205851198682

119895 120574119895= 2[(120585

119877

119895)2

+

(120585119868

)2

119895] 119895 = 1 2 119899 Let 119877

1= diag(119903

1 1199032 119903

119899) gt 0 It

follows from (19) that

1199031198951205832

(119905) 119892lowast

119895(119911119895(119905)) 119892119895(119911119895(119905)) minus 119903

1198951205832

(119905) 120574119895119911lowast

119895(119905) 119911119895(119905) le 0

(20)

for 119895 = 1 2 119899 Thus

1205832

(119905) 119892lowast

(119911 (119905)) 1198771119892 (119911 (119905)) minus 120583

2

(119905) 119911lowast

(119905) 1198771Γ119911 (119905) le 0 (21)

Also we can get that

1205832

(119905) 119892lowast

(119911 (119905 minus 120591 (119905))) 1198772119892 (119911 (119905 minus 120591 (119905)))

minus 1205832

(119905) 119911lowast

(119905 minus 120591 (119905)) 1198772Γ119911 (119905 minus 120591 (119905)) le 0

(22)

From (7) we have that

0 = 1205832

(119905) [ (119905) + 119862119911 (119905) minus 119863119892 (119911 (119905)) minus 119864119892 (119911 (119905 minus 120591 (119905)))]lowast

times [1198761119911 (119905) + 119876

2 (119905)] + 120583

2

(119905) [1198761119911 (119905) + 119876

2 (119905)]lowast

times [ (119905) + 119862119911 (119905) minus 119863119892 (119911 (119905)) minus 119864119892 (119911 (119905 minus 120591 (119905)))]

(23)

It follows from (17) (21) (22) and (23) that

(119905) le 1205832

(119905) 119908lowast

(119905) Ω119908 (119905) 119905 ge 119879 (24)

Abstract and Applied Analysis 5

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(25)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Γ Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21= 1198751+ 1198761+ 119876lowast

2119862 Ω33= minus1205732

(1 minus 120588)1198752+ 1198772Γ From (14)

we have thatΩ = Ω119877

+ 119894Ω119868 It follows from (13) and Lemma 5

thatΩ lt 0 Thus we get from (24) that

(119905) le 0 119905 ge 119879 (26)

which means 119881(119905) is monotonically nonincreasing for 119905 ge 119879So we have

119881 (119905) le 119881 (119879) 119905 ge 119879 (27)

From the definition of 119881(119905) in (16) we obtain that

1205832

(119905) 120582min (1198751) 119911(119905)2

le 119881 (119905) le 1198810le infin 119905 ge 119879 (28)

where 1198810= max

0le119904le119879119881(119904) It implies that

119911 (119905) le119872

120583 (119905) 119905 ge 119879 (29)

where119872 = radic1198810120582min(1198751) The proof is completed

Theorem 7 Assume that assumptions (H2) and (H3) holdNeural network (1) is globally120583-stable if there exist two positivedefinite Hermitian matrices 119875

1= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2

two real positive diagonal matrices 1198771gt 0 and 119877

2gt 0 two

complex matrices 1198761= 119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 a positive

differential function120583(119905) defined on [0infin) and three constants120572 ge 0120573 gt 0 and119879 gt 0 such that conditions (12) (13) and (14)in Theorem 6 are satisfied where Ω119877

11= 2120572119875

119877

1+ 119875119877

2+ 119862119876119877

1+

(119876119877

1)119879

119862 + 1198771Π and Ω119877

33= minus1205732

(1 minus 120588)119875119877

2+ 1198772Π

Proof From assumption (H2) we get10038161003816100381610038161003816119892119895(119911119895(119905))

10038161003816100381610038161003816le 120578119895

10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816(30)

for 119895 = 1 2 119899 Let 1198771= diag(119903

1 1199032 119903

119899) gt 0 It follows

from (30) that

1199031198951205832

(119905) 119892lowast

119895(119911119895(119905)) 119892119895(119911119895(119905)) minus 119903

1198951205832

(119905) 1205782

119895119911lowast

119895(119905) 119911119895(119905) le 0

(31)

for 119895 = 1 2 119899 Hence

1205832

(119905) 119892lowast

(119911 (119905)) 1198771119892 (119911 (119905)) minus 120583

2

(119905) 119911lowast

(119905) 1198771Π119911 (119905) le 0 (32)

Also we can get that

1205832

(119905) 119892lowast

(119911 (119905 minus 120591 (119905))) 1198772119892 (119911 (119905 minus 120591 (119905)))

minus 1205832

(119905) 119911lowast

(119905 minus 120591 (119905)) 1198772Π119911 (119905 minus 120591 (119905)) le 0

(33)

Consider the Lyapunov-Krasovskii functional (16) From(17) (23) (32) and (33) we can get that

(119905) le 1205832

(119905) 119908lowast

(119905) Ω119908 (119905) 119905 ge 119879 (34)

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(35)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Π Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21

= 1198751+ 1198761+ 119876lowast

2119862 Ω33

= minus1205732

(1 minus 120588)1198752+ 1198772Π From the

proof of Theorem 6 we know that neural network (1) is 120583-stable The proof is completed

Corollary 8 Assume that assumptions (H1) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that

conditions (13) and (14) inTheorem 6 are satisfied whereΩ11987711=

2120576119875119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811= 2120576119875

119868

1+ 119875119868

2+ 119862119876119868

1minus

(119876119868

1)119879

119862 Ω11987733= minus119890minus2120576120591

(1minus120588)119875119877

2+1198772Γ Ω11986833= minus119890minus2120576120591

(1minus120588)119875119868

2

Proof Let 120583(119905) = 119890120576119905 (119905 ge 0) then (119905)120583(119905) = 120576 120583(119905 minus

120591(119905))120583(119905) = 119890minus120576120591(119905)

ge 119890minus120576120591 Take 120572 = 120576 120573 = 119890

minus120576120591 and 119879 = 0It is obvious that condition (12) inTheorem 6 is satisfiedTheproof is completed

Corollary 9 Assume that assumptions (H1) and (H3) holdand (119905) le 120575119905(0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 6 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Γ

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Proof Let 120583(119905) = 119905120576 (119905 ge 119879) then (119905)120583(119905) = 120576119905 le 120576119879

120583(119905 minus 120591(119905))120583(119905) = (1 minus 120591(119905)119905)120576

ge (1 minus 120575)120576 Take 120572 = 120576119879

120573 = (1 minus 120575)120576 It is obvious that condition (12) inTheorem 6 is

satisfied The proof is completed

It is similar to the proof of Corollaries 8 and 9 and wehave the following results

Corollary 10 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

6 Abstract and Applied Analysis

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that conditions

(13) and (14) in Theorem 7 are satisfied where Ω11987711

= 2120576119875119877

1+

119875119877

2+119862119876119877

1+ (119876119877

1)119879

119862+1198771ΠΩ11986811= 2120576119875

119868

1+119875119868

2+119862119876119868

1minus (119876119868

1)119879

119862Ω119877

33= minus119890minus2120576120591

(1 minus 120588)119875119877

2+ 1198772Π Ω11986833= minus119890minus2120576120591

(1 minus 120588)119875119868

2

Corollary 11 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120575119905 (0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 7 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Π Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862 Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Π

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Remark 12 In [18 23] the authors studied the global stabilityof complex-valued neural networks with time-varying delaysand the activation function was supposed to be

119891119895(119911) = 119891

119877

(119909 119910) + 119894119891119868

(119909 119910) (36)

where 119911 = 119909+ 119894119910 and 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) are real-valued for all 119895 =

1 2 119899 and the partial derivatives of 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) with

respect to119909 and119910were required to exist and be continuous Inthis paper both the real parts and the imaginary parts of theactivation functions are no longer assumed to be derivable

Remark 13 In [21] the authors investigated boundedness andstability for discrete-time complex-valued neural networkswith constant delay and the activation functionwas supposedto be

119891119895(119911) = max 0Re (119911) + 119894max 0 Im (119911) (37)

Also the authors gave a complex-valued LMI criterion forchecking stability of the considered neural networks But afeasible way to solve the given complex-valued LMI is notprovided In this paper the criteria for checking stabilityof the complex-valued neural networks are established inthe real LMI which can be checked numerically using theeffective LMI toolbox in MATLAB

4 Examples

Example 1 Consider a two-neuron complex-valued neuralnetwork with constant delay

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591)) + 119867 (38)

where

119862 = (1 0

0 1) 119863 = (

1 + 119894 minus2 + 119894

1 minus 119894 minus1)

119864 = (1 + 119894 119894

minus1 + 119894 2) 119867 = (

2 minus 119894

minus2 + 119894) 120591 = 2

119891119877

119895(119906119895) =

1

32(10038161003816100381610038161003816Re (119906119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Re (119906119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

119891119868

119895(119906119895) =

1

32(10038161003816100381610038161003816Im (119906

119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Im (119906

119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

(39)

It can be verified that the activation functions 119891119895sat-

isfy assumption (H1) the time-varying delay 120591(119905) satisfiesassumption (H3) and 119863

119877

= ( 1 minus21 minus1

) 119863119868 = ( 1 1minus1 0

) 119864119877 =

( 1 0minus1 2

) 119864119868 = ( 1 11 0

) Γ = (164 0

0 164) 120588 = 0 There exists a

constant 120576 = 02 and by employing MATLAB LMI toolboxwe can find the solutions to the LMI under conditions (13)and (14) in Corollary 8 as follows

1198751= (

412118 minus68812 minus 42410119894

minus68812 + 42410119894 406762)

1198752= (

260370 minus41385 minus 07600119894

minus41385 + 07600119894 259447)

1198771= (

7859319 0

0 8570773)

1198772= (

5605160 0

0 6171000)

1198761= (

minus616337 + 11580119894 92494 + 06392119894

91464 + 01434119894 minus614512 minus 13717119894)

1198762= (

minus631096 minus 05628119894 102662 + 27895119894

101047 minus 33169119894 minus620592 + 06291119894)

(40)

Then all conditions in Corollary 8 are satisfied Therefore theneural network (38) is globally exponentially stable Figures1 and 2 depict the real and imaginary parts of states of theconsidered neural network (38) where initial condition is1199111(119905) = 05 119911

2(119905) = minus1 minus 05119894

Example 2 Consider a two-neuron complex-valued neuralnetwork with unbounded time-varying delays

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591 (119905))) + 119867 (41)

where

119862 = (1 0

0 1) 119863 = (

15 + 3119894 minus1 + 2119894

minus2 minus 2119894 25 + 05119894)

119864 = (05 + 119894 119894

minus2 + 05119894 1 + 05119894) 119867 = (

2 + 119894

minus2 + 119894)

120591 (119905) = 02119905

1198911(1199061) =

1

20(10038161003816100381610038161199061 + 1

1003816100381610038161003816 minus10038161003816100381610038161199061 minus 1

1003816100381610038161003816)

1198912(1199062) =

1

20(10038161003816100381610038161199062 + 1

1003816100381610038161003816 minus10038161003816100381610038161199062 minus 1

1003816100381610038161003816)

(42)

It can be verified that the activation functions 119891119895satisfy

Condition (H2) the time-varying delay 120591(119905) satisfies assump-tion (H3) and 119863

119877

= ( 15 minus1minus2 25

) 119863119868 = ( 3 2minus2 05

) 119864119877 = ( 05 0minus2 1

)

Abstract and Applied Analysis 7

0 10 20 30 40minus3

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 1 State trajectories of neural network (38)

minus1

minus05

0

05

1

15

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 2 State trajectories of neural network (38)

119864119868

= ( 1 105 05

) Π = ( 001 00 001

) 120588 = 02 There exist twoconstants 120576 = 1 and 119879 = 10 and by employing MATLABLMI toolbox we can find the solutions to the LMI underconditions (13) and (14) in Corollary 11 as follows

1198751= (

398524 147299 minus 32336119894

147299 + 32336119894 315075)

1198752= (

239471 97114 minus 13426119894

97114 + 13426119894 188527)

1198771= 103

times (13534 0

0 10135)

1198772= (

5846278 0

0 4293048)

0 10 20 30 40

minus3

minus4

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 3 State trajectories of neural network (41)

minus02

0

02

04

06

08

1

12

14

16

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 4 State trajectories of neural network (41)

1198761= (

minus518770 minus 15115119894 minus167003 + 06185119894

minus168517 minus 09537119894 minus407204 + 10687119894)

1198762= (

minus604708 + 02332119894 minus212500 + 36232119894

minus216247 minus 35095119894 minus482459 minus 01731119894)

(43)

Then all conditions in Corollary 11 are satisfied Thus theneural network (41) is globally power stable Figures 3 and 4depict the real and imaginary parts of states of the considerednetwork (41) where initial condition is 119909

1(119905) = 05 sin(05119905)119894

1199092(119905) = minus05 + 15 cos(03119905)119894

8 Abstract and Applied Analysis

5 Conclusion

In this paper the global 120583-stability of complex-valued neuralnetworks with unbounded time delays has been investi-gated by employing a combination of Lyapunov-Krasovskiifunctionals and the free weighting matrix method Severalsufficient conditions for checking the global 120583-stability ofthe considered complex-valued neural networks have beengiven in linearmatrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABAs direct applications we can get the exponential stabilityand power stability with respect to different time-varyingdelays Two examples have been provided to demonstrate theeffectiveness and less conservatism of the proposed criteria

Remark 14 Note that the activation functions in this paperare continuous Recently neural networks with discontinu-ous activations have received more and more attention [31ndash34] Based on the results of this paper we will consider 120583-stability of complex-valued neural networks with discontin-uous activations in future

Conflict of Interests

The authors declare that they have no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China under Grants 61273021 and 11172247and in part by the Natural Science Foundation Project of CQcstc2013jjB40008

References

[1] H Dong Z Wang and H Gao ldquoDistributed 119867infin

filteringfor a class of Markovian jump nonlinear time-delay systemsover lossy sensor networksrdquo IEEE Transactions on IndustrialElectronics vol 60 no 10 pp 4665ndash4672 2013

[2] J Hu Z Wang B Shen and H Gao ldquoQuantised recursivefiltering for a class of nonlinear systems with multiplicativenoises and missing measurementsrdquo International Journal ofControl vol 86 no 4 pp 650ndash663 2013

[3] Y Fang K Yan and K Li ldquoImpulsive synchronization of a classof chaotic systemsrdquo Systems Science and Control Engineeringvol 2 no 1 pp 55ndash60 2014

[4] M Darouach and M Chadli ldquoAdmissibility and control ofswitched discrete-time singular systemsrdquo Systems Science andControl Engineering vol 1 no 1 pp 43ndash51 2013

[5] B Shen ZWang D Ding andH Shu ldquo119867infinstate estimation for

complex networks with uncertain inner coupling and incom-plete measurementsrdquo IEEE Transactions on Neural Networksand Learning Systems vol 24 no 12 pp 2027ndash2037 2013

[6] B Shen Z Wang and X Liu ldquoSampled-data synchronizationcontrol of dynamical networks with stochastic samplingrdquo IEEETransactions on Automatic Control vol 57 no 10 pp 2644ndash2650 2012

[7] J Liang Z Wang Y Liu and X Liu ldquoState estimation for two-dimensional complex networks with randomly occurring non-linearities and randomly varying sensor delaysrdquo InternationalJournal of Robust and Nonlinear Control vol 24 no 1 pp 18ndash38 2014

[8] X Kan Z Wang and H Shu ldquoState estimation for discrete-time delayed neural networks with fractional uncertainties andsensor saturationsrdquo Neurocomputing vol 117 pp 64ndash71 2013

[9] Z Wang B Zineddin J Liang et al ldquoA novel neural networkapproach to cDNAmicroarray image segmentationrdquo ComputerMethods and Programs in Biomedicine vol 111 no 1 pp 189ndash198 2013

[10] A Hirose Complex-Valued Neural Networks Springer Heidel-berg GmbH and Co K Berlin Germany 2012

[11] G Tanaka and K Aihara ldquoComplex-valued multistate associa-tive memory with nonlinear multilevel functions for gray-levelimage reconstructionrdquo IEEE Transactions on Neural Networksvol 20 no 9 pp 1463ndash1473 2009

[12] S Chen L Hanzo and S Tan ldquoSymmetric complex-valued RBFreceiver for multiple-antenna-aided wireless systemsrdquo IEEEtransactions on Neural Networks vol 19 no 9 pp 1659ndash16652008

[13] I Cha and S A Kassam ldquoChannel equalization using adaptivecomplex radial basis function networksrdquo IEEE Journal onSelected Areas in Communications vol 13 no 1 pp 122ndash1311995

[14] T Nitta ldquoOrthogonality of decision boundaries in complex-valued neural networksrdquo Neural Computation vol 16 no 1 pp73ndash97 2004

[15] M Faijul Amin and K Murase ldquoSingle-layered complex-valued neural network for real-valued classification problemsrdquoNeurocomputing vol 72 no 4ndash6 pp 945ndash955 2009

[16] B K Tripathi and P K Kalra ldquoOn efficient learning machinewith root-power mean neuron in complex domainrdquo IEEETransactions on Neural Networks vol 22 no 5 pp 727ndash7382011

[17] J H Mathews and R W Howell Complex Analysis for Math-ematics and Engineering Jones and Bartlett Publications 3rdedition 1997

[18] J Hu and JWang ldquoGlobal stability of complex-valued recurrentneural networks with time-delaysrdquo IEEETransactions onNeuralNetworks vol 23 no 6 pp 853ndash865 2012

[19] V S H Rao and G R Murthy ldquoGlobal dynamics of a classof complex valued neural networksrdquo International Journal ofNeural Systems vol 18 no 2 pp 165ndash171 2008

[20] W Zhou and J M Zurada ldquoDiscrete-time recurrent neuralnetworks with complex-valued linear threshold neuronsrdquo IEEETransactions on Circuits and Systems II Express Briefs vol 56no 8 pp 669ndash673 2009

[21] C Duan and Q Song ldquoBoundedness and stability for discrete-time delayed neural network with complex-valued linearthreshold neuronsrdquo Discrete Dynamics in Nature and Societyvol 2010 Article ID 368379 19 pages 2010

[22] B Zou and Q Song ldquoBoundedness and complete stabilityof complex-valued neural networks with time delayrdquo IEEETransactions on Neural Networks and Learning Systems vol 24no 8 pp 1227ndash1238 2013

[23] X Xu J Zhang and J Shi ldquoExponential stability of complex-valued neural networks with mixed delaysrdquo Neurocomputingvol 128 pp 483ndash490 2014

Abstract and Applied Analysis 9

[24] X Chen and Q Song ldquoGlobal stability of complex-valuedneural networks with both leakage time delay and discrete timedelay on time scalesrdquo Neurocomputing vol 121 pp 254ndash2642013

[25] S-I Niculescu Delay Effects on Stability A Robust ControlApproach Springer London UK 2001

[26] V B Kolmanovskiı and V R Nosov Stability of Functional-Differential Equations Academic Press London UK 1986

[27] Y Liu Z Wang J Liang and X Liu ldquoSynchronization ofcoupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delaysrdquo IEEETransactions on Cybernetics vol 43 no 1 pp 102ndash114 2013

[28] T Chen and L Wang ldquoPower-rate global stability of dynamicalsystems with unbounded time-varying delaysrdquo IEEE Transac-tions on Circuits and Systems II Express Briefs vol 54 no 8 pp705ndash709 2007

[29] T Chen and L Wang ldquoGlobal 120583-stability of delayed neural net-works with unbounded time-varying delaysrdquo IEEETransactionson Neural Networks vol 18 no 6 pp 1836ndash1840 2007

[30] X Liu and T Chen ldquoRobust 120583-stability for uncertain stochasticneural networks with unbounded time-varying delaysrdquo PhysicaA Statistical Mechanics and Its Applications vol 387 no 12 pp2952ndash2962 2008

[31] X Liu and J Cao ldquoOn periodic solutions of neural networksvia differential inclusionsrdquo Neural Networks vol 22 no 4 pp329ndash334 2009

[32] X Liu and J Cao ldquoRobust state estimation for neural networkswith discontinuous activationsrdquo IEEE Transactions on SystemsMan and Cybernetics B Cybernetics vol 40 no 6 pp 1425ndash1437 2010

[33] X Liu T Chen J Cao and W Lu ldquoDissipativity and quasi-synchronization for neural networks with discontinuous acti-vations and parameter mismatchesrdquo Neural Networks vol 24no 10 pp 1013ndash1021 2011

[34] X Liu J Cao and W Yu ldquoFilippov systems and quasi-sync-hronization control for switched networksrdquo Chaos vol 22 no3 Article ID 033110 2012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 5: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

Abstract and Applied Analysis 5

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(25)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Γ Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21= 1198751+ 1198761+ 119876lowast

2119862 Ω33= minus1205732

(1 minus 120588)1198752+ 1198772Γ From (14)

we have thatΩ = Ω119877

+ 119894Ω119868 It follows from (13) and Lemma 5

thatΩ lt 0 Thus we get from (24) that

(119905) le 0 119905 ge 119879 (26)

which means 119881(119905) is monotonically nonincreasing for 119905 ge 119879So we have

119881 (119905) le 119881 (119879) 119905 ge 119879 (27)

From the definition of 119881(119905) in (16) we obtain that

1205832

(119905) 120582min (1198751) 119911(119905)2

le 119881 (119905) le 1198810le infin 119905 ge 119879 (28)

where 1198810= max

0le119904le119879119881(119904) It implies that

119911 (119905) le119872

120583 (119905) 119905 ge 119879 (29)

where119872 = radic1198810120582min(1198751) The proof is completed

Theorem 7 Assume that assumptions (H2) and (H3) holdNeural network (1) is globally120583-stable if there exist two positivedefinite Hermitian matrices 119875

1= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2

two real positive diagonal matrices 1198771gt 0 and 119877

2gt 0 two

complex matrices 1198761= 119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 a positive

differential function120583(119905) defined on [0infin) and three constants120572 ge 0120573 gt 0 and119879 gt 0 such that conditions (12) (13) and (14)in Theorem 6 are satisfied where Ω119877

11= 2120572119875

119877

1+ 119875119877

2+ 119862119876119877

1+

(119876119877

1)119879

119862 + 1198771Π and Ω119877

33= minus1205732

(1 minus 120588)119875119877

2+ 1198772Π

Proof From assumption (H2) we get10038161003816100381610038161003816119892119895(119911119895(119905))

10038161003816100381610038161003816le 120578119895

10038161003816100381610038161003816119911119895(119905)

10038161003816100381610038161003816(30)

for 119895 = 1 2 119899 Let 1198771= diag(119903

1 1199032 119903

119899) gt 0 It follows

from (30) that

1199031198951205832

(119905) 119892lowast

119895(119911119895(119905)) 119892119895(119911119895(119905)) minus 119903

1198951205832

(119905) 1205782

119895119911lowast

119895(119905) 119911119895(119905) le 0

(31)

for 119895 = 1 2 119899 Hence

1205832

(119905) 119892lowast

(119911 (119905)) 1198771119892 (119911 (119905)) minus 120583

2

(119905) 119911lowast

(119905) 1198771Π119911 (119905) le 0 (32)

Also we can get that

1205832

(119905) 119892lowast

(119911 (119905 minus 120591 (119905))) 1198772119892 (119911 (119905 minus 120591 (119905)))

minus 1205832

(119905) 119911lowast

(119905 minus 120591 (119905)) 1198772Π119911 (119905 minus 120591 (119905)) le 0

(33)

Consider the Lyapunov-Krasovskii functional (16) From(17) (23) (32) and (33) we can get that

(119905) le 1205832

(119905) 119908lowast

(119905) Ω119908 (119905) 119905 ge 119879 (34)

where 119908(119905) = (119911(119905) (119905) 119911(119905 minus 120591(119905)) 119892(119911(119905)) 119892(119911(119905 minus 120591(119905))))lowast

and

Ω =((

(

Ω11

Ω12

0 minus119876lowast

1119863 minus119876

lowast

1119864

Ω21

119876lowast

2+ 1198762

0 minus119876lowast

2119863 minus119876

lowast

2119864

0 0 Ω33

0 0

minus119863lowast

1198761

minus119863lowast

1198762

0 minus1198771

0

minus119864lowast

1198761

minus119864lowast

1198762

0 0 minus1198772

))

)

(35)

withΩ11= 2120572119875

1+1198752+1198621198761+119876lowast

1119862+1198771Π Ω12= 119875lowast

1+119876lowast

1+1198621198762

Ω21

= 1198751+ 1198761+ 119876lowast

2119862 Ω33

= minus1205732

(1 minus 120588)1198752+ 1198772Π From the

proof of Theorem 6 we know that neural network (1) is 120583-stable The proof is completed

Corollary 8 Assume that assumptions (H1) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that

conditions (13) and (14) inTheorem 6 are satisfied whereΩ11987711=

2120576119875119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811= 2120576119875

119868

1+ 119875119868

2+ 119862119876119868

1minus

(119876119868

1)119879

119862 Ω11987733= minus119890minus2120576120591

(1minus120588)119875119877

2+1198772Γ Ω11986833= minus119890minus2120576120591

(1minus120588)119875119868

2

Proof Let 120583(119905) = 119890120576119905 (119905 ge 0) then (119905)120583(119905) = 120576 120583(119905 minus

120591(119905))120583(119905) = 119890minus120576120591(119905)

ge 119890minus120576120591 Take 120572 = 120576 120573 = 119890

minus120576120591 and 119879 = 0It is obvious that condition (12) inTheorem 6 is satisfiedTheproof is completed

Corollary 9 Assume that assumptions (H1) and (H3) holdand (119905) le 120575119905(0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1=

119876119877

1+ 119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 6 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Γ Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Γ

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Proof Let 120583(119905) = 119905120576 (119905 ge 119879) then (119905)120583(119905) = 120576119905 le 120576119879

120583(119905 minus 120591(119905))120583(119905) = (1 minus 120591(119905)119905)120576

ge (1 minus 120575)120576 Take 120572 = 120576119879

120573 = (1 minus 120575)120576 It is obvious that condition (12) inTheorem 6 is

satisfied The proof is completed

It is similar to the proof of Corollaries 8 and 9 and wehave the following results

Corollary 10 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120591 Neural network (1) is globally exponentiallystable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

6 Abstract and Applied Analysis

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that conditions

(13) and (14) in Theorem 7 are satisfied where Ω11987711

= 2120576119875119877

1+

119875119877

2+119862119876119877

1+ (119876119877

1)119879

119862+1198771ΠΩ11986811= 2120576119875

119868

1+119875119868

2+119862119876119868

1minus (119876119868

1)119879

119862Ω119877

33= minus119890minus2120576120591

(1 minus 120588)119875119877

2+ 1198772Π Ω11986833= minus119890minus2120576120591

(1 minus 120588)119875119868

2

Corollary 11 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120575119905 (0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 7 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Π Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862 Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Π

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Remark 12 In [18 23] the authors studied the global stabilityof complex-valued neural networks with time-varying delaysand the activation function was supposed to be

119891119895(119911) = 119891

119877

(119909 119910) + 119894119891119868

(119909 119910) (36)

where 119911 = 119909+ 119894119910 and 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) are real-valued for all 119895 =

1 2 119899 and the partial derivatives of 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) with

respect to119909 and119910were required to exist and be continuous Inthis paper both the real parts and the imaginary parts of theactivation functions are no longer assumed to be derivable

Remark 13 In [21] the authors investigated boundedness andstability for discrete-time complex-valued neural networkswith constant delay and the activation functionwas supposedto be

119891119895(119911) = max 0Re (119911) + 119894max 0 Im (119911) (37)

Also the authors gave a complex-valued LMI criterion forchecking stability of the considered neural networks But afeasible way to solve the given complex-valued LMI is notprovided In this paper the criteria for checking stabilityof the complex-valued neural networks are established inthe real LMI which can be checked numerically using theeffective LMI toolbox in MATLAB

4 Examples

Example 1 Consider a two-neuron complex-valued neuralnetwork with constant delay

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591)) + 119867 (38)

where

119862 = (1 0

0 1) 119863 = (

1 + 119894 minus2 + 119894

1 minus 119894 minus1)

119864 = (1 + 119894 119894

minus1 + 119894 2) 119867 = (

2 minus 119894

minus2 + 119894) 120591 = 2

119891119877

119895(119906119895) =

1

32(10038161003816100381610038161003816Re (119906119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Re (119906119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

119891119868

119895(119906119895) =

1

32(10038161003816100381610038161003816Im (119906

119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Im (119906

119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

(39)

It can be verified that the activation functions 119891119895sat-

isfy assumption (H1) the time-varying delay 120591(119905) satisfiesassumption (H3) and 119863

119877

= ( 1 minus21 minus1

) 119863119868 = ( 1 1minus1 0

) 119864119877 =

( 1 0minus1 2

) 119864119868 = ( 1 11 0

) Γ = (164 0

0 164) 120588 = 0 There exists a

constant 120576 = 02 and by employing MATLAB LMI toolboxwe can find the solutions to the LMI under conditions (13)and (14) in Corollary 8 as follows

1198751= (

412118 minus68812 minus 42410119894

minus68812 + 42410119894 406762)

1198752= (

260370 minus41385 minus 07600119894

minus41385 + 07600119894 259447)

1198771= (

7859319 0

0 8570773)

1198772= (

5605160 0

0 6171000)

1198761= (

minus616337 + 11580119894 92494 + 06392119894

91464 + 01434119894 minus614512 minus 13717119894)

1198762= (

minus631096 minus 05628119894 102662 + 27895119894

101047 minus 33169119894 minus620592 + 06291119894)

(40)

Then all conditions in Corollary 8 are satisfied Therefore theneural network (38) is globally exponentially stable Figures1 and 2 depict the real and imaginary parts of states of theconsidered neural network (38) where initial condition is1199111(119905) = 05 119911

2(119905) = minus1 minus 05119894

Example 2 Consider a two-neuron complex-valued neuralnetwork with unbounded time-varying delays

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591 (119905))) + 119867 (41)

where

119862 = (1 0

0 1) 119863 = (

15 + 3119894 minus1 + 2119894

minus2 minus 2119894 25 + 05119894)

119864 = (05 + 119894 119894

minus2 + 05119894 1 + 05119894) 119867 = (

2 + 119894

minus2 + 119894)

120591 (119905) = 02119905

1198911(1199061) =

1

20(10038161003816100381610038161199061 + 1

1003816100381610038161003816 minus10038161003816100381610038161199061 minus 1

1003816100381610038161003816)

1198912(1199062) =

1

20(10038161003816100381610038161199062 + 1

1003816100381610038161003816 minus10038161003816100381610038161199062 minus 1

1003816100381610038161003816)

(42)

It can be verified that the activation functions 119891119895satisfy

Condition (H2) the time-varying delay 120591(119905) satisfies assump-tion (H3) and 119863

119877

= ( 15 minus1minus2 25

) 119863119868 = ( 3 2minus2 05

) 119864119877 = ( 05 0minus2 1

)

Abstract and Applied Analysis 7

0 10 20 30 40minus3

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 1 State trajectories of neural network (38)

minus1

minus05

0

05

1

15

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 2 State trajectories of neural network (38)

119864119868

= ( 1 105 05

) Π = ( 001 00 001

) 120588 = 02 There exist twoconstants 120576 = 1 and 119879 = 10 and by employing MATLABLMI toolbox we can find the solutions to the LMI underconditions (13) and (14) in Corollary 11 as follows

1198751= (

398524 147299 minus 32336119894

147299 + 32336119894 315075)

1198752= (

239471 97114 minus 13426119894

97114 + 13426119894 188527)

1198771= 103

times (13534 0

0 10135)

1198772= (

5846278 0

0 4293048)

0 10 20 30 40

minus3

minus4

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 3 State trajectories of neural network (41)

minus02

0

02

04

06

08

1

12

14

16

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 4 State trajectories of neural network (41)

1198761= (

minus518770 minus 15115119894 minus167003 + 06185119894

minus168517 minus 09537119894 minus407204 + 10687119894)

1198762= (

minus604708 + 02332119894 minus212500 + 36232119894

minus216247 minus 35095119894 minus482459 minus 01731119894)

(43)

Then all conditions in Corollary 11 are satisfied Thus theneural network (41) is globally power stable Figures 3 and 4depict the real and imaginary parts of states of the considerednetwork (41) where initial condition is 119909

1(119905) = 05 sin(05119905)119894

1199092(119905) = minus05 + 15 cos(03119905)119894

8 Abstract and Applied Analysis

5 Conclusion

In this paper the global 120583-stability of complex-valued neuralnetworks with unbounded time delays has been investi-gated by employing a combination of Lyapunov-Krasovskiifunctionals and the free weighting matrix method Severalsufficient conditions for checking the global 120583-stability ofthe considered complex-valued neural networks have beengiven in linearmatrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABAs direct applications we can get the exponential stabilityand power stability with respect to different time-varyingdelays Two examples have been provided to demonstrate theeffectiveness and less conservatism of the proposed criteria

Remark 14 Note that the activation functions in this paperare continuous Recently neural networks with discontinu-ous activations have received more and more attention [31ndash34] Based on the results of this paper we will consider 120583-stability of complex-valued neural networks with discontin-uous activations in future

Conflict of Interests

The authors declare that they have no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China under Grants 61273021 and 11172247and in part by the Natural Science Foundation Project of CQcstc2013jjB40008

References

[1] H Dong Z Wang and H Gao ldquoDistributed 119867infin

filteringfor a class of Markovian jump nonlinear time-delay systemsover lossy sensor networksrdquo IEEE Transactions on IndustrialElectronics vol 60 no 10 pp 4665ndash4672 2013

[2] J Hu Z Wang B Shen and H Gao ldquoQuantised recursivefiltering for a class of nonlinear systems with multiplicativenoises and missing measurementsrdquo International Journal ofControl vol 86 no 4 pp 650ndash663 2013

[3] Y Fang K Yan and K Li ldquoImpulsive synchronization of a classof chaotic systemsrdquo Systems Science and Control Engineeringvol 2 no 1 pp 55ndash60 2014

[4] M Darouach and M Chadli ldquoAdmissibility and control ofswitched discrete-time singular systemsrdquo Systems Science andControl Engineering vol 1 no 1 pp 43ndash51 2013

[5] B Shen ZWang D Ding andH Shu ldquo119867infinstate estimation for

complex networks with uncertain inner coupling and incom-plete measurementsrdquo IEEE Transactions on Neural Networksand Learning Systems vol 24 no 12 pp 2027ndash2037 2013

[6] B Shen Z Wang and X Liu ldquoSampled-data synchronizationcontrol of dynamical networks with stochastic samplingrdquo IEEETransactions on Automatic Control vol 57 no 10 pp 2644ndash2650 2012

[7] J Liang Z Wang Y Liu and X Liu ldquoState estimation for two-dimensional complex networks with randomly occurring non-linearities and randomly varying sensor delaysrdquo InternationalJournal of Robust and Nonlinear Control vol 24 no 1 pp 18ndash38 2014

[8] X Kan Z Wang and H Shu ldquoState estimation for discrete-time delayed neural networks with fractional uncertainties andsensor saturationsrdquo Neurocomputing vol 117 pp 64ndash71 2013

[9] Z Wang B Zineddin J Liang et al ldquoA novel neural networkapproach to cDNAmicroarray image segmentationrdquo ComputerMethods and Programs in Biomedicine vol 111 no 1 pp 189ndash198 2013

[10] A Hirose Complex-Valued Neural Networks Springer Heidel-berg GmbH and Co K Berlin Germany 2012

[11] G Tanaka and K Aihara ldquoComplex-valued multistate associa-tive memory with nonlinear multilevel functions for gray-levelimage reconstructionrdquo IEEE Transactions on Neural Networksvol 20 no 9 pp 1463ndash1473 2009

[12] S Chen L Hanzo and S Tan ldquoSymmetric complex-valued RBFreceiver for multiple-antenna-aided wireless systemsrdquo IEEEtransactions on Neural Networks vol 19 no 9 pp 1659ndash16652008

[13] I Cha and S A Kassam ldquoChannel equalization using adaptivecomplex radial basis function networksrdquo IEEE Journal onSelected Areas in Communications vol 13 no 1 pp 122ndash1311995

[14] T Nitta ldquoOrthogonality of decision boundaries in complex-valued neural networksrdquo Neural Computation vol 16 no 1 pp73ndash97 2004

[15] M Faijul Amin and K Murase ldquoSingle-layered complex-valued neural network for real-valued classification problemsrdquoNeurocomputing vol 72 no 4ndash6 pp 945ndash955 2009

[16] B K Tripathi and P K Kalra ldquoOn efficient learning machinewith root-power mean neuron in complex domainrdquo IEEETransactions on Neural Networks vol 22 no 5 pp 727ndash7382011

[17] J H Mathews and R W Howell Complex Analysis for Math-ematics and Engineering Jones and Bartlett Publications 3rdedition 1997

[18] J Hu and JWang ldquoGlobal stability of complex-valued recurrentneural networks with time-delaysrdquo IEEETransactions onNeuralNetworks vol 23 no 6 pp 853ndash865 2012

[19] V S H Rao and G R Murthy ldquoGlobal dynamics of a classof complex valued neural networksrdquo International Journal ofNeural Systems vol 18 no 2 pp 165ndash171 2008

[20] W Zhou and J M Zurada ldquoDiscrete-time recurrent neuralnetworks with complex-valued linear threshold neuronsrdquo IEEETransactions on Circuits and Systems II Express Briefs vol 56no 8 pp 669ndash673 2009

[21] C Duan and Q Song ldquoBoundedness and stability for discrete-time delayed neural network with complex-valued linearthreshold neuronsrdquo Discrete Dynamics in Nature and Societyvol 2010 Article ID 368379 19 pages 2010

[22] B Zou and Q Song ldquoBoundedness and complete stabilityof complex-valued neural networks with time delayrdquo IEEETransactions on Neural Networks and Learning Systems vol 24no 8 pp 1227ndash1238 2013

[23] X Xu J Zhang and J Shi ldquoExponential stability of complex-valued neural networks with mixed delaysrdquo Neurocomputingvol 128 pp 483ndash490 2014

Abstract and Applied Analysis 9

[24] X Chen and Q Song ldquoGlobal stability of complex-valuedneural networks with both leakage time delay and discrete timedelay on time scalesrdquo Neurocomputing vol 121 pp 254ndash2642013

[25] S-I Niculescu Delay Effects on Stability A Robust ControlApproach Springer London UK 2001

[26] V B Kolmanovskiı and V R Nosov Stability of Functional-Differential Equations Academic Press London UK 1986

[27] Y Liu Z Wang J Liang and X Liu ldquoSynchronization ofcoupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delaysrdquo IEEETransactions on Cybernetics vol 43 no 1 pp 102ndash114 2013

[28] T Chen and L Wang ldquoPower-rate global stability of dynamicalsystems with unbounded time-varying delaysrdquo IEEE Transac-tions on Circuits and Systems II Express Briefs vol 54 no 8 pp705ndash709 2007

[29] T Chen and L Wang ldquoGlobal 120583-stability of delayed neural net-works with unbounded time-varying delaysrdquo IEEETransactionson Neural Networks vol 18 no 6 pp 1836ndash1840 2007

[30] X Liu and T Chen ldquoRobust 120583-stability for uncertain stochasticneural networks with unbounded time-varying delaysrdquo PhysicaA Statistical Mechanics and Its Applications vol 387 no 12 pp2952ndash2962 2008

[31] X Liu and J Cao ldquoOn periodic solutions of neural networksvia differential inclusionsrdquo Neural Networks vol 22 no 4 pp329ndash334 2009

[32] X Liu and J Cao ldquoRobust state estimation for neural networkswith discontinuous activationsrdquo IEEE Transactions on SystemsMan and Cybernetics B Cybernetics vol 40 no 6 pp 1425ndash1437 2010

[33] X Liu T Chen J Cao and W Lu ldquoDissipativity and quasi-synchronization for neural networks with discontinuous acti-vations and parameter mismatchesrdquo Neural Networks vol 24no 10 pp 1013ndash1021 2011

[34] X Liu J Cao and W Yu ldquoFilippov systems and quasi-sync-hronization control for switched networksrdquo Chaos vol 22 no3 Article ID 033110 2012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 6: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

6 Abstract and Applied Analysis

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and a constant 120576 gt 0 such that conditions

(13) and (14) in Theorem 7 are satisfied where Ω11987711

= 2120576119875119877

1+

119875119877

2+119862119876119877

1+ (119876119877

1)119879

119862+1198771ΠΩ11986811= 2120576119875

119868

1+119875119868

2+119862119876119868

1minus (119876119868

1)119879

119862Ω119877

33= minus119890minus2120576120591

(1 minus 120588)119875119877

2+ 1198772Π Ω11986833= minus119890minus2120576120591

(1 minus 120588)119875119868

2

Corollary 11 Assume that assumptions (H2) and (H3) holdand 120591(119905) le 120575119905 (0 lt 120575 lt 1) Neural network (1) is globally powerstable if there exist two positive definite Hermitian matrices1198751= 119875119877

1+ 119894119875119868

1and 119875

2= 119875119877

2+ 119894119875119868

2 two real positive diagonal

matrices 1198771gt 0 and 119877

2gt 0 two complex matrices 119876

1= 119876119877

1+

119894119876119868

1 1198762= 119876119877

2+ 119894119876119868

2 and two constants 120576 gt 0 and 119879 gt 0

such that conditions (13) and (14) in Theorem 7 are satisfiedwhere Ω119877

11= (2120576119879)119875

119877

1+ 119875119877

2+ 119862119876119877

1+ (119876119877

1)119879

119862 + 1198771Π Ω11986811

=

(2120576119879)119875119868

1+119875119868

2+119862119876119868

1minus(119876119868

1)119879

119862 Ω11987733= minus(1minus120575)

2120576

(1minus120588)119875119877

2+1198772Π

Ω119868

33= minus(1 minus 120575)

2120576

(1 minus 120588)119875119868

2

Remark 12 In [18 23] the authors studied the global stabilityof complex-valued neural networks with time-varying delaysand the activation function was supposed to be

119891119895(119911) = 119891

119877

(119909 119910) + 119894119891119868

(119909 119910) (36)

where 119911 = 119909+ 119894119910 and 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) are real-valued for all 119895 =

1 2 119899 and the partial derivatives of 119891119877119895(sdot sdot) 119891119868

119895(sdot sdot) with

respect to119909 and119910were required to exist and be continuous Inthis paper both the real parts and the imaginary parts of theactivation functions are no longer assumed to be derivable

Remark 13 In [21] the authors investigated boundedness andstability for discrete-time complex-valued neural networkswith constant delay and the activation functionwas supposedto be

119891119895(119911) = max 0Re (119911) + 119894max 0 Im (119911) (37)

Also the authors gave a complex-valued LMI criterion forchecking stability of the considered neural networks But afeasible way to solve the given complex-valued LMI is notprovided In this paper the criteria for checking stabilityof the complex-valued neural networks are established inthe real LMI which can be checked numerically using theeffective LMI toolbox in MATLAB

4 Examples

Example 1 Consider a two-neuron complex-valued neuralnetwork with constant delay

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591)) + 119867 (38)

where

119862 = (1 0

0 1) 119863 = (

1 + 119894 minus2 + 119894

1 minus 119894 minus1)

119864 = (1 + 119894 119894

minus1 + 119894 2) 119867 = (

2 minus 119894

minus2 + 119894) 120591 = 2

119891119877

119895(119906119895) =

1

32(10038161003816100381610038161003816Re (119906119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Re (119906119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

119891119868

119895(119906119895) =

1

32(10038161003816100381610038161003816Im (119906

119895) + 1

10038161003816100381610038161003816+10038161003816100381610038161003816Im (119906

119895) minus 1

10038161003816100381610038161003816) 119895 = 1 2

(39)

It can be verified that the activation functions 119891119895sat-

isfy assumption (H1) the time-varying delay 120591(119905) satisfiesassumption (H3) and 119863

119877

= ( 1 minus21 minus1

) 119863119868 = ( 1 1minus1 0

) 119864119877 =

( 1 0minus1 2

) 119864119868 = ( 1 11 0

) Γ = (164 0

0 164) 120588 = 0 There exists a

constant 120576 = 02 and by employing MATLAB LMI toolboxwe can find the solutions to the LMI under conditions (13)and (14) in Corollary 8 as follows

1198751= (

412118 minus68812 minus 42410119894

minus68812 + 42410119894 406762)

1198752= (

260370 minus41385 minus 07600119894

minus41385 + 07600119894 259447)

1198771= (

7859319 0

0 8570773)

1198772= (

5605160 0

0 6171000)

1198761= (

minus616337 + 11580119894 92494 + 06392119894

91464 + 01434119894 minus614512 minus 13717119894)

1198762= (

minus631096 minus 05628119894 102662 + 27895119894

101047 minus 33169119894 minus620592 + 06291119894)

(40)

Then all conditions in Corollary 8 are satisfied Therefore theneural network (38) is globally exponentially stable Figures1 and 2 depict the real and imaginary parts of states of theconsidered neural network (38) where initial condition is1199111(119905) = 05 119911

2(119905) = minus1 minus 05119894

Example 2 Consider a two-neuron complex-valued neuralnetwork with unbounded time-varying delays

(119905) = minus119862119906 (119905) + 119863119891 (119906 (119905)) + 119864119891 (119906 (119905 minus 120591 (119905))) + 119867 (41)

where

119862 = (1 0

0 1) 119863 = (

15 + 3119894 minus1 + 2119894

minus2 minus 2119894 25 + 05119894)

119864 = (05 + 119894 119894

minus2 + 05119894 1 + 05119894) 119867 = (

2 + 119894

minus2 + 119894)

120591 (119905) = 02119905

1198911(1199061) =

1

20(10038161003816100381610038161199061 + 1

1003816100381610038161003816 minus10038161003816100381610038161199061 minus 1

1003816100381610038161003816)

1198912(1199062) =

1

20(10038161003816100381610038161199062 + 1

1003816100381610038161003816 minus10038161003816100381610038161199062 minus 1

1003816100381610038161003816)

(42)

It can be verified that the activation functions 119891119895satisfy

Condition (H2) the time-varying delay 120591(119905) satisfies assump-tion (H3) and 119863

119877

= ( 15 minus1minus2 25

) 119863119868 = ( 3 2minus2 05

) 119864119877 = ( 05 0minus2 1

)

Abstract and Applied Analysis 7

0 10 20 30 40minus3

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 1 State trajectories of neural network (38)

minus1

minus05

0

05

1

15

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 2 State trajectories of neural network (38)

119864119868

= ( 1 105 05

) Π = ( 001 00 001

) 120588 = 02 There exist twoconstants 120576 = 1 and 119879 = 10 and by employing MATLABLMI toolbox we can find the solutions to the LMI underconditions (13) and (14) in Corollary 11 as follows

1198751= (

398524 147299 minus 32336119894

147299 + 32336119894 315075)

1198752= (

239471 97114 minus 13426119894

97114 + 13426119894 188527)

1198771= 103

times (13534 0

0 10135)

1198772= (

5846278 0

0 4293048)

0 10 20 30 40

minus3

minus4

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 3 State trajectories of neural network (41)

minus02

0

02

04

06

08

1

12

14

16

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 4 State trajectories of neural network (41)

1198761= (

minus518770 minus 15115119894 minus167003 + 06185119894

minus168517 minus 09537119894 minus407204 + 10687119894)

1198762= (

minus604708 + 02332119894 minus212500 + 36232119894

minus216247 minus 35095119894 minus482459 minus 01731119894)

(43)

Then all conditions in Corollary 11 are satisfied Thus theneural network (41) is globally power stable Figures 3 and 4depict the real and imaginary parts of states of the considerednetwork (41) where initial condition is 119909

1(119905) = 05 sin(05119905)119894

1199092(119905) = minus05 + 15 cos(03119905)119894

8 Abstract and Applied Analysis

5 Conclusion

In this paper the global 120583-stability of complex-valued neuralnetworks with unbounded time delays has been investi-gated by employing a combination of Lyapunov-Krasovskiifunctionals and the free weighting matrix method Severalsufficient conditions for checking the global 120583-stability ofthe considered complex-valued neural networks have beengiven in linearmatrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABAs direct applications we can get the exponential stabilityand power stability with respect to different time-varyingdelays Two examples have been provided to demonstrate theeffectiveness and less conservatism of the proposed criteria

Remark 14 Note that the activation functions in this paperare continuous Recently neural networks with discontinu-ous activations have received more and more attention [31ndash34] Based on the results of this paper we will consider 120583-stability of complex-valued neural networks with discontin-uous activations in future

Conflict of Interests

The authors declare that they have no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China under Grants 61273021 and 11172247and in part by the Natural Science Foundation Project of CQcstc2013jjB40008

References

[1] H Dong Z Wang and H Gao ldquoDistributed 119867infin

filteringfor a class of Markovian jump nonlinear time-delay systemsover lossy sensor networksrdquo IEEE Transactions on IndustrialElectronics vol 60 no 10 pp 4665ndash4672 2013

[2] J Hu Z Wang B Shen and H Gao ldquoQuantised recursivefiltering for a class of nonlinear systems with multiplicativenoises and missing measurementsrdquo International Journal ofControl vol 86 no 4 pp 650ndash663 2013

[3] Y Fang K Yan and K Li ldquoImpulsive synchronization of a classof chaotic systemsrdquo Systems Science and Control Engineeringvol 2 no 1 pp 55ndash60 2014

[4] M Darouach and M Chadli ldquoAdmissibility and control ofswitched discrete-time singular systemsrdquo Systems Science andControl Engineering vol 1 no 1 pp 43ndash51 2013

[5] B Shen ZWang D Ding andH Shu ldquo119867infinstate estimation for

complex networks with uncertain inner coupling and incom-plete measurementsrdquo IEEE Transactions on Neural Networksand Learning Systems vol 24 no 12 pp 2027ndash2037 2013

[6] B Shen Z Wang and X Liu ldquoSampled-data synchronizationcontrol of dynamical networks with stochastic samplingrdquo IEEETransactions on Automatic Control vol 57 no 10 pp 2644ndash2650 2012

[7] J Liang Z Wang Y Liu and X Liu ldquoState estimation for two-dimensional complex networks with randomly occurring non-linearities and randomly varying sensor delaysrdquo InternationalJournal of Robust and Nonlinear Control vol 24 no 1 pp 18ndash38 2014

[8] X Kan Z Wang and H Shu ldquoState estimation for discrete-time delayed neural networks with fractional uncertainties andsensor saturationsrdquo Neurocomputing vol 117 pp 64ndash71 2013

[9] Z Wang B Zineddin J Liang et al ldquoA novel neural networkapproach to cDNAmicroarray image segmentationrdquo ComputerMethods and Programs in Biomedicine vol 111 no 1 pp 189ndash198 2013

[10] A Hirose Complex-Valued Neural Networks Springer Heidel-berg GmbH and Co K Berlin Germany 2012

[11] G Tanaka and K Aihara ldquoComplex-valued multistate associa-tive memory with nonlinear multilevel functions for gray-levelimage reconstructionrdquo IEEE Transactions on Neural Networksvol 20 no 9 pp 1463ndash1473 2009

[12] S Chen L Hanzo and S Tan ldquoSymmetric complex-valued RBFreceiver for multiple-antenna-aided wireless systemsrdquo IEEEtransactions on Neural Networks vol 19 no 9 pp 1659ndash16652008

[13] I Cha and S A Kassam ldquoChannel equalization using adaptivecomplex radial basis function networksrdquo IEEE Journal onSelected Areas in Communications vol 13 no 1 pp 122ndash1311995

[14] T Nitta ldquoOrthogonality of decision boundaries in complex-valued neural networksrdquo Neural Computation vol 16 no 1 pp73ndash97 2004

[15] M Faijul Amin and K Murase ldquoSingle-layered complex-valued neural network for real-valued classification problemsrdquoNeurocomputing vol 72 no 4ndash6 pp 945ndash955 2009

[16] B K Tripathi and P K Kalra ldquoOn efficient learning machinewith root-power mean neuron in complex domainrdquo IEEETransactions on Neural Networks vol 22 no 5 pp 727ndash7382011

[17] J H Mathews and R W Howell Complex Analysis for Math-ematics and Engineering Jones and Bartlett Publications 3rdedition 1997

[18] J Hu and JWang ldquoGlobal stability of complex-valued recurrentneural networks with time-delaysrdquo IEEETransactions onNeuralNetworks vol 23 no 6 pp 853ndash865 2012

[19] V S H Rao and G R Murthy ldquoGlobal dynamics of a classof complex valued neural networksrdquo International Journal ofNeural Systems vol 18 no 2 pp 165ndash171 2008

[20] W Zhou and J M Zurada ldquoDiscrete-time recurrent neuralnetworks with complex-valued linear threshold neuronsrdquo IEEETransactions on Circuits and Systems II Express Briefs vol 56no 8 pp 669ndash673 2009

[21] C Duan and Q Song ldquoBoundedness and stability for discrete-time delayed neural network with complex-valued linearthreshold neuronsrdquo Discrete Dynamics in Nature and Societyvol 2010 Article ID 368379 19 pages 2010

[22] B Zou and Q Song ldquoBoundedness and complete stabilityof complex-valued neural networks with time delayrdquo IEEETransactions on Neural Networks and Learning Systems vol 24no 8 pp 1227ndash1238 2013

[23] X Xu J Zhang and J Shi ldquoExponential stability of complex-valued neural networks with mixed delaysrdquo Neurocomputingvol 128 pp 483ndash490 2014

Abstract and Applied Analysis 9

[24] X Chen and Q Song ldquoGlobal stability of complex-valuedneural networks with both leakage time delay and discrete timedelay on time scalesrdquo Neurocomputing vol 121 pp 254ndash2642013

[25] S-I Niculescu Delay Effects on Stability A Robust ControlApproach Springer London UK 2001

[26] V B Kolmanovskiı and V R Nosov Stability of Functional-Differential Equations Academic Press London UK 1986

[27] Y Liu Z Wang J Liang and X Liu ldquoSynchronization ofcoupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delaysrdquo IEEETransactions on Cybernetics vol 43 no 1 pp 102ndash114 2013

[28] T Chen and L Wang ldquoPower-rate global stability of dynamicalsystems with unbounded time-varying delaysrdquo IEEE Transac-tions on Circuits and Systems II Express Briefs vol 54 no 8 pp705ndash709 2007

[29] T Chen and L Wang ldquoGlobal 120583-stability of delayed neural net-works with unbounded time-varying delaysrdquo IEEETransactionson Neural Networks vol 18 no 6 pp 1836ndash1840 2007

[30] X Liu and T Chen ldquoRobust 120583-stability for uncertain stochasticneural networks with unbounded time-varying delaysrdquo PhysicaA Statistical Mechanics and Its Applications vol 387 no 12 pp2952ndash2962 2008

[31] X Liu and J Cao ldquoOn periodic solutions of neural networksvia differential inclusionsrdquo Neural Networks vol 22 no 4 pp329ndash334 2009

[32] X Liu and J Cao ldquoRobust state estimation for neural networkswith discontinuous activationsrdquo IEEE Transactions on SystemsMan and Cybernetics B Cybernetics vol 40 no 6 pp 1425ndash1437 2010

[33] X Liu T Chen J Cao and W Lu ldquoDissipativity and quasi-synchronization for neural networks with discontinuous acti-vations and parameter mismatchesrdquo Neural Networks vol 24no 10 pp 1013ndash1021 2011

[34] X Liu J Cao and W Yu ldquoFilippov systems and quasi-sync-hronization control for switched networksrdquo Chaos vol 22 no3 Article ID 033110 2012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 7: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

Abstract and Applied Analysis 7

0 10 20 30 40minus3

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 1 State trajectories of neural network (38)

minus1

minus05

0

05

1

15

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 2 State trajectories of neural network (38)

119864119868

= ( 1 105 05

) Π = ( 001 00 001

) 120588 = 02 There exist twoconstants 120576 = 1 and 119879 = 10 and by employing MATLABLMI toolbox we can find the solutions to the LMI underconditions (13) and (14) in Corollary 11 as follows

1198751= (

398524 147299 minus 32336119894

147299 + 32336119894 315075)

1198752= (

239471 97114 minus 13426119894

97114 + 13426119894 188527)

1198771= 103

times (13534 0

0 10135)

1198772= (

5846278 0

0 4293048)

0 10 20 30 40

minus3

minus4

minus2

minus1

0

1

2

3

t

Re(z

1(t))

and

Re(z

2(t))

Re(z1(t))Re(z2(t))

Figure 3 State trajectories of neural network (41)

minus02

0

02

04

06

08

1

12

14

16

0 10 20 30 40t

Im(z1(t))

Im(z2(t))

Im(z

1(t))

and

Im(z

2(t))

Figure 4 State trajectories of neural network (41)

1198761= (

minus518770 minus 15115119894 minus167003 + 06185119894

minus168517 minus 09537119894 minus407204 + 10687119894)

1198762= (

minus604708 + 02332119894 minus212500 + 36232119894

minus216247 minus 35095119894 minus482459 minus 01731119894)

(43)

Then all conditions in Corollary 11 are satisfied Thus theneural network (41) is globally power stable Figures 3 and 4depict the real and imaginary parts of states of the considerednetwork (41) where initial condition is 119909

1(119905) = 05 sin(05119905)119894

1199092(119905) = minus05 + 15 cos(03119905)119894

8 Abstract and Applied Analysis

5 Conclusion

In this paper the global 120583-stability of complex-valued neuralnetworks with unbounded time delays has been investi-gated by employing a combination of Lyapunov-Krasovskiifunctionals and the free weighting matrix method Severalsufficient conditions for checking the global 120583-stability ofthe considered complex-valued neural networks have beengiven in linearmatrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABAs direct applications we can get the exponential stabilityand power stability with respect to different time-varyingdelays Two examples have been provided to demonstrate theeffectiveness and less conservatism of the proposed criteria

Remark 14 Note that the activation functions in this paperare continuous Recently neural networks with discontinu-ous activations have received more and more attention [31ndash34] Based on the results of this paper we will consider 120583-stability of complex-valued neural networks with discontin-uous activations in future

Conflict of Interests

The authors declare that they have no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China under Grants 61273021 and 11172247and in part by the Natural Science Foundation Project of CQcstc2013jjB40008

References

[1] H Dong Z Wang and H Gao ldquoDistributed 119867infin

filteringfor a class of Markovian jump nonlinear time-delay systemsover lossy sensor networksrdquo IEEE Transactions on IndustrialElectronics vol 60 no 10 pp 4665ndash4672 2013

[2] J Hu Z Wang B Shen and H Gao ldquoQuantised recursivefiltering for a class of nonlinear systems with multiplicativenoises and missing measurementsrdquo International Journal ofControl vol 86 no 4 pp 650ndash663 2013

[3] Y Fang K Yan and K Li ldquoImpulsive synchronization of a classof chaotic systemsrdquo Systems Science and Control Engineeringvol 2 no 1 pp 55ndash60 2014

[4] M Darouach and M Chadli ldquoAdmissibility and control ofswitched discrete-time singular systemsrdquo Systems Science andControl Engineering vol 1 no 1 pp 43ndash51 2013

[5] B Shen ZWang D Ding andH Shu ldquo119867infinstate estimation for

complex networks with uncertain inner coupling and incom-plete measurementsrdquo IEEE Transactions on Neural Networksand Learning Systems vol 24 no 12 pp 2027ndash2037 2013

[6] B Shen Z Wang and X Liu ldquoSampled-data synchronizationcontrol of dynamical networks with stochastic samplingrdquo IEEETransactions on Automatic Control vol 57 no 10 pp 2644ndash2650 2012

[7] J Liang Z Wang Y Liu and X Liu ldquoState estimation for two-dimensional complex networks with randomly occurring non-linearities and randomly varying sensor delaysrdquo InternationalJournal of Robust and Nonlinear Control vol 24 no 1 pp 18ndash38 2014

[8] X Kan Z Wang and H Shu ldquoState estimation for discrete-time delayed neural networks with fractional uncertainties andsensor saturationsrdquo Neurocomputing vol 117 pp 64ndash71 2013

[9] Z Wang B Zineddin J Liang et al ldquoA novel neural networkapproach to cDNAmicroarray image segmentationrdquo ComputerMethods and Programs in Biomedicine vol 111 no 1 pp 189ndash198 2013

[10] A Hirose Complex-Valued Neural Networks Springer Heidel-berg GmbH and Co K Berlin Germany 2012

[11] G Tanaka and K Aihara ldquoComplex-valued multistate associa-tive memory with nonlinear multilevel functions for gray-levelimage reconstructionrdquo IEEE Transactions on Neural Networksvol 20 no 9 pp 1463ndash1473 2009

[12] S Chen L Hanzo and S Tan ldquoSymmetric complex-valued RBFreceiver for multiple-antenna-aided wireless systemsrdquo IEEEtransactions on Neural Networks vol 19 no 9 pp 1659ndash16652008

[13] I Cha and S A Kassam ldquoChannel equalization using adaptivecomplex radial basis function networksrdquo IEEE Journal onSelected Areas in Communications vol 13 no 1 pp 122ndash1311995

[14] T Nitta ldquoOrthogonality of decision boundaries in complex-valued neural networksrdquo Neural Computation vol 16 no 1 pp73ndash97 2004

[15] M Faijul Amin and K Murase ldquoSingle-layered complex-valued neural network for real-valued classification problemsrdquoNeurocomputing vol 72 no 4ndash6 pp 945ndash955 2009

[16] B K Tripathi and P K Kalra ldquoOn efficient learning machinewith root-power mean neuron in complex domainrdquo IEEETransactions on Neural Networks vol 22 no 5 pp 727ndash7382011

[17] J H Mathews and R W Howell Complex Analysis for Math-ematics and Engineering Jones and Bartlett Publications 3rdedition 1997

[18] J Hu and JWang ldquoGlobal stability of complex-valued recurrentneural networks with time-delaysrdquo IEEETransactions onNeuralNetworks vol 23 no 6 pp 853ndash865 2012

[19] V S H Rao and G R Murthy ldquoGlobal dynamics of a classof complex valued neural networksrdquo International Journal ofNeural Systems vol 18 no 2 pp 165ndash171 2008

[20] W Zhou and J M Zurada ldquoDiscrete-time recurrent neuralnetworks with complex-valued linear threshold neuronsrdquo IEEETransactions on Circuits and Systems II Express Briefs vol 56no 8 pp 669ndash673 2009

[21] C Duan and Q Song ldquoBoundedness and stability for discrete-time delayed neural network with complex-valued linearthreshold neuronsrdquo Discrete Dynamics in Nature and Societyvol 2010 Article ID 368379 19 pages 2010

[22] B Zou and Q Song ldquoBoundedness and complete stabilityof complex-valued neural networks with time delayrdquo IEEETransactions on Neural Networks and Learning Systems vol 24no 8 pp 1227ndash1238 2013

[23] X Xu J Zhang and J Shi ldquoExponential stability of complex-valued neural networks with mixed delaysrdquo Neurocomputingvol 128 pp 483ndash490 2014

Abstract and Applied Analysis 9

[24] X Chen and Q Song ldquoGlobal stability of complex-valuedneural networks with both leakage time delay and discrete timedelay on time scalesrdquo Neurocomputing vol 121 pp 254ndash2642013

[25] S-I Niculescu Delay Effects on Stability A Robust ControlApproach Springer London UK 2001

[26] V B Kolmanovskiı and V R Nosov Stability of Functional-Differential Equations Academic Press London UK 1986

[27] Y Liu Z Wang J Liang and X Liu ldquoSynchronization ofcoupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delaysrdquo IEEETransactions on Cybernetics vol 43 no 1 pp 102ndash114 2013

[28] T Chen and L Wang ldquoPower-rate global stability of dynamicalsystems with unbounded time-varying delaysrdquo IEEE Transac-tions on Circuits and Systems II Express Briefs vol 54 no 8 pp705ndash709 2007

[29] T Chen and L Wang ldquoGlobal 120583-stability of delayed neural net-works with unbounded time-varying delaysrdquo IEEETransactionson Neural Networks vol 18 no 6 pp 1836ndash1840 2007

[30] X Liu and T Chen ldquoRobust 120583-stability for uncertain stochasticneural networks with unbounded time-varying delaysrdquo PhysicaA Statistical Mechanics and Its Applications vol 387 no 12 pp2952ndash2962 2008

[31] X Liu and J Cao ldquoOn periodic solutions of neural networksvia differential inclusionsrdquo Neural Networks vol 22 no 4 pp329ndash334 2009

[32] X Liu and J Cao ldquoRobust state estimation for neural networkswith discontinuous activationsrdquo IEEE Transactions on SystemsMan and Cybernetics B Cybernetics vol 40 no 6 pp 1425ndash1437 2010

[33] X Liu T Chen J Cao and W Lu ldquoDissipativity and quasi-synchronization for neural networks with discontinuous acti-vations and parameter mismatchesrdquo Neural Networks vol 24no 10 pp 1013ndash1021 2011

[34] X Liu J Cao and W Yu ldquoFilippov systems and quasi-sync-hronization control for switched networksrdquo Chaos vol 22 no3 Article ID 033110 2012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 8: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

8 Abstract and Applied Analysis

5 Conclusion

In this paper the global 120583-stability of complex-valued neuralnetworks with unbounded time delays has been investi-gated by employing a combination of Lyapunov-Krasovskiifunctionals and the free weighting matrix method Severalsufficient conditions for checking the global 120583-stability ofthe considered complex-valued neural networks have beengiven in linearmatrix inequality (LMI) which can be checkednumerically using the effective LMI toolbox in MATLABAs direct applications we can get the exponential stabilityand power stability with respect to different time-varyingdelays Two examples have been provided to demonstrate theeffectiveness and less conservatism of the proposed criteria

Remark 14 Note that the activation functions in this paperare continuous Recently neural networks with discontinu-ous activations have received more and more attention [31ndash34] Based on the results of this paper we will consider 120583-stability of complex-valued neural networks with discontin-uous activations in future

Conflict of Interests

The authors declare that they have no conflict of interestsregarding the publication of this paper

Acknowledgments

This work was supported by the National Natural ScienceFoundation of China under Grants 61273021 and 11172247and in part by the Natural Science Foundation Project of CQcstc2013jjB40008

References

[1] H Dong Z Wang and H Gao ldquoDistributed 119867infin

filteringfor a class of Markovian jump nonlinear time-delay systemsover lossy sensor networksrdquo IEEE Transactions on IndustrialElectronics vol 60 no 10 pp 4665ndash4672 2013

[2] J Hu Z Wang B Shen and H Gao ldquoQuantised recursivefiltering for a class of nonlinear systems with multiplicativenoises and missing measurementsrdquo International Journal ofControl vol 86 no 4 pp 650ndash663 2013

[3] Y Fang K Yan and K Li ldquoImpulsive synchronization of a classof chaotic systemsrdquo Systems Science and Control Engineeringvol 2 no 1 pp 55ndash60 2014

[4] M Darouach and M Chadli ldquoAdmissibility and control ofswitched discrete-time singular systemsrdquo Systems Science andControl Engineering vol 1 no 1 pp 43ndash51 2013

[5] B Shen ZWang D Ding andH Shu ldquo119867infinstate estimation for

complex networks with uncertain inner coupling and incom-plete measurementsrdquo IEEE Transactions on Neural Networksand Learning Systems vol 24 no 12 pp 2027ndash2037 2013

[6] B Shen Z Wang and X Liu ldquoSampled-data synchronizationcontrol of dynamical networks with stochastic samplingrdquo IEEETransactions on Automatic Control vol 57 no 10 pp 2644ndash2650 2012

[7] J Liang Z Wang Y Liu and X Liu ldquoState estimation for two-dimensional complex networks with randomly occurring non-linearities and randomly varying sensor delaysrdquo InternationalJournal of Robust and Nonlinear Control vol 24 no 1 pp 18ndash38 2014

[8] X Kan Z Wang and H Shu ldquoState estimation for discrete-time delayed neural networks with fractional uncertainties andsensor saturationsrdquo Neurocomputing vol 117 pp 64ndash71 2013

[9] Z Wang B Zineddin J Liang et al ldquoA novel neural networkapproach to cDNAmicroarray image segmentationrdquo ComputerMethods and Programs in Biomedicine vol 111 no 1 pp 189ndash198 2013

[10] A Hirose Complex-Valued Neural Networks Springer Heidel-berg GmbH and Co K Berlin Germany 2012

[11] G Tanaka and K Aihara ldquoComplex-valued multistate associa-tive memory with nonlinear multilevel functions for gray-levelimage reconstructionrdquo IEEE Transactions on Neural Networksvol 20 no 9 pp 1463ndash1473 2009

[12] S Chen L Hanzo and S Tan ldquoSymmetric complex-valued RBFreceiver for multiple-antenna-aided wireless systemsrdquo IEEEtransactions on Neural Networks vol 19 no 9 pp 1659ndash16652008

[13] I Cha and S A Kassam ldquoChannel equalization using adaptivecomplex radial basis function networksrdquo IEEE Journal onSelected Areas in Communications vol 13 no 1 pp 122ndash1311995

[14] T Nitta ldquoOrthogonality of decision boundaries in complex-valued neural networksrdquo Neural Computation vol 16 no 1 pp73ndash97 2004

[15] M Faijul Amin and K Murase ldquoSingle-layered complex-valued neural network for real-valued classification problemsrdquoNeurocomputing vol 72 no 4ndash6 pp 945ndash955 2009

[16] B K Tripathi and P K Kalra ldquoOn efficient learning machinewith root-power mean neuron in complex domainrdquo IEEETransactions on Neural Networks vol 22 no 5 pp 727ndash7382011

[17] J H Mathews and R W Howell Complex Analysis for Math-ematics and Engineering Jones and Bartlett Publications 3rdedition 1997

[18] J Hu and JWang ldquoGlobal stability of complex-valued recurrentneural networks with time-delaysrdquo IEEETransactions onNeuralNetworks vol 23 no 6 pp 853ndash865 2012

[19] V S H Rao and G R Murthy ldquoGlobal dynamics of a classof complex valued neural networksrdquo International Journal ofNeural Systems vol 18 no 2 pp 165ndash171 2008

[20] W Zhou and J M Zurada ldquoDiscrete-time recurrent neuralnetworks with complex-valued linear threshold neuronsrdquo IEEETransactions on Circuits and Systems II Express Briefs vol 56no 8 pp 669ndash673 2009

[21] C Duan and Q Song ldquoBoundedness and stability for discrete-time delayed neural network with complex-valued linearthreshold neuronsrdquo Discrete Dynamics in Nature and Societyvol 2010 Article ID 368379 19 pages 2010

[22] B Zou and Q Song ldquoBoundedness and complete stabilityof complex-valued neural networks with time delayrdquo IEEETransactions on Neural Networks and Learning Systems vol 24no 8 pp 1227ndash1238 2013

[23] X Xu J Zhang and J Shi ldquoExponential stability of complex-valued neural networks with mixed delaysrdquo Neurocomputingvol 128 pp 483ndash490 2014

Abstract and Applied Analysis 9

[24] X Chen and Q Song ldquoGlobal stability of complex-valuedneural networks with both leakage time delay and discrete timedelay on time scalesrdquo Neurocomputing vol 121 pp 254ndash2642013

[25] S-I Niculescu Delay Effects on Stability A Robust ControlApproach Springer London UK 2001

[26] V B Kolmanovskiı and V R Nosov Stability of Functional-Differential Equations Academic Press London UK 1986

[27] Y Liu Z Wang J Liang and X Liu ldquoSynchronization ofcoupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delaysrdquo IEEETransactions on Cybernetics vol 43 no 1 pp 102ndash114 2013

[28] T Chen and L Wang ldquoPower-rate global stability of dynamicalsystems with unbounded time-varying delaysrdquo IEEE Transac-tions on Circuits and Systems II Express Briefs vol 54 no 8 pp705ndash709 2007

[29] T Chen and L Wang ldquoGlobal 120583-stability of delayed neural net-works with unbounded time-varying delaysrdquo IEEETransactionson Neural Networks vol 18 no 6 pp 1836ndash1840 2007

[30] X Liu and T Chen ldquoRobust 120583-stability for uncertain stochasticneural networks with unbounded time-varying delaysrdquo PhysicaA Statistical Mechanics and Its Applications vol 387 no 12 pp2952ndash2962 2008

[31] X Liu and J Cao ldquoOn periodic solutions of neural networksvia differential inclusionsrdquo Neural Networks vol 22 no 4 pp329ndash334 2009

[32] X Liu and J Cao ldquoRobust state estimation for neural networkswith discontinuous activationsrdquo IEEE Transactions on SystemsMan and Cybernetics B Cybernetics vol 40 no 6 pp 1425ndash1437 2010

[33] X Liu T Chen J Cao and W Lu ldquoDissipativity and quasi-synchronization for neural networks with discontinuous acti-vations and parameter mismatchesrdquo Neural Networks vol 24no 10 pp 1013ndash1021 2011

[34] X Liu J Cao and W Yu ldquoFilippov systems and quasi-sync-hronization control for switched networksrdquo Chaos vol 22 no3 Article ID 033110 2012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 9: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

Abstract and Applied Analysis 9

[24] X Chen and Q Song ldquoGlobal stability of complex-valuedneural networks with both leakage time delay and discrete timedelay on time scalesrdquo Neurocomputing vol 121 pp 254ndash2642013

[25] S-I Niculescu Delay Effects on Stability A Robust ControlApproach Springer London UK 2001

[26] V B Kolmanovskiı and V R Nosov Stability of Functional-Differential Equations Academic Press London UK 1986

[27] Y Liu Z Wang J Liang and X Liu ldquoSynchronization ofcoupled neutral-type neural networks with jumping-mode-dependent discrete and unbounded distributed delaysrdquo IEEETransactions on Cybernetics vol 43 no 1 pp 102ndash114 2013

[28] T Chen and L Wang ldquoPower-rate global stability of dynamicalsystems with unbounded time-varying delaysrdquo IEEE Transac-tions on Circuits and Systems II Express Briefs vol 54 no 8 pp705ndash709 2007

[29] T Chen and L Wang ldquoGlobal 120583-stability of delayed neural net-works with unbounded time-varying delaysrdquo IEEETransactionson Neural Networks vol 18 no 6 pp 1836ndash1840 2007

[30] X Liu and T Chen ldquoRobust 120583-stability for uncertain stochasticneural networks with unbounded time-varying delaysrdquo PhysicaA Statistical Mechanics and Its Applications vol 387 no 12 pp2952ndash2962 2008

[31] X Liu and J Cao ldquoOn periodic solutions of neural networksvia differential inclusionsrdquo Neural Networks vol 22 no 4 pp329ndash334 2009

[32] X Liu and J Cao ldquoRobust state estimation for neural networkswith discontinuous activationsrdquo IEEE Transactions on SystemsMan and Cybernetics B Cybernetics vol 40 no 6 pp 1425ndash1437 2010

[33] X Liu T Chen J Cao and W Lu ldquoDissipativity and quasi-synchronization for neural networks with discontinuous acti-vations and parameter mismatchesrdquo Neural Networks vol 24no 10 pp 1013ndash1021 2011

[34] X Liu J Cao and W Yu ldquoFilippov systems and quasi-sync-hronization control for switched networksrdquo Chaos vol 22 no3 Article ID 033110 2012

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of

Page 10: Research Article Global -Stability of Complex-Valued Neural Networks …downloads.hindawi.com/journals/aaa/2014/263847.pdf · 2019-07-31 · approach, stability criteria were presented

Submit your manuscripts athttpwwwhindawicom

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical Problems in Engineering

Hindawi Publishing Corporationhttpwwwhindawicom

Differential EquationsInternational Journal of

Volume 2014

Applied MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Probability and StatisticsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Mathematical PhysicsAdvances in

Complex AnalysisJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

OptimizationJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

CombinatoricsHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Operations ResearchAdvances in

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Function Spaces

Abstract and Applied AnalysisHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of Mathematics and Mathematical Sciences

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Algebra

Discrete Dynamics in Nature and Society

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Decision SciencesAdvances in

Discrete MathematicsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom

Volume 2014 Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Stochastic AnalysisInternational Journal of