cukoo srch

25
Soft Comput DOI 10.1007/s00500-015-1594-8 METHODOLOGIES AND APPLICATION A particle swarm inspired cuckoo search algorithm for real parameter optimization Xiangtao Li · Minghao Yin © Springer-Verlag Berlin Heidelberg 2015 Abstract The cuckoo search algorithm (CS) is a simple and effective global optimization algorithm. It has been suc- cessfully applied to solve a wide range of real-world opti- mization problems. In this paper, inspired by the particle swarm optimization (PSO), the proposed algorithm uses the best individuals among the entire population to enhance the convergence rate of the standard cuckoo search algorithm. While the PSO directly uses the global best solution of the population to determine new positions for the particles at the each iteration, agents of the CS do not directly use this infor- mation but the global best solution in the CS is stored at the each iteration. The global best solutions are used to add into the Information flow between the nest helps increase global and local search abilities of the new approach. Therefore, in the first component, the neighborhood information is added into the new population to enhance the diversity of the algo- rithm. In the second component, two new search strategies are used to balance the exploitation and exploration of the algorithm through a random probability rule. In other aspect, our algorithm has a very simple structure and thus is easy to implement. To verify the performance of PSCS, 30 bench- mark functions chosen from literature are employed. The results show that the proposed PSCS algorithm clearly out- performs the basic CS and PSO algorithm. Compared with some evolution algorithms (CLPSO, CMA-ES, GL-25, DE, OXDE, ABC, GOABC, FA, FPA, CoDE, BA, BSA, BDS and SDS) from literature, experimental results indicate that Communicated by V. Loia. X. Li (B ) · M. Yin School of Computer Science and Information Technology, Northeast Normal University, Changchun 130117, China e-mail: [email protected] M. Yin e-mail: [email protected] the proposed algorithm performs better than, or at least com- parable to state-of-the-art approaches from literature when considering the quality of the solution obtained. In the last part, experiments have been conducted on two real-world optimization problems including the spread spectrum radar poly-phase code design problem and the chaotic system. Sim- ulation results demonstrate that the proposed algorithm is very effective. Keywords Cuckoo search algorithm · Global numerical optimization · Particle swarm optimization · Exploration · Exploitation · Chaotic system 1 Introduction Optimization methods play an important role in many scien- tific and engineering fields. In the past decades, the com- putational cost having been reduced almost dramatically, researchers all over the world are coming up with new evolu- tionary algorithms on a regular basis to meet the demands of the complex, real-world optimization problems. They have attracted more and more attention in recent years. We have viewed different kinds of evolutionary algorithms advanced to solve optimization problems, such as genetic algorithm (GA), particle swarm optimization algorithm (PSO), esti- mation of distribution algorithms (EDA), ant colony opti- mization (ACO), firefly algorithm (FA), flower pollution algorithm (FPA), differential evolution (DE), artificial bee colony (ABC), and cuckoo search algorithm (CS) (Yildiz and Saitou 2011; Yildiz and Solanki 2012; Yildiz 2012, 2013a, b; Kennedy and Eberhart 1995; Yang and Deb 2009; Yang 2009, 2012; Storn and Price 1997). Among them, the performance of cuckoo search algorithm has been carefully studied by many researchers since it is pro- 123

Upload: sietpradeep18

Post on 18-Aug-2015

19 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Cukoo srch

Soft ComputDOI 10.1007/s00500-015-1594-8

METHODOLOGIES AND APPLICATION

A particle swarm inspired cuckoo search algorithmfor real parameter optimization

Xiangtao Li · Minghao Yin

© Springer-Verlag Berlin Heidelberg 2015

Abstract The cuckoo search algorithm (CS) is a simpleand effective global optimization algorithm. It has been suc-cessfully applied to solve a wide range of real-world opti-mization problems. In this paper, inspired by the particleswarm optimization (PSO), the proposed algorithm uses thebest individuals among the entire population to enhance theconvergence rate of the standard cuckoo search algorithm.While the PSO directly uses the global best solution of thepopulation to determine new positions for the particles at theeach iteration, agents of the CS do not directly use this infor-mation but the global best solution in the CS is stored at theeach iteration. The global best solutions are used to add intothe Information flow between the nest helps increase globaland local search abilities of the new approach. Therefore, inthe first component, the neighborhood information is addedinto the new population to enhance the diversity of the algo-rithm. In the second component, two new search strategiesare used to balance the exploitation and exploration of thealgorithm through a random probability rule. In other aspect,our algorithm has a very simple structure and thus is easy toimplement. To verify the performance of PSCS, 30 bench-mark functions chosen from literature are employed. Theresults show that the proposed PSCS algorithm clearly out-performs the basic CS and PSO algorithm. Compared withsome evolution algorithms (CLPSO, CMA-ES, GL-25, DE,OXDE, ABC, GOABC, FA, FPA, CoDE, BA, BSA, BDSand SDS) from literature, experimental results indicate that

Communicated by V. Loia.

X. Li (B) · M. YinSchool of Computer Science and Information Technology,Northeast Normal University, Changchun 130117, Chinae-mail: [email protected]

M. Yine-mail: [email protected]

the proposed algorithm performs better than, or at least com-parable to state-of-the-art approaches from literature whenconsidering the quality of the solution obtained. In the lastpart, experiments have been conducted on two real-worldoptimization problems including the spread spectrum radarpoly-phase code design problem and the chaotic system. Sim-ulation results demonstrate that the proposed algorithm isvery effective.

Keywords Cuckoo search algorithm · Global numericaloptimization · Particle swarm optimization · Exploration ·Exploitation · Chaotic system

1 Introduction

Optimization methods play an important role in many scien-tific and engineering fields. In the past decades, the com-putational cost having been reduced almost dramatically,researchers all over the world are coming up with new evolu-tionary algorithms on a regular basis to meet the demands ofthe complex, real-world optimization problems. They haveattracted more and more attention in recent years. We haveviewed different kinds of evolutionary algorithms advancedto solve optimization problems, such as genetic algorithm(GA), particle swarm optimization algorithm (PSO), esti-mation of distribution algorithms (EDA), ant colony opti-mization (ACO), firefly algorithm (FA), flower pollutionalgorithm (FPA), differential evolution (DE), artificial beecolony (ABC), and cuckoo search algorithm (CS) (Yildiz andSaitou 2011; Yildiz and Solanki 2012; Yildiz 2012, 2013a, b;Kennedy and Eberhart 1995; Yang and Deb 2009; Yang 2009,2012; Storn and Price 1997).

Among them, the performance of cuckoo search algorithmhas been carefully studied by many researchers since it is pro-

123

Page 2: Cukoo srch

X. Li, M. Yin

posed in 2009. The cuckoo search algorithm is a population-based heuristic evolutionary algorithm inspired by the inter-esting breeding behavior such as brood parasitism of certainspecies of cuckoos. In CS, each cuckoo lies on egg at a timeand dumps its egg in a randomly chosen nest. The best nestswith high quality of eggs will carry over to the next genera-tion. The number of available host nests is fixed, and the egglaid by a cuckoo is discovered by the host bird with a prob-ability. In this case, the host bird can either abandon the eggaway or throw the nest, and build a completely new nest. Toaccelerate the convergence speed and avoid the local optima,several variations of CS have been proposed to enhance theperformance of the standard CS recently. Moreover, CS hasbeen proved to be really efficient when solving real-worldproblems. Civicioglu (2013a, b) compares the performanceof CS with that of particle swarm optimization, differentialevolution, and artificial bee colony on many global optimiza-tion problems. The performances of the CS and PSO algo-rithms are statistically closer to the performance of the DEalgorithm than the ABC algorithm. The CS and DE algo-rithms supply more robust and precise results than the PSOand ABC algorithms. Walton et al. (2011) proposes modi-fied cuckoo search which can be regarded as a modificationof the recently developed cuckoo search is presented. Themodification involves the addition of information betweenthe top eggs, or the best solutions. Gandomi et al. (2013)proposes the CS for solving structural optimization prob-lems which is subsequently applied to 13 design problemsreported in the specialized literature. The performance ofthe CS algorithm is further compared with various algo-rithms representative of the state of the art in the area. Theoptimal solutions obtained by CS are better than the bestsolutions obtained by the existing methods. Layeb (2011)proposes a new inspired algorithm called quantum inspiredcuckoo search algorithm, which a new framework is relyingon quantum computing principles and cuckoo search algo-rithm. The contribution consists in defining an appropriaterepresentation scheme in the cuckoo search algorithm thatallows applying successfully on combinatorial optimizationproblems. Tuba et al. (2011) implements a modified versionof this algorithm where the stepsize is determined from thesorted rather than only permuted the fitness matrix. The mod-ified algorithm is tested on eight standard benchmark func-tions. Comparison of the pure cuckoo search algorithm andthis modified one is presented and it shows improved resultsby the modification. Goghrehabadi et al. (2011) proposes ahybrid power series and cuckoo search via lévy flights opti-mization algorithm (PS-CS) that is applied to solve a systemof nonlinear differential equations arising from the distrib-uted parameter model of a micro fixed–fixed switch subjectto electrostatic force and fringing filed effect. The obtainedresults are compared with numerical results and found ingood agreement. Furthermore, the present method can be

easily extended to solve a wide range of boundary valueproblems. Yildiz (2013a, b) proposes CS to the optimizationof machining parameters. The results demonstrate that theCS is a very effective and robust approach for the optimiza-tion of machining optimization problems. Durgun and Yildiz(2012) proposed to use the cuckoo search algorithm (CS)algorithm for solving structural design optimization prob-lems. The CS algorithm is applied to the structural designoptimization of a vehicle component to illustrate how thepresent approach can be applied for solving structural designproblems. Agrawal et al. (2013) use the cuckoo search algo-rithm to find the optimal thresholds for multi-level thresholdin an image are obtained by maximizing the Tsallis entropy.The results are then compared with that of other comparedalgorithms. Ouaarab et al. (2014) present an improved anddiscrete version of the cuckoo search (CS) algorithm to solvethe famous traveling salesman problem (TSP), an NP-hardcombinatorial optimization problem. The proposed discretecuckoo search (DCS) is tested against a set of benchmarksof symmetric TSP from the well-known TSPLIB library.Burnwal and Deb (2013) propose a new algorithm to solvescheduling optimization of a flexible manufacturing systemby minimizing the penalty cost due to delay in manufactur-ing and maximizing the machine utilization time. Li et al.(2014) use a new search strategy based on orthogonal learn-ing strategy to enhance the exploitation ability of the basiccuckoo search algorithm. Experiment results show that theproposed algorithm is very effective. Dey et al. (2013) pro-pose a new approach to design a robust biomedical contentauthentication system by embedding logo of the hospitalwithin the electrocardiogram signal by means of both dis-crete wavelet transformation and cuckoo search algorithm.An adaptive meta-heuristic cuckoo search algorithm is usedto find the optimal scaling factor settings for logo embedding.Ehsan and Saeed (2013) use an improved cuckoo search algo-rithm, enhancing the accuracy and convergence rate of thestandard cuckoo search algorithm. Then, the performanceof the proposed algorithm is tested on some complex engi-neering optimization problems including four well-knownreliability optimization problems and a large-scale reliabil-ity optimization problem, which is a 15-unit system reliabil-ity optimization problem. These methods seem to be dif-ficult to simultaneously achieve the balance between theexploration and exploitation of the CS. Therefore, a largenumber of future researches are necessary to develop neweffective cuckoo search algorithms for optimization prob-lems.

To achieve both of the goals, the proposed algorithminspired by the particle swarm optimization is used for thebest individuals among the entire population. While the PSOdirectly uses the global best solution of the population todetermine new positions for the particles at the each iter-ation, agents of the CS do not directly use this informa-

123

Page 3: Cukoo srch

A particle swarm inspired cuckoo search algorithm

tion but the global best solution in the CS is stored at theeach iteration. Therefore, in the first component, the neigh-borhood information is added into the new population toenhance the diversity of the algorithm. In the second com-ponent, two new search strategies are used to balance theexploitation and exploration of the algorithm through a ran-domly probability rule. In other aspect, our algorithm hasa very simple structure and thus is easy to implement. Toverify the performance of PSCS algorithm, 30 benchmarkfunctions chosen from literature are employed. Comparedwith other evolution algorithms from literature, experimental

results indicate that the proposed algorithm performs betterthan, or at least comparable to state-of-the-art approachesfrom literature when considering the quality of the solu-tion obtained. In the last, experiments have been conductedon two real world problems. Simulation results and com-parisons demonstrate the proposed algorithm is very effec-tive.

The rest of this paper is organized as follows: In Sect. 2we will review the basic CS and the basic PSO. The parti-cle swarm inspired cuckoo search algorithm is presented inSect. 3 respectively. Benchmark problems and correspond-ing experimental results are given in Sect. 4. Two real worldproblems are given in Sect. 5. In the last section we concludethis paper and point out some future research directions.

2 Preliminaries

2.1 The standard cuckoo search algorithm

Cuckoo search algorithm was first proposed by Yang andDeb (2009). The algorithm was one of the most recent swarm

intelligent-based algorithms that were inspired by the oblig-ate brood parasitism of some cuckoo species by laying theireggs in the nests of other host birds. In the standard cuckoosearch algorithm, the algorithm combines three principlerules. First, each cuckoo will be dumped in a randomly cho-sen nest. The second rule is that the best nests will be keptto the next generation. The third rule is that the host birdwill find the egg laid by a cuckoo with a probability. Whenit happens, the laid egg will be thrown away or the host birdwill abandon the nest to build a new nest. Based on theserules, the standard cuckoo search algorithm is described asfollows:

In the beginning of the cuckoo search algorithm, each solu-tion is generated randomly within the range of the boundaryof the parameters. When generating i th solution in t + 1generation, denoted by xt+1

i , a lévy flight is performed asfollows:

xt+1i = xt

i + α ⊕ Le′vy(λ) (1)

where α > 0 is real number denoting the stepsize, which isrelated to the sizes of the problem of interest, and the product⊕ denotes entry-wise multiplications. A lévy flight is a ran-dom walk where the step-lengths are distributed according toa heavy-tailed probability distribution in the following form:

le′vy ∼ u = t−λ, (1 < λ < 3), (2)

which has an infinite variance with an infinite mean. Accord-ingly, the consecutive jumps of a cuckoo from a random walkprocess obeying a power law step length distribution with aheavy tail. In this way, the process of generating new solu-tions can be viewed as a stochastic equation for random walk

123

Page 4: Cukoo srch

X. Li, M. Yin

which also forms a Markov chain whose next location onlydepends on the current location and the transition probability.

The evolution phase of the xti begins by the donor vector

υ, where υ = xti . After this step, the required stepsize value

has been computed using the Eq. (3)

Stepsize j = 0.01 ·(

u j

v j

)1/λ

· (υ − Xbest) (3)

where u = t−λ × randn[D] and v = randn[D]. Therandn[D] function generates a Gaussian distribution. Thenthe donor vector υ is random using the Eq. (4)

υ = υ + Stepsize j ∗ randn[D] (4)

After producing the new solution υi , it will be evaluated andcompared to the xi , If the objective fitness of υi is smallerthan the objective fitness of xi , υi is accepted as a new basicsolution. Otherwise xi would be obtained.

The other part of cuckoo search algorithm is to place somenests by constructing a new solution. This crossover operatoris shown as follows:

υi ={

Xi + rand · (Xr1 − Xr2) randi < pa

Xi otherwise(5)

After producing the new solution υi , it will be evaluated andcompared to the xi . If the objective fitness of υi is smallerthan the objective fitness of xi , υi is accepted as a new basicsolution. Otherwise xi would be obtained.

Note that in the real world, a cuckoo’s egg is more difficultto be found when it is more similar to a host’s eggs. So, thefitness is related to the difference and that is the main reasonto use a random walk in a biased way with some randomstepsizes.

2.2 The particle swarm optimization algorithm (PSO)

PSO is fundamentally a stochastic, population-based searchalgorithm which mimics organisms that interact as a swarmsuch a school of fish or a swarm of bees looking for the foods.The algorithm was first proposed by Kennedy and Eberhart(1995) based on the cooperation and competition among indi-viduals to complete the search of the optimal solution in ann-dimensional space. The standard PSO can be specificallydescribed as follows: during the swarm evolution, each parti-cle has a velocity vector Vi = (vi1, vi2, . . . , vi D) and a posi-tion vector Xi = (xi1, xi2, . . . , xi D) to guide itself to a poten-tial optimal solution, where i is a positive integer indexing theparticle in the swarm. The personal best position of particlei is denoted as pbesti = (pbesti1, pbesti2, . . . , pbesti D)

and the global best position of the particle is gbest =(gbest1, gbest2, . . . , gbestD). The velocity Vi and the posi-tion xi are randomly initialized in the search space and theyare updated with the following formulas at the (t + 1) gen-erations:

Vi, j (t + 1) = ωVi, j (t) + c1r1, j[

pbesti, j (t) − Xi, j (t)]

+ c2r2, j[gbest j (t) − Xi, j (t)

]

Xi, j (t + 1) = Xi, j (t) + Vi, j (t + 1) (6)

where i ∈ [1, 2, . . . , N P] means the i th particle in the pop-ulation and j ∈ [1, 2, . . . , D] is the j th dimension of thisparticle; NP is the population size and D is the dimension ofthe searching space. c1 and c2 are acceleration constants. Ther1, j and r2, j are two random number uniformly distributedin [0, 1]. ω is the inertia weight that is used to balance globaland local search ability.

3 Our approach: particle swarm inspired cuckoo searchalgorithm (PSCS)

In this section, we will introduce our algorithm PSCS indetail.

3.1 The new search strategy

In the standard PSO algorithm, each particle keeps the bestposition pbest found by itself. Besides, we know the globalposition gbest search by the group particles, and change itsvelocity according to the two best positions. The high con-vergence speed is an important feature of the original PSOalgorithm because of the usage of the global elite “gbest”imposes a strong influence on the whole swarm. The globalbest solution is used to guide the flight of the particles, as itcan be called “social learning”. In the social learning part,the individuals’ behaviors indicate the information share andcooperation among the swarm. The other learning part is thecognitive learning models which make the tendency of parti-cles to return to previously found best positions. This part canavoid the algorithm trapping into the local optimal. Inspiredby the social learning and cognitive learning, the two learn-ing parts are used in standard CS to find the neighborhood ofthe nest. The main model of the new search strategy can bedescribed as follows:

υi, j (t + 1) = Xi, j (t) + ϑi, j[

pbesti, j (t) − Xi, j (t)]

+ ϕi, j[gbest j (t) − Xi, j (t)

](7)

where ϑ and ϕ are the parameter of the new search method.In other aspect, as the global best found early in the search-

ing process may be a poor local optimum; it may attract allfood sources to a bad searching area. In this case, on com-plex multi-modal problems, the convergence speed of thealgorithm is often very high at the beginning, but only lastsfor a few generations. After that, the search will inevitablybe trapped. Therefore, on such kind of problems, it wouldmislead the search towards local optima, which inhibits theadvantages of the new strategies on multi-modal problems. In

123

Page 5: Cukoo srch

A particle swarm inspired cuckoo search algorithm

this paper, taking into consideration these facts and to over-come the limitations of fast but less reliable convergenceperformance of the above search strategy, we propose a newsearch strategy by utilizing the best vector of a group of q%of the randomly selected population members for each targetvector that can be described as follows:

υi, j (t + 1) = Xi, j (t) + ϑi, j[

pbesti, j (t) − Xi, j (t)]

+ ϕi, j[q_gbest j (t) − Xi, j (t)

](8)

where q_gbest is the best of the q% vectors randomly cho-sen from the current population, and none of them is equalto q_gbest . Under this method, the target solutions are notalways attracted toward the same best position found so far inthe current population, and this feature is helpful in avoidingpremature convergence at local optima. It is seen that keepingthe value of the q% is equal to the top 5 % of the populationsize.

In the standard CS algorithm, two main components com-bine the algorithm. The first component of algorithm getsnew cuckoos by random walk with Lévy flight around the sofar best nest. The required stepsize value has been computedas follows:

Stepsize j = 0.01 ·(

u j

v j

)1/λ

· (υ − Xbest) (9)

where u = σu × randn[D] and v = randn[D]. Therandn[D] function generate an rand number between [0,1].Then the donor vector υ can be generated as follows:

υ = υ + Stepsize j ∗ randn[D] (10)

Inspired by the new search strategy, we can modify the firstpart as follows:

υ = υ + 0.01 ·(

u j

v j

)1/λ

· (υ − q_gbest) ∗ randn[D]+ ϕ ∗ (Xr1 − q_gbest) (11)

where r1 is mutually different random integer indices selectedfrom {1, . . . , N P}. ϕ is the parameter of this part. From thenew modified search method, we can find that the first partshows the distance of the current individual and the globalbest individual. The second part shows the distance of theneighborhood of the current individual and the global bestindividual. This new search strategy can enhance the conver-gence rate and the diversity of the population. It can avoidthe algorithm trapping into the local optimal.

For the second component of cuckoo search algorithm,the nest can place some nests by construct a new solution.This crossover operator is shown as follows:

υi ={

Xi + rand · (Xr1 − Xr2) randi < pa

Xi otherwise(12)

Inspired by the new search strategy, in this section, twoimproved search strategies are used in the second compo-nent of the cuckoo search algorithm.

υi, j (t + 1) = Xi, j (t) + ϑi, j[Xr1, j (t) − Xi, j (t)

]υi, j (t + 1) = Xi, j (t) + ϑi, j

[Xr1, j (t) − Xi, j (t)

]+ ϕi, j

[q_gbest j (t) − Xi, j (t)

](13)

For the first mutation strategy, it is able to maintain popula-tion diversity and global search capability, but it slows downthe convergence of CS algorithms. For the second mutationstrategy, the best solution in the current population is veryuseful information that can explore the region around the bestvector. Besides, it also favors exploitation ability since thenew individual is strongly attracted around the current bestvector and at same time enhances the convergence speed.However, it is easy to trap into the local minima. Based onthese two new search strategies, the new crossover strategyis embedded into the cuckoo search algorithm and it is com-bined with these two new search strategies through a randomprobability rule as follows:

If rand > 0.5 Then

υi, j (t + 1) = Xi, j (t) + ϑi, j[Xr1, j (t) − Xi, j (t)

]Else

υi, j (t + 1) = Xi, j (t) + ϑi, j[Xr1, j (t) − Xi, j (t)

]+ ϕi, j

[q_gbest j (t) − Xi, j (t)

]End If (14)

It can be found that one of the two strategies is used toproduce the current individual relative to a uniformly distrib-uted random value within the range (0, 1). Hence, based onthe random probability rule and two new search methods, thealgorithm can balance the exploitation and exploration in thesearch space.

3.2 Boundary constraints

The PSCS algorithm assumes that the whole populationshould be in an isolated and finite space. During the search-ing process, if there are some individuals that will move outof bounds of the space, the original algorithm stops themon the boundary. In other words, the nest will be assigneda boundary value. The disadvantage is that if there are toomany individuals on the boundary, and especially when thereexists some local minimum on the boundary, the algorithmwill lose its population diversity to some extent. To tacklethis problem, we proposed the following repair rule:

xi =⎧⎨⎩

2 ∗ li − xi if xi < li2 ∗ ui − xi if xi > ui

xi otherwise(15)

123

Page 6: Cukoo srch

X. Li, M. Yin

3.3 Proposed PSCS algorithm

In this section, we introduce the new proposed particle swarminspired cuckoo search algorithm to balance the exploitationand the exploration. In this modified version, the new searchrules are proposed based on the best individual among theentire population of a particular generation. In addition, ThePSCS has a very simple structure and thus is easy to imple-ment and not enhance any complexity. Moreover, this methodcan overcome the lack of the exploration of the standard CSalgorithm. The algorithm can be described as follows:

In this section, we will analyze the computational com-plexity of the new proposed particle swarm inspired cuckoosearch algorithm. As we know, almost all metaheurtics algo-rithm are simple in terms of complexity, and thus they areeasy to implement. Proposed particle swarm inspired thatcuckoo search algorithm has two stages when going throughthe population NP with the dimension D and one outer loopfor iteration Gmax. Therefore, the complexity at the extremecase is O(2 · N P · D · Gmax). For the new proposed method.Runtime complexity of finding the top 5 % globally best vec-tor depends only on comparing the objective function against

123

Page 7: Cukoo srch

A particle swarm inspired cuckoo search algorithm

the previous function value. Note that the top 5 % valuesshould be upgraded for each newly generated trial vector. Inthe worst cased, this is done 2 · N P · Gmax. Thus, the overallruntime remains O(max(2·N P ·Gmax, 2·N P · D ·Gmax)) =O(2 · N P · D · Gmax). Therefore, our algorithm does notimpose any serious burden on the runtime complexity of theexisting CS variants. From the parameter setting for the algo-rithm, we can find the 2 · N P · D is less than the Gmax.The computation cost is relatively in expensive because thealgorithm complexity is linear in terms of Gmax. The maincomputational cost will be in the evaluations of objectivefunctions.

4 Experimental results

To evaluate the performance of our algorithm, in this sec-tion, PSCS algorithm is applied to minimize a set of 30scalable benchmark functions. These functions have beenwidely used in the literature. The first eight functions areunimodal functions. Among them, for the function f 03, thegeneralized Rosenbrock’s function is a multimodal functionwhen D >3. f 06 is a discontinuous step function. f 07 is anoise quadratic function. f 09– f 20 are multimodal and thenumber of their local minima increases exponentially withthe problem dimension. For these functions, the number oflocal minima increases exotically with the problem dimen-sion. Then, for the f 21– f 30, ten multimodal functions withfix dimension which have only a few local search minima areused in our experimental study. So far, these problems havebeen widely used as benchmarks for research with differentmethods by many researchers. The test function, the globaloptimum, search ranges and initialization ranges of the testfunctions are presented in Table 1.

4.1 Experimental setup

To evaluate the effectiveness and efficiency of PSCS algo-rithm, we have chosen a suitable set of value and have notmade any effort in finding the best parameter settings. In thisexperiment, we set the number of individuals to be 50. Thevalue of the ϑ is the Gaussian distribution with the mean0 and the standard deviation 0.5. The value of the φ is theGaussian distribution with the mean 0.5 and the standarddeviation 0.5. The value to reach (VTR) is 10−4 for all func-tions. The algorithm is coded in MATLAB 7.9, and exper-iments are made on a Pentium 3.0 GHz Processor with 4.0GB of memory. The above benchmark function f 1– f 18 betested in 30 dimension and 50 dimenison. For the functionf 19 and f 20, we will test in 100 dimension and 200 dimen-sion. The maximum number of function evaluations is set to300,000 for 30D problems and 500,000 for 50D problems forf 01– f 18. For the f 19 and f 20, the maximum number of

function evaluations is set to 300,000 for 100D problems and500,000 for 200D problems. For all test functions, the algo-rithms carry out 30 independent runs. The performance ofdifferent algorithms is statistically compared with PSCS bya non-parametric statistical test called Wilcoxon’s rank-sumtest for independent samples with significance level of 0.05.The real number 1, 0, −1 denote that the PSCS algorithm issuperior to, equal to or inferior to the algorithm with otheralgorithms.

Three performance criteria are chosen from the literatureto evaluate the performance of the algorithms. These criteriaare described as follows.

Error The error of a solution X is defined as f (X) −f (X∗), where the X is the best solution found by the algo-rithm in a run and X∗ is the global optimum of the test func-tion. The minimum error is found when the Max_NFFEs isreached in 30 runs, and then the average error and the stan-dard deviation of the error value are calculated.

NFFEs The number of fitness function evaluations (NFFEs)is also recorded when the VTR is reached. The average andstandard deviation of the NFFEs values are calculated.

SR A run is considered to be successful if at least onesolution was discovered during the course whose fitness valueis not worse than the VTR before the max_NFFEs conditionterminates the trial.

4.2 Experimental results

In this simulation, to examine our proposed PSCS approachto optimization problem, we compare it with the CS algo-rithm and the PSO algorithm in terms of the best, worst,median, and the standard deviation (SD) of the solutionobtained in the 30 independent runs by each algorithm. Theassociated results are presented in Table 2. Moreover, thetwo-tailed Wilcoxon’s rank-sum test which is the well-knownnonparametric statistical hypothesis test, is used to comparethe significance between the PSCS algorithm and its com-petitors at α = 0.05 significance level. And then, the Figs. 1and 2 graphically present the convergence graph for the testfunctions f 01– f 20 so as to show the convergence rate of thePSCS algorithm more clearly.

As can be seen in Table 2, we can find that the PSCSalgorithm is significantly better than CS on nearly all thetest functions. At the same time, we can find the PSCS algo-rithm is better than PSO algorithms on almost all the testfunctions expect for the functions f 16, f 20, f 21, f 23, f 24and f 26. For the f 16 with 30D and 50D, solution accu-racy obtained by PSO algorithm is the better than the PSCSalgorithm. For the fixed dimension f 20, f 21, f 23, f 24 andf 26,the PSO algorithm is better than other algorithms. Ingeneral, our algorithm PSCS is faster than that of PSO andCS algorithm on almost all the benchmark problems. It isnoted that our algorithm can find the global optima on the

123

Page 8: Cukoo srch

X. Li, M. Yin

Table 1 Benchmark functions based in our experimental study

Test function Range Optimum

f01 = ∑Di=1 x2

i [−100,100] 0

f02 = ∑Di=1 |xi | + ∏D

i=1 |xi | [−10,10] 0

f03 = ∑Di=1 (

∑ij=1 x j )

2[−100, 100] 0

f04 = maxi {|xi | , 1 ≤ i ≤ D} [−100, 100] 0

f05 = ∑D−1i=1 [100(xi+1 − x2

i )2 + (xi − 1)2] [−30, 30] 0

f06 = ∑Di=1 (�xi + 0.5)2 [−100, 100] 0

f07 = ∑Di=1 i x4

i + random[0, 1) [−1.28, 1.28] 0

f08 = ∑Di=1 |x |(i+1) [−1, 1] 0

f09 = ∑Di=1 [x2

i − 10 cos(2πxi ) + 10] [−5.12, 5.12] 0

f10 = ∑Di=1 [y2

i − 10 cos(2πyi ) + 10] [−5.12, 5.12] 0

yi ={

xi |xi | < 12

round(2xi )2 |xi | ≥ 1

2

f11 = 1400

∑Di=1 x2

i − ∏Di=1 cos( xi√

i) + 1 [−600, 600] 0

f12 = 418.9828872724338 × D + ∑Di=1 −xi sin

(√|xi |)

[−500, 500] 0

f13 = −20 exp

(−0.2

√1D

∑Di=1 x2

i

)− exp

(1D

∑Di=1 cos 2πxi

)+ 20 + e [−32, 32] 0

f14 = πD

{10 sin2(πyi ) + ∑D−1

i=1 (yi − 1)2[1 + 10 sin2(πyi + 1)]+(y D − 1)2 + ∑D

i=1 u(xi , 10, 100, 4)} [−50, 50] 0

yi = 1 + xi +14 u(xi , a, k, m) =

⎧⎨⎩

k(xi − a)m

0k(−xi − a)m

xi > a−a < xi < axi < −a

f15 = 0.1{

10 sin2(πyi ) + ∑D−1i=1 (yi − 1)2[1 + 10 sin2(πyi + 1)] + (y D − 1)2

}+ ∑D

i=1 u(xi , 10, 100, 4) [−50, 50] 0

f16 = ∑Di=1 |xi · sin(xi ) + 0.1xi | [−10, 10] 0

f17 = ∑Di=1 (xi − 1)2 [

1 + sin2(3πxi+1)] + sin2(3πx1) + |xD − 1| [1 + sin2(3πxn)

][−10, 10] 0

f18 = ∑Di=1

(∑kmaxk=0

[ak cos(ak cos(2πbk(xi + 0.5)))

]) − D∑kmax

k=0

[ak cos(2πbk0.5)

], a = 0.5, b = 3,

kmax = 20

[−0.5, 0.5] 0

f19 = 1D

∑Di=1 x4

i − 16x2i + 5xi [−5, 5] −78.33236

f20 = − ∑Di=1 sin(xi ) sin20

(i×x2

)[0,π ] −99.2784

f21 =[

1500 + ∑25

j=11

j+∑2i=1 (xi −ai j )

6

]−1

[−65.53, 65.53] 0.998004

f22 = ∑11i=1

[ai − x1(b2

i +bi xi )

b2i +b1x3+x4

]2

[−5, 5] 0.0003075

f23 = 4x21 − 2.1x4

i + 13 x6

1 + x1x2 − 4x22 + 4x4

2 [−5,5] −1.0316285

f24 =(

x2 − 5.14π2 x2

1 + 5π

x1 − 6)2 + 10(1 − 1

8π) cos x1 + 10 [−5, 10]*[0, 15] 0.398

f25 = [1 + (x1 + x2 + 1)2(19 − 14x1 + 3x2

1 − 14x2 + 6x1x2 + 3x22 )

]×[30 + (2x1 − 3x2)

2(18 − 32x1 + 12x21 + 48x2 − 36x1x2 + 27x2

2 )] [−5, 5] 3

f26 = − ∑4i=1 ci exp(− ∑3

j=1 ai j (x j − pi j )2) [0, 1] −3.86

f27 = − ∑4i=1 ci exp(− ∑6

j=1 ai j (x j − pi j )2) [0, 1] −3.32

f28 = − ∑5i=1 [(X − ai )(X − ai )

T + ci ]−1[0, 10] −10.1532

f29 = − ∑7i=1 [(X − ai )(X − ai )

T + ci ]−1[0, 10] −10.4029

f30 = − ∑10i=1 [(X − ai )(X − ai )

T + ci ]−1[0, 10] −10.5364

123

Page 9: Cukoo srch

A particle swarm inspired cuckoo search algorithm

Table 2 Best, worst, median, mean, standard deviation and success rate values achieved by CS, PSO and PSCS through 30 independent runs

No. Dim MaxFEs Methods Best Worst Median Mean Std Sig.

f 01 30 3e5 CS 3.6230e−016 3.0089e−015 6.4575e−016 1.0963e−015 9.9842e−016 +

PSO 7.8162e−043 5.9514e−040 1.3795e−041 1.2551e−040 2.1199e−040 +

PSCS 2.4574e−051 3.7160e−050 7.2574e−051 9.6819e−051 1.0311e−050

50 5e5 CS 8.0575e−018 7.4868e−017 2.4519e−017 3.2095e−017 2.1260e−017 +

PSO 9.0666e−034 8.1058e−032 1.9245e−032 3.1301e−032 2.8533e−032 +

PSCS 2.1365e−064 2.7862e−063 1.5094e−063 1.5045e−063 9.7749e−064

f 02 30 3e5 CS 4.1116e−007 1.8588e−006 7.1902e−007 8.8030e−007 4.5990e−007 +

PSO 2.4598e−029 1.4278e−026 1.3806e−027 2.5890e−027 4.2782e−027 +

PSCS 4.7960e−029 2.1835e−028 1.2241e−028 1.2865e−028 5.2708e−029

50 5e5 CS 2.6675e−008 8.4987e−008 3.7683e−008 4.2386e−008 1.6996e−008 +

PSO 8.8992e−024 3.4013e−021 1.8752e−022 5.6325e−022 1.0282e−021 +

PSCS 1.3859e−035 5.7439e−035 2.8863e−035 3.0332e−035 1.3784e−035

f 03 30 3e5 CS 0.2604 0.7744 0.5719 0.5339 0.1600 +

PSO 0.4495 3.8075 2.4630 2.2519 1.2702 +

PSCS 3.2692e−010 6.2212e−009 1.4395e−009 2.3503e−009 2.0191e−009

50 5e5 CS 21.1477 36.3775 31.7724 29.9420 5.9115 +

PSO 3.1424e+002 1.6600e+003 9.4404e+002 9.9406e+002 4.1591e+002 +

PSCS 2.1904e−005 7.5976e−005 3.6683e−005 4.0249e−005 1.8619e−005

f 04 30 3e5 CS 0.0429 0.7403 0.1308 0.1901 0.2084 +

PSO 0.1184 0.7855 0.4334 0.4527 0.1734 +

PSCS 2.0676e−009 8.4460e−009 3.6061e−009 4.1096e−009 1.8666e−009

50 5e5 CS 1.4599 5.0701 2.6698 2.7100 1.0238 +

PSO 11.7187 20.5046 13.8819 15.0708 3.1039 +

PSCS 6.8676e−011 4.1681e−010 1.0652e−010 1.5855e−010 1.1521e−010

f 05 30 3e5 CS 14.5319 19.0239 16.9995 17.0632 1.3235 +

PSO 1.3051 88.3707 22.7252 34.2397 32.0204 +

PSCS 5.8717e−007 7.1865 0.4692 1.6879 2.4024

50 5e5 CS 30.5970 38.9580 35.3056 35.2959 2.2178 +

PSO 18.0234 1.9986e+002 85.1484 1.0340e+002 53.1578 +

PSCS 9.1593 13.1634 11.7211 11.5491 1.3832

f 06 30 3e5 CS 0 0 0 0 0 =

PSO 0 0 0 0 0 =

PSCS 0 0 0 0 0

50 5e5 CS 0 0 0 0 0 =

PSO 0 2 0 0.5 0.7071 −PSCS 0 0 0 0 0

f 07 30 3e5 CS 0.0058 0.0118 0.0089 0.0089 0.0021 +

PSO 0.0052 0.0140 0.0099 0.0098 0.0026 +

PSCS 0.0015 0.0064 0.0036 0.0037 0.0015

50 5e5 CS 0.0076 0.0239 0.0135 0.0151 0.0050 +

PSO 0.0194 0.0491 0.0311 0.0336 0.0106 +

PSCS 0.0028 0.0057 0.0041 0.0042 8.6491e−004

f 08 30 3e5 CS 3.2866e−052 1.3586e−040 1.5621e−043 1.6638e−041 4.2512e−041 +

PSO 1.9991e−084 7.8062e−072 5.0537e−077 7.8491e−073 2.4670e−072 +

PSCS 1.2825e−158 2.7048e−155 3.4185e−157 4.3501e−156 9.0819e−156

50 5e5 CS 5.4142e−049 3.6107e−041 2.2210e−043 5.0245e−042 1.1604e−041 +

123

Page 10: Cukoo srch

X. Li, M. Yin

Table 2 continued

No. Dim MaxFEs Methods Best Worst Median Mean Std Sig.

PSO 1.9655e−054 3.6171e−045 2.6554e−049 6.2144e−046 1.3105e−045 +

PSCS 1.8730e−240 9.0305e−236 1.3573e−238 1.3546e−236 0

f 09 30 3e5 CS 32.9314 57.2548 51.3889 48.8168 7.4424 +

PSO 11.9395 58.7024 16.9142 23.0830 14.2556 +

PSCS 0 0 0 0 0

50 5e5 CS 58.7076 82.9198 74.2850 72.1058 9.6937 +

PSO 45.7680 1.1641e+002 72.6319 76.4127 22.3652 +

PSCS 0 0 0 0 0

f 10 30 3e5 CS 27.4882 50.6028 47.1317 43.3522 8.0868 +

PSO 2.0000 38.0000 10.5000 13.8005 10.2612 +

PSCS 0 0 0 0 0

50 5e5 CS 49.7897 81.4700 70.5342 69.7437 10.1232 +

PSO 26.0006 1.0900e+002 57.0000 56.9609 24.2640 +

PSCS 3.0419e-004 0.0749 0.0035 0.0185 0.0299

f 11 30 3e5 CS 4.2633e−012 1.4741e−008 4.6941e−010 2.8226e−009 4.9572e−009 +

PSO 0 0.0294 0.0074 0.0091 0.0093 +

PSCS 0 0 0 0 0

50 5e5 CS 0 6.5071e−008 9.0372e−014 6.5821e−009 2.0552e−008 +

PSO 0 0.0295 0.0074 0.0098 0.0102 +

PSCS 0 0 0 0 0

f 12 30 3e5 CS 2.4245e+003 3.2388e+003 2.8785e+003 2.8329e+003 2.2699e+002 +

PSO 5.9219e+002 1.5396e+003 1.0659e+003 1.1014e+003 3.1111e+002 +

PSCS 0 0 0 0 0

50 5e5 CS 4.9195e+003 5.8670e+003 5.4085e+003 5.4358e+003 3.1066e+002 +

PSO 1.4212e+003 2.7240e+003 1.9542e+003 2.0252e+003 4.3048e+002 +

PSCS 1.8190e−011 1.8190e−011 1.8190e−011 1.8190e−011 1.8190e−011

f 13 30 3e5 CS 5.4167e−006 0.04875 2.9841e−004 0.0069 0.0151 +

PSO 1.5099e−014 5.7731e−014 2.2204e−014 2.5401e−014 1.3856e−014 +

PSCS 4.4409e−015 4.4409e−015 4.4409e−015 4.4409e−015 0

50 5e5 CS 1.2864e−006 1.7013 0.0010 0.4505 0.7371 +

PSO 3.9968e−014 8.6153e−014 5.5955e−014 6.0218e−014 1.5625e−014 +

PSCS 4.4409e−015 4.4409e−015 4.4409e−015 4.4409e−015 0

f 14 30 3e5 CS 1.7209e−009 1.1688e−005 7.4907e−008 1.5756e−006 3.6447e−006 +

PSO 1.5705e−032 2.0868e−032 1.5705e−032 1.6738e−032 2.1769e−033 +

PSCS 1.5705e−032 1.5705e−032 1.5705e−032 1.5705e−032 2.8849e−048

50 5e5 CS 1.6738e−012 4.1814e−008 8.9045e−011 4.3105e−009 1.3178e−008 +

PSO 8.4711e−029 0.1243 1.6020e−025 0.0311 0.0439 +

PSCS 9.4233e−033 9.4233e−033 9.4233e−033 9.4233e−033 1.4425e−048

f 15 30 3e5 CS 5.5102e−014 5.4052e−013 1.8423e−013 2.2526e−013 1.5765e−013 +

PSO 1.4730e−032 0.0109 1.8428e−032 0.0011 0.0034 +

PSCS 1.3498e−032 1.3498e−032 1.3498e−032 1.3498e−032 2.8849e−048

50 5e5 CS 2.2501e−016 2.4223e−013 3.6753e−015 2.9364e−014 7.5003e−014 +

PSO 4.1502e−030 0.0109 2.3126e−024 0.0021 0.0046 +

PSCS 1.3498e−032 1.3498e−032 1.3498e−032 1.3498e−032 2.8849e−048

f 16 30 3e5 CS 2.2104 4.8801 2.6040 2.8829 0.8145 +

PSO 7.9936e−015 2.8421e−014 1.6876e−014 1.7130e−014 5.7590e−015 +

PSCS 6.4643e−005 1.2879e−004 1.1112e−004 1.0285e−004 2.4428e−005

123

Page 11: Cukoo srch

A particle swarm inspired cuckoo search algorithm

Table 2 continued

No. Dim MaxFEs Methods Best Worst Median Mean Std Sig.

50 5e5 CS 4.0940 7.2364 5.6272 5.7024 0.9171 +

PSO 78.6497 94.5041 88.3741 88.4735 4.9525 +

PSCS 0.0012 0.0028 0.0019 0.0021 4.7521e−004

f 17 30 3e5 CS 2.1001e−014 2.7145e−013 5.3615e−014 1.2637e−013 1.0928e−013 +

PSO 1.3498e−031 0.1098 1.6579e−031 0.0109 0.0347 +

PSCS 1.3498e−031 1.3498e−031 1.3498e−031 1.3498e−031 0

50 5e5 CS 5.6925e−017 1.0588e−015 4.7664e−016 4.6282e−016 3.4126e−016 +

PSO 3.4451e−031 0.1098 2.5823e−030 0.0109 0.0347 +

PSCS 1.3498e−031 1.3498e−031 1.3498e−031 1.3498e−031 0

f 18 30 3e5 CS 0.5682 1.1468 0.7483 0.8238 0.2187 +

PSO 0 6.4277e−005 1.0658e−014 1.9383e−005 2.7156e−005 +

PSCS 0 0 0 0 0

50 5e5 CS 0.5196 2.1556 1.1420 1.1565 0.4983 +

PSO 1.4120e−005 3.0001 0.0020 0.6025 1.2636 +

PSCS 0 0 0 0 0

f 19 30 3e5 CS −71.0018 −68.4713 −69.1340 −69.2652 0.7891 +

PSO −71.5467 −67.0229 −69.0020 −69.1151 1.3991 +

PSCS −78.3323 −78.3323 −78.3323 −78.3323 1.7079e−014

50 5e5 CS −69.4216 −67.4188 −68.2892 −68.2891 0.5089 +

PSO −69.9914 −66.0331 −67.1639 −67.4183 1.2711 +

PSCS −78.3323 −78.3323 −78.3323 −78.3323 3.5763e−014

f 20 30 3e5 CS −40.7363 −34.7007 −37.0229 −37.4340 1.9764 +

PSO −77.9249 −67.1982 −73.8676 −73.3498 3.3821 _

PSCS −63.6004 −59.8497 −60.9836 −61.2050 1.1910

50 5e5 CS −63.3153 −57.2110 −60.1343 −60.0120 2.0857 +

PSO −1.4525e+002 −1.3416e+002 −1.405e+002 −1.4022e+002 3.0571 _

PSCS −92.2794 −89.5850 −90.4580 -90.5335 0.8077

six test functions ( f 06, f 09, f 10, f 11, f 18 and f 19).Meanwhile, our algorithm also can find the global optimavalue on the one test function ( f 12) with D = 30. On thetest function f 08 with 50D, the objective value obtainedby PSCS is smaller than the value of the 1e−230, whichsuggests that the result is close to the global optimal solu-tion. For the test function f 09 with 50D, the mean value ofthis function is equal to the zeros, which those obtained byCS and PSO algorithm are larger than 70, respectively. Inthe Table 3, the experimental results for the fixed dimen-sion are shown for the f 21– f 30. From the results, wecan find that all algorithms can find the similar results. Inother aspect, from Table 4, we can find that PSCS algo-rithm requires less NFFEs to reach the VTR than CS andPSO algorithm on many functions for the 30D problems.For the some functions including f 07, f 20, f 22, f 28, f 29,and f 30, all algorithms cannot reach the VTR within theMax_NFFEs.

In any case, the PSCS exhibits the extremely convergenceperformance on almost all the benchmark functions. The per-

formance of PSCS is highly competitive with CS and PSOalgorithm, especially for the high-dimensional problems.

4.3 Comparison with other population based algorithms

To further test the efficiency of the PSCS algorithm, the PSCSalgorithm is compared with other ten well-sknown popu-lation based algorithms, i.e., MABC (Akay and Karaboga2012), GOABC (El-Abd 2012), DE (Storn and Price 1997),OXDE (Wang et al. 2011a, b), CLPSO (Liang et al. 2006),CMA-ES (Hansen and Ostermeier 2001), GL-25 (Garcia-Martinez et al. 2008), FA (Yang 2009), and FPA (Yang 2012).For the artificial bee colony, differential evolution, fireflyalgorithm, and flower pollution algorithm, the populationsize is 100. For the particle swarm optimization, the popula-tion size is 50. To the fair comparison, all algorithms have thesame number of function evaluation. The number of functionevaluation is set to 3e5 and 5e5 for 30D and 50D. The furtherexperimental results are listed in Tables 5 and 7, which showthe performance comparison among the MABC, GOABC,

123

Page 12: Cukoo srch

X. Li, M. Yin

0 0.5 1 1.5 2 2.5 3

x 104

10-60

10-50

10-40

10-30

10-20

10-10

100

1010

FEs

Err

orf01

PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-30

10-20

10-10

100

1010

1020

FEs

Err

or

f02PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-10

10-8

10-6

10-4

10-2

100

102

104

106

FEs

Err

or

f03PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-10

10-8

10-6

10-4

10-2

100

102

FEs

Err

or

f04

PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-2

100

102

104

106

108

1010

FEs

Err

or

f05PSCSCSPSO

0 2000 4000 6000 8000 10000 12000 14000 16000 1800010

0

101

102

103

104

105

FEs

Err

or

f06PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-3

10-2

10-1

100

101

102

103

FEs

Err

or

f07

PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-160

10-140

10-120

10-100

10-80

10-60

10-40

10-20

100

FEs

Err

or

f08

PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-15

10-10

10-5

100

105

FEs

Err

or

f09

PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-15

10-10

10-5

100

105

FEs

Err

or

f10PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-20

10-15

10-10

10-5

100

105

FEs

Err

or

f11PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-15

10-10

10-5

100

105

FEs

Err

or

f12

PSCSCSPSO

Fig. 1 The convergence rate of the function error values on f 01– f 12

DE, OXDE, CLPSO, CMA-ES, GL-25, FA, and FPA forf 01– f 18. We also list the rank of every algorithm in Tables6 and 8 for 30D and 50D. From Tables 5, 6, 7 and 8, it can

observe that PSCS ranks on the top for the most benchmarkfunctions. To be specific, PSCS is far better than the OXDE,CMA-ES, FA and FPA on all the test functions. PSCS is supe-

123

Page 13: Cukoo srch

A particle swarm inspired cuckoo search algorithm

0 0.5 1 1.5 2 2.5 3

x 104

10-15

10-10

10-5

100

105

FEs

Err

orf13

PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-40

10-30

10-20

10-10

100

1010

FEs

Err

or

f14PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-40

10-30

10-20

10-10

100

1010

FEs

Err

or

f15PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-15

10-10

10-5

100

105

FEs

Err

or

f16PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-35

10-30

10-25

10-20

10-15

10-10

10-5

100

105

FEs

Err

or

f17PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

10-14

10-12

10-10

10-8

10-6

10-4

10-2

100

102

FEs

Err

or

f18PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

-80

-70

-60

-50

-40

-30

-20

-10

FEs

Err

or

f19PSCSCSPSO

0 0.5 1 1.5 2 2.5 3

x 104

-80

-70

-60

-50

-40

-30

-20

-10

FEs

Err

or

f20PSCSCSPSO

Fig. 2 The convergence rate of the function error values on f 13– f 20

rior or equal to the GL-25 on some functions. For the GL-25algorithm, it can be better than PSCS for the GL-25 algo-rithm for the function f 01, f 02, f 07, f 08 and f 16 on 30D.For the 50D problem, the PSCS is similar with the DE algo-rithm on some functions. However, the DE algorithm onlycan better PSCS on the function f 16. As far as the resultsof the MABC with 30D problem, PSCS is similar with sixtest functions, while the MABC is better than the PSCS algo-rithm on one test function f 02. In the next, we will analysedifferent algorithms.

First, we will compare our algorithm with the MABC(Akay and Karaboga 2012) and GOABC (El-Abd 2012).Modified artificial bee colony algorithm, MABC for short,is proposed to used and applied to the real-parameter opti-mization problem. GOABC is enhanced by combining theconcept of generalized opposition-based learning. This con-

cept is introduced through the initialization step and throughthe generation jumping. The performance of the proposedgeneralized opposition-based ABC (GOABC) is comparedto the performance of ABC. The functions were studied atD = 30, and D = 50. The results are listed in Tables 5, 6, 7and 8 after D × 10, 000 NFFEs.

As can be seen in these tables, we can find that PSCS isbetter than MABC on eleven out of eighteen in the case of30D. For the rest functions, PSCS and MABC can all find theoptimal solution except f 02. For the 50D problem, our algo-rithm can give the best solution for all benchmark functions.Compared with the GOABC, the PSCS also can obtain bet-ter solution than the GOABC algorithm with 30D and 50Dexcept f 11. For the f 10, GOABC can obtain the best solu-tion with 50D. For the dimensional 30, it can be deduced thatPSCS is statistically significantly better as compared to all

123

Page 14: Cukoo srch

X. Li, M. Yin

Table 3 Best, worst, median, mean, standard deviation and success rate values achieved by CS, PSO and PSCS through 30 independent runs onfixed dimensions

No. Dim Methods Best Worst Median Mean SD Sig.

+ f 21 2 CS 0.9980 0.9985 0.9980 0.9981 1.3647e−004 −PSO 0.9980 0.9980 0.9980 0.9980 1.9119e−016 −PSCS 0.9980 0.9983 0.9980 0.9980 9.9501e−005

f 22 4 CS 7.1751e−004 0.0018 0.0010 0.0011 3.3925e−004 +

PSO 5.787e−004 0.0214 7.249e−004 0.0035 0.0069 +

PSCS 7.1628e−004 0.0013 8.5650e−004 9.0847e−004 1.9201e−004

f 23 2 CS −1.0316 −1.0316 −1.0316 −1.0316 8.2950e−008 −PSO −1.0316 −1.0316 −1.0316 −1.0316 6.5454e−016 −PSCS −1.0316 −1.0316 −1.0316 −1.0316 6.4580e−007

f 24 2 CS 0.3979 0.3979 0.3979 0.3979 1.8927e−006 −PSO 0.3979 0.3979 0.3979 0.3979 0 −PSCS 0.3979 0.3985 0.3979 0.3980 2.1092e−004

f 25 2 CS 3 3 3 3 1.0058e−008 +

PSO 3 3.0011 3.0002 3.0003 2.7243e−004 +

PSCS 3 3 3 3 3.7532e−013

f 26 3 CS −3.8628 −3.8628 −3.8628 −3.8628 4.8402e−006 +

PSO −3.8628 −3.8628 −3.8628 −3.8628 2.2035e−015 −PSCS −3.8628 −3.8628 −3.8628 −3.8628 5.8053e−007

f 27 6 CS −3.3192 −3.3013 −3.3137 −3.3130 0.0049 +

PSO −3.3219 −3.2031 −3.2031 −3.2427 0.0570 +

PSCS −3.3213 −3.3134 −3.3160 −3.3167 0.0025

f 28 4 CS −9.9828 −9.1045 −9.7448 −9.7008 0.2715 −PSO −10.1531 −2.6304 −5.1007 −6.6281 3.0650 +

PSCS −10.0104 −8.7811 −9.3368 −9.4557 0.4223

f 29 4 CS −10.3143 −8.5449 −10.0948 −9.7808 0.6724 +

PSO −10.4029 −1.8375 −10.4029 −8.0758 3.4499 +

PSCS −10.3814 −9.7080 −10.0669 −10.0592 0.2245

f 30 4 CS −10.3571 −8.9279 −9.7901 −9.7428 0.4723 +

PSO −10.5364 −2.4217 −10.5364 −8.9789 2.9320 +

PSCS −10.5331 −9.7516 −10.3150 −10.2130 0.2574

other algorithms. Obviously, it can be seen that the PSCS issuperior to all other algorithms.

Second, PSCS was compared with two other state-of-the-art DE variants, i.e., DE and OXDE (Wang et al. 2011a, b).Wang et al. (2011a) proposes an orthogonal crossover oper-ator, which is based on orthogonal design, can make a sys-tematic and rational search in a region defined by the par-ent solutions. Experimental results show the OXDE is veryeffective. Tables 5, 6, 7 and 8 summarizes the experimen-tal results for 30D and 50D. As can be seen in Table 5, forthe 30D problem, PSCS can obtain better solutions than DEand OXDE. For the 50D problem, the algorithm can find thebetter solutions than DE algorithm except f 10 and f 16.

Third, to evaluate the effectiveness and efficiency ofPSCS, we compare its performance with CLPSO (Lianget al. 2006), CMA-ES (Hansen and Ostermeier 2001),

GL-25 (Garcia-Martinez et al. 2008). Liang et al. proposesa new particle swarm optimization-CLPSO; a particle usesthe personal historical best information of all the particles toupdate its velocity. Hansen and Ostermeier propose a veryefficient and famous evolution strategy. Garcia-Martinez etal. proposes a hybrid real-coded genetic algorithm whichcombines the global and local search. Each method was run30 times on each test function. Table 5, 6, 7 and 8 summa-rizes the experimental results for 30D and 50D. As can beseen in these tables, PSCS significantly outperforms CLPSO,CMA-ES, and GL-25. PSCS performs better than CLPSO,CMA-ES, and GL-25 on 15, 15, and 13 out of 18 test functionfor 30D, respectively. CLPSO and CMA-ES are superior to,equal to PSCS on three test functions. GL-25 is superior to,equal to PSCS on five test functions. For the 30D, the resultsare shown in Table in terms of the mean, standard deviation

123

Page 15: Cukoo srch

A particle swarm inspired cuckoo search algorithm

Table 4 Comparisons the NFFES of CS, PSO and PSOCS on 30 dimension problem

N Max_NFEES CS PSO PSCS

Mean SD SR Mean SD SR Mean SD SR

f 01 3e5 128,190 4.0888e+003 30 185,085 2.8981e+003 30 47,580 6.3385e+002 30

f 02 3e5 228,490 5.0498e+003 30 186,520 2.7211e+003 30 60,550 1.0936e+003 30

f 03 3e5 NA NA NA NA NA NA 185,020 8.7575e+003 30

f 04 3e5 NA NA NA NA NA NA 170,780 3.2987e+003 30

f 05 3e5 NA NA NA NA NA NA 295,040 1.5684e+004 3

f 06 3e5 87,880 5.7420e+003 30 165,045 7.3227e+003 30 25,600 1.1728e+003 30

f 07 3e5 NA NA NA NA NA NA NA NA NA

f 08 3e5 12,660 1.4104e+003 30 69,180 8.4894e+003 30 6,300 8.2865e+002 30

f 09 3e5 NA NA NA NA NA NA 161,900 5.3299e+003 30

f 10 3e5 NA NA NA NA NA NA 185,450 4.0749e+003 30

f 11 3e5 184,350 1.8280e+004 30 261,985 5.0812e+004 12 58,620 2.0043e+003 30

f 12 3e5 NA NA NA NA NA NA 143,420 4.4293e+003 30

f 13 3e5 270,620 1.9708e+004 27 202,200 4.9934e+003 30 79,600 6.3133e+003 30

f 14 3e5 245,070 3.8154e+004 30 213,685 4.6077e+004 24 40,790 1.0795e+003 30

f 15 3e5 158,460 5.1055e+003 30 219,180 4.3058e+004 24 45,590 7.5048e+002 30

f 16 3e5 NA NA NA 192,275 5.5779e+003 30 299,290 1.7816e+003 6

f 17 3e5 144,990 3.9761e+003 30 190,010 3.9215e+004 28 41,600 1.1756e+003 30

f 18 3e5 NA NA NA 250,250 4.2935e+004 18 97,770 1.0551e+003 30

f 19 3e5 NA NA NA NA NA NA 141,030 5.8638e+003 30

f 20 3e5 NA NA NA NA NA NA NA NA NA

f 21 1e4 5,800 3.0422e+003 24 5,450 3.0733e+003 27 5,100 2.1155e+003 30

f 22 1e4 NA NA NA NA NA NA NA NA NA

f 23 1e4 3,420 8.0249e+002 30 8,465 2.1612e+003 17 3,820 1.2752e+003 30

f 24 1e4 3,970 9.7758e+002 30 9,455 1.0468e+003 11 7,530 1.9630e+003 24

f 25 1e4 3,330 1.2884e+003 30 9,585 1.4704e+003 5 2,780 5.6529e+002 30

f 26 1e4 3,060 8.4747e+002 30 1,525 5.3812e+002 30 2,350 9.1560e+002 30

f 27 1e4 9,990 31.622 3 NA NA NA 9,030 1.5004e+003 12

f 28 1e4 NA NA NA NA NA NA NA NA NA

f 29 1e4 NA NA NA NA NA NA NA NA NA

f 30 1e4 NA NA NA NA NA NA NA NA NA

of the solutions obtained in the 30 independent runs by eachalgorithm. From the Table 6, we can find that the PSCS pro-vides better solutions than other algorithms on 17, 14, and14 out of 18 test functions for 50D, respectively.

Finally, to show the effective of our algorithm further,we increase the function evaluation number up to at least2,000,000 with the dimension 50. Since the problem solv-ing success of some algorithms used in the tests stronglydepends on “the size of the population”, the size of the popu-lation is 30. Then, the proposed algorithm is compared witheight well-known algorithms. Based on the above experi-ments, the CLPSO, GL-25 and CMA-ES are discarded fromexperiments. MABC and GOABC are still in the comparedalgorithms. For the DE algorithm, we use the CoDE (Wanget al. 2011a, b) instead of the standard DE and OXDE becauseit is very effective compared with other well-known algo-

rithms. Simultaneously, we also add some well-known algo-rithms, such as bat algorithm (BA) (Yang and Gandomi Amir2012), backtracking search optimization algorithm (BSA)(Civicioglu 2012, 2013a, b), Bijective/Surjective version ofdifferential search algorithm (BSA, SDS) (Civicioglu 2012,2013a, b). BSA uses three basic genetic operators: selection,mutation and crossover to generate trial individuals. Thisalgorithm has been shown better than some well-known algo-rithms. DS algorithm simulates the Brownian-like random-walk movement used by an organism to migrate and its per-formance is compared with the performances of the classicalmethods. These two algorithms are high performance meth-ods. Therefore, we add these two algorithms in our experi-ments. The statistical results are calculated in Tables 9 and10. As observed in Table 9, the proposed PSCS obtains goodresults in some benchmark test functions. The analysis and

123

Page 16: Cukoo srch

X. Li, M. Yin

Table 5 Comparisons with other algorithms on 30 dimension problem

F f 1 f 2 f 3

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 7.2133e−044 4.7557e−044 1 3.6944e−031 1.6797e−031 −1 3.8170e+003 1.0130e+003 1

GOABC 5.4922e−016 1.4663e−016 1 6.5650e−016 3.2782e−016 1 3.3436e+003 1.6035e+003 1

DE 1.8976e−031 2.3621e−031 1 6.7922e−016 3.8931e−016 1 3.5495e−005 3.0922e−005 1

OXDE 5.7407e−005 2.3189e−005 1 0.0089 0.0015 1 2.6084e+003 456.6186 1

CLPSO 1.2815e−023 5.8027e−024 1 1.4293e−014 3.9883e−015 1 6.4358e+002 1.5270e+002 1

CMA-ES 5.9151e−029 1.0673e−029 1 0.0132 0.0594 1 1.5514−026 3.6118e−027 −1

GL-25 8.2615e−232 0 −1 3.1950e−038 1.3771e−037 −1 3.5100 6.1729 1

FA 9.0507e−004 1.9291e−004 1 0.0162 0.0034 1 0.0060 0.0021 1

FPA 2.9882e−009 4.2199e−009 1 1.5300e−005 6.7334e−006 1 5.4833e−007 1.3205e−006 1

PSCS 9.6819e−051 1.0311e−050 – 1.2865e−028 5.2708e−029 – 2.3503e−009 2.0191e−009 –

F F4 F5 F6

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 0.0849 0.0106 1 25.1824 1.3538 1 0 0 0

GOABC 1.2109 3.8285 1 38.6234 24.6906 1 0 0 0

DE 0.0644 0.1704 1 3.0720 0.5762 1 0 0 0

OXDE 0.4925 0.2268 1 23.8439 0.4515 1 0 0 0

CLPSO 2.5647 0.2958 1 5.6052 3.6231 1 0 0 0

CMA-ES 3.9087e−015 4.7777e−016 −1 1.8979 2.4604 1 0 0 0

GL-25 0.3726 0.2910 1 22.0314 1.4487 1 0 0 0

FA 0.0393 0.0134 1 30.9577 16.9374 1 0 0 0

FPA 1.7694 0.6656 1 20.8044 13.2997 1 0 0 0

PSCS 4.1096e−009 1.8666e−009 – 1.6879 2.4024 – 0 0 –

F F7 F8 F9

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 0.0114 0.0022 1 4.6951e−093 1.0199e−092 1 60.4535 4.4675 1

GOABC 0.0108 0.0046 1 8.5567e−017 7.5688e−017 1 0 0 0

DE 0.0048 0.0012 1 3.5903e−060 1.1354e−059 1 139.0106 33.9803 1

OXDE 0.0065 0.0014 1 9.4201e−025 1.7803e−024 1 93.9627 8.9225 1

CLPSO 0.0053 0.0010 1 9.2601e−080 1.0938e−079 1 3.1327e−012 5.6853e−012 1

CMA-ES 0.2466 0.0813 1 6.7414e−020 6.7206e−020 1 2.2754e+002 64.3046 1

GL-25 0.0014 5.8267e−004 −1 1.0375e−322 0 −1 19.5817 6.2866 1

FA 0.0203 0.0131 1 1.3939e−008 7.4786e−009 1 34.4259 12.6178 1

FPA 0.0119 0.0065 1 5.0197e−029 1.0228e−028 1 27.7686 5.2689 1

PSCS 0.0037 0.0015 – 4.3501e−156 9.0819e−156 – 0 0 –

F f 10 f 11 f 12

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 44.3808 4.8644 1 0 0 0 2.0518e+003 644.3215 1

GOABC 0 0 0 0.0115 0.0178 1 11.8438 37.4534 1

DE 98.3747 27.4538 1 0 0 0 5.1481e−009 1.6278e−008 1

OXDE 70.3559 10.5847 1 0.0029 0.0035 1 1.9799e+003 697.7371 1

CLPSO 1.2276e−010 7.2195e−011 1 4.9404e−015 6.2557e−015 1 0 0 0

123

Page 17: Cukoo srch

A particle swarm inspired cuckoo search algorithm

Table 5 continued

F f 10 f 11 f 12

Algorithm Mean SD p value Mean SD p value Mean SD p value

CMA-ES 2.4720e+002 45.9514 1 0.0014 0.0036 1 5.5215e+003 8.1119e+002 1

GL-25 34.8904 6.9122 1 2.9753e-015 7.6569e-015 1 3.5905e+003 9.6997e+002 1

FA 43.7334 19.5903 1 0.0021 5.5807e-004 1 5.2300e+003 389.8672 1

FPA 33.0036 6.2419 1 0.0116 0.0114 1 3.2972e+003 2.9941e+002 1

PSCS 0 0 – 0 0 – 0 0 –

F f 13 f 14 f 15

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 7.9936e−015 0 1 1.5705e−032 2.8850e−048 0 1.3498e−032 2.8850e−048 0

GOABC 3.0020e−014 1.0296e−014 1 0.0124 0.0393 1 2.9888e−006 9.4515e−006 1

DE 5.1514e−015 1.4980e−015 1 2.1772e−032 7.0712e−033 1 3.8520e−032 3.9614e−032 1

OXDE 0.0026 4.6523e−004 1 2.5482e−006 1.1609e−006 1 1.9809e−005 8.4309e−006 1

CLPSO 1.1306e−012 2.7237e−013 1 1.1760e−024 8.6371e−025 1 7.3255e−024 4.5667e−024 1

CMA-ES 19.5117 0.1664 1 0.0103 0.0319 1 5.4936e−004 0.0024 1

GL-25 8.9173e−014 1.4217e−013 1 2.1809e−031 7.7133e−031 1 2.1243e−031 3.8884e−031 1

FA 0.0073 9.9154e−004 1 0.0114 0.0122 1 6.7341e−004 2.9108e−004 1

FPA 1.5676 1.0199 1 0.0622 0.1347 1 7.3713e−004 0.0028 1

PSCS 4.4409e−015 0 – 1.5705e−032 2.8849e−048 – 1.3498e−032 2.8849e−048 –

F f 16 f 17 f 18

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 0.0053 0.0012 1 1.3498e-031 0 0 0 0 0

GOABC 3.5689e−012 8.2904e−012 −1 3.5846e−016 6.6160e−017 1 3.5527e−015 5.0243e−015 1

DE 0.0027 0.0045 1 1.3498e−031 0 0 0 0 0

OXDE 0.0253 0.0018 1 2.6546e−006 7.4999e−007 1 33.7129 2.0843 1

CLPSO 1.1762e−004 3.8038e−005 1 6.5838e−025 3.447e−025 1 0 0 0

CMA-ES 0.1496 0.2721 1 0.3164 1.3381 1 2.7869 1.9945 1

GL-25 9.9252e−006 3.8464e−005 −1 2.2374e−028 9.6446e−028 1 0.0044 0.0020 1

FA 0.0701 0.0557 1 0.2336 0.3232 1 22.1799 1.6093

FPA 0.0775 0.1881 1 0.0073 0.0283 1 2.8863 0.8868 1

PSCS 1.0285e−004 2.4428e−005 – 1.3498e−031 0 – 0 0 –

discussion of the experimental results are given in the fol-lowing section:

1. For MABC and GOABC, the proposed PSCS clearly per-forms better than competitors on seven test functions (f 3,f 4, f 5, f 9, f 10, f 13, f 16). MABC offers the best perfor-mance on two test functions (f 2 and f 12) and GOABCcan obtain better solution on F7. For the rest functions,our algorithm can provide the similar solutions with thesealgorithms. From the Table 10, we can draw a conclu-sion that the outstanding of the proposed algorithm isattributed to its new updated search method. Therefore,

PSCS has the good exploitation ability in terms of solvingthese functions.

2. For the algorithm CoDE, the experimental results showthat the proposed algorithm is better than CoDE on eighttest functions including f 2, f 5, f 9, f 10, f 14, f 15, f 16and f 17. For the function f 3, f 4, and f 12, the CoDEoutperforms our algorithm on these functions. For therest functions f 1, f 6, f 7, f 8, f 11 and f 18, both algo-rithms can obtain the same results. The reason is thatthe best solution in the current population is used in ouralgorithm, which indicates that the proposed algorithmhas the pleasurable exploration ability.

123

Page 18: Cukoo srch

X. Li, M. Yin

Table 6 Rank of different algorithms on 30D problem

F MABC GOABC DE OXDE CLPSO CMA-ES GL-25 FA FPA PSCS

f 01 3 7 4 9 6 5 1 10 8 2

f 02 2 4 5 8 6 9 1 10 7 3

f 03 10 9 4 8 7 1 6 5 3 2

f 04 5 8 4 7 10 1 6 3 9 2

f 05 8 9 3 7 4 2 6 10 5 1

f 06 1 1 1 1 1 1 1 1 1 1

f 07 7 6 3 5 4 10 1 9 8 2

f 8 3 9 5 7 4 8 1 10 6 2

f 9 7 1 9 8 3 10 4 6 5 1

f 10 7 1 9 8 3 10 5 6 4 1

f 11 1 9 1 8 5 6 4 7 10 1

f 12 6 4 3 5 1 10 8 9 7 1

f 13 3 4 2 7 6 10 5 8 9 1

F14 1 9 3 6 5 7 4 8 10 1

F15 1 6 3 7 5 8 4 9 10 1

F16 6 1 5 7 4 10 2 8 9 3

F17 1 6 3 7 5 10 4 9 8 1

F18 1 5 1 10 1 7 6 9 8 1

Average 4.0556 5.5000 3.7778 6.9444 4.4444 6.9444 3.8333 7.6111 7.0556 1.5000

3. For the algorithm FA, BA, BSA, BDS and SDS, our algo-rithm can obtain the best solutions on all test functionscompared with the FA and BA. For the BSA, it onlycan provide the better solution than our algorithm on testfunction f 12. BDS and SDS are two different versionsof differential search algorithm. From the results, thesealgorithms can provide very similar results with our algo-rithm. For BDS, it can provide the better solutions onfunction f 12 and f 17. For the f 1, f 6, f 8, f 10, f 11,f 14, f 15, f 17 and f 18, our algorithm can give the bestsolutions. For the SDS, our algorithm can perform betterthan this algorithm on the test functions including f 2,f 3, f 4, f 5, f 7, f 9, f 11, f 13, f 16 and f 17. For thef 12, SDS can give the better solution. This is attributedto that our algorithm uses different search methods toenlarge the search space.

Summarizing the above statements, PSCS can prevent thenest falling into the local solution, reduce evolution proposedsignificantly and convergence faster.

5 Application to real world problems

In this section, we will use the algorithm to solve two famousreal-world optimizations to verify the efficacy of the pro-posed algorithm.

5.1 Chaotic system

The following part of this section describes the chaotic sys-tem. Let

X = F(X, X0, θ0) (16)

be a continuous nonlinear chaotic system, where X =(x1, x2, . . . , xN )′ ∈ Rn the state vector of the chaotic systemis, X is the derivative of X and X0 denotes the initial state.The θ0 = (θ10, θ20, . . . , θd0)

′ is the original parameters.Suppose the structure of system (16) is known, then the

estimated system can be written as˙X = F(X, X0, θ ) (17)

where X = (x1, x2, . . . , xN )′ ∈ Rn denotes the state vector,and θ = (θ1, θ2, . . . , θd)′ is a set of estimated parameters.

Based on the measurable state vector X = (x1, x2, . . . ,

xN )′ ∈ Rn , we define the following objective function orfitness function

f (θni ) =

W∑t=0

[(x1(t) − xn

i,1(t))2 + · · · + (xN (t) − xn

i,N (t))2]

(18)

where t = 0, 1, . . . , W . The goal of estimating the parame-ters of chaotic system (17) is to find out the suitable value ofθn

i so that fitness function (18) is globally minimized.To evaluate the performance of our algorithm, we applied

it to the chaotic system as the standard benchmark. Lorenz

123

Page 19: Cukoo srch

A particle swarm inspired cuckoo search algorithm

Table 7 Comparisons with other algorithms on 50 dimension problem

F f 1 f 2 f 3

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 3.0941e−032 1.3476e−032 1 1.1029e−025 5.4166e−026 1 4.0654e+004 3.8946e+003 1

GOABC 9.6227e−016 4.1880e−016 1 1.8941e−015 5.7908e−016 1 1.8008e+004 1.1428e+004 1

DE 6.4438e−035 9.0934e−035 1 7.6202e−018 4.5051e−018 1 2.1434 1.3166 1

OXDE 4.0583e−006 1.7326e−006 1 0.0016 3.8530e−004 1 1.2537e+004 1.7127e+003 1

CLPSO 6.0841e−011 2.3352e−011 1 1.6721e−007 2.7779e−008 1 9.7209e+003 1.3183e+003 1

CMA-ES 1.1135e−028 1.8896e−029 1 0.0011 0.0052 1 7.2663e−026 1.1403e−026 −1

GL-25 3.6608e−164 0 −1 2.9368e−008 1.2813e−007 1 1.8173e+002 1.8525e+002 1

FA 0.0035 7.2415e−004 1 0.0756 0.0335 1 0.2429 0.0671 1

FPA 2.6443e−005 2.3912e−005 1 5.0326e−005 1.9679e−005 1 0.3083 0.1823 1

PSCS 1.5045e−063 9.7749e−064 – 3.0332e−035 1.3784e−035 – 4.0249e−005 1.8619e−005 –

F F4 F5 F6

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 6.3271 0.6280 1 48.4641 10.0716 1 0 0 0

GOABC 2.0933 5.7658 1 46.6914 0.1364 1 0 0 0

DE 4.7399 1.8562 1 21.2158 2.2015 1 0 0 0

OXDE 3.7553 1.3748 1 42.5529 2.6007 1 0 0 0

CLPSO 10.4321 0.5326 1 72.4622 26.3377 1 0 0 0

CMA-ES 5.7282e−015 6.1633e−016 −1 0.1993 0.8914 −1 0 0 0

GL-25 9.5680 1.9727 1 41.0062 0.8413 1 0 0 0

FA 0.0855 0.0071 1 95.9064 72.5433 1 0 0 0

FPA 8.6147 9.0588 1 50.4389 25.1110 1 0.2 0.4472 1

PSCS 1.5855e−010 1.1521e−010 – 11.5491 1.3832 – 0 0 –

F F7 F8 F9

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 0.0258 0.0034 1 2.7185e−053 3.0991e−053 1 200.3499 12.3649 1

GOABC 0.01458 0.0047 1 1.0793e−016 9.4372e−017 1 0 0 0

DE 0.0062 0.0011 1 1.0607e−024 3.3519e−024 1 224.8962 54.7317 1

OXDE 0.0103 0.0034 1 4.5549e−024 1.3807e−023 1 146.7573 9.1273 1

CLPSO 0.0158 0.0042 1 1.2511e−057 2.0673e−057 1 3.4997 1.1701 1

CMA-ES 0.2713 0.1054 1 1.8078e−017 1.5782e−017 1 3.8022e+002 79.2564 1

GL-25 0.0050 0.0012 1 1.0745e−274 0 −1 49.0380 9.0639 1

FA 0.0121 0.0054 1 2.2465e−008 9.2175e−009 1 81.9855 26.1505 1

FPA 0.0578 0.0221 1 1.4688e−024 2.7240e−024 1 45.2255 11.8221 1

PSCS 0.0042 8.6491e−004 – 1.3546e−236 0 – 0 0 –

F f 10 f 11 f 12

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 163.7688 12.2769 1 0 0 0 7.7211e+003 655.7639 1

GOABC 0 0 −1 0.00591 0.0080 1 11.8913 37.4370 1

DE 194.0885 35.8132 1 0 0 0 189.5013 186.8508 1

OXDE 137.0397 9.4330 1 5.0143e−006 2.3346e−006 1 69.8963 105.5689 1

CLPSO 9.0885 2.3566 1 3.9804e−008 4.7773e−008 1 3.3105e−011 9.1473e−012 1

123

Page 20: Cukoo srch

X. Li, M. Yin

Table 7 continued

F f 10 f 11 f 12

Algorithm Mean SD p value Mean SD p value Mean SD p value

CMA-ES 3.8490e+002 64.6715 1 8.6266e−004 0.0026 1 9.2754e+003 1.0321e+003 1

GL-25 78.3676 22.9800 1 2.3617e−013 8.7969e−013 1 7.5250e+003 1.1652e+003 1

FA 90.0001 10.9316 1 0.0043 3.6548e−004 1 9.2466e+003 1.0012e+003 1

FPA 49.7813 14.9183 1 0.0049 0.0075 1 6.2738e+003 3.2282e+002 1

PSCS 0.0185 0.0299 – 0 0 – 1.8190e−011 1.8190e−011 –

F f 13 f 14 f 15

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 1.9718e−014 2.3979e−015 1 2.3891e−027 3.4561e−027 1 3.0936e−028 2.5372e−028 1

GOABC 5.5244e−014 1.0860e−014 1 9.5123e−016 6.6300e−017 1 0.1303 0.2774 1

DE 6.2172e−015 1.8724e−015 1 9.4233e−033 1.4425e−048 0 1.3498e−032 2.8850e−048 0

OXDE 4.4683e−004 6.1365e−005 1 6.6657e−008 1.8656e−008 1 1.7408e−006 9.1030e−007 1

CLPSO 1.8146e−006 1.7580e−007 1 4.1795e−012 1.3277e−012 1 7.3135e−011 2.0487e−011 1

CMA-ES 19.4765 0.1470 1 0.0062 0.0191 1 0.0016 0.0040 1

GL-25 3.9945e−009 1.7822e−008 1 0.0279 0.0621 1 0.0679 0.1293 1

FA 0.0117 0.0012 1 0.3730 0.3851 1 0.0041 9.0510e−004 1

FPA 1.3134 1.2827 1 0.0769 0.1398 1 10.5293 9.8654 1

PSCS 9.4233e−033 1.4425e−048 – 9.4233e−033 1.4425e−048 – 1.3498e−032 2.8849e−048 –

F f 16 f 17 f 18

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 0.0316 0.0024 1 1.3498e−031 0 0 0 0 0

GOABC 3.4754e−010 8.8363e−010 −1 7.5876e−016 1.0618e−016 1 2.9842e−014 1.5639e−014 1

DE 1.7682e−010 5.4624e−010 −1 1.3498e−031 0 0 0 0 0

OXDE 0.0203 0.0092 1 1.7238e−007 1.1368e−007 1 62.9351 1.9283 1

CLPSO 0.0047 0.0011 1 3.0900e−012 9.7408e−013 1 1.3296e−004 1.8833e−005 1

CMA-ES 0.7919 1.0294 1 0.4714 0.9191 1 5.5316 2.7861 1

GL-25 4.7890e−004 0.0011 −1 4.0716e−026 1.3718e−025 1 0.1393 0.0596 1

FA 1.4922 1.0389 1 0.7896 1.0041 1 39.7721 2.1286 1

FPA 6.5539e−005 7.9779e−005 −1 0.0220 0.0491 1 7.7835 1.6791 1

PSCS 0.0021 4.7521e−004 – 1.3498e−031 0 – 0 0 –

system described below was chosen to test the performanceof the algorithm. Each algorithm ran 30 times on the chaoticsystem. The successive W state (W = 30) of both the esti-mated system and the original system are used to calculatethe fitness.

The well-known Lorenz (1963) system is employed as anexample in this paper. The general expression of the chaoticsystem can be described as follows:⎧⎨⎩

x1 = θ1(x2 − x1)

x2 = (θ2 − x3)x1 − x2

x3 = x1x2 − θ3x3

(19)

where x1, x2 and x3 are the state variable, θ1, θ2 and θ3 areunknown positive constant parameters. The original parame-ters is θ1 = 10 θ2 = 28 and θ3 = 8/3. To simulate, we letthe parameters of the Lorenz system be θ1 = 10, θ2 = 28,θ3 = 8/3.

To simulate this system, the successive state W is 30 andeach algorithm ran 30 times with each single runs 100 iter-ations. Table 11 lists the statistical results of the best fitnessvalue, the mean value, the standard deviation and identifiedparameters of Lorenz system. From Table 11, it can be seenthat the best fitness value obtained by PSCS can perform bet-

123

Page 21: Cukoo srch

A particle swarm inspired cuckoo search algorithm

Table 8 Rank of different algorithms on 50D problem

F MABC GOABC DE OXDE CLPSO CMA-ES GL-25 FA FPA PSCS

f 01 4 6 3 8 7 5 1 10 9 2

f 02 2 4 3 9 6 8 5 10 7 1

f 03 10 9 5 8 7 1 6 3 4 2

f 04 7 4 6 5 10 1 9 3 8 2

f 05 7 6 3 5 9 1 4 10 8 2

f 06 1 1 1 1 1 1 1 1 10 1

f 07 8 6 3 4 7 10 2 5 9 1

f 8 4 9 5 7 3 8 1 10 6 2

f 9 8 1 9 7 3 10 5 6 4 1

f 10 8 1 9 7 3 10 5 6 4 2

f 11 1 10 1 6 5 7 4 8 9 1

f 12 8 3 5 4 2 10 7 9 6 1

f 13 3 4 2 7 6 10 5 8 9 1

F14 3 4 1 6 5 7 8 10 9 1

F15 3 9 1 5 4 6 8 7 10 1

F16 8 2 1 7 6 9 4 10 3 5

F17 1 5 1 7 6 9 4 10 8 1

F18 1 4 1 10 5 7 6 9 8 1

Average 4.8333 4.8889 3.3333 6.2778 5.2778 6.6667 4.7222 7.5000 7.2778 1.5556

ter than CS, and PSO. The mean of identified parameters byPSCS is more accurate than those identified by CS and PSO.

5.2 Application to spread spectrum radar poly-phase codedesign problem

The spread spectrum radar poly-phase code design problem isa very famous problem of optimal design (Das and Suganthan2010). The problem can be defined as follows:

Global min f ( �X) = max(ϕ1( �X), ϕ2( �X), . . . , ϕ2m( �X))

where �X = {(x1, . . . , xD) ∈ RD|0 ≤ x j ≤ 2π, j = 1, . . . ,

D}

and m = 2D − 1.

ϕ2i−1( �X) =D∑

j=i

cos

⎛⎝ j∑

k=|2i− j−1|+1

xk

⎞⎠, i = 1, 2, . . . , D

ϕ2i ( �X) = 0.5 +D∑

j=i+1

cos

⎛⎝ j∑

k=|2i− j |+1

xk

⎞⎠,

i = 1, 2, . . . , D − 1

ϕm+i ( �X) = −ϕi ( �X), i = 1, 2, . . . , m.

Table 12 shows the best, worst, median, mean and the stan-dard deviation values obtained by three algorithms through30 independent runs. As can be seen in this table, we can findthat our algorithm can achieve superior performance over the

other algorithms. It can also demonstrate that our algorithmis a very effective algorithm for optimization problem.

6 Conclusions

In this paper, we propose a new cuckoo search algorithm-inspired particle swarm optimization to solve the global opti-mization problems with continuous variables. In our paper,the proposed algorithms modify the update strategy throughadd the neighborhood individual and best individual to bal-ance the exploitation and exploration of the algorithm. In thefirst part, the algorithm uses the neighborhood individual toenhance the diversity of the algorithm. In the second part, thealgorithm uses two new search strategies changing by a ran-dom probability rule to balance the exploitation and explo-ration of the algorithm. In other aspect, our algorithm has avery simple structure and thus is easy to implement. To verifythe performance of PSCS, 30 benchmark functions chosenfrom literature are employed. The results show that the pro-posed PSCS algorithm clearly outperforms the basic CS andPSO algorithm. Compared with some evolution algorithms(CLPSO, CMA-ES, GL-25, DE, OXDE, ABC, GOABC, FAand FPA) from literature, we find our algorithm is superiorto or at least highly competitive with these algorithms. Inthe last, experiments have been conducted on two real-worldproblems. Simulation results and comparisons demonstratethat the proposed algorithm is very effective.

123

Page 22: Cukoo srch

X. Li, M. Yin

Table 9 Coherent comparisons with other algorithms on 50 dimension problem

F f 1 f 2 f 3

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 0 0 0 0 0 −1 2.1639e+004 3.4719e+003 1

GOABC 4.590e−008 1.026e−007 1 1.2910e−011 2.860e−011 1 4.1824e+004 2.1844e+004 1

CoDE 0 0 0 2.6431e−176 0 1 8.9463e−048 2.2692e−047 −1

FA 7.465e−101 7.142e−102 1 0.0083 0.0163 1 2.0373e−025 3.1539e−026 1

BA 2.7120-005 3.023e−006 1 1.5945e+004 3.2499e+004 1 3.2420e+002 7.2495e+002 1

BSA 2.201e−261 0 1 3.5564e−148 8.4640e−148 1 1.6969e−005 2.2606e−005 1

BDS 0 0 0 2.6645e-177 0 1 0.0754 0.0705 1

SDS 0 0 0 3.6358e−206 0 1 2.5610e−005 1.9755e−005 1

PSCS 0 0 − 1.1924e−272 0 − 6.7446e−046 4.2013e−046 −F F4 F5 F6

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 8.294e−012 7.675e−012 1 36.2419 31.0792 1 0 0 0

GOABC 0.3459 0.1311 1 4.98838e+002 8.0716e+002 1 0 0 0

CoDE 9.093e−048 2.567e−047 −1 0.3987 1.2271 1 0 0 0

FA 0.0532 0.0251 1 45.8660 0.8307 1 0 0

BA 32.0319 5.5715 1 9.5638 2.4789 1 2.9988e+004 8.0771e+003 1

BSA 0.0309 0.0266 1 0.9966 1.7711 1 0 0 0

BDS 2.293e−013 3.594e−013 1 9.8809 20.8574 1 0 0 0

SDS 1.319e−016 1.755e−016 1 5.2646e−027 1.8936e−026 1 0 0 0

PSCS 5.830e−020 1.301e−019 − 2.5590e−028 2.0639e−028 − 0 0 −F F7 F8 F9

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 0.0113 0.0023 1 0 0 0 27.4042 56.7605 1

GOABC 2.100e−004 1.124e−004 −1 1.4851e−017 1.2626e−017 1 1.1952 2.1599 1

CoDE 0.0013 7.535e−004 0 0 0 0 0.4975 0.9411 1

FA 0.0349 0.0272 1 1.1005e−008 4.3442e−009 1 93.9239 41.6611 1

BA 0.0699 0.0159 1 1.8885e−010 2.4897e−011 1 1.0328e+002 22.6817 1

BSA 0.0044 0.0010 1 0 0 0 0.3482 0.6674 1

BDS 0.0020 6.841e−004 1 0 0 0 0.0497 0.2224 1

SDS 0.0019 3.597e−004 −1 0 0 0 0.8457 1.2616 1

PSCS 0.0013 2.771e−004 0 0 − 0 0 −F f 10 f 11 f 12

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 117.4553 7.6507 1 0 0 0 1.8190e−011 0 −1

GOABC 5.0011e−009 7.2382e−009 1 0.0024 0.0055 1 23.6877 52.9674 −1

CoDE 1.8000 1.3219 1 0 0 0 1.8190e−011 0 −1

FA 1.042e+002 9.859 1 2.220e−017 4.965e−017 1 8.457e+003 3.328e+002 1

BA 4.2235e+002 1.5171e+002 1 18.7555 41.9249 1 1.0584e+004 8.503e+002 1

BSA 0 0 0 0.0013 0.0033 1 5.9219 26.4836 −1

BDS 0 0 0 0 0 0 5.9219 26.4836 −1

SDS 0 0 0 8.6131e−004 0.0038 1 1.8190e−011 0 −1

123

Page 23: Cukoo srch

A particle swarm inspired cuckoo search algorithm

Table 9 continued

F f 13 f 14 f 15

Algorithm Mean SD p value Mean SD p value Mean SD p value

PSCS 0 0 − 0 0 − 47.3753 64.8713 −MABC 1.1191e−014 2.0167e−015 1 9.4233e−033 1.4425e−048 0 1.3498e−032 2.8850e−048 0

GOABC 5.2013e−004 9.2477e−004 1 1.6304e−004 3.6458e−004 1 0.0624 0.1396 1

CoDE 4.4409e−015 0 0 0.0031 0.0139 1 5.4937e−004 0.0025 1

FA 5.3468e−014 1.1621e−014 1 0.0128 0.0137 1 3.3674e−005 3.0565e−005 1

BA 16.7048 0.7936 1 13.8561 19.4668 1 1.3728e+002 13.3091 1

BSA 2.7355e−014 4.5343e−015 1 9.4233e−033 1.4425e−048 0 5.4936e−004 0.0024 1

BDS 1.0302e−014 3.3157e−015 1 9.4233e−033 1.4425e−048 0 1.3498e−032 2.8850e−048 0

SDS 1.3500e−014 2.9330e−015 1 9.4233e−033 1.4425e−048 0 2.3488e−032 4.2383e−032 1

PSCS 4.4409e−015 0 − 9.4233e−033 0 1.3498e−032 0 −F f16 f17 f18

Algorithm Mean SD p value Mean SD p value Mean SD p value

MABC 5.9746e−028 1.7753e−027 1 1.3498e−031 0 0 0 0 0

GOABC 4.2093e−006 8.8854e−006 1 6.0368e−011 1.3498e−010 1 3.1674e−005 6.4256e−005 1

CoDE 1.0894e−014 4.3789e−014 1 0.0055 0.0246 1 0 0 0

FA 0.5996 0.2157 1 0.9538 1.6373 1 39.5324 1.4778 1

BA 4.8754 1.4660 1 1.7640e+002 2.0049e+002 1 62.7109 6.6955 1

BSA 2.8981e−023 8.9652e−023 1 1.8921e−031 4.5315e−032 1 2.8422e−015 5.8320e−015 1

BDS 7.9231e−033 3.5346e−032 −1 1.3498e−031 0 0 0 0 0

SDS 4.1633e−017 1.8619e−016 1 2.4043e−031 4.2263e−031 1 0 0 0

PSCS 4.4980e−030 9.1888e−030 − 1.3498e−031 0 − 0 0 −

Table 10 Rank of different algorithms on 50D problem for a coherent comparison

F MABC GOABC CoDE FA BA BSA BDS SDS PSCS

f 01 1 8 1 7 9 6 1 1 1

f 02 1 7 5 8 9 6 4 3 2

f 03 8 9 1 3 7 4 6 5 2

f 04 5 8 1 7 9 6 4 3 2

f 05 7 9 3 8 5 4 6 2 1

f 06 1 1 1 1 9 1 1 1 1

f 07 8 1 2 9 7 6 5 4 2

f 8 1 7 1 9 8 1 1 1 1

f 9 7 6 4 8 9 3 2 5 1

f 10 8 5 6 7 9 1 1 1 1

f 11 1 8 1 5 9 7 1 6 1

f 12 1 6 1 8 9 4 4 1 7

f 13 4 8 1 7 9 6 3 5 1

F14 1 6 7 8 9 1 1 1 1

F15 1 8 6 5 9 6 1 4 1

F16 3 7 6 8 9 4 1 5 2

f 17F 1 6 7 8 9 4 1 5 1

F18 1 7 1 8 9 6 1 1 1

Average 3.333 6.5000 3.0556 6.8889 8.5000 4.2222 2.4444 3.000 1.6111

123

Page 24: Cukoo srch

X. Li, M. Yin

Table 11 The statistical results of the best fitness value, the mean value, the standard deviation and identified parameters of Lorenz system

Algorithm Means of best fitness SD of best fitness Mean value and best value obtained (in brackets) of identified parameters

θ1 θ2 θ3

PSCS 2.4995e−006 2.93660e−006 10.0000 (10.0002) 28.0000 (28.0000) 2.6667 (2.6667)

CCS 1.81E−04 1.66E−04 9.9984 (10.0000) 27.9997 (28.0000) 2.6666 (2.6665)

PSO 0.11788 0.268094 10.1667 (9.9999) 28.0105 (27.9999) 2.6684 (2.6666)

Table 12 The best, worst,median, mean and the standarddeviation values obtained byPSCS, CS and PSO through 30independent runs

Dimension Algorithm Best Worst Median Mean SD

D = 19 PSCS 0.5 0.5133 0.5 0.5037 0.0059

CS 0.6868 0.8987 0.7749 0.7759 0.0872

PSO 0.5594 0.8090 0.5922 0.6477 0.1107

D = 20 PSCS 0.5 0.5982 0.5 0.5288 0.0435

CS 0.7645 0.9133 0.8750 0.8469 0.0710

PSO 0.5 1.0581 0.8274 0.7870 0.2084

In this paper, we only consider the global optimization.The algorithm can be extended to solve other problems suchas constrained optimization problems.

Acknowledgments This research is fully supported by Opening Fundof Top Key Discipline of Computer Software and Theory in ZhejiangProvincial Colleges at Zhejiang Normal University under Grant No.ZSDZZZZXK37 and the Fundamental Research Funds for the Cen-tral Universities Nos. 11CXPY010. Guangxi Natural Science Founda-tion (No. 2013GXNSFBA019263), Science and Technology ResearchProjects of Guangxi Higher Education (No.2013YB029), ScientificResearch Foundation of Guangxi Normal University for Doctors.

References

Agrawal S, Panda R, Bhuyan S, Panigrahi BK (2013) Tsallis entropybased optimal multilevel thresholding using cuckoo search algo-rithm. Swarm Evol Comput 11:16–30

Akay B, Karaboga D (2012) A modified artificial bee colony algorithmfor real-parameter optimization. Inf Sci 192:120–142

Burnwal S, Deb S (2013) Scheduling optimization of flexible manu-facturing system using cuckoo search-based approach. Int J AdvManuf Technol 64(5–8):951–959

Civicioglu P (2012) Transforming geocentric cartesian coordinates togeodetic coordinates by using differential search algorithm. Com-put Geosci 46(229–247):2012

Civicioglu P (2013a) Backtracking search optimization algorithm fornumerical optimization problems. Appl Math Comput 219(8121–8144):2013

Civicioglu P (2013b) Circular antenna array design by using evolution-ary search algorithms. Progr Electromagn Res B 54:265–284

Civicioglu P, Besdok E (2013) A conceptual comparison of the cuckoo-search, particle swarm optimization, differential evolution and arti-ficial bee colony algorithms. Artif Intell Rev 39(4):315–346

Das S, Suganthan PN (2010) Problem definitions and evaluation criteriafor CEC 2011 competition on testing evolutionary algorithms onreal world optimization problems. Jadavpur University, India andNanyang Technological University, Singapore, Technical Report-Technical Report

Dey N, Samanta S, Yang XS et al (2013) Optimisation of scaling factorsin electrocardiogram signal watermarking using cuckoo search. IntJ Bio Inspir Comput 5(5):315–326

Durgun I, Yildiz AR (2012) Structural design optimization of vehiclecomponents using cuckoo search algorithm. Mater Test 54(3):185–188

Ehsan V, Saeed T (2013) Improved cuckoo search for reliability opti-mization problems. Comput Ind Eng 64(1):459–468

El-Abd M (2012) Generalized opposition-based artificial bee colonyalgorithm. IEEE Congr Evol Comput (CEC) 2012:1–4

Gandomi A, Yang X, Alavi A (2013) Cuckoo search algorithm: a meta-heuristic approach to solve structural optimization problems. EngComput 29:17–35

Garcia-Martinez C, Lozano M, Herrera F, Molina D, Sanchez AM(2008) Global and local real-coded genetic algorithms basedon parent-centric crossover operators. Eur J Oper Res 185:1088–1113

Goghrehabadi A, Ghalambaz M, Vosough A (2011) A hybrid powerseries—cuckoo search optimization algorithm to electrostaticdeflection of micro fixed-fixed actuators. Int J Multidiscip Sci Eng2(4):22–26

Hansen N, Ostermeier A (2001) Completely derandomized self adap-tation in evolution strategies. Evol Comput 9(2):159–195

Kennedy J, Eberhart R (1995) Particle swarm optimization. Proc IEEEInt Conf Neural Netw 4(2):1942–1948

Layeb A (2011) A novel quantum inspired cuckoo search for knapsackproblems. Int J Bio Inspir Comput 3:297–305

Li XT, Wang JN, Yin MH (2014) Enhancing the performance of cuckoosearch algorithm using orthogonal learning method. Neural Com-put Appl 24(6):1233–1247

Liang JJ, Qin AK, Suganthan PN, Baskar S (2006) Comprehen-sive learning particle swarm optimizer for global optimizationof multimodal functions. IEEE Trans Evol Comput 10(3):281–295

Lorenz EN (1963) Deterministic nonperiodic flow. J Atmos Sci 20:130–141

Ouaarab A, Ahiod B, Yang XS (2014) Discrete cuckoo search algorithmfor the travelling salesman problem. Neural Comput Appl 24(7–8):1659–1669

Storn R, Price K (1997) Differential evolution—a simple and efficientheuristic for global optimization over continuous space. J GlobOptim 11:341–359

123

Page 25: Cukoo srch

A particle swarm inspired cuckoo search algorithm

Tuba M, Subotic M, Stanarevic N (2011) Modified cuckoo search algo-rithm for unconstrained optimization problems. In: Proceeding ofthe 5th European conference on European computing conference(ECC’11), pp 263–268

Walton S, Hassan O, Morgan K, Brown MR (2011) Modified cuckoosearch: a new gradient free optimisation algorithm Chaos. SolitonsFractals 44:710–718

Wang Y, Cai ZX, Zhang QF (2011a) Enhancing the search abilityof differential evolution through orthogonal crossover. Inf Sci18(1):153–177

Wang Y, Cai Z, Zhang Q (2011b) Differential evolution with compos-ite trial vector generation strategies and control parameters. IEEETrans Evol Comput 15(1):55–66

Yang XS (2009) Firefly algorithms for multimodal optimization. In:Stochastic algorithms: foundations and applications, SAGA 2009.Lecture Notes in Computer Sciences, vol 5792, pp 169–178

Yang XS (2012) Flower pollination algorithm for global optimiza-tion. In: Unconventional computation and natural computation.Springer, Berlin, pp 240–249

Yang XS, Deb S (2009) Cuckoo search via Levy flights. World Congresson nature & biologically inspired computing (NaBIC 2009). IEEEPublication, USA, pp 210–214

Yang XS, Gandomi Amir H (2012) Bat algorithm: a novel approach forglobal engineering optimization. Eng Comput 29(5):464–483

Yildiz AR, Saitou KN (2011) Topology synthesis of multicompo-nent structural assemblies in continuum domains. J Mech Des133(1):011008

Yildiz AR, Solanki KN (2012) Multi-objective optimization of vehiclecrashworthiness using a new particle swarm based approach. Int JAdv Manuf Technol 59(1–4):367–376

Yildiz AR (2012) A comparative study of population-based optimiza-tion algorithms for turning operations. Inf Sci 210:81–88

Yildiz AR (2013) A new hybrid artificial bee colony algorithm for robustoptimal design and manufacturing. Appl Soft Comput 13(5):2906–2912

Yildiz AR (2013) Cuckoo search algorithm for the selection of opti-mal machining parameters in milling operations. Int J Adv ManufTechnol 64(1–4):55–61

123