Professional Documents
Culture Documents
DOI 10.1007/s00521-015-2037-2
ORIGINAL ARTICLE
Received: 6 April 2015 / Accepted: 12 August 2015 / Published online: 4 September 2015
The Natural Computing Applications Forum 2015
Abstract Water cycle algorithm (WCA) is a new popu- Keywords Chaos Meta-heuristic Global optimization
lation-based meta-heuristic technique. It is originally Water cycle algorithm
inspired by idealized hydrological cycle observed in natu-
ral environment. The conventional WCA is capable to
demonstrate a superior performance compared to other 1 Introduction
well-established techniques in solving constrained and also
unconstrained problems. Similar to other meta-heuristics, Larger-scale nonlinear optimization problems regularly
premature convergence to local optima may still be hap- contain multiple regional optimums, which pose substantial
pened in dealing with some specific optimization tasks. efforts to develop appropriate computation mechanisms for
Similar to chaos in real water cycle behavior, this article searching near-optimal solutions in high-dimensional and/
incorporates chaotic patterns into stochastic processes of or non-convex search spaces [1]. In this area, structural
WCA to improve the performance of conventional algo- design tasks have been regarded as an identical motivating
rithm and to mitigate its premature convergence problem. theme in engineering to determine more effectual config-
First, different chaotic signal functions along with various urations [2, 3]. With regard to some drawbacks of classical
chaotic-enhanced WCA strategies (totally 39 meta-heuris- optimization strategies, several meta-heuristics have been
tics) are implemented, and the best signal is preferred as designed to tackle multifarious optimization tasks [4–9].
the most appropriate chaotic technique for modification of These algorithms are often inspired by characteristics of
WCA. Second, the chaotic algorithm is employed to tackle natural mechanisms such as evolution and cooperation
various benchmark problems published in the specialized [10]. These branches of optimizers have gained consider-
literature and also training of neural networks. The com- able momentum among researchers, due to their explo-
parative statistical results of new technique vividly ration and exploitation potentials [11, 12]. To the present
demonstrate that premature convergence problem is time, various nature-inspired meta-heuristics have been
relieved significantly. Chaotic WCA with sinusoidal map developed such as chaos optimization algorithm (COA)
and chaotic-enhanced operators not only can exploit high- [13], particle swarm optimization (PSO) [14–16], teach-
quality solutions efficiently but can outperform WCA ing–learning-based algorithm (TLBO) [17, 18], and water
optimizer and other investigated algorithms. cycle algorithm (WCA) [19].
Water cycle optimization algorithm (WCA) is a recent
robust meta-heuristic technique [19]. The core inspiration
& Ahmad Rezaee Jordehi of the WCA is motivated from the idealized hydrological
ahmadrezaeejordehi@gmail.com cycle in natural environments. Until now, WCA has been
1 utilized in different areas to tackle diverse optimization
Department of Geomatics and Spatial Information
Engineering, College of Engineering, University of Tehran, problems [20–24]. In most of the cases, WCA demonstrates
Tehran, Iran a superior and efficient performance compared to well-
2
Department of Electrical Engineering, University Putra established algorithms in dealing with constrained and
Malaysia (UPM), Serdang, Selangor, Malaysia unconstrained problems [22, 23]. In addition, it is evident
123
58 Neural Comput & Applic (2017) 28:57–85
that WCA is able to effectively enhance the quality of and GA-SVM approaches. Up to now, chaotic sequences
solutions. In 2015, also an enhanced WCA (ER-WCA) was have been hybridized with different meta-heuristic
established in order to tackle different constrained tasks approaches such as GA [34], big bang–big crunch algo-
[22]. In [22], ER-WCA approach can outperform with an rithm (BB–BC) [35], harmony search (HS) [36], PSO [37],
improved performance compared to WCA and other well- ant colony optimization (ACO) [38], bat algorithm (BA)
established optimizers in solving constrained and uncon- [5], seeker optimization algorithm (SOA) [39] and artificial
strained tasks. In general, the performance of WCA mainly immune system algorithm (AIS) [40]. While various
depends on its exploitation tendency [22]. Therefore, with chaotic algorithms have been proposed already, most of
considering the distinctive benefits of WCA method, them only have been utilized to solve unconstrained
stagnation to local optima may still be happened in some problems such as [5, 28, 30, 31, 35, 36, 40, 41]. Presently,
cases. Comparable to other meta-heuristic optimizers, this no work is available in the specialized literature for
problem may be observed in some circumstances that the enhancing exploration and/or exploitation capacities of
conventional WCA is not consistently capable to manage WCA by using chaos paradigm.
an appropriate tradeoff between exploitation (intensifica- The motivation of this article is as follows: first, the
tion) and exploration (diversification) tendencies in dealing potential of chaos-embedded strategies for alleviating
with some particular test problems. premature convergence problem of WCA is investigated.
The significant effect of randomization on meta- Second, with regard to chaotic features of real hydrological
heuristic algorithms can be provided by utilizing chaotic processes [26], this article presents a new improved WCA
patterns within optimization process. The concept of approach to tackle different optimization problems. In
chaos can be clarified as a deterministic, random-like these algorithms, 13 different chaotic signals are sequen-
process that may be observed in nonlinear, dynamic tially incorporated into the WCA movement phases and
systems, which is topological mixing, non-converging, raining procedure in order to enhance the performance of
non-period, and delimited [25]. This is inspiring that conventional algorithm. These algorithms employ pseudo-
chaotic dynamic can also be observed in hydrological random sequences in place of random variables and some
events [26]. Because of the ergodicity of chaos, chaotic fixed parameters of WCA. Finally, comparative simula-
maps can efficiently prevent meta-heuristics from pre- tions were conducted using a set of unconstrained and
mature convergence to local solutions by increasing the constrained problems extensively. Diversity of the
diversity of the individuals. employed problems was necessary to clearly confirm that
In 2010, Xu et al. [27] offered an enhanced artificial bee the enhancements of chaotic WCA variant are significant.
colony technique (ABC) integrated with irregularity of In order to substantiate the improvements in chaotic WCA
chaotic patterns to tackle the UCAV trajectory planning meta-heuristic, statistical results are also provided in detail.
task. The simulations affirmed that the chaotic ABC opti- The modified WCA can effectively be utilized to diverse
mizer outperforms as a feasible path planner. In 2011, multifarious optimization tasks in diverse areas of sciences
Talatahari et al. [28] enriched the results of imperialist and technologies. It is expected that new technique obtains
competitive optimizer (ICA) by adding logistic and sinu- high-quality solutions in tackling constrained and uncon-
soidal sequences to the movement steps of orthogonal ICA. strained problems and some applications such as training of
In 2013, Jothiprakash [29] used chaotic patterns to create neural networks (NNs). In this area, the next significant
initial population of GA and DE for dealing with the questions can be raised that will be answered by this article
problem of water resource system. The results showed that numerically.
new hybrid methods outperform basic GA and DE algo-
• Does the combination of chaotic signals with WCA
rithms. In 2014, Gandomi et al. [30] utilized a chaotic krill
relieve its premature convergence problem?
herd method (KH) to optimize unconstrained problems.
• Which of the possible chaos-based strategies are
The simulations confirmed that the employment of chaotic
appropriate for WCA?
maps instead of linearly shifting parameters is a significant
• Which chaotic technique shows the best performance?
adjustment of the KH optimizer. Gandomi et al. [31]
• Which chaotic signal is the most appropriate one for
showed that bat algorithm (BA) with sinusoidal map as its
original algorithm?
pulse rate outperforms basic technique based on the com-
• Does the WCA combined with the most proper chaotic
parison of the success rate results.
strategy and the most appropriate signal outperforms
In 2015, Fister et al. [32] reviewed different chaos-based
other well-established optimization methods?
firefly algorithms (FA). Li et al. [33] introduced an efficient
• Does the best chaotic version also demonstrate an
classification technique (CGSA-SVM) based on a chaotic
efficient performance for practical applications such as
gravitational search algorithm (GSA). Results exposed that
training of NNs?
CGSA-SVM algorithm is capable to outperform PSO-SVM
123
Neural Comput & Applic (2017) 28:57–85 59
123
60 Neural Comput & Applic (2017) 28:57–85
where rand denotes a uniformly distributed Equation (12) is only utilized in solving
random value within the range of [0, 1] and unconstrained problems. To improve the
C is a value between 1 and 2 (near to 2). convergence performance of the WCA opti-
When the C value is restricted to be greater mizer for constrained problems, Eq. (13) is
than 1, streams can move toward the rivers only applied to the streams which directly
from various directions. flow in the location of the sea [with regard to
Step 6: Each river flows into the downhill place or sea evaporation operation in Eq. (11)].
based on Eq. (9): new pffiffiffi
XStream ðt þ 1Þ ¼ Xsea þ l randnð1; N Þ
XRiver ðt þ 1Þ ¼ XRiver
ðtÞ þ rand C ð13Þ
XSea ðtÞ XRiver ðtÞ ð9Þ
where l coefficient represents the concept of
Step 7: Exchanging the positions of river with a variance, which shows the range of searching
stream that generates a better solution. region around the best solution. In addition,
Step 8: Similar to Step 7, if a river explores a better Randn is a normally distributed random
solution compared to the sea, the position of number. Equation (13) is designed to increase
them should be exchanged. the generation of streams that move straightly
Step 9: Checking the evaporation condition for toward the sea in order to enhance the
unconstrained problems based on the follow- exploration in the proximity of best solution.
ing pseudo-code [22]: Step 11: Decreasing the value of controlling parameter
dmax via Eq. (14):
if X X i \dmax or rand\0:1
Sea River
i ¼ 1; 2; 3; . . .; Nsr 1 dmax ðtÞ
ð10Þ dmax ðt þ 1Þ ¼ dmax ðtÞ ð14Þ
Max: iteration
perform raining process
end if Step 12: Checking the convergence criteria. If the
termination condition is fulfilled, terminate;
where dmax is a very small number. dmax is otherwise, return to step 5.
determined by user to control the intensifica-
tion level in the proximity of best solutions. The pseudo-code of the original WCA can be described
To heighten the capability of WCA technique as follows:
only against constrained problems, the next
pseudo-code in Eq. (11) should be considered
only for the streams which move directly to 3 Chaos and chaotic maps
the sea. This assumption is proposed in order
to prevent searching procedure from prema- The concept of chaos can be clarified as a deterministic
ture convergence in tackling constrained process observed in nonlinear, dynamic systems that is
tasks. topological mixing, non-converging, aperiodic, and boun-
ded [42]. A dynamic system may be described by using a
if X X i
Sea
\dmax
stream set of equations as some unpredictable phenomenon that
i ¼ 1; 2; 3; . . .; NSn can gradually evolve throughout the time from a known
ð11Þ
perform raining process initial condition. In addition, it is a mathematical fact that
chaotic systems exhibit sensitive dependence upon forma-
end if
tive conditions and involved parameters [43]. With this
Next to evaporation, the raining procedure regard, the nature of chaotic behaviors is evidently random,
should be performed. Therefore, some new unpredictable, and regular [36]. The theory of chaos has
streams in the different positions are created. been fundamentally established based upon the observation
Step 10: If the evaporation process satisfied, the rain- of complex weather patterns. Furthermore, according to
ing process occurs according to the following several substantive evidence, chaos is an intrinsic charac-
equation: teristic of various natural processes; hence, chaotic
new
XStream ðt þ 1Þ ¼ LB þ rand ðUB LBÞ behaviors can often be distinguished in neural systems and
environmental phenomena such as rainfall patterns, climate
ð12Þ
changes, and hydrological cycles [26]. Due to these rela-
where LB and UB represent lower and upper tions, the theories of nature-inspired computing and chaos
boundaries of the given problem, respectively. are two inseparable notions. In this paper, chaos is utilized
123
Neural Comput & Applic (2017) 28:57–85 61
0 xk ¼ 0
xkþ1 ¼ ð17Þ
1=xk mod ð1Þ otherwise
1 1
1=xk mod ð1Þ ¼ ð18Þ
xk xk
This map also generates chaotic sequences in (0, 1).
(d) Intermittency map
The intermittency map is expressed as [46]:
(
e þ xk þ cxnk 0 xk P
xkþ1 ¼ xk P ð19Þ
P\xk \1
1P
where c parameter is calculated as follows:
1eP
c¼ ð20Þ
Pn
(e) Iterative chaotic map (ICMIC)
The iterative map with infinite collapses (ICMIC)
[37] has the following form:
ap
xkþ1 ¼ sin ; a 2 ð0; 1Þ ð21Þ
xk
where a 2 (0, 1) is an appropriate parameter.
(f) Liebovitch map
Liebovitch et al. [43] have proposed another chaotic
map represented as:
8
< ax
> k 0\xk P1
P1 xk
xkþ1 ¼ P1 \xk P2 ð22Þ
>
: P2 P1
1 bð1 xk Þ P2 \xk 1
Fig. 1 Pseudo-code of WCA algorithm (based on [20])
where a \ b and these parameters are expressed as
follows:
to improve the performance of original water cycle algo-
P2
rithm. The rest of this section briefly reviews non-invertible a¼ ð 1 ð P2 P 1 Þ Þ ð23Þ
P1
maps that generate chaotic motions for later algorithm
simulations (Fig. 1). 1
b¼ ð ð P2 1 Þ P1 ð P2 P1 Þ Þ ð24Þ
P2 1
(a) Chebyshev map
The family of Chebyshev map [43] is represented In this article, three equal limits are chosen for
via: Liebovitch map.
xkþ1 ¼ cosðk cos1 ðxk ÞÞ ð15Þ (g) Logistic map
This classic map appears in nonlinear dynamics of
where k shows the iteration number. biological population evidencing chaotic behavior
(b) Circle map [47], and is formulated as
The circle map [44] is defined as follows: xkþ1 ¼ axk ð1 xk Þ ð25Þ
xkþ1 ¼ xk þ b ða=2pÞ sinð2pkÞ mod ð1Þ ð16Þ
where xk represents the kth chaotic number and k is
For a = 0.5 and b = 0.2, circle map generates the iteration number. Clearly, x 2 (0, 1) with initial
chaotic sequence in (0, 1). condition x0 2 (0,1) and x0 62 [1 0.5, 1]. In later
(c) Gauss/mouse map implementation, a = 4 is used.
The Gauss map is widely utilized in the literature (h) Piecewise map
[45] and is represented via: The piecewise map can be given by [48]:
123
62 Neural Comput & Applic (2017) 28:57–85
8x
> k proposed chaotic operators try to relieve the problem of
>
> 0 xk \P
>
> P premature convergence for achieving better search perfor-
>
> xk P
>
< P xk \0:5 mance in WCA optimizer. In these algorithms, different
xkþ1 ¼ 0:5 P ð26Þ chaotic maps are incorporated into the movement steps and
> 1 P xk
>
> 0:5 xk \1 P raining phases of WCA in order to strengthen the down-
>
> 0:5 P
>
>
: 1 xk
> 1 P xk \1
right exploration and exploitation potentials of the con-
P ventional technique. Experimental tests also confirm the
advantages of combining chaotic sequences into the
where x 2 (0, 1) and P denotes the control parameter
stochastic meta-heuristic search methodologies [43].
within the range of [0, 0.5].
On other hand, decreasing the number of parameters in
(i) Sawtooth map
meta-heuristic algorithms usually improves the exploration
This map [31] can be defined using a mod function:
and exploitation abilities of them [35]. For example, PSO
xkþ1 ¼ 2xk mod ð1Þ ð27Þ algorithm with linearly decreasing inertia weight can
(j) Sine map explore superior solutions with a preferable performance
The sine map [49] is a unimodal chaotic sequence [10], or in HS algorithm, chaotically decreasing pitch
represented by: adjustment rate provides better results [50]. The reason of
preferable results of these mechanisms is the fact that
xkþ1 ¼ 0:25a sinðpxk Þ; 0\a 4 ð28Þ explorative tendency of swarm should be at highest level at
(k) Singer map beginning iterations and it must be decreased by passing
Singer’s map [37] is a one-dimensional chaotic the time. With this regard, it is known that chaotic maps in
system that can be expressed as: meta-heuristics can provide such a mechanism in order to
improve the overall performance of basic algorithm. These
xkþ1 ¼ lð7:86xk 23:31x2k þ 28:75x3k two strategies are coupled in chaotic WCA. For this pur-
13:302875x4k Þ ð29Þ pose, chaotic linearly decreasing strategies are imple-
where l is a parameter in the interval [0.9, 1.08]. mented to control the performance of WCA dynamically.
(l) Sinusoidal map Novel chaos-embedded WCA algorithms can clearly be
This map [31] is formulated as: categorized and described as follows:
123
Neural Comput & Applic (2017) 28:57–85 63
(c) (d)
(e) (f)
(g) (h)
(i) (j)
123
64 Neural Comput & Applic (2017) 28:57–85
Fig. 2 continued
(k) (l)
(m)
where Cf and Ci show the final and preliminary values of In chaotic evaporation, dmax is determined chaotically,
linearly decreasing operator. In order to determine proper which should be located in the range of [1.00E-17,
values for Ci and Cf, a trial and error procedure has been 1.00E-01]. In order to form chaotic raining process, rand
performed. Based on the test, the best values are suggested vector of Eq. (12) for unconstrained problems is modified
and chosen as 1 and 2, respectively. These values have by chosen chaotic maps during iterations as:
recognized as the most suitable values of Ci and Cf to solve new
XStream ¼ LB þ chaosðtÞ ðUB LBÞ ð38Þ
all studied optimization problems. Symbol t represents
current iteration, tmax shows maximum number of itera- In addition, for constrained problems, the evaporation
tions, and chaos (t) represents the utilized chaotic equation check is based on Eq. (39) and l value is computed by
which generates chaotic patterns. Theoretically, it is multiplying a linearly decreasing operator by a chaotic
expected that the integration of chaotic patterns with signal as follows
movement operation of WCA emphasizes its exploitation
at the last iterations. if XSea Xstream
i \chaosðtÞ
i ¼ 1; 2; 3; . . .; NSn
ð39Þ
4.2 WCA with chaotic evaporation and raining perform chaotic raining process
process end if
123
Neural Comput & Applic (2017) 28:57–85 65
4.3 WCA with chaotic streams, evaporation 5.1 Selecting the best chaotic variant
and raining process
In order to find out the most appropriate chaos-based
In CWCO III algorithm, CWCO I and CWCO II variants strategy for WCA, different criteria can be investigated.
are integrated; hence, C and l values are not two prede- Based on the literature, usually these criteria are according
fined constants, and rand vectors are also modified by to the statistical results, total number of function evalua-
selected chaotic sequences during iterations based on the tions, success rate, and some combined methods. It is
aforementioned equations (see Eqs. 33–41). It should be experimented that the success rate results of an algorithm
noted that the chaotic evaporation condition is still checked effectively demonstrate its overall potential in finding of
based on Eq. (37) (for unconstrained tasks) and Eq. (39) high-quality solutions. Hence, this article discovers the
(for constrained tasks). In this chaotic algorithm, different potential of the different chaotic methods by comparing the
chaotic signals may provide different diversification and success rate results. Regardless of the statistical compar-
intensification patterns to improve the standard WCA isons, an algorithm with inappropriate success rate results
algorithm. cannot be considered a reliable and/or efficient method.
The next observations can assist in demonstrating how Success rate criterion may be expressed as:
the chaotic operators are theoretically efficient, either in Sr ¼ 100 ðrunsuccessful =runtotal Þ ð42Þ
individual form or cooperatively:
where runtotal is the total number of runs and runsuccessful
• The chaotic movements permit CWCA to flow streams represents the number of runs that have explored the
chaotically that emphasizes more exploration at first optimal solution. The criterion for a successful run can be
steps and more exploitation at last iterations. determined as follows:
• The chaos-enhanced evaporation and raining operator
of CWCA linearly decrease its exploration tendency by X
Dim
ðXigb Xi Þ2 ðUB LBÞ 1008 ð43Þ
more iterations.
d¼1
• Various chaotic signals for dynamic streams, evapora-
tion, and raining process provide different exploration where Dim shows the dimension of optimization problem,
and exploitation tendencies for the CWCA during Xgb
i represents ith dimension of the global best solution
optimization tasks. explored by algorithm, X*i shows the recent best solution of
• Since chaotic patterns exhibit chaotic motions, a optimizer, and UB and LB show upper and lower solution
generation of CWCA can emphasize either exploration space bounds, respectively.
or exploitation. Six standard benchmark problems are employed in 30
• In case of premature convergence to regional best, dimensions in order to perform the first phase of imple-
different chaotic operators help algorithm to leave them mentations [41, 51]. The features of these problems are
by more iterations. listed in Table 1. With entirely random initial conditions,
• In case of exploring an appropriate region(s) of solution the results are obtained over 30 runs of each algorithm. The
space, the exploitative tendency of CWCA is amplified internal parameters of chaotic and standard WCA are
in order to exploit better solutions. chosen as: Max. iteration = 1000, Nsr = 4, Npop = 50 (so
NFEs is equal to 5.00E?04), dmax = 1.00E-16 for
unconstrained problems and 1.00E-05 for constrained
problems. The success rates of methods are presented in
5 Experimental results and discussion Tables 2, 3 and 4. For reliable comparisons, the overall
success rates of CWCA-based strategies in solving differ-
Here, the performance of different chaotic versions of ent test problems are reflected in Table 5.
WCA optimizer is investigated in detail. For experiments From Table 5, some significant points can be recognized
of this article, each algorithm is implemented using as follows:
MATLAB R2012a (7.14) programming software on a
T6400@4 GHz Intel Core (TM) 2 Duo processor desktop • The performance of WCA with chaotic streams,
PC with 4 GB of Random Access Memory (RAM). The evaporation, and raining process (CWCA III) is signif-
structure of this experiment is organized as follows: First, icantly superior to the other chaotic strategies.
the best chaotic signal for modification of CWCA is • Based on the success rate results, sinusoidal map is the
determined. Second, the best chaotic-based strategy and most appropriate map for CWCA III technique.
best signal establish the new chaos-enhanced WCA of this • For the CWCA I and CWCA II, the most efficacious
study and is used to tackle different optimization problems. signals are ICMIC and sinusoidal maps, respectively.
123
66 Neural Comput & Applic (2017) 28:57–85
Table 1 Benchmark problems (C characteristic, U unimodal, M multimodal, S separable, N non-separable, GM global minimum)
ID Name C Function Bounds GM
ffi
pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
P P
F01 Ackley MN f ðXÞ ¼ 20 exp 0:02 n1 ni¼1 x2i expðn1 ni¼1 cosð2pxi ÞÞ þ 20 þ e [-31, 32] 0.00
Pn 2 Qn
F02 Griewank MN f ðXÞ ¼ 1 þ 4000 1 xiffi [-150, 150] 0.00
i¼1 xi i¼1 cos
p
i
Pn1
F03 Rosenbrock UN f ðXÞ ¼ i¼1 100ðxiþ1 x2i Þ2 þ ðxi 1Þ2 [-05, 05] 0.00
P
F04 Rastrigin MS f ðXÞ ¼ ni¼1 ðx2i 10: cosð2pxi Þ þ 10Þ [-10, 10] 0.00
p Xn1
F05 Penalized MN f ðXÞ ¼ f10 sin2 ðpy1 Þ þ ðyn 1Þ2 þ ðyi 1Þ2 : 1 þ 10 sin2 ðpyiþ1 Þ g [-50, 50] 0.00
nX i¼1
n
þ i¼1
uðxi ; 10; 100; 4Þ; yi ¼ 1 þ 0:25ðxi þ 1Þ
8 m
< kðxi aÞ ;
> xi [ a
uðxi ; a; k; mÞ ¼ 0 a xi a
>
:
kðxi aÞm xi \ a
P
F06 Dixon-Price UN f ðXÞ ¼ ðxi 1Þ2 þ ni¼2 ið2x2i xi1 Þ2 [-10, 10] 0.00
Table 2 Success rate of CWCA I with different chaotic maps for Table 3 Success rate of CWCA II with different chaotic maps for
benchmark functions benchmark functions
Chaotic map F01 F02 F03 F04 F05 F06 Chaotic map F01 F02 F03 F04 F05 F06
Chebyshev 04 04 03 0 07 07 Chebyshev 01 10 07 02 0 03
Circle 02 18 41 04 12 19 Circle 45 27 04 49 07 21
Gauss/mouse 72 24 49 87 17 28 Gauss/mouse 71 20 31 78 18 27
Intermittency 73 60 38 71 34 14 Intermittency 75 51 36 67 14 27
ICMIC 97 71 42 92 27 21 ICMIC 88 27 07 72 15 35
Liebovitch 89 54 21 91 22 17 Liebovitch 89 48 15 84 12 30
Logistic 83 44 39 71 27 18 Logistic 74 37 17 13 24 25
Piecewise 85 47 39 89 29 31 Piecewise 87 40 25 74 19 15
Sawtooth 04 07 01 0 0 0 Sawtooth 01 02 0 0 09 04
Sine 85 62 14 15 29 34 Sine 81 63 30 42 17 34
Singer 87 70 04 44 32 35 Singer 64 31 28 27 18 32
Sinusoidal 92 38 37 71 33 35 Sinusoidal 95 61 39 51 20 34
Tent 45 30 19 44 24 17 Tent 34 20 17 05 16 14
123
Neural Comput & Applic (2017) 28:57–85 67
Table 4 Success rate of CWCA III with different chaotic maps for Max. iteration = 1000) demonstrates a competent perfor-
benchmark functions mance in solving F01–F06 problems. For F01 problem,
Chaotic map F01 F02 F03 F04 F05 F06 original WCA can provide the best solutions with zero
standard deviations (SD) for dimension 30 and even higher
Chebyshev 01 16 04 0 11 24
dimensions. However, the CWCA III is not statistically
Circle 54 38 20 57 13 17 better than original method in solving F02 or F05 prob-
Gauss/mouse 77 32 29 51 24 14 lems, but still the success ratio and running time of the
Intermittency 92 84 62 84 20 18 improved method are slightly better than the WCA tech-
ICMIC 100 51 44 98 17 21 nique in all of the tests. For instance, the CPU time spent
Liebovitch 94 62 27 87 28 25 by CWCA III in solving F01 is 16.21 (s) compared to
Logistic 74 64 31 44 12 36 WCA which is 17.04 (s) (with 5.00E?04 NFEs). The
Piecewise 100 47 41 17 22 34 overall success rate of CWCA III is 394, while this value
Sawtooth 08 10 0 07 10 09 for WCA is 371. With respect to these values and robust
Sine 94 72 34 24 13 05 performance of standard WCA, it can be recognized that
Singer 81 79 57 72 18 22 the conventional WCA with proper chaotic operations is
Sinusoidal 94 92 59 87 27 35 still capable to demonstrate better performances against
Tent 61 37 27 17 12 18 unconstrained problems. These results also affirm the
improved exploration and exploitation capacities of
CWCA III for tackling unconstrained tasks.
Table 5 Comparison of the overall success rates for all benchmark From Tables 6, 7, 8, 9, 10, and 11, it can be seen that
problems improved CWCA III has demonstrated better performance
Chaotic map name CWCA I CWCA II CWCA III
in terms of CPU running time compared to other chaotic
methods. Simulations verify that the new CWCA III
Chebyshev 25 23 56 method is also capable to statistically outperform CWCA I
Circle 96 153 199 and CWCA II techniques. The improvement is apparent in
Gauss/mouse 277 245 227 average, minimum, maximum, and standard deviation of
Intermittency 290 270 360 the archived optimal solutions. The results in Table 12 also
ICMIC 350a 244 331 confirm that CWCA has explored statistically better results
Liebovitch 294 278 323 compared to the other chaotic techniques, with a statistical
Logistic 282 190 261 significance value a = 0.05.
Piecewise 320 260 261
Sawtooth 12 16 44 5.3 Solving constrained benchmark problems
Sine 239 267 242
Singer 272 200 329 In this section, new improved algorithm is benchmarked
Sinusoidal 306 300a 394a via 13 well-known constrained problems (G1–G13). The
Tent 179 106 172 equations of these constrained problems are represented in
a section ‘‘Appendix’’. The details of the implemented
The best maps (overall success rate of the standard method is 371)
problems are summarized in Table 13. The objective val-
ues are obtained by performing 30 independent runs of
95 % significance level (i.e., T? and T- are described in algorithms. The best results computed by CWCA and
[52]). To investigate the preciseness and robustness of WCA for G1–G13 are also reported in Tables 13 and 14,
proposed strategy, CWCA III with sinusoidal map is sta- respectively.
tistically compared to other chaotic versions and the stan- The best solutions in these tables indicate the capability
dard WCA [19] for all benchmark problems described in of a strategy to explore the optimal result, while the
Table 1. In all cases, simulations are carried out in 30 average results and standard deviation values reflect the
independent runs. In Tables 6, 7, 8, 9, 10, and 11, the robustness of meta-heuristic. From the results in Table 14,
obtained statistics of implemented optimizers have been it can be vividly recognized that new improved algorithm is
presented. In these tables, the better results are spotlighted able to explore the global solutions in all benchmark
in boldness. Note that the optimization results of CWCA I problems with a robust performance. The results of WCA
(with ICMIC map) and CWCA II (with sinusoidal signal) are represented in Table 15. From Tables 14 and 15, it can
are also reported in these tables. be seen that the chaotic algorithm reached better or equal
Due to the simplicity of these benchmarks, the original average running time than original WCA. In addition, the
optimizer (with Npop = 50, Nsr = 4, dmax = 1.00E-16, last column in Table 14 shows the statistical significance
123
68 Neural Comput & Applic (2017) 28:57–85
level derived from a paired one-tailed Student’s t test [52] the other investigated algorithms in comparison with
with 98 degree of freedom (df) at 0.05 level of significance obtained results of CWCA III. Considering the best solu-
between conventional and chaotic method. With regard to tions in Table 16, it can be recognized that new improved
obtained statistical results, new strategy demonstrates an algorithm can obtain practically high-quality or compara-
improved performance on 10 benchmarks compared to the ble solutions for these problems compared to the past
WCA optimizer. Table 16 shows the statistical results of proposed techniques.
123
Neural Comput & Applic (2017) 28:57–85 69
Table 14 Best results obtained by new method for constrained benchmark problems
ID Global optimum CWCA CWCA versus WCA
Best Average Worst SD Average p value Paired t test
run time (s)
123
70 Neural Comput & Applic (2017) 28:57–85
Table 15 Best results obtained by WCA optimizer for constrained benchmark problems
ID Global optimum WCA
Best Average Worst SD Average run time (s)
5.4 Solving constrained engineering problems thickness of the head, Th = x2, inner radius of the vessel,
R = x3, and the length of the vessel without heads, L = x4.
Due to the nonlinearity of multiconstrained design prob- Ts and Th are integer multiples of 0.0625 inches, and
lems, several algorithms have been developed to solve available thickness of rolled steel plates, R, and L are
them in an efficient way. In this section, the CWCA III is continuous. Therefore, the mathematical model of the
evaluated against the previous robust methods by using a pressure vessel problem can be expressed as follows:
set of well-studied engineering problems including pres-
sure vessel design, welded beam design, compression min f ðXÞ ¼ 0:6224x1 x3 x4 þ 1:7781x2 x23 þ 3:1661x21 x4
x
string design, and Himmelblau problem. These design test þ 19:84x21 x3
cases have been previously optimized by using a wide
ð44Þ
variety of meta-heuristic strategies. Hence, these types of
problems are also appropriate to statistically substantiate s:t: g1 ðXÞ ¼ x1 þ 0:0193x3 0
the effectiveness of the new improved CWCA III tech- g2 ðXÞ ¼ x2 þ 0:00954x3 0
nique. In these implementations, the same constraint han- ð45Þ
g3 ðXÞ ¼ px23 x4 4=3 px33 þ 1296000 0
dling methodology as utilized in conventional WCA meta-
heuristic [19] is employed to tackle the constrained prob- g4 ðXÞ ¼ x4 240 0
lems. The rest of this section is devoted to the implemen- In several works, this problem has been solved consid-
tation of CWCA III for solving the following problems. ering the following variable bounds:
Region I : 1 0:0625 x1 ; x2 99
5.4.1 Pressure vessel design
0:0625; 10 x3 ; x4 200 ð46Þ
The first test case is the challenge of pressure vessel design, Up to now, several techniques have been utilized to
which is regarded as a well-known structural optimization solve described problem including improved ACO [61],
task. A schematic of a cylindrical vessel capped on both genetic adaptive search [62], branch and bound technique
ends by hemispheric heads is sketched in Fig. 3. This air [63], GA-based co-evolution model [64], augmented
tank has a minimum volume of 750 ft3 with a working Lagrangian multiplier method [65], co-evolutionary PSO
pressure of 3000psi, which is designed based on the [66], evolution strategy [67], feasibility-based tournament
American Society of Mechanical Engineers (ASME) boiler selection [68], CS [69], hybrid algorithm based on PSO
and pressure vessel code. The main objective of this with passive congregation [70], and quantum-behaved PSO
problem is designed to minimize the total cost, involving a [71]. The comparative results are exposed in Table 17
combination of single 60L welding cost, material, and under the Region I, while their corresponding statistical
forming cost. The design variables associated with this results are summarized in Table 18. Referring to the
problem are thickness of the pressure vessel, Ts = x1, results, yielded solutions by CWCA III are better compared
123
Neural Comput & Applic (2017) 28:57–85 71
Table 16 Statistical features of the best results computed by different approaches (NA means not available)
ID and Statistical Montes and Becerra Gandomi Tessema Hedar and Koziel and Cabrera Present study
Optimum features Coello [55] and Coello et al. [54] and Yen Fukushima Michalewicz and Coello
[56] [57] [58] [59] [60]
123
72 Neural Comput & Applic (2017) 28:57–85
Table 16 continued
ID and Statistical Montes and Becerra Gandomi Tessema Hedar and Koziel and Cabrera Present study
Optimum features Coello [55] and Coello et al. [54] and Yen Fukushima Michalewicz and Coello
[56] [57] [58] [59] [60]
Fig. 3 Pressure vessel. a Schematic, b stress heat map, c displacement heat map [51]
to original WCA and other investigated optimizers. Hence, objective is to find the minimum fabricating cost of the
the improved CWCA III strategy is able to demonstrate a welded beam structure subject to some constraints on shear
competent performance in tackling this constrained stress (s), bending stress (r), buckling load (Ps), end
problem. deflection (d), and side restrictions. There are some vari-
In order to examine the entire constrained domain, the ables related to the problem such as thickness of the weld
upper limit of the variable x4 has extended to 240 [82], i.e., h = x1, length of the welded joint l = x2, width of the
range of decision variables is updated to: beam t = x3, and the thickness of the beam b = x4. Hence,
the decision vector is formed as X = (h, l, t, b) = (x1, x2,
Region II : 1 0:0625 x1 ; x2 99 x3, x4). For this problem, the mathematical formulation can
0:0625; 10 x3 200; 10 x4 240; ð47Þ be formed as follows:
In this region, Gandomi et al. [81], Dimopoulos [82], min f ðXÞ ¼ 1:10471x21 x2 þ 0:04811x3 x4 ð14 þ x2 Þ ð48Þ
x
Mahdavi et al. [50], and Hedar and Fukushima [58] solved
s:t: g1 ðXÞ ¼ sðXÞ smax 0
the problem by previously mentioned methods. Table 17
represents the best solution vectors attained by different g2 ðXÞ ¼ rðXÞ rmax 0
methods including CWCA III in Region II. From Table 18, g3 ðXÞ ¼ x1 x4 0
ð49Þ
it can be realized that the performance of CWCA III is still g4 ðXÞ ¼ 0:125 x1 0
better than some of the other optimizers. Solutions obtained g5 ðXÞ ¼ dðXÞ 0:25 0
by modified WCA are preferable compared to WCA
g6 ðXÞ ¼ P Pc ðXÞ 0
strategy. The time elapsed for one run of CWCA III opti-
mizer in regions I and II is 7.901 (s) and 8.127(s), 0:1 x1 2; 0:1 x2 10; 0:1 x3 10 ; 0:1 x4 2
respectively. ð50Þ
where s is the shear stress in the weld, smax is the maximum
5.4.2 Welded beam design shear stress of the weld, r is the normal stress on the beam,
rmax is the allowable normal stress of the beam material, Pc
Welded beam design, which is depicted in Fig. 4, is chosen stands for the bar buckling load, P the load, and d is the
as a fundamental engineering problem frequently described beam end deflection. The shear stress s has mainly two
throughout the literature to evaluate the performance of ingredients, including primary stress (s1) and secondary
different optimizers [83]. In this problem, the primary stress (s2), which is given as:
123
Neural Comput & Applic (2017) 28:57–85 73
Table 17 Comparison of the best solution for pressure vessel design problem found by different methods
Region Method Design variables Cost
x1 x2 x3 x4 f(X)
123
74 Neural Comput & Applic (2017) 28:57–85
Fig. 4 Welded beam design. a Schematic, b stress heat map, c displacement heat map [51]
simulation results are represented in Table 20. With regard CWCA III is able to establish an operative strategy to
to Table 20, it may be recognized that the best, average, compromise between quality of the computed solutions and
worst, standard deviation, and median of the solutions computational burden.
achieved by CWCA III are better compared to WCA and In the literature, various authors including [58, 65, 69, 78,
some other optimizers. The consumed time of CWCA III 91] and also [62, 67, 68, 71, 73, 76, 81, 90] utilized different
and WCA is equal to 10.142 (s) and 11.57 (s), corre- versions of the welded beam optimization problem. In this
spondingly. Hence, it can be identified that the improved version, some constraints such as deflection d(X), buckling
123
Neural Comput & Applic (2017) 28:57–85 75
Table 19 Comparison of the best solution for welded beam problem found by different methods
Version Method Design variables Cost
x1 x2 x3 x4 f (X)
load Pc(X), and polar moment of inertia J(X) have taken Statistical results of the best, median, average, worst, and
along with another constraint g7(X) as follows: the standard deviation of the solutions obtained using
CWCA III are tabulated in Table 20. Based on this table, it
g7 ðXÞ ¼ 0:10471x21 þ 0:04811x3 x4 ð14 þ x2 Þ 5 0;
can be found that the best vector attained by the conven-
ð55Þ tional WCA is preferable compared to the solutions dis-
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
covered by other techniques. However, the results found by
dðXÞ ¼ 6PL3 =Ex33 x4 ; Pc ðXÞ ¼ 4:013E x23 x64 =36=L2
CWCA III are also superior to the results revealed by WCA
pffiffiffiffiffiffiffiffiffiffiffiffi strategy. Therefore, the proposed improved algorithm can
1 ðx3 =2LÞ E=4G ;
successfully show an efficient, superior performance in
npffiffiffi h io
JðXÞ ¼ 2 2x1 x2 x22 =4 þ ððx1 þ x3 Þ=2Þ2 ð56Þ dealing with this constrained problem.
The complete results are presented in Table 19 under 5.4.3 Tension–compression string design
version II section. With regard to these solutions, it is
recognized that the proposed CWCA III demonstrates a This problem was described by Arora [91] to minimize the
superior performance compared to other algorithms. weight of a tension–compression spring subject to some
123
76 Neural Comput & Applic (2017) 28:57–85
Fig. 5 Tension–compression string: a schematic, b stress heat map, c displacement heat map [51]
123
Neural Comput & Applic (2017) 28:57–85 77
Table 21 Comparison of the best solution for tension–compression string design by various methods
Method Design variables Cost
x1 x2 x3 f (X)
0:05 x1 2; 0:25 x2 1:3; 2 x3 15 ð59Þ which reveals that the improved CWCA III is capable to
Arora [91] utilized a numerical optimization technique provide reasonably feasible, accurate results in solving this
known as constraint correction at the constant cost. problem. Referring to Table 22, it is known that the solu-
Belegundu [92] solved this problem by eight different tions realized by WCA are almost superior. Therefore,
mathematical optimizers, and the best results are shown competent performance of the conventional optimizer in
here. Montes and Coello [67] employed various evolu- tackling this problem is also substantiated here. Addition-
tion strategies to tackle this problem. He and Wang [66] ally, the consumed time of CWCA III and WCA for this
utilized a co-evolutionary PSO algorithm (CPSO) to problem is 10.032 (s) and 11.76 (s), respectively.
handle this problem. Coello [64] and Montes [68] solved
this problem by using a GA-based method. Table 21 5.4.4 Himmelblau nonlinear optimization problem
shows the best results found by CWCA III in compar-
ison with the solutions reported by the other literature. Finally, CWCA III is benchmarked using Himmelblau
The statistical simulation results are also exposed in problem, which has been proposed in [98]. This problem
Table 22. includes five positive design variables X = [x1, x2, x3, x4,
From Table 21, it can be realized that the best solution x5], six nonlinear inequality constraints, and ten boundary
obtained by WCA is superior to the best solutions found by conditions. This optimization problem can be described as
the CWCA III, but it is infeasible. The solutions provided follows:
by Kaveh and Talatahari [61], Hu et al. [72], and He et al.
min f ðXÞ ¼ 5:3578547x23 þ 0:8356891x1 x5 þ 37:293239x1
[78] are infeasible also, since some of the constraints are x
violated. From the results, it can be seen that the standard 40792:141
deviation of the results of CWCA III is better than WCA, ð60Þ
123
78 Neural Comput & Applic (2017) 28:57–85
s:t: 0 g1 ðXÞ 92 Table 24. Based on the results, it can be determined that
90 g2 ðXÞ 110 ð61Þ CWCA III algorithm can demonstrate a comparable per-
20 g3 ðXÞ 25 formance compared to WCA and previous works.
123
Table 23 Optimal results of Himmelblau nonlinear optimization problem
Version Method Design variables Cost Constraints
Neural Comput & Applic (2017) 28:57–85
x1 x2 x3 x4 x5 f (X) 0 B g1 B 92 90 B g2 B 110 20 B g3 B 25
123
80 Neural Comput & Applic (2017) 28:57–85
123
Neural Comput & Applic (2017) 28:57–85 81
has been utilized to enhance the efficiency of WCA algo- s:t: g1 ðxÞ ¼ 2x1 þ 2x2 þ x10 þ x11 10 0;
rithm. The standard algorithm is developed by mimicking g2 ðxÞ ¼ 2x1 þ 2x3 þ x10 þ x12 10 0;
the idealized hydrological cycle in natural environments,
g3 ðxÞ ¼ 2x2 þ 2x3 þ x11 þ x12 10 0;
which also can be explained by deterministic chaos. With
respect to the chaotic characteristics of water cycle in g4 ðxÞ ¼ 8x1 þ x10 0;
nature, three variants of WCA are proposed by utilizing 13 g5 ðxÞ ¼ 8x2 þ x11 0;
non-invertible chaotic maps. In these strategies, different g6 ðxÞ ¼ 8x3 þ x12 0;
deterministic chaotic operations are utilized instead of g7 ðxÞ ¼ 2x4 x5 þ x10 0;
constant and/or random values of conventional optimizer.
g8 ðxÞ ¼ 2x6 x7 þ x11 0;
The overall success rate of these algorithms for several
benchmark problems demonstrates that CWCA III with g9 ðxÞ ¼ 2x8 x9 þ x12 0;
sinusoidal map has a superior performance compared to xi 0; i ¼ 1; . . .; 13;
other WCA-based approaches. In addition, the most xi 1; i ¼ 1; . . .9; 13:
effective chaotic sequences for the CWCA I and CWCA II
strategies can be ICMIC signal and sinusoidal map, Bound restrictions: UB = (1, 1, 1, 1, 1, 1, 1, 1, 1, 100,
respectively. 100; 100; 1Þ; LB ¼ ð0; . . .; 0Þ;
For evaluating the reliability of CWCA III, this method Global minimum: xbest = (1, 1, 1, 1, 1, 1, 1, 1, 1, 3, 3, 3, 1),
was compared to standard WCA and other chaotic versions
in solving unconstrained tasks. Statistical results reveal that f ðxbest Þ¼ 15:
CWCA III can effectively explore the best solution with
more precision. From simulation results, it can be con- Problem G2
P
cluded that incorporating the chaos theme into the WCA n cos4 ðx Þ 2 Qn cos2 ðx Þ
i¼1 i i¼1 i
algorithm can manifestly enhance the feasibility and max f ðxÞ ¼ pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
Pn
x i¼1 ixi
2
quality of the results. In addition, new improved method is Yn
utilized to tackle some well-studied constrained problems s:t: g1 ðxÞ ¼ i¼1 xi þ 0:75 0;
such as G1–G13 benchmarks, pressure vessel design, wel- Xn
ded beam design, compression string design, and Him- g2 ðxÞ ¼ x 7:5n 0:
i¼1 i
melblau problem. These test cases are chosen to Bound restrictions: UB = (10,. . .; 10Þ and LB = (0,. . .; 0Þ:
substantiate the effectiveness and feasibility of CWCA III
Best known value: f (xbest Þ¼ 0:803619; for n = 20:
against pervious techniques for constrained tasks. The
results also show that chaotic CWCA III outperforms other
conventional WCA and some other methods in tanning Problem G3
pffiffiffi Yn
NNs. With regard to the statistical results, CWCA III is max f ðxÞ ¼ ð nÞn i¼1 xi
x
able to realize better or comparable solutions compared to Xn
previous works. It can be vividly concluded that CWCA III s:t: h1 ðxÞ ¼ x2 1 ¼ 0:
i¼1 i
is a competent optimizer for solving unconstrained tasks.
Bound restrictions: UB = (1,. . .; 1) and LB = (0,. . .; 0Þ:
Modified algorithm also outperforms other studied opti-
1 1
mizers on constrained test cases. Future studies can be Global maximum: xbest ¼ ðpffiffiffi ; . . .; pffiffiffiÞ; f ðxbest Þ¼ 1:
dedicated to the extension of the WCA strategy and its n n
improved versions to tackle mixed-type problems and
Problem G4
complex optimization tasks in discrete spaces. More elab-
orated simulations may also be carried out using parallel or min f ðxÞ ¼ 5:3578547x23 þ 0:8356891x1 x5
x
distributed computation techniques.
þ 37:293239x1 40792:141
s:t: g1 ðxÞ ¼ uðxÞ 92 0;
Appendix g1 ðxÞ ¼ uðxÞ 0;
g3 ðxÞ ¼ vðxÞ 110 0;
List of 13 constrained benchmark problems: g4 ðxÞ ¼ vðxÞ þ 90 0;
Problem G1
X4 X4 X13 g5 ðxÞ ¼ wðxÞ 25 0;
2
min f ðxÞ ¼ 5 i¼1
x i 5 i¼1
x i x
i¼5 i g6 ðxÞ ¼ wðxÞ þ 20 0;
x
123
82 Neural Comput & Applic (2017) 28:57–85
where Problem G7
uðxÞ ¼ 85:334407þ0:0056858x2 x5 þ0:0006262x1 x4
min f ðxÞ ¼ x21 þ x22 þ x1 x2 14x1 16x2 þ ðx3 10Þ2
x
0:0022053x3 x5 ;
þ 4ðx4 5Þ2 þ ðx5 3Þ2 þ 2ðx6 1Þ2 þ 5x27
vðxÞ ¼ 80:51249þ0:0071317x2 x5 þ0:0029955x1 x2
þ 7ðx8 11Þ2 þ 2ðx9 10Þ2 þ ðx10 7Þ2 þ 45
þ 0:0021813x23 ;
s:t: g1 ðxÞ ¼ 4x1 þ 5x2 3x7 þ 9x8 105 0;
wðxÞ ¼ 9:300961þ0:0047026x3 x5 þ0:0012547x1 x3
g2 ðxÞ ¼ 10x1 8x2 17x7 þ 2x8 0;
þ 0:0019085x3 x4 : g3 ðxÞ ¼ 8x1 þ 2x2 þ 5x9 2x10 12 0;
Bound restrictions: UB = (102, 45, 45, 45, 45) g4 ðxÞ ¼ 3ðx1 2Þ2 þ 4ðx2 3Þ2 þ 2x23 7x4 120 0;
and LB = (78, 33, 27, 27, 27): g5 ðxÞ ¼ 5x21 þ 8x2 þ ðx3 6Þ2 2x4 40 0;
Global minimum: xbest ¼ ð78; 33; 29:995256025682; g6 ðxÞ ¼ 0:5ðx1 8Þ2 þ 2ðx2 4Þ2 þ 3x25 x6 30 0;
45; 36:775812905788Þ; f ðxbest Þ ¼ 30665:539: g7 ðxÞ ¼ x21 þ 2ðx2 2Þ2 2x1 x2 þ 14x5 6x6 0;
g8 ðxÞ ¼ 3x1 þ 6x2 þ 12ðx9 8Þ2 7x10 0:
Problem G5
2 Bound restrictions: UB = (10,. . .; 10)
min f ðxÞ ¼ 3x1 þ 106 x31 þ 2x2 þ 106 x32
x 3
and LB = ð10; . . .; 10Þ:
s:t: g1 ðxÞ ¼ x3 x4 0:55 0;
Global minimum: xbest ¼ ð2:171996; 2:363683; 8:773926;
g2 ðxÞ ¼ x4 x3 0:55 0;
5:095984; 0:9906548; 1:430574; 1:321644; 9:828726;
h1 ðxÞ ¼ 1000½sinðx3 0:25Þ þ sinðx4 0:25Þ
þ 894:8 x1 ¼ 0; 8:280092; 8:375927Þ; f ðxbest Þ ¼ 24:3062091:
h2 ðxÞ ¼ 1000½sinðx3 0:25Þ þ sinðx3 x4 0:25Þ
Problem G8
þ 894:8 x2 ¼ 0;
sin3 ð2px1 Þ sinð2px2 Þ
h3 ðxÞ ¼ 1000½sinðx4 0:25Þ þ sinðx4 x3 0:25Þ max f ðxÞ ¼
x x31 ðx1 þ x2 Þ
þ 1294:8 ¼ 0:
s:t: g1 ðxÞ ¼ x21 x2 þ 1 0;
Bound restrictions: UB = (1200, 1200, 0:55; 0:55Þ g2 ðxÞ ¼ 1 x1 þ ðx2 4Þ2 0:
and LB = (0; 0; 0:55; 0:55Þ: Bound restrictions: UB = (10, 10) and LB = (0, 0):
Best known solution: xbest ¼ ð679:9453; 1026; 0:118876; Global maximum: xbest ¼ ð1:2279713; 4:2453733Þ;
best
0:3962336Þ; f ðx Þ ¼ 5126:4981: f ðxbest Þ ¼ 0:095825:
Problem G6 Problem G9
3
min f ðxÞ ¼ ðx1 10Þ þ ðx2 20Þ 3
min f ðxÞ ¼ ðx1 10Þ2 þ 5ðx2 12Þ2 þ x43 þ 3ðx4 11Þ2
x x
s:t: 2 2
g1 ðxÞ ¼ ðx1 5Þ þ ðx2 5Þ þ 100 0; þ 10x65 þ 7x26 þ x47 4x6 x7 10x6 8x7
g2 ðxÞ ¼ ðx1 5Þ2 þ ðx2 5Þ2 82:81 0; s:t: g1 ðxÞ ¼ 2x21 þ 3x42 þ x3 þ 4x24 þ 5x5 127 0;
g2 ðxÞ ¼ 7x1 þ 3x2 þ 10x23 þ x4 x5 282 0;
Bound restrictions: UB = (100, 100) and LB = (13, 0):
g3 ðxÞ ¼ 23x1 þ x22 þ 6x26 8x7 196 0;
Global minimum: xbest ¼ ð14:095; 0:84296Þ; f ðxbest Þ
g4 ðxÞ ¼ 4x21 þ x22 3x1 x2 þ 2x23 þ 5x6 11x7 0:
123
Neural Comput & Applic (2017) 28:57–85 83
Bound restrictions: UB = (10,. . .; 10) and LB Bound restrictions: UB ¼ ð2:3; 2:3; 3:2; 3:2; 3:2Þ
¼ð10; . . .; 10Þ: and LB = ( 2:3; 2:3; 3:2; 3:2; 3:2Þ:
Global minimum: xbest ¼ ð2:330499; 1:951372;
Global minimum: xbest ¼ ð1:717143; 1:595709; 1:827247;
0:4775414; 4:365726; 0:6244870; 1:038131; 1:594227Þ;
0:7636413; 0:763645Þ; f ðxbest Þ ¼ 0:0539498:
f ðxbest Þ ¼ 680:6300573:
Problem G10
min f ðxÞ ¼ x1 þ x2 þ x3 References
x
s:t: g1 ðxÞ ¼ 1 þ 0:0025ðx4 þ x6 Þ 0; 1. Yang XS (2010) Engineering optimization: an introduction with
g2 ðxÞ ¼ 1 þ 0:0025ðx4 þ x5 þ x7 Þ 0; metaheuristic applications. Wiley, Hoboken
2. Yildiz AR (2009) A new design optimization framework based
g3 ðxÞ ¼ 1 þ 0:01ðx5 þ x8 Þ 0; on immune algorithm and Taguchi’s method. Comput Ind
g4 ðxÞ ¼ 100x1 x1 x6 þ 833:33252x4 83333:333 0; 60:613–620
3. Yildiz AR (2008) Optimal structural design of vehicle compo-
g5 ðxÞ ¼ x2 x4 x2 x7 1250x4 þ 1250x5 0; nents using topology design and optimization. Mater Test
g6 ðxÞ ¼ x3 x5 x3 x8 2500x5 þ 1250000 0: 50:224–228
4. Rahimi S, Roodposhti MS, Abbaspour RA (2014) Using com-
bined AHP–genetic algorithm in artificial groundwater recharge
Bound restrictions: UB = (10000, 10000, 10000, 1000, 1000, site selection of Gareh Bygone Plain, Iran. Environ Earth Sci
1000; 1000; 1000Þ and 72:1979–1992
5. Jordehi AR (2015) Chaotic bat swarm optimisation (CBSO).
LB = (100, 1000, 1000, 10, 10, 10, 10, 10): Appl Soft Comput 26:523–530
Global minimum: xbest ¼ ð579:3167; 1359:943; 5110:071; 6. Abbaspour RA, Samadzadegan F (2011) Time-dependent per-
sonal tour planning and scheduling in metropolises. Expert Syst
182:0174; 295:5985; 217:9799; 286:4162; 395:5979Þ; Appl 38:12439–12452
f ðxbest Þ ¼ 7049:3307: 7. Yildiz AR (2013) Hybrid Taguchi-differential evolution algo-
rithm for optimization of multi-pass turning operations. Appl
Soft Comput 13:1433–1439
Problem G11 8. Yildiz AR (2013) Comparison of evolutionary-based optimiza-
tion algorithms for structural design optimization. Eng Appl
min f ðxÞ ¼ x21 þ ðx2 1Þ2
x Artif Intell 26:327–333
9. Yildiz AR (2012) A comparative study of population-based
s:t: h1 ðxÞ ¼ x2 x21 ¼ 0:
optimization algorithms for turning operations. Inf Sci 210:81–88
Bound restrictions: UB = (1, 1) and LB = ð1; 1Þ: 10. Jordehi AR (2014) Particle swarm optimisation for dynamic
optimisation problems: a review. Neural Comput Appl
1 1 25:1507–1516
Global minimum: xbest ¼ ð pffiffiffi ; Þ; f ðxbest Þ ¼ 0:75: 11. Jordehi AR (2015) Enhanced leader PSO (ELPSO): a new PSO
2 2
variant for solving global optimisation problems. Appl Soft
Problem G12 Comput 26:401–417
h i 12. Jordehi AR (2015) A review on constraint handling strategies in
min f ðxÞ ¼ 1 0:01 ðx1 5Þ2 þ ðx2 5Þ2 þ ðx3 5Þ2 particle swarm optimisation. Neural Comput Appl
x
26(6):1265–1275
s:t: gi;j;k ðxÞ ¼ ðx iÞ2 þ ðx2 jÞ2 þ ðx3 kÞ2 0:0625 0; 13. Jiang BLW (1998) Optimizing complex functions by chaos
i; j; k ¼ 1; 2; . . .; 9: search. Cybern Syst 29:409–419
14. Yildiz AR, Solanki KN (2012) Multi-objective optimization of
vehicle crashworthiness using a new particle swarm based
Bound restrictions: UB = (10, 10, 10) and LB = (0, 0, 0): approach. Int J Adv Manuf Technol 59:367–376
15. Rezaee Jordehi A, Jasni J (2013) Parameter selection in particle
Global minimum: xbest ¼ ð5; 5; 5Þ; f ðxbest Þ ¼ 1:
swarm optimisation: a survey. J Exp Theor Artif Intell
25:527–542
Problem G13 16. Jordehi AR, Jasni J (2015) Particle swarm optimisation for
discrete optimisation problems: a review. Artif Intell Rev
min f ðxÞ ¼ ex1 x2 x3 x4 x5 43(2):243–258
x
17. Yildiz AR (2013) Optimization of multi-pass turning operations
s:t: h1 ðxÞ ¼ x21 þ x22 þ x23 þ x24 þ x25 10 ¼ 0; using hybrid teaching learning-based approach. Int J Adv Manuf
h2 ðxÞ ¼ x2 x3 5x4 x5 ¼ 0; Technol 66:1319–1326
18. Jordehi AR (2015) Optimal setting of TCSC’s in power systems
h3 ðxÞ ¼ x31 þ x32 þ 1 ¼ 0: using teaching-learning-based optimisation algorithm. Neural
Comput Appl 26(5):1249–1256
123
84 Neural Comput & Applic (2017) 28:57–85
19. Eskandar H, Sadollah A, Bahreininejad A, Hamdi M (2012) 41. Saremi S, Mirjalili S, Lewis A (2014) Biogeography-based
Water cycle algorithm—a novel metaheuristic optimization optimisation with chaos. Neural Comput Appl 25:1077–1097
method for solving constrained engineering optimization prob- 42. Schuster HG, Just W (2006) Deterministic chaos: an introduc-
lems. Comput Struct 110:151–166 tion. Wiley, Hoboken
20. Sadollah A, Eskandar H, Bahreininejad A, Kim JH (2015) Water 43. Tavazoei MS, Haeri M (2007) Comparison of different one-
cycle algorithm for solving multi-objective optimization prob- dimensional maps as chaotic search pattern in chaos optimiza-
lems. Soft Comput 19(9):2587–2603 tion algorithms. Appl Math Comput 187:1076–1085
21. Sadollah A, Eskandar H, Bahreininejad A, Kim JH (2015) Water 44. Hilborn RC (2000) Chaos and nonlinear dynamics: an intro-
cycle, mine blast and improved mine blast algorithms for dis- duction for scientists and engineers. Oxford University Press,
crete sizing optimization of truss structures. Comput Struct Oxford
149:1–16 45. He D, He C, Jiang LG, Zhu HW, Hu GR (2001) Chaotic char-
22. Sadollah A, Eskandar H, Bahreininejad A, Kim JH (2015) Water acteristics of a one-dimensional iterative map with infinite col-
cycle algorithm with evaporation rate for solving constrained lapses. IEEE Trans Circuits Syst I Fundam Theory Appl
and unconstrained optimization problems. Appl Soft Comput 48:900–906
30:58–71 46. Erramilli A, Singh R, Pruthi P (1995) An application of deter-
23. Sadollah A, Eskandar H, Kim JH (2015) Water cycle algorithm ministic chaotic maps to model packet traffic. Queueing Syst
for solving constrained multi-objective optimization problems. 20:171–206
Appl Soft Comput 27:279–298 47. Li Y, Deng S, Xiao D (2011) A novel Hash algorithm con-
24. Sadollah A, Eskandar H, Yoo DG, Kim JH (2015) Approximate struction based on chaotic neural network. Neural Comput Appl
solving of nonlinear ordinary differential equations using least 20:133–141
square weight function and metaheuristic algorithms. Eng Appl 48. Xiang T, Liao X (2007) K.w. Wong, An improved particle
Artif Intell 40:117–132 swarm optimization algorithm combined with piecewise linear
25. Ott E (2002) Chaos in dynamical systems. Cambridge Univer- chaotic map. Appl Math Comput 190:1637–1645
sity Press, Cambridge 49. Devaney RL (1989) An introduction to chaotic dynamical sys-
26. Stehlik J (1999) Deterministic chaos in runoff series. J Hydrol tems. Westview Press, Colorado
Hydromech 47:271–287 50. Mahdavi M, Fesanghary M, Damangir E (2007) An improved
27. Xu C, Duan H, Liu F (2010) Chaotic artificial bee colony harmony search algorithm for solving optimization problems.
approach to Uninhabited Combat Air Vehicle (UCAV) path Appl Math Comput 188:1567–1579
planning. Aerosp Sci Technol 14:535–541 51. Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer.
28. Talatahari S, Azar BF, Sheikholeslami R, Gandomi AH (2012) Adv Eng Softw 69:46–61
Imperialist competitive algorithm combined with chaos for 52. Derrac J, Garcı́a S, Molina D, Herrera F (2011) A practical
global optimization. Commun Nonlinear Sci Numer Simul tutorial on the use of nonparametric statistical tests as a
17:1312–1319 methodology for comparing evolutionary and swarm intelli-
29. Jothiprakash V, Arunkumar R (2013) Optimization of hydro- gence algorithms. Swarm Evolut Comput 1:3–18
power reservoir using evolutionary algorithms coupled with 53. Wilcoxon F (1945) Individual comparisons by ranking methods.
chaos. Water Resour Manage 27:1963–1979 Biom Bull 1(6):80–83
30. Wang GG, Guo L, Gandomi AH, Hao GS, Wang H (2014) 54. Gandomi AH, Yang XS, Alavi AH, Talatahari S (2013) Bat
Chaotic krill herd algorithm. Inf Sci 274:17–34 algorithm for constrained optimization tasks. Neural Comput
31. Gandomi AH, Yang XS (2014) Chaotic bat algorithm. J Comput Appl 22:1239–1255
Sci 5:224–232 55. Mezura Montes E, Coello CAC (2005) A simple multimem-
32. Fister I, Perc M, Kamal SM (2015) A review of chaos-based bered evolution strategy to solve constrained optimization
firefly algorithms: perspectives and research challenges. Appl problems. IEEE Trans Evolut Comput 9:1–17
Math Comput 252:155–165 56. Becerra RL, Coello CAC (2006) Cultured differential evolution
33. Li C, An X, Li R (2015) A chaos embedded GSA-SVM hybrid for constrained optimization. Comput Methods Appl Mech Eng
system for classification. Neural Comput Appl 26:713–721 195:4303–4322
34. Liao GC, Tsao TP (2006) Application of a fuzzy neural network 57. Tessema B, Yen GG (2006) A self adaptive penalty function
combined with a chaos genetic algorithm and simulated based algorithm for constrained optimization. In: Evolutionary
annealing to short-term load forecasting. IEEE Trans Evolut computation, 2006. CEC 2006. IEEE Congress on, IEEE,
Comput 10:330–340 pp 246–253
35. Jordehi AR (2014) A chaotic-based big bang–big crunch algo- 58. Hedar AR, Fukushima M (2006) Derivative-free filter simulated
rithm for solving global optimisation problems. Neural Comput annealing method for constrained continuous global optimiza-
Appl 25:1329–1335 tion. J Global Optim 35:521–549
36. Alatas B (2010) Chaotic harmony search algorithms. Appl Math 59. Koziel S, Michalewicz Z (1999) Evolutionary algorithms,
Comput 216:2687–2699 homomorphous mappings, and constrained parameter opti-
37. Gandomi AH, Yun GJ, Yang XS, Talatahari S (2013) Chaos- mization. Evol Comput 7:19–44
enhanced accelerated particle swarm optimization. Commun 60. Cabrera JCF, Coello CAC (2007) Handling constraints in par-
Nonlinear Sci Numer Simul 18:327–340 ticle swarm optimization using a small population size. In:
38. dos Santos Coelho L, Mariani VC (2008) Use of chaotic Gelbukh A, Kuri Morales AF (eds) MICAI 2007: advances in
sequences in a biologically inspired algorithm for engineering artificial intelligence, Springer, pp 41–51
design optimization. Expert Syst Appl 34:1905–1913 61. Kaveh A, Talatahari S (2010) An improved ant colony opti-
39. Jordehi AR (2015) Seeker optimisation (human group optimi- mization for constrained engineering design problems. Eng
sation) algorithm with chaos. J Exp Theor Artif Intell 1–10. Comput 27:155–182
doi:10.1080/0952813X.2015.1020568 62. Deb K, Goyal M (1996) A combined genetic adaptive search
40. Jordehi AR (2015) A chaotic artificial immune system optimi- (GeneAS) for engineering design. Comput Sci Inform 26:30–45
sation algorithm for solving global continuous optimisation 63. Sandgren E (1990) Nonlinear integer and discrete programming
problems. Neural Comput Appl 26(4):827–833 in mechanical design optimization. J Mech Des 112:223–229
123
Neural Comput & Applic (2017) 28:57–85 85
64. Coello CAC (2000) Use of a self-adaptive penalty approach for 83. Rao SS, Rao S (2009) Engineering optimization: theory and
engineering optimization problems. Comput Ind 41:113–127 practice. Wiley, New York
65. Kannan B, Kramer SN (1994) An augmented Lagrange multi- 84. Hwang SF, He RS (2006) A hybrid real-parameter genetic
plier based method for mixed integer discrete continuous opti- algorithm for function optimization. Adv Eng Inform 20:7–21
mization and its applications to mechanical design. J Mech Des 85. Deb K (1991) Optimal design of a welded beam via genetic
116:405–411 algorithms. AIAA J 29:2013–2015
66. He Q, Wang L (2007) An effective co-evolutionary particle 86. Ragsdell K, Phillips D (1976) Optimal design of a class of
swarm optimization for constrained engineering design prob- welded structures using geometric programming. J Manuf Sci
lems. Eng Appl Artif Intell 20:89–99 Eng 98:1021–1025
67. Mezura E, Montes CAC (2008) Coello, An empirical study 87. Deb K (2000) An efficient constraint handling method for
about the usefulness of evolution strategies to solve constrained genetic algorithms. Comput Methods Appl Mech Eng
optimization problems. Int J Gen Syst 37:443–473 186:311–338
68. Coello CAC, Montes EM (2002) Constraint-handling in genetic 88. Ray T, Liew KM (2003) Society and civilization: an optimiza-
algorithms through the use of dominance-based tournament tion algorithm based on the simulation of social behavior. IEEE
selection. Adv Eng Inform 16:193–203 Trans Evolut Comput 7:386–396
69. Gandomi AH, Yang XS, Alavi AH (2013) Cuckoo search 89. Mehta VK, Dasgupta B (2012) A constrained optimization
algorithm: a metaheuristic approach to solve structural opti- algorithm based on the simplex search method. Eng Optim
mization problems. Eng Comput 29:17–35 44:537–550
70. Kaveh A, Talatahari S (2009) Engineering optimization with 90. Fesanghary M, Mahdavi M, Minary-Jolandan M, Alizadeh Y
hybrid particle swarm and ant colony optimization. Asian J Civ (2008) Hybridizing harmony search algorithm with sequential
Eng 10:611–628 quadratic programming for engineering optimization problems.
71. dos Santos Coelho L (2010) Gaussian quantum-behaved particle Comput Methods Appl Mech Eng 197:3080–3091
swarm optimization approaches for constrained engineering 91. Arora J (2004) Introduction to optimum design. Academic Press,
design problems. Expert Syst Appl 37:1676–1683 New York
72. Hu X, Eberhart RC, Shi Y (2003) Engineering optimization with 92. Belegundu AD, Arora JS (1985) A study of mathematical pro-
particle swarm. In: Swarm intelligence symposium, 2003. gramming methods for structural optimization, Part I: Theory.
SIS’03. Proceedings of the 2003 IEEE, IEEE, pp 53–57 Int J Numer Methods Eng 21:1583–1599
73. Zhang C, Wang HP (1993) Mixed-discrete nonlinear optimiza- 93. Omran MG, Salman A (2009) Constrained optimization using
tion with simulated annealing. Eng Optim 21:277–291 CODEQ. Chaos Solitons Fractals 42:662–668
74. Lee KS, Geem ZW (2005) A new meta-heuristic algorithm for 94. Ray T, Saini P (2001) Engineering design optimization using a
continuous engineering optimization: harmony search theory swarm with an intelligent information sharing among individu-
and practice. Comput Methods Appl Mech Eng 194:3902–3933 als. Eng Optim 33:735–748
75. Mezura Montes E, Coello CAC, Velázquez Reyes J, Muñoz 95. Tsai JF (2005) Global optimization of nonlinear fractional
Dávila L (2007) Multiple trial vectors in differential evolution programming problems in engineering design. Eng Optim
for engineering design. Eng Optim 39:567–589 37:399–409
76. Karaboga D, Akay B (2009) A comparative study of artificial 96. Zhang M, Luo W, Wang X (2008) Differential evolution with
bee colony algorithm. Appl Math Comput 214:108–132 dynamic stochastic selection for constrained optimization. Inf
77. Cagnina LC, Esquivel SC, Coello CAC (2008) Solving engi- Sci 178:3043–3074
neering optimization problems with the simple constrained 97. Raj KH, Sharma R (2005) An evolutionary computational
particle swarm optimizer. Inform (Slov) 32:319–326 technique for constrained optimisation in engineering design. IE
78. He S, Prempain E, Wu Q (2004) An improved particle swarm (I) J—MC 86:121–128
optimizer for mechanical design optimization problems. Eng 98. Himmelblau DM (1972) Applied nonlinear programming.
Optim 36:585–605 McGraw-Hill, New York
79. Akhtar S, Tai K, Ray T (2002) A socio-behavioural simulation 99. Homaifar A, Qi CX, Lai SH (1994) Constrained optimization
model for engineering design optimization. Eng Optim via genetic algorithms. Simulation 62:242–253
34:341–354 100. Coello CAC (2000) Treating constraints as objectives for single-
80. Mirjalili S, Mirjalili SM, Hatamlou A (2015) Multi-verse opti- objective evolutionary optimization. Eng Optim?
mizer: a nature-inspired algorithm for global optimization. A35(32):275–308
Neural Comput Appl 1–19. doi:10.1007/s00521-015-1870-7 101. Chen D, Zhao C, Zhang H (2011) An improved cooperative
81. Gandomi AH, Yang XS, Alavi AH (2011) Mixed variable particle swarm optimization and its application. Neural Comput
structural optimization using firefly algorithm. Comput Struct Appl 20:171–182
89:2325–2336
82. Dimopoulos GG (2007) Mixed-variable engineering optimiza-
tion based on evolutionary and social metaphors. Comput
Methods Appl Mech Eng 196:803–817
123