You are on page 1of 29

Neural Comput & Applic (2017) 28:57–85

DOI 10.1007/s00521-015-2037-2

ORIGINAL ARTICLE

An efficient chaotic water cycle algorithm for optimization tasks


Ali Asghar Heidari1 • Rahim Ali Abbaspour1 • Ahmad Rezaee Jordehi2

Received: 6 April 2015 / Accepted: 12 August 2015 / Published online: 4 September 2015
 The Natural Computing Applications Forum 2015

Abstract Water cycle algorithm (WCA) is a new popu- Keywords Chaos  Meta-heuristic  Global optimization 
lation-based meta-heuristic technique. It is originally Water cycle algorithm
inspired by idealized hydrological cycle observed in natu-
ral environment. The conventional WCA is capable to
demonstrate a superior performance compared to other 1 Introduction
well-established techniques in solving constrained and also
unconstrained problems. Similar to other meta-heuristics, Larger-scale nonlinear optimization problems regularly
premature convergence to local optima may still be hap- contain multiple regional optimums, which pose substantial
pened in dealing with some specific optimization tasks. efforts to develop appropriate computation mechanisms for
Similar to chaos in real water cycle behavior, this article searching near-optimal solutions in high-dimensional and/
incorporates chaotic patterns into stochastic processes of or non-convex search spaces [1]. In this area, structural
WCA to improve the performance of conventional algo- design tasks have been regarded as an identical motivating
rithm and to mitigate its premature convergence problem. theme in engineering to determine more effectual config-
First, different chaotic signal functions along with various urations [2, 3]. With regard to some drawbacks of classical
chaotic-enhanced WCA strategies (totally 39 meta-heuris- optimization strategies, several meta-heuristics have been
tics) are implemented, and the best signal is preferred as designed to tackle multifarious optimization tasks [4–9].
the most appropriate chaotic technique for modification of These algorithms are often inspired by characteristics of
WCA. Second, the chaotic algorithm is employed to tackle natural mechanisms such as evolution and cooperation
various benchmark problems published in the specialized [10]. These branches of optimizers have gained consider-
literature and also training of neural networks. The com- able momentum among researchers, due to their explo-
parative statistical results of new technique vividly ration and exploitation potentials [11, 12]. To the present
demonstrate that premature convergence problem is time, various nature-inspired meta-heuristics have been
relieved significantly. Chaotic WCA with sinusoidal map developed such as chaos optimization algorithm (COA)
and chaotic-enhanced operators not only can exploit high- [13], particle swarm optimization (PSO) [14–16], teach-
quality solutions efficiently but can outperform WCA ing–learning-based algorithm (TLBO) [17, 18], and water
optimizer and other investigated algorithms. cycle algorithm (WCA) [19].
Water cycle optimization algorithm (WCA) is a recent
robust meta-heuristic technique [19]. The core inspiration
& Ahmad Rezaee Jordehi of the WCA is motivated from the idealized hydrological
ahmadrezaeejordehi@gmail.com cycle in natural environments. Until now, WCA has been
1 utilized in different areas to tackle diverse optimization
Department of Geomatics and Spatial Information
Engineering, College of Engineering, University of Tehran, problems [20–24]. In most of the cases, WCA demonstrates
Tehran, Iran a superior and efficient performance compared to well-
2
Department of Electrical Engineering, University Putra established algorithms in dealing with constrained and
Malaysia (UPM), Serdang, Selangor, Malaysia unconstrained problems [22, 23]. In addition, it is evident

123
58 Neural Comput & Applic (2017) 28:57–85

that WCA is able to effectively enhance the quality of and GA-SVM approaches. Up to now, chaotic sequences
solutions. In 2015, also an enhanced WCA (ER-WCA) was have been hybridized with different meta-heuristic
established in order to tackle different constrained tasks approaches such as GA [34], big bang–big crunch algo-
[22]. In [22], ER-WCA approach can outperform with an rithm (BB–BC) [35], harmony search (HS) [36], PSO [37],
improved performance compared to WCA and other well- ant colony optimization (ACO) [38], bat algorithm (BA)
established optimizers in solving constrained and uncon- [5], seeker optimization algorithm (SOA) [39] and artificial
strained tasks. In general, the performance of WCA mainly immune system algorithm (AIS) [40]. While various
depends on its exploitation tendency [22]. Therefore, with chaotic algorithms have been proposed already, most of
considering the distinctive benefits of WCA method, them only have been utilized to solve unconstrained
stagnation to local optima may still be happened in some problems such as [5, 28, 30, 31, 35, 36, 40, 41]. Presently,
cases. Comparable to other meta-heuristic optimizers, this no work is available in the specialized literature for
problem may be observed in some circumstances that the enhancing exploration and/or exploitation capacities of
conventional WCA is not consistently capable to manage WCA by using chaos paradigm.
an appropriate tradeoff between exploitation (intensifica- The motivation of this article is as follows: first, the
tion) and exploration (diversification) tendencies in dealing potential of chaos-embedded strategies for alleviating
with some particular test problems. premature convergence problem of WCA is investigated.
The significant effect of randomization on meta- Second, with regard to chaotic features of real hydrological
heuristic algorithms can be provided by utilizing chaotic processes [26], this article presents a new improved WCA
patterns within optimization process. The concept of approach to tackle different optimization problems. In
chaos can be clarified as a deterministic, random-like these algorithms, 13 different chaotic signals are sequen-
process that may be observed in nonlinear, dynamic tially incorporated into the WCA movement phases and
systems, which is topological mixing, non-converging, raining procedure in order to enhance the performance of
non-period, and delimited [25]. This is inspiring that conventional algorithm. These algorithms employ pseudo-
chaotic dynamic can also be observed in hydrological random sequences in place of random variables and some
events [26]. Because of the ergodicity of chaos, chaotic fixed parameters of WCA. Finally, comparative simula-
maps can efficiently prevent meta-heuristics from pre- tions were conducted using a set of unconstrained and
mature convergence to local solutions by increasing the constrained problems extensively. Diversity of the
diversity of the individuals. employed problems was necessary to clearly confirm that
In 2010, Xu et al. [27] offered an enhanced artificial bee the enhancements of chaotic WCA variant are significant.
colony technique (ABC) integrated with irregularity of In order to substantiate the improvements in chaotic WCA
chaotic patterns to tackle the UCAV trajectory planning meta-heuristic, statistical results are also provided in detail.
task. The simulations affirmed that the chaotic ABC opti- The modified WCA can effectively be utilized to diverse
mizer outperforms as a feasible path planner. In 2011, multifarious optimization tasks in diverse areas of sciences
Talatahari et al. [28] enriched the results of imperialist and technologies. It is expected that new technique obtains
competitive optimizer (ICA) by adding logistic and sinu- high-quality solutions in tackling constrained and uncon-
soidal sequences to the movement steps of orthogonal ICA. strained problems and some applications such as training of
In 2013, Jothiprakash [29] used chaotic patterns to create neural networks (NNs). In this area, the next significant
initial population of GA and DE for dealing with the questions can be raised that will be answered by this article
problem of water resource system. The results showed that numerically.
new hybrid methods outperform basic GA and DE algo-
• Does the combination of chaotic signals with WCA
rithms. In 2014, Gandomi et al. [30] utilized a chaotic krill
relieve its premature convergence problem?
herd method (KH) to optimize unconstrained problems.
• Which of the possible chaos-based strategies are
The simulations confirmed that the employment of chaotic
appropriate for WCA?
maps instead of linearly shifting parameters is a significant
• Which chaotic technique shows the best performance?
adjustment of the KH optimizer. Gandomi et al. [31]
• Which chaotic signal is the most appropriate one for
showed that bat algorithm (BA) with sinusoidal map as its
original algorithm?
pulse rate outperforms basic technique based on the com-
• Does the WCA combined with the most proper chaotic
parison of the success rate results.
strategy and the most appropriate signal outperforms
In 2015, Fister et al. [32] reviewed different chaos-based
other well-established optimization methods?
firefly algorithms (FA). Li et al. [33] introduced an efficient
• Does the best chaotic version also demonstrate an
classification technique (CGSA-SVM) based on a chaotic
efficient performance for practical applications such as
gravitational search algorithm (GSA). Results exposed that
training of NNs?
CGSA-SVM algorithm is capable to outperform PSO-SVM

123
Neural Comput & Applic (2017) 28:57–85 59

This article is structured as follows. First, the idea of 2 3


Sea
conventional WCA technique and detailed steps of its 6 River1 7
6 7
implementation are explained in Sect. 2. The details about 6 .. 7
6 . 7
utilized chaotic maps are reviewed in Sect. 3. Section 4 Total population ¼ 6 Stream 7
6 Nsrþ1 7
proposes new chaos-embedded WCA algorithms. In 6 .. 7
4 . 5
Sects. 5 and 6, the main simulation results for implemented
StreamNpop
problems are reported, and the results of WCA-based 2 3
x11 x12 x13 ... x1N
method and other investigated optimizers are compared 6 x21 x22 x23
6 ... x2N 7 7
statistically. Finally, Sect. 7 represents the main conclud- ¼6 .
4 .. .. .. .. .. 7
ing remarks of this research. . . . . 5
N N N N
x1 pop x2 pop x3 pop ... xNpop
ð1Þ
2 An overview of basic WCA
Nsr ¼ Number of rivers + 1 ð2Þ
Hydrological cycle briefly can be explained as the unin- Nstreams ¼ Npop  Nsr ð3Þ
terrupted motion of H2O on, above, or below the earth’s
surface, which is a critical process in existence of where stream, river, and the sea show a
ecosystems [26]. During H2O cycle process, which initi- 1 9 N-dimensional array representing a par-
ates from the highest position in the mountains and ter- ticular solution defined as stream = [x1, x2,
minates in the downhill locations, water continuously x3,…, xN], and Nsr indicates the total number
flows across the land surface within streams and rivers of rivers and sea. Nstream represents number of
toward the sea. Streams and rivers take surface-water streams which directly or indirectly flow to
bodies from the rain and other sub-streams on their rivers and sea.
journey to the deepest location. The river water evapo- Step 3: Calculating the cost of every population
ration often occurs when the plants emit water during the member via Eq. (4):
 
transpire procedure. Water vapor is progressively trans- Ci ¼ Costi ¼ f xi1 ; xi2 ; . . .; xiN ;
ð4Þ
mitted to the atmosphere to create scattered clouds. Then, i ¼ 1; 2; 3; . . .; Npop
condensed clouds generate raining drops to create some
new stream flows and rivers. where Npop indicates the total population of
The WCA optimization technique has been developed streams, rivers, and sea.
by mimicking the idealized hydrological cycle observed Step 4: Computing the corresponding flow intensity
in natural environments [19]. In the original WCA, of rivers and sea according to the next
some of the events such as the groundwater and plant equations:
absorption are not simulated. In addition, opposite to Cn ¼ Costn  CostNsrþ1 ; n ¼ 1; 2; . . .; Nsr ð5Þ
PSO, WCA has been designed based on the indirect (  )
 C 
movement concept from streams to the rivers and from  n 
NSn ¼ round PNsr   Nstreams ;
the rivers to the sea as the temporal optimum solution. It  i¼1 Cn  ð6Þ
is significant to note that during optimization procedure, n ¼ 1; 2; . . .; Nsr
vicinity of these partial solutions is exploited by stream
where NSn stands for the number of streams,
movements, but exploring the search space is done
which are flowing into the related rivers or
using stochastically driven evaporation and raining
sea.
operations. Consequently, modification of these opera-
Step 5: The streams flow into rivers and the sea
tors might influence the overall algorithm performance.
according to:
In order to implement WCA algorithm, required steps
can be expressed as follows: XStream ðt þ 1Þ ¼ XStream
 ðtÞ þ rand  C 
 XSea ðtÞ  XStream ðtÞ
Step 1: Determining the initial parameters of WCA
technique: Nsr, Npop, dmax, Max. iteration. ð7Þ
Step 2: Generating the basic population randomly XStream ðt þ 1Þ ¼ XStream
 ðtÞ þ rand  C 
over the search space and forming the initial  XRiver ðtÞ  XStream ðtÞ
sea, rivers, and streams based on the follow-
ð8Þ
ing equations:

123
60 Neural Comput & Applic (2017) 28:57–85

where rand denotes a uniformly distributed Equation (12) is only utilized in solving
random value within the range of [0, 1] and unconstrained problems. To improve the
C is a value between 1 and 2 (near to 2). convergence performance of the WCA opti-
When the C value is restricted to be greater mizer for constrained problems, Eq. (13) is
than 1, streams can move toward the rivers only applied to the streams which directly
from various directions. flow in the location of the sea [with regard to
Step 6: Each river flows into the downhill place or sea evaporation operation in Eq. (11)].
based on Eq. (9): new pffiffiffi
XStream ðt þ 1Þ ¼ Xsea þ l  randnð1; N Þ
XRiver ðt þ 1Þ ¼ XRiver
 ðtÞ þ rand  C  ð13Þ
 XSea ðtÞ  XRiver ðtÞ ð9Þ
where l coefficient represents the concept of
Step 7: Exchanging the positions of river with a variance, which shows the range of searching
stream that generates a better solution. region around the best solution. In addition,
Step 8: Similar to Step 7, if a river explores a better Randn is a normally distributed random
solution compared to the sea, the position of number. Equation (13) is designed to increase
them should be exchanged. the generation of streams that move straightly
Step 9: Checking the evaporation condition for toward the sea in order to enhance the
unconstrained problems based on the follow- exploration in the proximity of best solution.
ing pseudo-code [22]: Step 11: Decreasing the value of controlling parameter
  dmax via Eq. (14):
if X  X i \dmax or rand\0:1
Sea River
i ¼ 1; 2; 3; . . .; Nsr  1 dmax ðtÞ
ð10Þ dmax ðt þ 1Þ ¼ dmax ðtÞ  ð14Þ
Max: iteration
perform raining process
end if Step 12: Checking the convergence criteria. If the
termination condition is fulfilled, terminate;
where dmax is a very small number. dmax is otherwise, return to step 5.
determined by user to control the intensifica-
tion level in the proximity of best solutions. The pseudo-code of the original WCA can be described
To heighten the capability of WCA technique as follows:
only against constrained problems, the next
pseudo-code in Eq. (11) should be considered
only for the streams which move directly to 3 Chaos and chaotic maps
the sea. This assumption is proposed in order
to prevent searching procedure from prema- The concept of chaos can be clarified as a deterministic
ture convergence in tackling constrained process observed in nonlinear, dynamic systems that is
tasks. topological mixing, non-converging, aperiodic, and boun-
  ded [42]. A dynamic system may be described by using a
if X  X i
Sea
\dmax
stream set of equations as some unpredictable phenomenon that
i ¼ 1; 2; 3; . . .; NSn can gradually evolve throughout the time from a known
ð11Þ
perform raining process initial condition. In addition, it is a mathematical fact that
chaotic systems exhibit sensitive dependence upon forma-
end if
tive conditions and involved parameters [43]. With this
Next to evaporation, the raining procedure regard, the nature of chaotic behaviors is evidently random,
should be performed. Therefore, some new unpredictable, and regular [36]. The theory of chaos has
streams in the different positions are created. been fundamentally established based upon the observation
Step 10: If the evaporation process satisfied, the rain- of complex weather patterns. Furthermore, according to
ing process occurs according to the following several substantive evidence, chaos is an intrinsic charac-
equation: teristic of various natural processes; hence, chaotic
new
XStream ðt þ 1Þ ¼ LB þ rand  ðUB  LBÞ behaviors can often be distinguished in neural systems and
environmental phenomena such as rainfall patterns, climate
ð12Þ
changes, and hydrological cycles [26]. Due to these rela-
where LB and UB represent lower and upper tions, the theories of nature-inspired computing and chaos
boundaries of the given problem, respectively. are two inseparable notions. In this paper, chaos is utilized

123
Neural Comput & Applic (2017) 28:57–85 61


0 xk ¼ 0
xkþ1 ¼ ð17Þ
1=xk mod ð1Þ otherwise
 
1 1
1=xk mod ð1Þ ¼  ð18Þ
xk xk
This map also generates chaotic sequences in (0, 1).
(d) Intermittency map
The intermittency map is expressed as [46]:
(
e þ xk þ cxnk 0  xk  P
xkþ1 ¼ xk  P ð19Þ
P\xk \1
1P
where c parameter is calculated as follows:
1eP
c¼ ð20Þ
Pn
(e) Iterative chaotic map (ICMIC)
The iterative map with infinite collapses (ICMIC)
[37] has the following form:

ap
xkþ1 ¼ sin ; a 2 ð0; 1Þ ð21Þ
xk
where a 2 (0, 1) is an appropriate parameter.
(f) Liebovitch map
Liebovitch et al. [43] have proposed another chaotic
map represented as:
8
< ax
> k 0\xk  P1
P1  xk
xkþ1 ¼ P1 \xk  P2 ð22Þ
>
: P2  P1
1  bð1  xk Þ P2 \xk  1
Fig. 1 Pseudo-code of WCA algorithm (based on [20])
where a \ b and these parameters are expressed as
follows:
to improve the performance of original water cycle algo-
P2
rithm. The rest of this section briefly reviews non-invertible a¼ ð 1  ð P2  P 1 Þ Þ ð23Þ
P1
maps that generate chaotic motions for later algorithm
simulations (Fig. 1). 1
b¼ ð ð P2  1 Þ  P1 ð P2  P1 Þ Þ ð24Þ
P2  1
(a) Chebyshev map
The family of Chebyshev map [43] is represented In this article, three equal limits are chosen for
via: Liebovitch map.
xkþ1 ¼ cosðk cos1 ðxk ÞÞ ð15Þ (g) Logistic map
This classic map appears in nonlinear dynamics of
where k shows the iteration number. biological population evidencing chaotic behavior
(b) Circle map [47], and is formulated as
The circle map [44] is defined as follows: xkþ1 ¼ axk ð1  xk Þ ð25Þ
xkþ1 ¼ xk þ b  ða=2pÞ sinð2pkÞ mod ð1Þ ð16Þ
where xk represents the kth chaotic number and k is
For a = 0.5 and b = 0.2, circle map generates the iteration number. Clearly, x 2 (0, 1) with initial
chaotic sequence in (0, 1). condition x0 2 (0,1) and x0 62 [1 0.5, 1]. In later
(c) Gauss/mouse map implementation, a = 4 is used.
The Gauss map is widely utilized in the literature (h) Piecewise map
[45] and is represented via: The piecewise map can be given by [48]:

123
62 Neural Comput & Applic (2017) 28:57–85

8x
> k proposed chaotic operators try to relieve the problem of
>
> 0  xk \P
>
> P premature convergence for achieving better search perfor-
>
> xk  P
>
< P  xk \0:5 mance in WCA optimizer. In these algorithms, different
xkþ1 ¼ 0:5 P ð26Þ chaotic maps are incorporated into the movement steps and
> 1  P  xk
>
> 0:5  xk \1  P raining phases of WCA in order to strengthen the down-
>
> 0:5  P
>
>
: 1  xk
> 1  P  xk \1
right exploration and exploitation potentials of the con-
P ventional technique. Experimental tests also confirm the
advantages of combining chaotic sequences into the
where x 2 (0, 1) and P denotes the control parameter
stochastic meta-heuristic search methodologies [43].
within the range of [0, 0.5].
On other hand, decreasing the number of parameters in
(i) Sawtooth map
meta-heuristic algorithms usually improves the exploration
This map [31] can be defined using a mod function:
and exploitation abilities of them [35]. For example, PSO
xkþ1 ¼ 2xk mod ð1Þ ð27Þ algorithm with linearly decreasing inertia weight can
(j) Sine map explore superior solutions with a preferable performance
The sine map [49] is a unimodal chaotic sequence [10], or in HS algorithm, chaotically decreasing pitch
represented by: adjustment rate provides better results [50]. The reason of
preferable results of these mechanisms is the fact that
xkþ1 ¼ 0:25a sinðpxk Þ; 0\a  4 ð28Þ explorative tendency of swarm should be at highest level at
(k) Singer map beginning iterations and it must be decreased by passing
Singer’s map [37] is a one-dimensional chaotic the time. With this regard, it is known that chaotic maps in
system that can be expressed as: meta-heuristics can provide such a mechanism in order to
improve the overall performance of basic algorithm. These
xkþ1 ¼ lð7:86xk  23:31x2k þ 28:75x3k two strategies are coupled in chaotic WCA. For this pur-
 13:302875x4k Þ ð29Þ pose, chaotic linearly decreasing strategies are imple-
where l is a parameter in the interval [0.9, 1.08]. mented to control the performance of WCA dynamically.
(l) Sinusoidal map Novel chaos-embedded WCA algorithms can clearly be
This map [31] is formulated as: categorized and described as follows:

xkþ1 ¼ ax2k sinðpxk Þ ð30Þ


4.1 WCA with chaotic streams
For a = 2.3 and x0 = 0.7, the simplified form can be
expressed via the following equation: In original WCA, rivers, which are defined as a number of
finest obtained solutions except the most excellent one (sea),
xkþ1 ¼ sinðpxk Þ ð31Þ
operate as leader points for guiding the streams toward better
(m) Tent map locations and to escape the search in inappropriate areas of
This map resembles the classic logistic map [25]. solution space [20]. This process assists WCA in exploitation
Tent map generates chaotic sequences inside (0, 1), (intensification) phase [19]. With this consideration, in the
assuming the following formulas: first chaotic WCA (CWCA I), C value (which is a prede-
8 termined parameter and affects the balance among its
> x
< k xk \0:7 exploitative and explorative tendencies) is calculated via
xkþ1 ¼ 0:7
10 ð32Þ
>
: ð1  xk Þ xk  0:7 multiplying a linearly increasing operator by a chaotic signal
3 equation. Therefore, C value and new location of streams and
rivers can be updated by using following equations:
Based on the described chaotic signals, Fig. 2 illustrates  
the chaotic distributions of 100 iterations for utilized XStream ðt þ 1Þ ¼ XStream ðtÞ þ CðtÞ  XSea ðtÞ  XStream ðtÞ
chaotic signals with randomly chosen initial conditions. ð33Þ
 
XStream ðt þ 1Þ ¼ XStream ðtÞ þ CðtÞ  XRiver ðtÞ  XStream ðtÞ
ð34Þ
4 Chaotic WCA  
XRiver ðt þ 1Þ ¼ XRiver ðtÞ þ CðtÞ  XSea ðtÞ  XRiver ðtÞ
In this section, different improved chaotic strategies are ð35Þ
developed to boost the performance of standard WCA.    
WCA has a robust, competent performance; however, the CðtÞ ¼ Ci þ Cf  Ci  ðt=tmax Þ  chaosðtÞ ð36Þ

123
Neural Comput & Applic (2017) 28:57–85 63

Fig. 2 Chaotic value


distributions during 100
iterations. a Chebyshev map,
b circle map, c Gauss/mouse
map, d intermittency map,
e iterative map, f Liebovitch
map, g logistic map,
h piecewise map, i sawtooth
map, j sine map, k Singer map,
l sinusoidal map, m tent map
(a) (b)

(c) (d)

(e) (f)

(g) (h)

(i) (j)

123
64 Neural Comput & Applic (2017) 28:57–85

Fig. 2 continued

(k) (l)

(m)

where Cf and Ci show the final and preliminary values of In chaotic evaporation, dmax is determined chaotically,
linearly decreasing operator. In order to determine proper which should be located in the range of [1.00E-17,
values for Ci and Cf, a trial and error procedure has been 1.00E-01]. In order to form chaotic raining process, rand
performed. Based on the test, the best values are suggested vector of Eq. (12) for unconstrained problems is modified
and chosen as 1 and 2, respectively. These values have by chosen chaotic maps during iterations as:
recognized as the most suitable values of Ci and Cf to solve new
XStream ¼ LB þ chaosðtÞ  ðUB  LBÞ ð38Þ
all studied optimization problems. Symbol t represents
current iteration, tmax shows maximum number of itera- In addition, for constrained problems, the evaporation
tions, and chaos (t) represents the utilized chaotic equation check is based on Eq. (39) and l value is computed by
which generates chaotic patterns. Theoretically, it is multiplying a linearly decreasing operator by a chaotic
expected that the integration of chaotic patterns with signal as follows
movement operation of WCA emphasizes its exploitation  
at the last iterations. if XSea  Xstream
i \chaosðtÞ
i ¼ 1; 2; 3; . . .; NSn
ð39Þ
4.2 WCA with chaotic evaporation and raining perform chaotic raining process
process end if

The evaporation condition resembles mutation operator in new


XStream ¼ Xsea þ ½lðtÞ0:5 chaosðtÞ ð40Þ
GAs to avoid algorithm from immature convergence to    
lðtÞ ¼ li þ lf  li  ðt=tmax Þ ð41Þ
local optima [19]. Subsequent to satisfaction of evaporation
step, the raining process is performed and fresh streams in where lf and li show the final and first values of linearly
the diverse positions are formed. The evaporation condition decreasing operator. Based on a trial and error processing,
and raining process mutually provide an escape method for their values are recommended and set as 0.1 and 0.8,
WCA in the exploration phase. In this regard, in new respectively. Chaotic WCA algorithm with these values
algorithm, which is called CWCA II, chaos-based evapo- demonstrates an appropriate performance in solving
ration condition is checked based on the next pseudo-code implemented constrained optimization tasks. In this
 
if X ðtÞ  X i ðtÞ\chaosðtÞ or chaosðtÞ\0:1
Sea River
method, chaotic evaporation and raining operators can
improve the exploration (diversification) capacity of WCA
i ¼ 1; 2; 3; . . .Nsr  1
ð37Þ in first iterations in order to avoid algorithm from prema-
perform raining process ture convergence to the partial solutions.
end if

123
Neural Comput & Applic (2017) 28:57–85 65

4.3 WCA with chaotic streams, evaporation 5.1 Selecting the best chaotic variant
and raining process
In order to find out the most appropriate chaos-based
In CWCO III algorithm, CWCO I and CWCO II variants strategy for WCA, different criteria can be investigated.
are integrated; hence, C and l values are not two prede- Based on the literature, usually these criteria are according
fined constants, and rand vectors are also modified by to the statistical results, total number of function evalua-
selected chaotic sequences during iterations based on the tions, success rate, and some combined methods. It is
aforementioned equations (see Eqs. 33–41). It should be experimented that the success rate results of an algorithm
noted that the chaotic evaporation condition is still checked effectively demonstrate its overall potential in finding of
based on Eq. (37) (for unconstrained tasks) and Eq. (39) high-quality solutions. Hence, this article discovers the
(for constrained tasks). In this chaotic algorithm, different potential of the different chaotic methods by comparing the
chaotic signals may provide different diversification and success rate results. Regardless of the statistical compar-
intensification patterns to improve the standard WCA isons, an algorithm with inappropriate success rate results
algorithm. cannot be considered a reliable and/or efficient method.
The next observations can assist in demonstrating how Success rate criterion may be expressed as:
the chaotic operators are theoretically efficient, either in Sr ¼ 100  ðrunsuccessful =runtotal Þ ð42Þ
individual form or cooperatively:
where runtotal is the total number of runs and runsuccessful
• The chaotic movements permit CWCA to flow streams represents the number of runs that have explored the
chaotically that emphasizes more exploration at first optimal solution. The criterion for a successful run can be
steps and more exploitation at last iterations. determined as follows:
• The chaos-enhanced evaporation and raining operator
of CWCA linearly decrease its exploration tendency by X
Dim
ðXigb  Xi Þ2  ðUB  LBÞ  1008 ð43Þ
more iterations.
d¼1
• Various chaotic signals for dynamic streams, evapora-
tion, and raining process provide different exploration where Dim shows the dimension of optimization problem,
and exploitation tendencies for the CWCA during Xgb
i represents ith dimension of the global best solution
optimization tasks. explored by algorithm, X*i shows the recent best solution of
• Since chaotic patterns exhibit chaotic motions, a optimizer, and UB and LB show upper and lower solution
generation of CWCA can emphasize either exploration space bounds, respectively.
or exploitation. Six standard benchmark problems are employed in 30
• In case of premature convergence to regional best, dimensions in order to perform the first phase of imple-
different chaotic operators help algorithm to leave them mentations [41, 51]. The features of these problems are
by more iterations. listed in Table 1. With entirely random initial conditions,
• In case of exploring an appropriate region(s) of solution the results are obtained over 30 runs of each algorithm. The
space, the exploitative tendency of CWCA is amplified internal parameters of chaotic and standard WCA are
in order to exploit better solutions. chosen as: Max. iteration = 1000, Nsr = 4, Npop = 50 (so
NFEs is equal to 5.00E?04), dmax = 1.00E-16 for
unconstrained problems and 1.00E-05 for constrained
problems. The success rates of methods are presented in
5 Experimental results and discussion Tables 2, 3 and 4. For reliable comparisons, the overall
success rates of CWCA-based strategies in solving differ-
Here, the performance of different chaotic versions of ent test problems are reflected in Table 5.
WCA optimizer is investigated in detail. For experiments From Table 5, some significant points can be recognized
of this article, each algorithm is implemented using as follows:
MATLAB R2012a (7.14) programming software on a
T6400@4 GHz Intel Core (TM) 2 Duo processor desktop • The performance of WCA with chaotic streams,
PC with 4 GB of Random Access Memory (RAM). The evaporation, and raining process (CWCA III) is signif-
structure of this experiment is organized as follows: First, icantly superior to the other chaotic strategies.
the best chaotic signal for modification of CWCA is • Based on the success rate results, sinusoidal map is the
determined. Second, the best chaotic-based strategy and most appropriate map for CWCA III technique.
best signal establish the new chaos-enhanced WCA of this • For the CWCA I and CWCA II, the most efficacious
study and is used to tackle different optimization problems. signals are ICMIC and sinusoidal maps, respectively.

123
66 Neural Comput & Applic (2017) 28:57–85

Table 1 Benchmark problems (C characteristic, U unimodal, M multimodal, S separable, N non-separable, GM global minimum)
ID Name C Function Bounds GM

pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
P P
F01 Ackley MN f ðXÞ ¼ 20 exp 0:02 n1 ni¼1 x2i  expðn1 ni¼1 cosð2pxi ÞÞ þ 20 þ e [-31, 32] 0.00
Pn 2 Qn
F02 Griewank MN f ðXÞ ¼ 1 þ 4000 1 xiffi [-150, 150] 0.00
i¼1 xi  i¼1 cos
p
i
Pn1
F03 Rosenbrock UN f ðXÞ ¼ i¼1 100ðxiþ1  x2i Þ2 þ ðxi  1Þ2 [-05, 05] 0.00
P
F04 Rastrigin MS f ðXÞ ¼ ni¼1 ðx2i  10: cosð2pxi Þ þ 10Þ [-10, 10] 0.00
p Xn1 
F05 Penalized MN f ðXÞ ¼ f10 sin2 ðpy1 Þ þ ðyn  1Þ2 þ ðyi  1Þ2 : 1 þ 10 sin2 ðpyiþ1 Þ g [-50, 50] 0.00
nX i¼1
n
þ i¼1
uðxi ; 10; 100; 4Þ; yi ¼ 1 þ 0:25ðxi þ 1Þ
8 m
< kðxi  aÞ ;
> xi [ a
uðxi ; a; k; mÞ ¼ 0 a  xi  a
>
:
kðxi  aÞm xi \  a
P
F06 Dixon-Price UN f ðXÞ ¼ ðxi  1Þ2 þ ni¼2 ið2x2i  xi1 Þ2 [-10, 10] 0.00

Table 2 Success rate of CWCA I with different chaotic maps for Table 3 Success rate of CWCA II with different chaotic maps for
benchmark functions benchmark functions
Chaotic map F01 F02 F03 F04 F05 F06 Chaotic map F01 F02 F03 F04 F05 F06

Chebyshev 04 04 03 0 07 07 Chebyshev 01 10 07 02 0 03
Circle 02 18 41 04 12 19 Circle 45 27 04 49 07 21
Gauss/mouse 72 24 49 87 17 28 Gauss/mouse 71 20 31 78 18 27
Intermittency 73 60 38 71 34 14 Intermittency 75 51 36 67 14 27
ICMIC 97 71 42 92 27 21 ICMIC 88 27 07 72 15 35
Liebovitch 89 54 21 91 22 17 Liebovitch 89 48 15 84 12 30
Logistic 83 44 39 71 27 18 Logistic 74 37 17 13 24 25
Piecewise 85 47 39 89 29 31 Piecewise 87 40 25 74 19 15
Sawtooth 04 07 01 0 0 0 Sawtooth 01 02 0 0 09 04
Sine 85 62 14 15 29 34 Sine 81 63 30 42 17 34
Singer 87 70 04 44 32 35 Singer 64 31 28 27 18 32
Sinusoidal 92 38 37 71 33 35 Sinusoidal 95 61 39 51 20 34
Tent 45 30 19 44 24 17 Tent 34 20 17 05 16 14

5.2 Statistical analysis on unconstrained problems


• CWCA I with piecewise and sinusoidal maps outper-
forms CWCA II with sinusoidal signal.
Based on Derrac et al. [52], appropriate statistical tests
• CWCA III combined with intermittency map outper-
should be performed to compare the performance of meta-
forms the best version of CWCA I with ICMIC signal.
heuristic algorithms and substantiate the meaningful
• The performance of CWCA I version is totally better
improvements in new method compare to other techniques.
than CWCA II algorithm.
In order to statistically evaluate CWCA, Wilcoxon’s
• Whereas some chaotic signals can enhance the perfor-
signed-rank test [53] at 5 % significance level is conducted
mance of WCA optimizer, some of the maps such as
as a nonparametric statistical test. The p, T?, and T- values
sawtooth and Chebyshev lead to inefficacy of chaotic
are reported in Table 12. The general consideration is that
method.
p values \0.05 can be regarded as satisfactory evidence
Based on the success rate evaluations, sinusoidal map is versus the null hypothesis [41]. Note that ‘‘?’’ in Table 12
picked out as the satisfactory pattern through comparing shows cases in which the null hypothesis was rejected and
different chaos-embedded WCA variants to render the CWCA III demonstrated a statistically better performance
CWCA III as the best derivative of formal algorithm. in the problem-oriented statistical comparison tests at the

123
Neural Comput & Applic (2017) 28:57–85 67

Table 4 Success rate of CWCA III with different chaotic maps for Max. iteration = 1000) demonstrates a competent perfor-
benchmark functions mance in solving F01–F06 problems. For F01 problem,
Chaotic map F01 F02 F03 F04 F05 F06 original WCA can provide the best solutions with zero
standard deviations (SD) for dimension 30 and even higher
Chebyshev 01 16 04 0 11 24
dimensions. However, the CWCA III is not statistically
Circle 54 38 20 57 13 17 better than original method in solving F02 or F05 prob-
Gauss/mouse 77 32 29 51 24 14 lems, but still the success ratio and running time of the
Intermittency 92 84 62 84 20 18 improved method are slightly better than the WCA tech-
ICMIC 100 51 44 98 17 21 nique in all of the tests. For instance, the CPU time spent
Liebovitch 94 62 27 87 28 25 by CWCA III in solving F01 is 16.21 (s) compared to
Logistic 74 64 31 44 12 36 WCA which is 17.04 (s) (with 5.00E?04 NFEs). The
Piecewise 100 47 41 17 22 34 overall success rate of CWCA III is 394, while this value
Sawtooth 08 10 0 07 10 09 for WCA is 371. With respect to these values and robust
Sine 94 72 34 24 13 05 performance of standard WCA, it can be recognized that
Singer 81 79 57 72 18 22 the conventional WCA with proper chaotic operations is
Sinusoidal 94 92 59 87 27 35 still capable to demonstrate better performances against
Tent 61 37 27 17 12 18 unconstrained problems. These results also affirm the
improved exploration and exploitation capacities of
CWCA III for tackling unconstrained tasks.
Table 5 Comparison of the overall success rates for all benchmark From Tables 6, 7, 8, 9, 10, and 11, it can be seen that
problems improved CWCA III has demonstrated better performance
Chaotic map name CWCA I CWCA II CWCA III
in terms of CPU running time compared to other chaotic
methods. Simulations verify that the new CWCA III
Chebyshev 25 23 56 method is also capable to statistically outperform CWCA I
Circle 96 153 199 and CWCA II techniques. The improvement is apparent in
Gauss/mouse 277 245 227 average, minimum, maximum, and standard deviation of
Intermittency 290 270 360 the archived optimal solutions. The results in Table 12 also
ICMIC 350a 244 331 confirm that CWCA has explored statistically better results
Liebovitch 294 278 323 compared to the other chaotic techniques, with a statistical
Logistic 282 190 261 significance value a = 0.05.
Piecewise 320 260 261
Sawtooth 12 16 44 5.3 Solving constrained benchmark problems
Sine 239 267 242
Singer 272 200 329 In this section, new improved algorithm is benchmarked
Sinusoidal 306 300a 394a via 13 well-known constrained problems (G1–G13). The
Tent 179 106 172 equations of these constrained problems are represented in
a section ‘‘Appendix’’. The details of the implemented
The best maps (overall success rate of the standard method is 371)
problems are summarized in Table 13. The objective val-
ues are obtained by performing 30 independent runs of
95 % significance level (i.e., T? and T- are described in algorithms. The best results computed by CWCA and
[52]). To investigate the preciseness and robustness of WCA for G1–G13 are also reported in Tables 13 and 14,
proposed strategy, CWCA III with sinusoidal map is sta- respectively.
tistically compared to other chaotic versions and the stan- The best solutions in these tables indicate the capability
dard WCA [19] for all benchmark problems described in of a strategy to explore the optimal result, while the
Table 1. In all cases, simulations are carried out in 30 average results and standard deviation values reflect the
independent runs. In Tables 6, 7, 8, 9, 10, and 11, the robustness of meta-heuristic. From the results in Table 14,
obtained statistics of implemented optimizers have been it can be vividly recognized that new improved algorithm is
presented. In these tables, the better results are spotlighted able to explore the global solutions in all benchmark
in boldness. Note that the optimization results of CWCA I problems with a robust performance. The results of WCA
(with ICMIC map) and CWCA II (with sinusoidal signal) are represented in Table 15. From Tables 14 and 15, it can
are also reported in these tables. be seen that the chaotic algorithm reached better or equal
Due to the simplicity of these benchmarks, the original average running time than original WCA. In addition, the
optimizer (with Npop = 50, Nsr = 4, dmax = 1.00E-16, last column in Table 14 shows the statistical significance

123
68 Neural Comput & Applic (2017) 28:57–85

Table 6 Results of different


Method Min (best) Average Max (worst) SDa Time (s)
methods for F01 function
WCA 8.8817841E-16 8.8817841E-16 8.8817841E-16 0.00E?00 17.04
CWCA I 6.1107470E-11 6.1414771E-11 6.5217841E-11 1.00E-12 18.41
CWCA II 7.3121467E-10 8.8817841E-10 8.8817841E-10 1.12E-10 19.21
CWCA III 4.0017841E216 7.8817841E216 9.8817841E216 0.00E100 16.21
a
SD means standard deviation

Table 7 Results of different


Method Min (best) Average Max (worst) SD Time (s)
methods for F02 function
WCA 0.00E100 1.369275 E–16 4.107825E215 7.499825E216 09.69
CWCA I 1.00E?00 1.749307E?01 3.774882E?01 5.700014E?01 10.87
CWCA II 4.81E?00 4.401999E?01 8.710300E?01 4.391023E?01 14.87
CWCA III 0.00E?00 1.982558E-04 5.456772E-04 8.741476E-04 09.11

Table 8 Results of different


Method Min (best) Average Max (worst) SD Time (s)
methods for F03 function
WCA 0.00E?00 7. 98794E-01 2.396396E?01 4.375200E?00 12.05
CWCA I 1.001E-08 4.40573E-01 1.921420E?02 5.179984E?01 15.06
CWCA II 1.970E?00 1.79421E?00 8.170024E?03 1.145122E?02 17.01
CWCA III 0.00E100 2.78101E201 1.006187E101 2.485411E100 11.70

Table 9 Results of different


Method Min (best) Average Max (worst) SD Time (s)
methods for F04 function
WCA 0.00E?00 3.946670E?00 2.984877E?01 1.023530E?01 08.41
CWCA I 1.44E-17 1.258171E?01 4.893281E?02 8.941258E?01 12.14
CWCA II 1.01E?00 2.124885E?01 1.147415E?01 1.114753E?01 13.41
CWCA III 0.00E100 1.761741E100 3.147422E100 1.198623E100 07.19

Table 10 Results of different


Method Min (best) Average Max (worst) SD Time (s)
methods for F05 function
WCA 0.00E100 9.238908E220 1.943671E218 3.808960E219 09.32
CWCA I 1.18E-14 1.963145E?01 2.445075E?01 7.110445E?00 12.41
CWCA II 4.22E-08 3.881472E?01 5.158978E?01 2.433325E?01 12.22
CWCA III 0.00E?00 1.991701E-16 2.102863E-16 1.954012E-17 09.00

Table 11 Results of different


Name Min (best) Average Max (worst) SD Time (s)
methods for F06 function
WCA 2.451359E-09 8.888895E-02 6.666666E-01 2.304972E-01 06.60
CWCA I 1.417771E-03 1.548852E?01 2.114742E?02 2.741117E?01 09.18
CWCA II 3.419985E?01 4.774112E?01 7.217741E?01 3.110477E?01 11.74
CWCA III 0.00E100 1.314921E202 3.116433E201 1.014366E201 06.21

level derived from a paired one-tailed Student’s t test [52] the other investigated algorithms in comparison with
with 98 degree of freedom (df) at 0.05 level of significance obtained results of CWCA III. Considering the best solu-
between conventional and chaotic method. With regard to tions in Table 16, it can be recognized that new improved
obtained statistical results, new strategy demonstrates an algorithm can obtain practically high-quality or compara-
improved performance on 10 benchmarks compared to the ble solutions for these problems compared to the past
WCA optimizer. Table 16 shows the statistical results of proposed techniques.

123
Neural Comput & Applic (2017) 28:57–85 69

Table 12 Identifying the


Problem CWCA III versus WCA CWCA III versus CWCA I CWCA III versus CWCA II
method that is capable to
? - ? -
statistically compute the best p value T T W p value T T W p value T? T- W
results by using two-sided
Wilcoxon signed-rank test with F01 0.202 12 9 = 0.031 17 4 ? 0.0003 21 0 ?
a = 0.05 (W: winner) F02 0.177 6 15 = 0.022 21 0 ? 0.035 20 1 ?
F03 0.723 8 13 = 0.000183 20 1 ? 0.002 21 0 ?
F04 0.812 14 7 = 0.0032 21 0 ? 0.001 21 0 ?
F05 0.042 4 17 – 0.000183 21 0 ? 0.000183 21 0 ?
F06 0.721 3 18 = 0.000183 21 0 ? 0.0002 20 1 ?

Table 13 Characteristics of the implemented problems [54]


ID Function type Problem type No. of variables No. of constraints No. of active constraints q (%)a Global optimum

G1 Quadratic Min 13 9 [I] ? 0 [E] 6 0.0003 -15


G2 Nonlinear Max 20 2 [I] ? 0 [E] 1 99.9962 -0.803619
G3 Polynomial Max 10 0 [I] ? 1 [E] 1 0.0002 1
G4 Quadratic Min 5 6 [I] ? 0 [E] 2 26.9089 -30,665.539
G5 Cubic Min 4 2 [I] ? 3 [E] 3 0 5126.4981
G6 Cubic Min 2 2 [I] ? 0 [E] 2 0.0065 -6961.81388
G7 Quadratic Min 10 8 [I] ? 0 [E] 6 0.0001 24.30621
G8 Nonlinear Max 2 2 [I] ? 0 [E] 0 0.8488 0.095825
G9 Polynomial Min 7 4 [I] ? 0 [E] 2 0.5319 680.6300573
G10 Linear Min 8 6 [I] ? 0 [E] 3 0.0005 7049.3307
G11 Quadratic Min 2 0 [I] ? 1 [E] 1 0.0099 0.75
G12 Quadratic Min 3 93 [I] ? 0 [E] 0 4.7452 1
G13 Exponential Min 5 0 [I] ? 3 [E] 3 0 0.0539498
a
The ratio of the size of the feasible solution space to the size of the entire solution space

Table 14 Best results obtained by new method for constrained benchmark problems
ID Global optimum CWCA CWCA versus WCA
Best Average Worst SD Average p value Paired t test
run time (s)

G1 -15 -15.00000102 -14.67319699 -12.81320103 1.40E?00 13.4180237 \0.001 Extremely significant


G2 -0.803619 -0.770041722 -0.512411047 -0.254140793 1.80E-01 12.5922403 3.10E-01 Not quite significant
G3 1 1.004091113 0.923079924 0.934110741 2.31E-01 13.8730114 2.34E-02 Very significant
G4 -30,665.539 -30,665.5760 -30,665.5212 -30,665.5010 2.01E-04 13.4907522 \0.001 Extremely significant
G5 5126.4981 5127.871027 5401.044243 6113.100476 4.97E?02 10.0019772 2.48E-02 Very significant
G6 -6961.81388 -6963.914303 -6963.617273 -6963.501223 1.83E-04 08.9427411 \0.001 Extremely significant
G7 24.30621 24.39271859 27.998076241 31.41437829 1.97E?00 13.4767477 \0.001 Extremely significant
G8 0.095825 0.095825037 0.095825037 0.095825037 7.17E-14 09.9974280 4.71E-02 Very significant
G9 680.6300573 680.6430774 680.7710412 680.9607224 7.87E-02 10.5020937 4.73E-02 Very significant
G10 7049.3307 7102.710279 9512.110245 18,321.00114 2.37E?03 09.5141023 \0.001 Extremely significant
G11 0.75 0.748819932 0.748900043 0.748943712 1.12E?00 09.9862918 2.01E-01 Not quite significant
G12 1 1.000000000 1.000000000 1.000000000 1.01E-17 27.3101762 2.43E-01 Not quite significant
G13 0.0539498 0.512074938 1.513117413 10.41396776 3.77E?00 09.5235221 \0.001 Extremely significant

123
70 Neural Comput & Applic (2017) 28:57–85

Table 15 Best results obtained by WCA optimizer for constrained benchmark problems
ID Global optimum WCA
Best Average Worst SD Average run time (s)

G1 -15 -15.00149388 -14.6742245 -12.7613484 1.62135775 13.4275671


G2 -0.803619 -0.77028617 -0.469827578 -0.25306993 0.13710059 13.6898369
G3 1 1.005032741 0.936548742 0.93632842 0.25458992 13.9045188
G4 -30,665.539 -30,666.75111 -30,666.74930 -30,666.7374 0.00355035 13.5006919
G5 5126.4981 5128.879769 5442.659978 6112.216177 527.812787 10.4776145
G6 -6961.81388 -6964.139666 -6964.136649 -6964.129661 0.00242975 09.3291432
G7 24.30621 24.47779949 28.105921756 32.79760011 2.38927372 13.5131228
G8 0.095825 0.095825041 0.095825041 0.095825041 3.5229E-13 10.1286982
G9 680.6300573 680.6517860 680.7722062 680.9591332 0.07991533 10.5107772
G10 7049.3307 7109.138173 9531.775360 18,342.00464 2443.65135 09.5257118
G11 0.75 0.748900000 0.748912936 0.748954221 1.52349079 10.3258060
G12 1 1.000000000 1.000000000 1.000000000 1.2711E-16 27.3580122
G13 0.0539498 0.557585296 1.784293687 21.09655666 3.76118175 09.5632694

5.4 Solving constrained engineering problems thickness of the head, Th = x2, inner radius of the vessel,
R = x3, and the length of the vessel without heads, L = x4.
Due to the nonlinearity of multiconstrained design prob- Ts and Th are integer multiples of 0.0625 inches, and
lems, several algorithms have been developed to solve available thickness of rolled steel plates, R, and L are
them in an efficient way. In this section, the CWCA III is continuous. Therefore, the mathematical model of the
evaluated against the previous robust methods by using a pressure vessel problem can be expressed as follows:
set of well-studied engineering problems including pres-
sure vessel design, welded beam design, compression min f ðXÞ ¼ 0:6224x1 x3 x4 þ 1:7781x2 x23 þ 3:1661x21 x4
x
string design, and Himmelblau problem. These design test þ 19:84x21 x3
cases have been previously optimized by using a wide
ð44Þ
variety of meta-heuristic strategies. Hence, these types of
problems are also appropriate to statistically substantiate s:t: g1 ðXÞ ¼ x1 þ 0:0193x3  0
the effectiveness of the new improved CWCA III tech- g2 ðXÞ ¼ x2 þ 0:00954x3  0
nique. In these implementations, the same constraint han- ð45Þ
g3 ðXÞ ¼ px23 x4  4=3  px33 þ 1296000  0
dling methodology as utilized in conventional WCA meta-
heuristic [19] is employed to tackle the constrained prob- g4 ðXÞ ¼ x4  240  0
lems. The rest of this section is devoted to the implemen- In several works, this problem has been solved consid-
tation of CWCA III for solving the following problems. ering the following variable bounds:

Region I : 1  0:0625  x1 ; x2  99
5.4.1 Pressure vessel design
 0:0625; 10  x3 ; x4  200 ð46Þ
The first test case is the challenge of pressure vessel design, Up to now, several techniques have been utilized to
which is regarded as a well-known structural optimization solve described problem including improved ACO [61],
task. A schematic of a cylindrical vessel capped on both genetic adaptive search [62], branch and bound technique
ends by hemispheric heads is sketched in Fig. 3. This air [63], GA-based co-evolution model [64], augmented
tank has a minimum volume of 750 ft3 with a working Lagrangian multiplier method [65], co-evolutionary PSO
pressure of 3000psi, which is designed based on the [66], evolution strategy [67], feasibility-based tournament
American Society of Mechanical Engineers (ASME) boiler selection [68], CS [69], hybrid algorithm based on PSO
and pressure vessel code. The main objective of this with passive congregation [70], and quantum-behaved PSO
problem is designed to minimize the total cost, involving a [71]. The comparative results are exposed in Table 17
combination of single 60L welding cost, material, and under the Region I, while their corresponding statistical
forming cost. The design variables associated with this results are summarized in Table 18. Referring to the
problem are thickness of the pressure vessel, Ts = x1, results, yielded solutions by CWCA III are better compared

123
Neural Comput & Applic (2017) 28:57–85 71

Table 16 Statistical features of the best results computed by different approaches (NA means not available)
ID and Statistical Montes and Becerra Gandomi Tessema Hedar and Koziel and Cabrera Present study
Optimum features Coello [55] and Coello et al. [54] and Yen Fukushima Michalewicz and Coello
[56] [57] [58] [59] [60]

G1 Best -15.00000 -15.00000 -15.00000 -15.00000 -14.99911 -14.78640 -15.00000 -15.00000102


-15.00000 Average -14.79200 -15.00000 -14.65818 -14.96600 -14.99332 -14.70820 13.27340 -14.67319699
Worst -12.74300 -14.99999 -12.45309 -13.09700 -14.97998 -14.61540 -9.70120 -12.81320103
SD NA 2.00E-06 7.05E-01 7.00E-01 4.81E-03 NA 1.41E?00 1.40E?00
G2 Best 0.803619 0.803619 0.803619 -0.803202 0.754913 0.799530 0.803620 0.770041722
0.803619 Average 0.746236 0.724886 0.753470 -0.789398 0.371708 0.796710 0.777143 0.512411047
Worst 0.302179 0.590908 0.562553 -0.745712 0.271311 0.791190 0.711603 0.254140793
SD NA 7.01E-02 5.40E-02 1.33E-01 9.80E-02 NA 1.91E-02 1.80E-01
G3 Best 1.00000 0.99541 0.99998 -1.00000 1.00000 0.99970 1.00040 1.004091113
1.00000 Average 0.64000 0.78864 0.98955 -0.97100 0.99919 0.99890 0.99360 0.923079924
Worst 0.02900 0.63992 0.93641 -0.88700 0.99152 0.99780 0.66740 0.934110741
SD NA 1.15E-01 1.71E-02 3.01E-01 1.65E-03 NA 4.71E?02 2.31E-01
G4 Best -30,665.54 -30,665.54 -30,665.54 -30,665.40 -30,665.54 -30,664.50 30,665.54 -30,666.75111
-30,665.54 Average -30,592.15 -30,665.54 -30,665.54 -30,663.92 -30,665.47 -30,655.30 30,665.54 -30,665.5212
Worst -29,986.21 -30,665.54 -30,665.54 -30,656.47 -30,664.69 -30,645.90 30,665.53 -30,665.5010
SD NA 0.00E?00 0.00E?00 2.04E?00 1.73E-01 NA 6.83E?04 2.01E-04
G5 Best 5126.497 5126.571 5126.499 5126.907 5126.498 NA 5126.647 5127.871027
5126.498 Average 5218.729 5207.411 5129.425 5208.897 5126.498 NA 5495.239 5401.044243
Worst 5502.410 5327.390 5181.474 5564.642 5126.498 NA 6272.742 6113.100476
SD NA 6.92E?01 1.09E?01 2.47E?02 0.00E?00 NA 4.05E?02 4.973E?02
G6 Best -6961.814 -6961.814 -6961.814 -6961.046 -6961.814 -6952.100 6961.837 -6963.914303
-6961.814 Average -6367.575 -6961.814 -6961.814 -6953.823 -6961.814 -6342.600 6961.837 -6963.617273
Worst -2236.950 -6961.814 -6961.814 -6943.304 -6961.814 -5473.900 6961.836 -6963.501223
SD NA 0.00E?00 0.00E?00 5.88E?00 0.00E?00 NA 2.61E?04 1.83E-04
G7 Best 24.3060 24.3062 24.3062 24.8380 24.3106 24.6200 24.3278 24.39271859
24.3060 Average 104.5990 24.3062 24.3065 25.4150 24.3795 24.8260 24.6996 27.998076241
Worst 1120.5410 24.3062 24.3080 33.0950 24.6444 25.0690 25.2962 31.41437829
SD NA 1.00E-06 3.80E-04 2.17E?00 7.16E-02 NA 2.52E?01 1.97E?00
G8 Best 0.095825 0.095825 0.095825 -0.095825 0.095825 0.095825 0.095825 0.095825037
0.095825 Average 0.091292 0.095825 0.095825 -0.095825 0.095825 0.089157 0.095825 0.095825037
Worst 0.027188 0.095825 0.095825 -0.092697 0.095825 0.029144 0.095825 0.095825037
SD NA 0.00E?00 0.00E?00 1.06E–03 0.00E?00 NA 0.00E?00 7.17E-14
G9 Best 680.6300 680.6301 680.6301 680.7700 680.6301 680.9100 680.6307 680.6430774
680.6300 Average 692.4720 680.6301 680.6301 681.2400 680.6364 681.1600 680.6391 680.7710412
Worst 839.7800 680.6301 680.6301 682.0800 680.6983 683.1800 680.6671 680.9607224
SD NA 0.00E?00 0.00E?00 3.22E-01 1.45E-02 NA 6.68E?03 7.87E-02
G10 Best 7049.248 7049.248 7049.261 7069.981 7059.864 7147.900 7090.452 7102.710279
7049.248 Average 8442.660 7049.248 7049.471 7201.017 7509.321 8163.600 7747.630 9512.110245
Worst 15,580.370 7049.248 7051.782 7489.406 9398.649 9659.300 10,533.666 18,321.00114
SD NA 1.67E-04 4.91E-01 1.38E?02 5.42E?02 NA 5.52E?02 2.377E?03
G11 Best 0.750000 0.749900 0.750000 0.750000 0.749999 0.750000 0.749900 0.748819932
0.750000 Average 0.760000 0.757995 0.750000 0.750000 0.749999 0.750000 0.767300 0.748900043
Worst 0.870000 0.796455 0.750000 0.760000 0.749999 0.750000 0.992500 0.748943712
SD NA 1.71E-02 1.00E-08 2.00E-03 0.00E?00 NA 6.00E-02 1.12E?00
G12 Best 1.000000 1.000000 1.000000 -1.000000 -1.000000 -1.000000 1.000000 1.000000000
1.000000 Average 1.000000 1.000000 1.000000 -1.000000 -1.000000 -0.999135 1.000000 1.000000000

123
72 Neural Comput & Applic (2017) 28:57–85

Table 16 continued
ID and Statistical Montes and Becerra Gandomi Tessema Hedar and Koziel and Cabrera Present study
Optimum features Coello [55] and Coello et al. [54] and Yen Fukushima Michalewicz and Coello
[56] [57] [58] [59] [60]

Worst 1.000000 1.000000 1.000000 -1.000000 -1.000000 -0.991950 1.000000 1.000000000


SD NA 0.00E?00 0.00E?00 1.41E-04 0.00E?00 NA 0.00E?00 1.012E-17
G13 Best 0.053866 0.056180 0.053950 0.053941 0.053950 NA 0.059410 0.512074938
0.053950 Average 0.747227 0.288324 0.053952 0.054713 0.297720 NA 0.813350 1.513117413
Worst 2.259875 0.392100 0.053972 0.885276 0.438851 NA 2.444150 10.41396776
SD NA 1.67E-01 5.65E-06 0.275000 1.89E-01 NA 3.81E?01 3.77E?00

Fig. 3 Pressure vessel. a Schematic, b stress heat map, c displacement heat map [51]

to original WCA and other investigated optimizers. Hence, objective is to find the minimum fabricating cost of the
the improved CWCA III strategy is able to demonstrate a welded beam structure subject to some constraints on shear
competent performance in tackling this constrained stress (s), bending stress (r), buckling load (Ps), end
problem. deflection (d), and side restrictions. There are some vari-
In order to examine the entire constrained domain, the ables related to the problem such as thickness of the weld
upper limit of the variable x4 has extended to 240 [82], i.e., h = x1, length of the welded joint l = x2, width of the
range of decision variables is updated to: beam t = x3, and the thickness of the beam b = x4. Hence,
the decision vector is formed as X = (h, l, t, b) = (x1, x2,
Region II : 1  0:0625  x1 ; x2  99 x3, x4). For this problem, the mathematical formulation can
 0:0625; 10  x3  200; 10  x4  240; ð47Þ be formed as follows:

In this region, Gandomi et al. [81], Dimopoulos [82], min f ðXÞ ¼ 1:10471x21 x2 þ 0:04811x3 x4 ð14 þ x2 Þ ð48Þ
x
Mahdavi et al. [50], and Hedar and Fukushima [58] solved
s:t: g1 ðXÞ ¼ sðXÞ  smax  0
the problem by previously mentioned methods. Table 17
represents the best solution vectors attained by different g2 ðXÞ ¼ rðXÞ  rmax  0
methods including CWCA III in Region II. From Table 18, g3 ðXÞ ¼ x1  x4  0
ð49Þ
it can be realized that the performance of CWCA III is still g4 ðXÞ ¼ 0:125  x1  0
better than some of the other optimizers. Solutions obtained g5 ðXÞ ¼ dðXÞ  0:25  0
by modified WCA are preferable compared to WCA
g6 ðXÞ ¼ P  Pc ðXÞ  0
strategy. The time elapsed for one run of CWCA III opti-
mizer in regions I and II is 7.901 (s) and 8.127(s), 0:1  x1  2; 0:1  x2  10; 0:1  x3  10 ; 0:1  x4  2
respectively. ð50Þ
where s is the shear stress in the weld, smax is the maximum
5.4.2 Welded beam design shear stress of the weld, r is the normal stress on the beam,
rmax is the allowable normal stress of the beam material, Pc
Welded beam design, which is depicted in Fig. 4, is chosen stands for the bar buckling load, P the load, and d is the
as a fundamental engineering problem frequently described beam end deflection. The shear stress s has mainly two
throughout the literature to evaluate the performance of ingredients, including primary stress (s1) and secondary
different optimizers [83]. In this problem, the primary stress (s2), which is given as:

123
Neural Comput & Applic (2017) 28:57–85 73

Table 17 Comparison of the best solution for pressure vessel design problem found by different methods
Region Method Design variables Cost
x1 x2 x3 x4 f(X)

I Sandgren [63] 1.125000 0.6250 47.700000 117.701000 8129.1036


Deb and Gene [62] 0.937500 0.5000 48.329000 112.67900 6410.3811
Hu et al. [72] 0.8125 0.4375 42.098450 176.6366000 6059.131296
Zhang and Wang [73] 1.1250 0.6250 58.290000 43.6930000 7197.7000
Kaveh and Talatahari [61] 0.8125 0.4375 42.098353 176.637751 6059.7258
Montes and Coello [67] 0.8125 0.4375 42.098087 176.640518 6059.7456
Gandomi et al. [69] 0.8125 0.4375 42.0984456 176.6365958 6059.7143348
Kaveh and Talatahari [70] 0.8125 0.4375 42.103566 176.573220 6059.0925
Lee and Geem [74] 1.1250 0.6250 58.278900 43.75490000 7198.433
Montes et al. [75] 0.8125 0.4375 42.098446 176.6360470 6059.701660
Coello [64] 0.8125 0.4375 40.323900 200.000000 6288.7445
Akay and Karaboga [76] 0.8125 0.4375 42.098446 176.636596 6059.714339
Cagnina et al. [77] 0.8125 0.4375 42.098445 176.6365950 6059.714335
He and Wang [66] 0.8125 0.4375 42.091266 176.746500 6061.0777
Coello and Montes [68] 0.8125 0.4375 42.097398 176.654050 6059.946
Kannan and Kramer [65] 1.1250 0.6250 58.291000 43.690000 7198.0428
Coelho [71] 0.8125 0.4375 42.098400 176.6372000 6059.7208
He et al. [78] 0.8125 0.4375 42.098445 176.6365950 6059.7143
Akhtar et al. [79] 0.8125 0.4375 41.9768 182.2845 6171.000
Mirjalili et al. [80] 0.8125 0.4375 42.0907382 176.738690 6060.8066
Mirjalili et al. [51] 0.8125 0.4345 42.089181 176.758731 6051.5639
Sadollah et al. [22] NA NA NA NA 5885.3327
Original WCA 0.777168 0.383649 40.319618 200.0000000 5875.1957629
Present study 0.777102 0.383127 40.319531 200.0000000 5873.1874916
II Mahdavi et al. [50] 0.75 0.375 38.86010 221.36553 5849.76169
Gandomi et al. [81] 0.75 0.375 38.86010 221.36547 5850.38306
Hedar and Fukushima [58] 0.7683257 0.3797837 39.8096222 207.2255595 5868.764836
Dimopoulos [82] 0.75 0.375 38.86010 221.36549 5850.38306
Original WCA 0.728028 0.359359 37.773496 238.757085 5796.1850718
Present study 0.727612 0.359656 37.771023 238.510803 5788.6492866

qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi pffiffiffi G ¼ 12  106 ½psi; E ¼ 30  106 ½psi; P ¼ 6000 ½lb;


sðXÞ ¼ s21 þ 2s1 s2 ðx2 =2RÞ þ s22 ; s1 ¼ P= 2x1 x2 ;
L ¼ 14 ½in; rmax ¼ 30000 ½psi; smax ¼ 13600 ½psi;
s2 ¼ MR=J;
dmax ¼ 0:25 ½in;
ð51Þ
ð54Þ
where
   Hwang and He [84] and Deb [85] solved welded beam
x3 x1 x2 x22 x1 þ x3 2 design problem by using GA-based approaches. Lee and
M ¼ PðL þ Þ; JðXÞ ¼ 2 pffiffiffi þð Þ ;
2 2 12 2 Geem [74] and He et al. [78] employed HS and PSO to
ð52Þ solve the described problem, respectively. Ragsdell and
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi Phillips [86] compared the optimal results of APPROX,
R ¼ x22 =4 þ ððx1 þ x3 Þ=2Þ2 ; rðXÞ¼6PL=x4 x23 ; DAVID, SIMPLEX, and RANDOM algorithms. The opti-
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
mal solutions discovered by proposed improved technique
dðXÞ ¼ 4PL =Ex3 x4 ; Pc ðXÞ ¼ 4:013 EGx23 x64 =36=L2
3 3
are listed in Table 19, under the version I section, along
pffiffiffiffiffiffiffiffiffiffiffiffi with the best solutions obtained by some other compared
1  x3 =2L E=4G ; ð53Þ methods [50, 75, 82, 85–88]. In addition, the statistical

123
74 Neural Comput & Applic (2017) 28:57–85

Table 18 Statistical results of different methods for pressure vessel problem


Region Method Best Average Worst SD Median

I Sandgren [63] 8129.1036 NA NA NA NA


Cagnina et al. [77] 6059.714335 6092.0498 NA 12.1725 NA
Coelho [71] 6059.7208 6440.3786 7544.4925 448.4711 6257.5943
Montes and Coello [67] 6059.7456 6850.0049 7332.8798 426.0000 NA
Kaveh and Talatahari [61] 6059.7258 6081.7812 6150.1289 67.2418 NA
He and Wang [66] 6061.0777 6147.1332 6363.8041 86.4545 NA
Kaveh and Talatahari [70] 6059.0925 6075.2567 6135.3336 41.6825 NA
Gandomi et al. [69] 6059.714 6447.7360 6495.3470 502.693 NA
He et al. [78] 6059.7143 6289.92881 NA 305.78 NA
Akay and Karaboga [76] 6059.714339 6245.308144 NA 205 NA
Kannan and Kramer [65] 7198.0428 NA NA NA NA
Deb and Gene [62] 6410.3811 NA NA NA NA
Coello [64] 6288.7445 6293.8432 6308.1497 7.4133 NA
Coello and Montes [68] 6059.9463 6177.2533 6469.3220 130.9297 NA
Akhtar et al. [79] 6171.000 6335.050 6453.650 NA NA
Mirjalili et al. [80] 6060.8066 NA NA NA NA
Mirjalili et al. [51] 6051.5639 NA NA NA NA
Sadollah et al. [22] 5885.3327 5933.1276 6326.7716 153.43 NA
Original WCA 5875.1957629 6405.9792843 7307.6943748 488.7897438 6226.0137891
Present study 5873.1874916 5912.4135822 6421.7458001 443.2158852 6112.7933402
II Dimopoulos [82] 5850.38306 NA NA NA NA
Mahdavi et al. [50] 5849.7617 NA NA NA NA
Hedar and Fukushima [58] 5868.764836 6164.585867 6804.328100 257.473670 NA
Gandomi et al. [81] 5850.38306 5937.33790 6258.96825 164.54747 NA
Original WCA 5796.1850710 6156.278684 7030.496501 322.655923 6099.581546
Present study 5788.6492866 6103.774173 6971.110752 294.490733 6100.186102

Fig. 4 Welded beam design. a Schematic, b stress heat map, c displacement heat map [51]

simulation results are represented in Table 20. With regard CWCA III is able to establish an operative strategy to
to Table 20, it may be recognized that the best, average, compromise between quality of the computed solutions and
worst, standard deviation, and median of the solutions computational burden.
achieved by CWCA III are better compared to WCA and In the literature, various authors including [58, 65, 69, 78,
some other optimizers. The consumed time of CWCA III 91] and also [62, 67, 68, 71, 73, 76, 81, 90] utilized different
and WCA is equal to 10.142 (s) and 11.57 (s), corre- versions of the welded beam optimization problem. In this
spondingly. Hence, it can be identified that the improved version, some constraints such as deflection d(X), buckling

123
Neural Comput & Applic (2017) 28:57–85 75

Table 19 Comparison of the best solution for welded beam problem found by different methods
Version Method Design variables Cost
x1 x2 x3 x4 f (X)

I Rao [83] 0.245500 6.196000 8.273000 0.245500 2.3860


Ray and Liew [88] 0.244438276 6.2379672340 8.2885761430 0.2445661820 2.3854347
Ragsdell and Phillips [86] 0.245500 6.196000 8.273000 0.245500 2.385937
Deb [87] NA NA NA NA 2.38119
Mehta and Dasgupta [89] 0.24436895 6.21860635 8.29147256 0.24436895 2.3811341
Lee and Geem [74] 0.2442 6.2231 8.2915 0.2443 2.38
Hwang and He [84] 0.223100 1.5815 12.84680 0.2245 2.25
Deb [85] 0.248900 6.173000 8.178900 0.253300 2.433116
Akhtar et al. [79] 0.2407 6.4851 8.2399 0.2497 2.4426
Original WCA 0.24438213 6.22147223 8.29148721 0.24436824 2.38164462
Present study 0.24436271 6.22147416 8.29028563 0.24436790 2.38129130
II Coello [64] 0.208800 3.420500 8.997500 0.210000 1.748309
Kaveh and Talatahari [70] 0.205729 3.469875 9.036805 0.205765 1.724849
He and Wang [66] 0.202369 3.544214 9.048210 0.205723 1.728024
Dimopoulos [82] 0.2015 3.5620 9.041398 0.205706 1.731186
Coello and Montes [68] 0.205986 3.471328 9.020224 0.206480 1.728226
Montes and Coello [67] 0.199742 3.612060 9.037500 0.206082 1.73730
Montes et al. [75] 0.205730 3.470489 9.036624 0.205730 1.724852
Cagnina et al. [77] 0.205729 3.470488 9.036624 0.205729 1.724852
Hedar and Fukushima [58] 0.205644261 3.472578742 9.03662391 0.2057296 1.7250022
Hu et al. [72] 0.20573 3.47049 9.03662 0.20573 1.72485084
Fesanghary et al. [90] 0.20572 3.47060 9.03682 0.20572 1.7248
Gandomi et al. [81] 0.2015 3.562 9.0414 0.2057 1.73121
Mahdavi et al. [50] 0.20573 3.47049 9.03662 0.20573 1.7248
Kaveh and Talatahari [61] 0.205700 3.471131 9.036683 0.205731 1.724918
Akay and Karaboga [76] 0.205730 3.470489 9.036624 0.205730 1.724852
Mehta and Dasgupta [89] 0.20572885 3.47050567 9.03662392 0.20572964 1.724855
Mirjalili et al. [80] 0.205463 3.473193 9.044502 0.205695 1.72645
Mirjalili et al. [51] 0.205676 3.478377 9.03681 0.205778 1.72624
Original WCA 0.206729489 3.449090388 9.036645805 0.205729523 1.723512139
Present study 0.205724510 3.45017429 9.036631224 0.205728114 1.722067177

load Pc(X), and polar moment of inertia J(X) have taken Statistical results of the best, median, average, worst, and
along with another constraint g7(X) as follows: the standard deviation of the solutions obtained using
CWCA III are tabulated in Table 20. Based on this table, it
g7 ðXÞ ¼ 0:10471x21 þ 0:04811x3 x4 ð14 þ x2 Þ  5  0;
can be found that the best vector attained by the conven-
ð55Þ tional WCA is preferable compared to the solutions dis-
qffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
covered by other techniques. However, the results found by
dðXÞ ¼ 6PL3 =Ex33 x4 ; Pc ðXÞ ¼ 4:013E x23 x64 =36=L2
CWCA III are also superior to the results revealed by WCA
pffiffiffiffiffiffiffiffiffiffiffiffi strategy. Therefore, the proposed improved algorithm can
 1  ðx3 =2LÞ E=4G ;
successfully show an efficient, superior performance in
npffiffiffi h io
JðXÞ ¼ 2 2x1 x2 x22 =4 þ ððx1 þ x3 Þ=2Þ2 ð56Þ dealing with this constrained problem.

The complete results are presented in Table 19 under 5.4.3 Tension–compression string design
version II section. With regard to these solutions, it is
recognized that the proposed CWCA III demonstrates a This problem was described by Arora [91] to minimize the
superior performance compared to other algorithms. weight of a tension–compression spring subject to some

123
76 Neural Comput & Applic (2017) 28:57–85

Table 20 Statistical results of different methods for welded beam design


Version Method Best Average Worst SD Median

I Hwang and He [84] 2.25 2.26 2.28 NA NA


Deb [85] 2.433116 NA NA NA NA
Ray and Liew [88] 2.3854347 3.2551371 6.3996785 0.9590780 3.0025883
Rao [83] 2.3860 NA NA NA NA
Lee and Geem [74] 2.38 NA NA NA NA
Deb [87] 2.38119 NA NA NA NA
Akhtar et al. [79] 2.4426 NA NA NA NA
Original WCA 2.38164462 4.17413855 7.19985202 3.15077541 3.64120048
Present study 2.38129130 4.00158207 6.41076384 2.98710336 3.44175852
II Montes et al. [75] 1.724852 1.725 NA 1E-15 NA
Mehta and Dasgupta [89] 1.724855 1.724865 1.72489 NA 1.724861
He and Wang [66] 1.728024 1.748831 1.782143 0.012926 NA
Akay and Karaboga [76] 1.724852 1.741913 NA 0.031 NA
Kaveh and Talatahari [61] 1.724918 1.729752 1.775961 0.009200 NA
Cagnina et al. [77] 1.724852 2.0574 NA 0.2154 NA
Hedar and Fukushima [58] 1.7250022 1.7564428 1.8843960 0.0424175 NA
Montes and Coello [67] 1.737300 1.813290 1.994651 0.070500 NA
Coello [64] 1.748309 1.771973 1.785835 0.011220 NA
Coello and Montes [68] 1.728226 1.792654 1.993408 0.07471 NA
Kaveh and Talatahari [70] 1.724849 1.727564 1.759522 0.008254 NA
Gandomi et al. [81] 1.7312065 1.8786560 2.3455793 0.2677989 NA
Dimopoulos [82] 1.731186 NA NA NA NA
Mirjalili et al. [80] 1.72645 NA NA NA NA
Mirjalili et al. [51] 1.72624 NA NA NA NA
Original WCA 1.723512139 1.730328948 1.782538969 0.013428392 1.724286844
Present study 1.722067177 1.731371086 1.797167139 0.012441997 1.721147938

Fig. 5 Tension–compression string: a schematic, b stress heat map, c displacement heat map [51]

constraints on minimum deflection (g1), shear stress (g2), x32 x3


surge frequency (g3), and limits on outside diameters (g4). s:t: g1 ðXÞ ¼ 1   0;
71785x41
The design variables are the mean coil diameter (x1), the
4x22  x1 x2 1
wire diameter (x2), and the number of active coil (x3). The g2 ðXÞ ¼  þ  1  0;
12566ðx2 x31  x41 Þ 5108x21
problem is depicted in Fig. 5.
The mathematical formulation of this problem can be 140:45x1
g3 ðXÞ ¼ 1   0;
described as follows: x22 x3
x1 þ x2
min f ðXÞ ¼ ðx3 þ 2Þx2 x21 ð57Þ g4 ðXÞ ¼  1  0; ð58Þ
x 1:5

123
Neural Comput & Applic (2017) 28:57–85 77

Table 21 Comparison of the best solution for tension–compression string design by various methods
Method Design variables Cost
x1 x2 x3 f (X)

Belegundu [92] 0.05 0.315900 14.25000 0.0128334


Montes and Coello [67] 0.051643 0.355360 11.397926 0.012698
Omran and Salman [93] 0.0516837458 0.3565898352 11.2964717107 0.0126652375
Ray and Saini [94] 0.050417 0.321532 13.979915 0.013060
Coello and Montes [68] 0.051989 0.363965 10.890522 0.0126810
Ray and Liew [88] 0.0521602170 0.368158695 10.6484422590 0.01266924934
Arora [91] 0.053396 0.399180 9.185400 0.0127303
Tsai [95] 0.05168906 0.3567178 11.28896 0.01266523
Hu et al. [72] 0.051466369 0.351383949 11.60865920 0.0126661409a
Montes et al. [75] 0.051688 0.356692 11.290483 0.012665
He and Wang [66] 0.051728 0.357644 11.244543 0.0126747
Cagnina et al. [77] 0.051583 0.354190 11.438675 0.012665
Zhang et al. [96] 0.0516890614 0.3567177469 11.2889653382 0.012665233
He et al. [78] 0.05169040 0.35674999 11.28712599 0.0126652812a
Keveh and Talatahari [61] 0.051865 0.361500 11.00000 0.0126432a
Coelho [71] 0.051515 0.352529 11.538862 0.012665
Coello [64] 0.051480 0.351661 11.632201 0.01270478
Hedar and Fukushima [58] 0.05174250340926 0.35800478345599 11.21390736278739 0.012665285
Raj et al. [97] 0.05386200 0.41128365 8.68437980 0.01274840
Akay and Karaboga [76] 0.051749 0.358179 11.203763 0.012665
Mahdavi et al. [50] 0.05115438 0.34987116 12.0764321 0.0126706
Mirjalili et al. [51] 0.05169 0.356737 11.28885 0.012666
Original WCA 0.0517208702 0.3579276279 11.1912042488 0.012630231a
Present study 0.0517091014 0.3571073363 11.2708257743 0.012671577
a
Infeasible results

0:05  x1  2; 0:25  x2  1:3; 2  x3  15 ð59Þ which reveals that the improved CWCA III is capable to
Arora [91] utilized a numerical optimization technique provide reasonably feasible, accurate results in solving this
known as constraint correction at the constant cost. problem. Referring to Table 22, it is known that the solu-
Belegundu [92] solved this problem by eight different tions realized by WCA are almost superior. Therefore,
mathematical optimizers, and the best results are shown competent performance of the conventional optimizer in
here. Montes and Coello [67] employed various evolu- tackling this problem is also substantiated here. Addition-
tion strategies to tackle this problem. He and Wang [66] ally, the consumed time of CWCA III and WCA for this
utilized a co-evolutionary PSO algorithm (CPSO) to problem is 10.032 (s) and 11.76 (s), respectively.
handle this problem. Coello [64] and Montes [68] solved
this problem by using a GA-based method. Table 21 5.4.4 Himmelblau nonlinear optimization problem
shows the best results found by CWCA III in compar-
ison with the solutions reported by the other literature. Finally, CWCA III is benchmarked using Himmelblau
The statistical simulation results are also exposed in problem, which has been proposed in [98]. This problem
Table 22. includes five positive design variables X = [x1, x2, x3, x4,
From Table 21, it can be realized that the best solution x5], six nonlinear inequality constraints, and ten boundary
obtained by WCA is superior to the best solutions found by conditions. This optimization problem can be described as
the CWCA III, but it is infeasible. The solutions provided follows:
by Kaveh and Talatahari [61], Hu et al. [72], and He et al.
min f ðXÞ ¼ 5:3578547x23 þ 0:8356891x1 x5 þ 37:293239x1
[78] are infeasible also, since some of the constraints are x
violated. From the results, it can be seen that the standard  40792:141
deviation of the results of CWCA III is better than WCA, ð60Þ

123
78 Neural Comput & Applic (2017) 28:57–85

Table 22 Statistical results of different methods for tension/compression string


Method Best Average Worst SD Median

Belegundu [92] 0.0128334 NA NA NA NA


Ray and Saini [94] 0.0130600 0.015526 0.018992 NA NA
Coello [64] 0.01270478 0.01276920 0.01282208 3.9390E-05 0.01275576
He and Wang [66] 0.0126747 0.012730 0.012924 5.1985E-05 NA
Coello and Montes [68] 0.0126810 0.012742 0.012973 5.9000E-05 NA
He et al. [78] 0.0126652812 0.01270233 NA 4.12439E-05 NA
Hedar and Fukushima [58] 0.012665285 0.012665299 0.012665338 2.2E-08 NA
Zhang et al. [96] 0.012665233 0.012669366 0.012738262 1.25E-05 NA
Hu et al. [72] 0.0126661409 0.012718975 NA 6.446E-05 NA
Cagnina et al. [77] 0.012665 0.0131 NA 4.1E-04 NA
Kaveh and Talatahari [61] 0.0126432 0.012720 0.012884 3.4888E-05 NA
Montes et al. [75] 0.012665 0.012666 NA 2.0E-06 NA
Akay and Karaboga [76] 0.012665 0.012709 NA 0.012813 NA
Ray and Liew [88] 0.01266924934 0.012922669 0.016717272 5.92E-04 0.012922669
Coelho [71] 0.012665 0.013524 0.017759 1.268E-03 0.012957
Omran and Salman [93] 0.0126652375 0.0126652642 NA NA NA
Arora [91] 0.0127303 NA NA NA NA
Montes and Coello [67] 0.012698 0.013461 0.164850 9.66E-04 NA
Mirjalili et al. [51] 0.012666 NA NA NA NA
Original WCA 0.012630231 0.013388089 0.017722009 1.0864E-03 0.012849091
Present study 0.012671577 0.013401730 0.016810785 0.4578E-03 0.012714273

s:t: 0  g1 ðXÞ  92 Table 24. Based on the results, it can be determined that
90  g2 ðXÞ  110 ð61Þ CWCA III algorithm can demonstrate a comparable per-
20  g3 ðXÞ  25 formance compared to WCA and previous works.

where g1 ðXÞ¼ 85:334407þ0:0056858x2 x5 þ 0:0006262x1 x4


 0:0022053x3 x5 6 An application of chaotic WCA for training
g2 ðXÞ¼ 80:51249þ0:0071317x2 x5 þ 0:0029955x1 x2 of NNs
 0:0021813x23 In previous sections, the performance of chaos-embedded
g3 ðXÞ¼ 9:300961þ0:0047026x3 x5 þ 0:0012574x1 x3 WCA was evaluated against several optimization tasks. In
þ 0:0019085x3 x4 this section, CWCA is employed for training of NNs in
ð62Þ order to investigate the efficiency of chaotic approach in
relieving the problem of trapping in local best. Detailed
78  x1  102 ; 33  x2  45;27  x3 ; x4 ; x5  45 ð63Þ characteristics of this problem are provided in [101], and
In the literature, different approaches investigated to the results of chaotic algorithm are compared to the solu-
tackle this problem such as PSO [78], CS [69], GA [89, tions obtained by [101]. In this application, two three-layer
99], and HS [50]. In some of them, another version of this feed-forward NNs have been utilized to approximate the
problem has been employed, which the parameter function F1 ðxÞ ¼ sinð2xe2x Þ and yðx1 ; x2 Þ ¼ ð1þ
1:5 2 2
0.0006262 from the constraint g1 is replaced with 0.00026 x1 þ x2 Þ . The sigmoid function has been chosen as the
[65, 73, 91, 94]. The results of CWCA III for both versions mapping function in hidden layer, and linear function has
of this problem are tabulated in Table 23. It is worth been used in output layer. The fitness function has been
mentioning that the solutions obtained by [50, 65, 75, 91, represented as
94] for version I are infeasible since the g1 constraint is 1 Xn
violated. e¼ ðy  y Þ2 ; ð64Þ
2 i¼1
Best, median, average, worst, and the standard deviation
F ¼ 1=e; ð65Þ
of values obtained by CWCA III algorithm are presented in

123
Table 23 Optimal results of Himmelblau nonlinear optimization problem
Version Method Design variables Cost Constraints
Neural Comput & Applic (2017) 28:57–85

x1 x2 x3 x4 x5 f (X) 0 B g1 B 92 90 B g2 B 110 20 B g3 B 25

I Himmelblau [98] NA NA NA NA NA -30,373.9490 NA NA NA


Lee and Geem [74] 78.00 33.00 29.995 45.00 36.776 -30,665.500 92.00004a 98.84051 19.99994a
Deb [87] NA NA NA NA NA -30,665.539 NA NA NA
Dimopoulos [82] 78.00 33.00 29.995256 45.00 36.775813 -30,665.54 92.00000 98.84050 20.00000
a
He et al. [78] 78.00 33.00 29.995256 45.00 36.7758129 -30,665.539 93.28536 100.40478 20.00000
Gandomi et al. [69] 78.00 33.00 29.99616 45.00 36.77605 -30,665.233 91.99996 98.84067 20.0003
Homaifar et al. [99] 80.39 35.07 32.05 40.33 33.34 -30,005.700 91.65619 99.53690 20.02553
Original WCA 78.00 33.00 29.990463 45.00 36.7808442 -30,666.751127 92.00099a 94.91721 19.999000a
Present study 78.00 33.00 29.995256 45.00 36.775813 -30,665.538673 92.000000 94.915402 20.000000
a
II Omran & Salman [93] 78.00 33.00 27.0709971 45.00 44.9692425 -31,025.55626 93.28536 100.40478 20.00000
Fesanghary et al. [90] 78.00 33.00 27.085149 45.00 44.925329 -31,024.3166 93.27834a 100.39612 20.00000
Coello [64] 78.0495 33.007 27.081 45.00 44.94 -31,020.859 93.28381a 100.40786 20.00191
Hu et al. [72] 78.00 33.00 27.070997 45.00 44.9692425 -31,025.5614 92 100.404784 20
Coello [100] 78.5958 33.01 27.6460 45.00 45.0000 -30,810.359 91.956402 100.54511 20.251919
a
Original WCA 78.00 33.00 27.066957 45.00 44.97392777 -31,026.4264504 92.000999 97.209281 19.999000a
Present study 78.00 33.00 27.0709971 45.00 44.96924241 -31,025.5602531 91.999999 97.2077013 20.000
a
Violate constraints
79

123
80 Neural Comput & Applic (2017) 28:57–85

Table 24 Statistical results of the Himmelblau problem


Version Method Best Median Average Worst SD

I Deb [87] -30,665.537 -30,665.535 NA -29,846.654 NA


Lee and Geem [74] -30,665.500 NA NA NA NA
He et al. [78] -30,665.539 NA -30,643.989 NA 70.043
Dimopoulos [82] -30,665.54 NA NA NA NA
Gandomi et al. [69] -30,665.2327 NA NA NA 11.6231
Original WCA -30,666.7511276 -30,666.7510661 -30,666.7431539 -30,666.570905 0.0333038374
Present study -30,665.538673 -30,665.48314829 -30,665.40358145 -30,664.7191255 0.3787927
II Coello [64] -31,020.859 -31,017.21369099 -30,984.24070309 -30,792.40773775 73.633536
Hu et al. [72] -31,025.56142 NA -31,025.56142 NA 0
Fesanghary et al. [90] -31,024.3166 NA NA NA NA
Omran and Salman [93] -31,025.55626 NA -31,025.5562644829 NA NA
Original WCA -31,026.4264504 -31,026.4263964 -31,026.42591510 -31,026.42150651 0.0011068243
Present study -31,025.5412555 -31,025.5812452 -31,025.54841263 -31,025.49727982 0.0234913

Table 25 Results of different algorithms for training of NNs


Experiment Method Best Average Worst SD Convergent generation Average run time

Network-1 DBPSO [101] 438.76 386.75 322.53 12.97 658.35 125.37


MCPSO [101] 414.55 355.89 148.31 38.02 676.48 137.19
PSO [101] 213.17 196.81 102.90 25.21 881.39 169.47
NPSO [101] 154.77 91.82 56.39 42.03 NA 198.77
Original WCA 409.12 312.04 298.52 28.07 703.44176 128.23
Present study 442.14 388.35 348.42 12.13 658.10471 119.85
Network-2 DBPSO [101] 15.86 11.36 3.27 2.08 536.63 132.16
MCPSO [101] 13.57 6.06 3.88 2.69 631.37 153.37
PSO [101] 13.72 4.47 1.09 3.16 903 207.86
NPSO [101] 0.12 0.18 0.02 0.33 NA 209.38
Original WCA 13.01 8.87 3.71 3.12 614.79 164.61
Present study 16.41 10.14 3.84 2.46 572.06 150.44

run time of DBPSO in training of network 2 is still better


where y and y* show the outcomes of real model and NN-
than the consumed time of CWCA III and WCA strategy.
based model, correspondingly. n shows the number of
samples, e symbolizes the error, and F indicates the fitness
function. Table 25 summarizes the average, best, and worst
7 Conclusions
solutions, the average standard deviation, the convergent
generation, and the average running time obtained via
Since 1998, which chaos optimization method developed
improved method in comparison with the results of other
based on the deterministic chaos, hybridization of chaotic
meta-heuristics. From the results, it can be seen that the
equations with meta-heuristic algorithms turned into an
standard WCA is superior to PSO and NPSO in training of
interdisciplinary research field among researchers. It is
network 1. In this task, CWCA III can statistically out-
revealed by several investigations that the usage of chaotic-
perform WCA technique with a level of significance
based operators instead of random processes can signifi-
a = 0.05. Based on the results, it is recognized that the
cantly improve the diversification and intensification abil-
performance of chaotic optimizer is comparable and
ities of optimization strategies. In this paper, chaos theme
superior to the other optimizers. It should be noted that the

123
Neural Comput & Applic (2017) 28:57–85 81

has been utilized to enhance the efficiency of WCA algo- s:t: g1 ðxÞ ¼ 2x1 þ 2x2 þ x10 þ x11  10  0;
rithm. The standard algorithm is developed by mimicking g2 ðxÞ ¼ 2x1 þ 2x3 þ x10 þ x12  10  0;
the idealized hydrological cycle in natural environments,
g3 ðxÞ ¼ 2x2 þ 2x3 þ x11 þ x12  10  0;
which also can be explained by deterministic chaos. With
respect to the chaotic characteristics of water cycle in g4 ðxÞ ¼ 8x1 þ x10  0;
nature, three variants of WCA are proposed by utilizing 13 g5 ðxÞ ¼ 8x2 þ x11  0;
non-invertible chaotic maps. In these strategies, different g6 ðxÞ ¼ 8x3 þ x12  0;
deterministic chaotic operations are utilized instead of g7 ðxÞ ¼ 2x4  x5 þ x10  0;
constant and/or random values of conventional optimizer.
g8 ðxÞ ¼ 2x6  x7 þ x11  0;
The overall success rate of these algorithms for several
benchmark problems demonstrates that CWCA III with g9 ðxÞ ¼ 2x8  x9 þ x12  0;
sinusoidal map has a superior performance compared to xi  0; i ¼ 1; . . .; 13;
other WCA-based approaches. In addition, the most xi  1; i ¼ 1; . . .9; 13:
effective chaotic sequences for the CWCA I and CWCA II
strategies can be ICMIC signal and sinusoidal map, Bound restrictions: UB = (1, 1, 1, 1, 1, 1, 1, 1, 1, 100,
respectively. 100; 100; 1Þ; LB ¼ ð0; . . .; 0Þ;
For evaluating the reliability of CWCA III, this method Global minimum: xbest = (1, 1, 1, 1, 1, 1, 1, 1, 1, 3, 3, 3, 1),
was compared to standard WCA and other chaotic versions
in solving unconstrained tasks. Statistical results reveal that f ðxbest Þ¼  15:
CWCA III can effectively explore the best solution with
more precision. From simulation results, it can be con- Problem G2
P 
cluded that incorporating the chaos theme into the WCA  n cos4 ðx Þ  2 Qn cos2 ðx Þ
 i¼1 i i¼1 i 
algorithm can manifestly enhance the feasibility and max f ðxÞ ¼  pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
Pn 
x  i¼1 ixi
2 
quality of the results. In addition, new improved method is Yn
utilized to tackle some well-studied constrained problems s:t: g1 ðxÞ ¼  i¼1 xi þ 0:75  0;
such as G1–G13 benchmarks, pressure vessel design, wel- Xn
ded beam design, compression string design, and Him- g2 ðxÞ ¼  x  7:5n  0:
i¼1 i
melblau problem. These test cases are chosen to Bound restrictions: UB = (10,. . .; 10Þ and LB = (0,. . .; 0Þ:
substantiate the effectiveness and feasibility of CWCA III
Best known value: f (xbest Þ¼ 0:803619; for n = 20:
against pervious techniques for constrained tasks. The
results also show that chaotic CWCA III outperforms other
conventional WCA and some other methods in tanning Problem G3
pffiffiffi Yn
NNs. With regard to the statistical results, CWCA III is max f ðxÞ ¼ ð nÞn i¼1 xi
x
able to realize better or comparable solutions compared to Xn
previous works. It can be vividly concluded that CWCA III s:t: h1 ðxÞ ¼ x2  1 ¼ 0:
i¼1 i
is a competent optimizer for solving unconstrained tasks.
Bound restrictions: UB = (1,. . .; 1) and LB = (0,. . .; 0Þ:
Modified algorithm also outperforms other studied opti-
1 1
mizers on constrained test cases. Future studies can be Global maximum: xbest ¼ ðpffiffiffi ; . . .; pffiffiffiÞ; f ðxbest Þ¼ 1:
dedicated to the extension of the WCA strategy and its n n
improved versions to tackle mixed-type problems and
Problem G4
complex optimization tasks in discrete spaces. More elab-
orated simulations may also be carried out using parallel or min f ðxÞ ¼ 5:3578547x23 þ 0:8356891x1 x5
x
distributed computation techniques.
þ 37:293239x1  40792:141
s:t: g1 ðxÞ ¼ uðxÞ  92  0;
Appendix g1 ðxÞ ¼ uðxÞ  0;
g3 ðxÞ ¼ vðxÞ  110  0;
List of 13 constrained benchmark problems: g4 ðxÞ ¼ vðxÞ þ 90  0;
Problem G1
X4 X4 X13 g5 ðxÞ ¼ wðxÞ  25  0;
2
min f ðxÞ ¼ 5 i¼1
x i  5 i¼1
x i  x
i¼5 i g6 ðxÞ ¼ wðxÞ þ 20  0;
x

123
82 Neural Comput & Applic (2017) 28:57–85

where Problem G7
uðxÞ ¼ 85:334407þ0:0056858x2 x5 þ0:0006262x1 x4
min f ðxÞ ¼ x21 þ x22 þ x1 x2  14x1  16x2 þ ðx3  10Þ2
x
 0:0022053x3 x5 ;
þ 4ðx4  5Þ2 þ ðx5  3Þ2 þ 2ðx6  1Þ2 þ 5x27
vðxÞ ¼ 80:51249þ0:0071317x2 x5 þ0:0029955x1 x2
þ 7ðx8  11Þ2 þ 2ðx9  10Þ2 þ ðx10  7Þ2 þ 45
þ 0:0021813x23 ;
s:t: g1 ðxÞ ¼ 4x1 þ 5x2  3x7 þ 9x8  105  0;
wðxÞ ¼ 9:300961þ0:0047026x3 x5 þ0:0012547x1 x3
g2 ðxÞ ¼ 10x1  8x2  17x7 þ 2x8  0;
þ 0:0019085x3 x4 : g3 ðxÞ ¼ 8x1 þ 2x2 þ 5x9  2x10  12  0;
Bound restrictions: UB = (102, 45, 45, 45, 45) g4 ðxÞ ¼ 3ðx1  2Þ2 þ 4ðx2  3Þ2 þ 2x23  7x4  120  0;
and LB = (78, 33, 27, 27, 27): g5 ðxÞ ¼ 5x21 þ 8x2 þ ðx3  6Þ2  2x4  40  0;
Global minimum: xbest ¼ ð78; 33; 29:995256025682; g6 ðxÞ ¼ 0:5ðx1  8Þ2 þ 2ðx2  4Þ2 þ 3x25  x6  30  0;
45; 36:775812905788Þ; f ðxbest Þ ¼ 30665:539: g7 ðxÞ ¼ x21 þ 2ðx2  2Þ2  2x1 x2 þ 14x5  6x6  0;
g8 ðxÞ ¼ 3x1 þ 6x2 þ 12ðx9  8Þ2  7x10  0:
Problem G5
2 Bound restrictions: UB = (10,. . .; 10)
min f ðxÞ ¼ 3x1 þ 106 x31 þ 2x2 þ  106 x32
x 3
and LB = ð10; . . .; 10Þ:
s:t: g1 ðxÞ ¼ x3  x4  0:55  0;
Global minimum: xbest ¼ ð2:171996; 2:363683; 8:773926;
g2 ðxÞ ¼ x4  x3  0:55  0;
5:095984; 0:9906548; 1:430574; 1:321644; 9:828726;
h1 ðxÞ ¼ 1000½sinðx3  0:25Þ þ sinðx4  0:25Þ
þ 894:8  x1 ¼ 0; 8:280092; 8:375927Þ; f ðxbest Þ ¼ 24:3062091:
h2 ðxÞ ¼ 1000½sinðx3  0:25Þ þ sinðx3  x4  0:25Þ
Problem G8
þ 894:8  x2 ¼ 0;
sin3 ð2px1 Þ sinð2px2 Þ
h3 ðxÞ ¼ 1000½sinðx4  0:25Þ þ sinðx4  x3  0:25Þ max f ðxÞ ¼
x x31 ðx1 þ x2 Þ
þ 1294:8 ¼ 0:
s:t: g1 ðxÞ ¼ x21  x2 þ 1  0;
Bound restrictions: UB = (1200, 1200, 0:55; 0:55Þ g2 ðxÞ ¼ 1  x1 þ ðx2  4Þ2  0:
and LB = (0; 0; 0:55; 0:55Þ: Bound restrictions: UB = (10, 10) and LB = (0, 0):
Best known solution: xbest ¼ ð679:9453; 1026; 0:118876; Global maximum: xbest ¼ ð1:2279713; 4:2453733Þ;
best
 0:3962336Þ; f ðx Þ ¼ 5126:4981: f ðxbest Þ ¼ 0:095825:

Problem G6 Problem G9
3
min f ðxÞ ¼ ðx1  10Þ þ ðx2  20Þ 3
min f ðxÞ ¼ ðx1  10Þ2 þ 5ðx2  12Þ2 þ x43 þ 3ðx4  11Þ2
x x

s:t: 2 2
g1 ðxÞ ¼ ðx1  5Þ þ ðx2  5Þ þ 100  0; þ 10x65 þ 7x26 þ x47  4x6 x7  10x6  8x7
g2 ðxÞ ¼ ðx1  5Þ2 þ ðx2  5Þ2  82:81  0; s:t: g1 ðxÞ ¼ 2x21 þ 3x42 þ x3 þ 4x24 þ 5x5  127  0;
g2 ðxÞ ¼ 7x1 þ 3x2 þ 10x23 þ x4  x5  282  0;
Bound restrictions: UB = (100, 100) and LB = (13, 0):
g3 ðxÞ ¼ 23x1 þ x22 þ 6x26  8x7  196  0;
Global minimum: xbest ¼ ð14:095; 0:84296Þ; f ðxbest Þ
g4 ðxÞ ¼ 4x21 þ x22  3x1 x2 þ 2x23 þ 5x6  11x7  0:

123
Neural Comput & Applic (2017) 28:57–85 83

Bound restrictions: UB = (10,. . .; 10) and LB Bound restrictions: UB ¼ ð2:3; 2:3; 3:2; 3:2; 3:2Þ
¼ð10; . . .; 10Þ: and LB = (  2:3; 2:3; 3:2; 3:2; 3:2Þ:
Global minimum: xbest ¼ ð2:330499; 1:951372;
Global minimum: xbest ¼ ð1:717143; 1:595709; 1:827247;
 0:4775414; 4:365726; 0:6244870; 1:038131; 1:594227Þ;
 0:7636413; 0:763645Þ; f ðxbest Þ ¼ 0:0539498:
f ðxbest Þ ¼ 680:6300573:

Problem G10
min f ðxÞ ¼ x1 þ x2 þ x3 References
x
s:t: g1 ðxÞ ¼ 1 þ 0:0025ðx4 þ x6 Þ  0; 1. Yang XS (2010) Engineering optimization: an introduction with
g2 ðxÞ ¼ 1 þ 0:0025ðx4 þ x5 þ x7 Þ  0; metaheuristic applications. Wiley, Hoboken
2. Yildiz AR (2009) A new design optimization framework based
g3 ðxÞ ¼ 1 þ 0:01ðx5 þ x8 Þ  0; on immune algorithm and Taguchi’s method. Comput Ind
g4 ðxÞ ¼ 100x1  x1 x6 þ 833:33252x4  83333:333  0; 60:613–620
3. Yildiz AR (2008) Optimal structural design of vehicle compo-
g5 ðxÞ ¼ x2 x4  x2 x7  1250x4 þ 1250x5  0; nents using topology design and optimization. Mater Test
g6 ðxÞ ¼ x3 x5  x3 x8  2500x5 þ 1250000  0: 50:224–228
4. Rahimi S, Roodposhti MS, Abbaspour RA (2014) Using com-
bined AHP–genetic algorithm in artificial groundwater recharge
Bound restrictions: UB = (10000, 10000, 10000, 1000, 1000, site selection of Gareh Bygone Plain, Iran. Environ Earth Sci
1000; 1000; 1000Þ and 72:1979–1992
5. Jordehi AR (2015) Chaotic bat swarm optimisation (CBSO).
LB = (100, 1000, 1000, 10, 10, 10, 10, 10): Appl Soft Comput 26:523–530
Global minimum: xbest ¼ ð579:3167; 1359:943; 5110:071; 6. Abbaspour RA, Samadzadegan F (2011) Time-dependent per-
sonal tour planning and scheduling in metropolises. Expert Syst
182:0174; 295:5985; 217:9799; 286:4162; 395:5979Þ; Appl 38:12439–12452
f ðxbest Þ ¼ 7049:3307: 7. Yildiz AR (2013) Hybrid Taguchi-differential evolution algo-
rithm for optimization of multi-pass turning operations. Appl
Soft Comput 13:1433–1439
Problem G11 8. Yildiz AR (2013) Comparison of evolutionary-based optimiza-
tion algorithms for structural design optimization. Eng Appl
min f ðxÞ ¼ x21 þ ðx2  1Þ2
x Artif Intell 26:327–333
9. Yildiz AR (2012) A comparative study of population-based
s:t: h1 ðxÞ ¼ x2  x21 ¼ 0:
optimization algorithms for turning operations. Inf Sci 210:81–88
Bound restrictions: UB = (1, 1) and LB = ð1; 1Þ: 10. Jordehi AR (2014) Particle swarm optimisation for dynamic
optimisation problems: a review. Neural Comput Appl
1 1 25:1507–1516
Global minimum: xbest ¼ ð pffiffiffi ; Þ; f ðxbest Þ ¼ 0:75: 11. Jordehi AR (2015) Enhanced leader PSO (ELPSO): a new PSO
2 2
variant for solving global optimisation problems. Appl Soft
Problem G12 Comput 26:401–417
h i 12. Jordehi AR (2015) A review on constraint handling strategies in
min f ðxÞ ¼ 1  0:01 ðx1  5Þ2 þ ðx2  5Þ2 þ ðx3  5Þ2 particle swarm optimisation. Neural Comput Appl
x
26(6):1265–1275
s:t: gi;j;k ðxÞ ¼ ðx  iÞ2 þ ðx2  jÞ2 þ ðx3  kÞ2  0:0625  0; 13. Jiang BLW (1998) Optimizing complex functions by chaos
i; j; k ¼ 1; 2; . . .; 9: search. Cybern Syst 29:409–419
14. Yildiz AR, Solanki KN (2012) Multi-objective optimization of
vehicle crashworthiness using a new particle swarm based
Bound restrictions: UB = (10, 10, 10) and LB = (0, 0, 0): approach. Int J Adv Manuf Technol 59:367–376
15. Rezaee Jordehi A, Jasni J (2013) Parameter selection in particle
Global minimum: xbest ¼ ð5; 5; 5Þ; f ðxbest Þ ¼ 1:
swarm optimisation: a survey. J Exp Theor Artif Intell
25:527–542
Problem G13 16. Jordehi AR, Jasni J (2015) Particle swarm optimisation for
discrete optimisation problems: a review. Artif Intell Rev
min f ðxÞ ¼ ex1 x2 x3 x4 x5 43(2):243–258
x
17. Yildiz AR (2013) Optimization of multi-pass turning operations
s:t: h1 ðxÞ ¼ x21 þ x22 þ x23 þ x24 þ x25  10 ¼ 0; using hybrid teaching learning-based approach. Int J Adv Manuf
h2 ðxÞ ¼ x2 x3  5x4 x5 ¼ 0; Technol 66:1319–1326
18. Jordehi AR (2015) Optimal setting of TCSC’s in power systems
h3 ðxÞ ¼ x31 þ x32 þ 1 ¼ 0: using teaching-learning-based optimisation algorithm. Neural
Comput Appl 26(5):1249–1256

123
84 Neural Comput & Applic (2017) 28:57–85

19. Eskandar H, Sadollah A, Bahreininejad A, Hamdi M (2012) 41. Saremi S, Mirjalili S, Lewis A (2014) Biogeography-based
Water cycle algorithm—a novel metaheuristic optimization optimisation with chaos. Neural Comput Appl 25:1077–1097
method for solving constrained engineering optimization prob- 42. Schuster HG, Just W (2006) Deterministic chaos: an introduc-
lems. Comput Struct 110:151–166 tion. Wiley, Hoboken
20. Sadollah A, Eskandar H, Bahreininejad A, Kim JH (2015) Water 43. Tavazoei MS, Haeri M (2007) Comparison of different one-
cycle algorithm for solving multi-objective optimization prob- dimensional maps as chaotic search pattern in chaos optimiza-
lems. Soft Comput 19(9):2587–2603 tion algorithms. Appl Math Comput 187:1076–1085
21. Sadollah A, Eskandar H, Bahreininejad A, Kim JH (2015) Water 44. Hilborn RC (2000) Chaos and nonlinear dynamics: an intro-
cycle, mine blast and improved mine blast algorithms for dis- duction for scientists and engineers. Oxford University Press,
crete sizing optimization of truss structures. Comput Struct Oxford
149:1–16 45. He D, He C, Jiang LG, Zhu HW, Hu GR (2001) Chaotic char-
22. Sadollah A, Eskandar H, Bahreininejad A, Kim JH (2015) Water acteristics of a one-dimensional iterative map with infinite col-
cycle algorithm with evaporation rate for solving constrained lapses. IEEE Trans Circuits Syst I Fundam Theory Appl
and unconstrained optimization problems. Appl Soft Comput 48:900–906
30:58–71 46. Erramilli A, Singh R, Pruthi P (1995) An application of deter-
23. Sadollah A, Eskandar H, Kim JH (2015) Water cycle algorithm ministic chaotic maps to model packet traffic. Queueing Syst
for solving constrained multi-objective optimization problems. 20:171–206
Appl Soft Comput 27:279–298 47. Li Y, Deng S, Xiao D (2011) A novel Hash algorithm con-
24. Sadollah A, Eskandar H, Yoo DG, Kim JH (2015) Approximate struction based on chaotic neural network. Neural Comput Appl
solving of nonlinear ordinary differential equations using least 20:133–141
square weight function and metaheuristic algorithms. Eng Appl 48. Xiang T, Liao X (2007) K.w. Wong, An improved particle
Artif Intell 40:117–132 swarm optimization algorithm combined with piecewise linear
25. Ott E (2002) Chaos in dynamical systems. Cambridge Univer- chaotic map. Appl Math Comput 190:1637–1645
sity Press, Cambridge 49. Devaney RL (1989) An introduction to chaotic dynamical sys-
26. Stehlik J (1999) Deterministic chaos in runoff series. J Hydrol tems. Westview Press, Colorado
Hydromech 47:271–287 50. Mahdavi M, Fesanghary M, Damangir E (2007) An improved
27. Xu C, Duan H, Liu F (2010) Chaotic artificial bee colony harmony search algorithm for solving optimization problems.
approach to Uninhabited Combat Air Vehicle (UCAV) path Appl Math Comput 188:1567–1579
planning. Aerosp Sci Technol 14:535–541 51. Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer.
28. Talatahari S, Azar BF, Sheikholeslami R, Gandomi AH (2012) Adv Eng Softw 69:46–61
Imperialist competitive algorithm combined with chaos for 52. Derrac J, Garcı́a S, Molina D, Herrera F (2011) A practical
global optimization. Commun Nonlinear Sci Numer Simul tutorial on the use of nonparametric statistical tests as a
17:1312–1319 methodology for comparing evolutionary and swarm intelli-
29. Jothiprakash V, Arunkumar R (2013) Optimization of hydro- gence algorithms. Swarm Evolut Comput 1:3–18
power reservoir using evolutionary algorithms coupled with 53. Wilcoxon F (1945) Individual comparisons by ranking methods.
chaos. Water Resour Manage 27:1963–1979 Biom Bull 1(6):80–83
30. Wang GG, Guo L, Gandomi AH, Hao GS, Wang H (2014) 54. Gandomi AH, Yang XS, Alavi AH, Talatahari S (2013) Bat
Chaotic krill herd algorithm. Inf Sci 274:17–34 algorithm for constrained optimization tasks. Neural Comput
31. Gandomi AH, Yang XS (2014) Chaotic bat algorithm. J Comput Appl 22:1239–1255
Sci 5:224–232 55. Mezura Montes E, Coello CAC (2005) A simple multimem-
32. Fister I, Perc M, Kamal SM (2015) A review of chaos-based bered evolution strategy to solve constrained optimization
firefly algorithms: perspectives and research challenges. Appl problems. IEEE Trans Evolut Comput 9:1–17
Math Comput 252:155–165 56. Becerra RL, Coello CAC (2006) Cultured differential evolution
33. Li C, An X, Li R (2015) A chaos embedded GSA-SVM hybrid for constrained optimization. Comput Methods Appl Mech Eng
system for classification. Neural Comput Appl 26:713–721 195:4303–4322
34. Liao GC, Tsao TP (2006) Application of a fuzzy neural network 57. Tessema B, Yen GG (2006) A self adaptive penalty function
combined with a chaos genetic algorithm and simulated based algorithm for constrained optimization. In: Evolutionary
annealing to short-term load forecasting. IEEE Trans Evolut computation, 2006. CEC 2006. IEEE Congress on, IEEE,
Comput 10:330–340 pp 246–253
35. Jordehi AR (2014) A chaotic-based big bang–big crunch algo- 58. Hedar AR, Fukushima M (2006) Derivative-free filter simulated
rithm for solving global optimisation problems. Neural Comput annealing method for constrained continuous global optimiza-
Appl 25:1329–1335 tion. J Global Optim 35:521–549
36. Alatas B (2010) Chaotic harmony search algorithms. Appl Math 59. Koziel S, Michalewicz Z (1999) Evolutionary algorithms,
Comput 216:2687–2699 homomorphous mappings, and constrained parameter opti-
37. Gandomi AH, Yun GJ, Yang XS, Talatahari S (2013) Chaos- mization. Evol Comput 7:19–44
enhanced accelerated particle swarm optimization. Commun 60. Cabrera JCF, Coello CAC (2007) Handling constraints in par-
Nonlinear Sci Numer Simul 18:327–340 ticle swarm optimization using a small population size. In:
38. dos Santos Coelho L, Mariani VC (2008) Use of chaotic Gelbukh A, Kuri Morales AF (eds) MICAI 2007: advances in
sequences in a biologically inspired algorithm for engineering artificial intelligence, Springer, pp 41–51
design optimization. Expert Syst Appl 34:1905–1913 61. Kaveh A, Talatahari S (2010) An improved ant colony opti-
39. Jordehi AR (2015) Seeker optimisation (human group optimi- mization for constrained engineering design problems. Eng
sation) algorithm with chaos. J Exp Theor Artif Intell 1–10. Comput 27:155–182
doi:10.1080/0952813X.2015.1020568 62. Deb K, Goyal M (1996) A combined genetic adaptive search
40. Jordehi AR (2015) A chaotic artificial immune system optimi- (GeneAS) for engineering design. Comput Sci Inform 26:30–45
sation algorithm for solving global continuous optimisation 63. Sandgren E (1990) Nonlinear integer and discrete programming
problems. Neural Comput Appl 26(4):827–833 in mechanical design optimization. J Mech Des 112:223–229

123
Neural Comput & Applic (2017) 28:57–85 85

64. Coello CAC (2000) Use of a self-adaptive penalty approach for 83. Rao SS, Rao S (2009) Engineering optimization: theory and
engineering optimization problems. Comput Ind 41:113–127 practice. Wiley, New York
65. Kannan B, Kramer SN (1994) An augmented Lagrange multi- 84. Hwang SF, He RS (2006) A hybrid real-parameter genetic
plier based method for mixed integer discrete continuous opti- algorithm for function optimization. Adv Eng Inform 20:7–21
mization and its applications to mechanical design. J Mech Des 85. Deb K (1991) Optimal design of a welded beam via genetic
116:405–411 algorithms. AIAA J 29:2013–2015
66. He Q, Wang L (2007) An effective co-evolutionary particle 86. Ragsdell K, Phillips D (1976) Optimal design of a class of
swarm optimization for constrained engineering design prob- welded structures using geometric programming. J Manuf Sci
lems. Eng Appl Artif Intell 20:89–99 Eng 98:1021–1025
67. Mezura E, Montes CAC (2008) Coello, An empirical study 87. Deb K (2000) An efficient constraint handling method for
about the usefulness of evolution strategies to solve constrained genetic algorithms. Comput Methods Appl Mech Eng
optimization problems. Int J Gen Syst 37:443–473 186:311–338
68. Coello CAC, Montes EM (2002) Constraint-handling in genetic 88. Ray T, Liew KM (2003) Society and civilization: an optimiza-
algorithms through the use of dominance-based tournament tion algorithm based on the simulation of social behavior. IEEE
selection. Adv Eng Inform 16:193–203 Trans Evolut Comput 7:386–396
69. Gandomi AH, Yang XS, Alavi AH (2013) Cuckoo search 89. Mehta VK, Dasgupta B (2012) A constrained optimization
algorithm: a metaheuristic approach to solve structural opti- algorithm based on the simplex search method. Eng Optim
mization problems. Eng Comput 29:17–35 44:537–550
70. Kaveh A, Talatahari S (2009) Engineering optimization with 90. Fesanghary M, Mahdavi M, Minary-Jolandan M, Alizadeh Y
hybrid particle swarm and ant colony optimization. Asian J Civ (2008) Hybridizing harmony search algorithm with sequential
Eng 10:611–628 quadratic programming for engineering optimization problems.
71. dos Santos Coelho L (2010) Gaussian quantum-behaved particle Comput Methods Appl Mech Eng 197:3080–3091
swarm optimization approaches for constrained engineering 91. Arora J (2004) Introduction to optimum design. Academic Press,
design problems. Expert Syst Appl 37:1676–1683 New York
72. Hu X, Eberhart RC, Shi Y (2003) Engineering optimization with 92. Belegundu AD, Arora JS (1985) A study of mathematical pro-
particle swarm. In: Swarm intelligence symposium, 2003. gramming methods for structural optimization, Part I: Theory.
SIS’03. Proceedings of the 2003 IEEE, IEEE, pp 53–57 Int J Numer Methods Eng 21:1583–1599
73. Zhang C, Wang HP (1993) Mixed-discrete nonlinear optimiza- 93. Omran MG, Salman A (2009) Constrained optimization using
tion with simulated annealing. Eng Optim 21:277–291 CODEQ. Chaos Solitons Fractals 42:662–668
74. Lee KS, Geem ZW (2005) A new meta-heuristic algorithm for 94. Ray T, Saini P (2001) Engineering design optimization using a
continuous engineering optimization: harmony search theory swarm with an intelligent information sharing among individu-
and practice. Comput Methods Appl Mech Eng 194:3902–3933 als. Eng Optim 33:735–748
75. Mezura Montes E, Coello CAC, Velázquez Reyes J, Muñoz 95. Tsai JF (2005) Global optimization of nonlinear fractional
Dávila L (2007) Multiple trial vectors in differential evolution programming problems in engineering design. Eng Optim
for engineering design. Eng Optim 39:567–589 37:399–409
76. Karaboga D, Akay B (2009) A comparative study of artificial 96. Zhang M, Luo W, Wang X (2008) Differential evolution with
bee colony algorithm. Appl Math Comput 214:108–132 dynamic stochastic selection for constrained optimization. Inf
77. Cagnina LC, Esquivel SC, Coello CAC (2008) Solving engi- Sci 178:3043–3074
neering optimization problems with the simple constrained 97. Raj KH, Sharma R (2005) An evolutionary computational
particle swarm optimizer. Inform (Slov) 32:319–326 technique for constrained optimisation in engineering design. IE
78. He S, Prempain E, Wu Q (2004) An improved particle swarm (I) J—MC 86:121–128
optimizer for mechanical design optimization problems. Eng 98. Himmelblau DM (1972) Applied nonlinear programming.
Optim 36:585–605 McGraw-Hill, New York
79. Akhtar S, Tai K, Ray T (2002) A socio-behavioural simulation 99. Homaifar A, Qi CX, Lai SH (1994) Constrained optimization
model for engineering design optimization. Eng Optim via genetic algorithms. Simulation 62:242–253
34:341–354 100. Coello CAC (2000) Treating constraints as objectives for single-
80. Mirjalili S, Mirjalili SM, Hatamlou A (2015) Multi-verse opti- objective evolutionary optimization. Eng Optim?
mizer: a nature-inspired algorithm for global optimization. A35(32):275–308
Neural Comput Appl 1–19. doi:10.1007/s00521-015-1870-7 101. Chen D, Zhao C, Zhang H (2011) An improved cooperative
81. Gandomi AH, Yang XS, Alavi AH (2011) Mixed variable particle swarm optimization and its application. Neural Comput
structural optimization using firefly algorithm. Comput Struct Appl 20:171–182
89:2325–2336
82. Dimopoulos GG (2007) Mixed-variable engineering optimiza-
tion based on evolutionary and social metaphors. Comput
Methods Appl Mech Eng 196:803–817

123

You might also like