Professional Documents
Culture Documents
DOI 10.1007/s00521-014-1806-7
ORIGINAL ARTICLE
Received: 7 July 2014 / Accepted: 8 December 2014 / Published online: 31 December 2014
The Natural Computing Applications Forum 2014
S. Saremi (&)
School of Information and Communication Technology, Nathan
Campus, Griffith University, Brisbane, QLD 4111, Australia
e-mail: shahrzad.saremi@griffithuni.edu.au
S. Saremi
Queensland Institute of Business and Technology, Mt Gravatt,
Brisbane, QLD 4122, Australia
S. Z. Mirjalili S. M. Mirjalili
Zharfa Pajohesh System (ZPS) Co., Unit 5, NO. 30, West 208
St., Third Sq. Tehranpars, P.O. Box 1653745696, Tehran, Iran
1 Introduction
Over the last two decades, nature-inspired optimization
techniques have become a reliable alternatives compared to
other optimization approached. As their name implies, such
methods mostly mimic natural phenomena for optimization. The majority of the nature-inspired algorithms are
classified as stochastic optimization techniques. The term
stochastic stands opposite of deterministic.
In deterministic optimization, the improvement of variables of a given problem results in a similar solution every
time. However, stochastic algorithms find different solutions for optimization problems in every run. The general
framework of stochastic optimization algorithm is that they
start optimization process with one of a set of random
solutions. The candidate solution(s) is then improved based
on the mechanism of the algorithm. This proves is iteratively run until the satisfaction of an end criterion.
Due to the advantages of stochastic optimization paradigms, such high local optima avoidance, high convergence speed, simplicity, and derivative-free mechanism,
several algorithms have been proposed recently. Some of
the most popular are genetic algorithm (GA), particle
swarm optimization (PSO), ant colony optimization
(ACO), and differential evolution (DE). There are also a
massive number of recently proposed algorithms in the
literature: gravitational local search (GLSA) [1], big bang
big crunch (BBBC) [2], gravitational search algorithm
(GSA) [3], central force optimization (CFO) [4], artificial
chemical reaction optimization algorithm (ACROA) [5],
black hole (BH) [6] algorithm, small-world optimization
algorithm (SWOA) [7], galaxy-based search algorithm
(GbSA) [8], curved space optimization (CSO) [9], biogeography-based optimizer (BBO) [10], marriage in honey
bees optimization algorithm (MBO) in 2001 [11], artificial
123
1258
fish-swarm algorithm (AFSA) in 2003 [12], termite algorithm in 2005 [13], wasp swarm algorithm in 2007 [14],
monkey search in 2007 [15], bee collecting pollen algorithm (BCPA) in 2008 [16], Cuckoo search (CS) in 2009
[17], dolphin partner optimization (DPO) in 2009 [18],
firefly algorithm (FA) in 2010 [19], bird mating optimizer
(BMO) in 2012 [20], and fruit fly optimization algorithm
(FOA) in 2012 [21].
The large number of application of stochastic optimization techniques in science and engineering are also
another evidence of popularity and applicability of these
algorithms. One of the ways of improving the performance
of algorithms in this field is hybridization and combining
different operators. In this case, different operators and
mechanisms of two or more algorithms are combined.
Some of the examples are
123
Meta-heuristic
EPD
Population
1259
! ! ~ !
X1 Xa A1 D a
! ! ! !
X2 Xb A 2 D b
! ! ! !
X3 Xd A 3 D d
! ! !
~ t 1 X 1 X2 X3
X
3
3:6
3:7
3:8
3:9
1.
~
~t 1 !
AD
X
Xp t ~
2.
3.
3:2
~ !
where t indicates the current iteration, ~
A 2a
r1
!
!
~
~
~;
a C 2 r2 ; Xp is the position vector of the prey, X
indicates the position vector of a grey wolf, ~
a is linearly
decreased from 2 to 0, and r1, r2 are random vectors in [0,
1].
It should be noted that each mega wolf is required to
update its position with respect to alpha, beta, and delta
simultaneously as follows [46]:
! ~ ! ~
D a C 1 Xa X
3:3
!
~2 !
~j
Xb X
Db jC
! ! ! ~
Dd C3 Xd X
3:4
4.
3:5
123
1260
~t 1 ub lb r lb
X
3:13
Function
f1 x
Pn
2
i1 xi
30
Q
f2 x i1 jxi j ni1 jxi j 3
2
P Pi
f3 x ni1
30
j1 xj
Pn
f4 x maxi fjxi j; 1 i ng 30
i
2
Pn1 h
f5 x i1
100 xi1 x2i xi 12 15
P
f6 x ni1 xi 0:52 75
P
f7 x ni1 ix4i random0; 1 0:25
123
Dim
Range
fmin
30
[-100, 100]
30
[-10, 10]
30
[-100, 100]
30
[-100, 100]
30
[-30, 30]
30
[-100, 100]
30
[-1.28, 1.28]
1261
Pn 2
F9 x i1 xi 10cos2pxi 10 2
q
P
P
F10 x 20exp 0:2 1n ni1 x2i exp 1n ni1 cos2pxi 20 e 10
Pn 2 Qn
xi
1
p
1 400
F11 x 4;000
i1 xi
i1 cos
i
n
o
X
p
n1
yi 12 1 10sin2 pyi1 yn 12
10 sinpy1
F12 x
i1
nX
n
Pn
Dim
Range
fmin
30
[-500, 500]
-418.9829 9 5
30
[-5.12, 5.12]
30
[-32, 32]
30
[-600, 600]
30
[-50, 50]
30
[-50, 50]
yi 1 xi 1
4
8
m
xi [ a
< kxi a
uxi ; a; k; m 0
a\xi \a
:
kxi am xi \ a
n
Xn
o
F13 x 0:1 sin2 3px1
xi 12 1 sin2 3pxi 1 xn 12 1 sin2 2pxn
i1
Xn
uxi ; 5; 100; 4 30
i1
GWOEPD
GWO
F1
1.6915 0.6126
86.2764 463.9271
F2
54.8619 23.4144
55.5763 24.6718
5,603.5083 2463.6797
F3
5,795.515 2599.173
F4
1.1309 0.2842
1.0276 0.2583
F5
798.0715 794.6384
651.1185 767.3522
F6
149.0123 806.9220
160.3304 869.2208
GWOEPD
F7
0.2440 0.2653
GWO
0.2462 0.2277
F8
-12,702.47 932.4343
-12,465.01 1,362.617
F9
147.8416 54.1366
169.9558 61.9270
F10
20.8627 0.0876
20.8604 0.0740
F11
1.9205 5.3332
1.9207 5.3631
F12
0.1422 0.1634
0.1623 0.1695
F13
0.0825 0.0314
0.0967 0.0346
5 Conclusion
References
123
1262
2.
3.
4.
5.
6.
7.
8.
9.
10.
11.
12.
13.
14.
15.
16.
17.
18.
19.
20.
21.
22.
23.
24.
123
25.
26.
27.
28.
29.
30.
31.
32.
33.
34.
35.
36.
37.
38.
39.
40.
41.
42.
43.
44.
45.
1263
53. Molga M, Smutnicki C (2005) Test functions for optimization
needs. In: Test functions for optimization needs
54. Yang X-S (2010) Test problems in optimization. arXiv preprint
arXiv:1008.0549
55. Mirjalili S, Lewis A (2013) S-shaped versus V-shaped transfer
functions for binary particle swarm optimization. Swarm Evolut
Comput 9:114
56. Saremi S, Mirjalili S, Lewis A (2014) How important is a transfer
function in discrete heuristic algorithms. Neural Comput Appl
116. doi:10.1007/s00521-014-1743-5
57. Saremi S, Mirjalili S, Lewis A (2014) Biogeography-based
optimisation with chaos. Neural Comput Appl 25(5):10771097
58. Saremi S, Mirjalili SM, Mirjalili S (2014) Chaotic krill herd
optimization algorithm. Proc Technol 12:180185
123