Professional Documents
Culture Documents
= +
) (
) (
) 1 (
gen x
gen x
gen x
p
i
child
i
i
otherwise
x f than better x f
p
i
child
i
) ( ) (
(2)
Therefore, the knock-out competition mechanism also serves as the fitness evaluation scheme
for the DE algorithm. The parameter setting for the DE algorithm is given in Table 1: The
algorithm of DE is shown in Algorithm 1.
Algorithm 1: Differential Evolution (DE)
Step 1: Initialize individual size N, P, CR and F
Step 2: Randomly initialize the population vectors, x
G
i.
Step 3: Randomly select one principal parents, x
p
i
Step 4: Randomly select three auxilary parents, x
a
i
Step 5: Perform differential mutation & generate mutated vector, V
i
Step 6: Recombine V
i
with x
p
i
to generate child trial vector, x
child
i
Step 7: Perform knock-out competition for next generation survival selection
Step 8: If the fitness criterion is satisfied and t= T
max
, halt and print solutions
else proceed to step 3
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
5
Table 1: Differential Evolution (DE) Parameter Settings
Parameters Values
Individual Size, N 6
Population Size, P 7
Mutation amplification factor, F 0.3
Cross-over Probabbility, CR 0.667
Gravitational Search Algorithm (GSA)
The GSA algorithm is a meta-heuristic algorithm first developed in 2009 by E.Rashedi et
al[23]. This technique was inspired by the law of gravity and the idea of interaction of
masses. This algorithm uses the Newtonian gravitational laws where the search agents are the
associated masses. Thus, the gravitational forces influence the motion of these masses, where
lighter masses gravitate towards the heavier masses (which signify good solutions) during
these interactions. The gravitational force hence acts as the communication mechanism for
the masses (analogous to pheromone deposition for ant agents in ACO [32] and the social
component for the particle agents in PSO [25]). The position of the masses correlates to the
solution space in the search domain while the masses characterize the fitness space. As the
iterations increase, and gravitational interactions occur, it is expected that the masses would
conglomerate at its fittest position and provide an optimal solution to the problem.
Initially the GSA algorithm randomly generates a distribution of masses, m
i
(t)(search
agents) and also sets an initial position for these masses, x
i
d
. For a minimization problem, the
least fit mass, ) (t m
worst
i
and the fittest mass, ) (t m
best
i
at time t are calculated as follows:
) ( min ) (
] , 1 [
t m t m
j
N j
best
e
=
(3)
) ( max ) (
] , 1 [
t m t m
j
N j
worst
e
=
(4)
For a maximization problem, its simply vice versa. The inertial mass, ) (t m
i
'
and
gravitational masses, ) (t M
i
are then computed based on the fitness map developed
previously.
) ( ) (
) ( ) (
) (
t m t m
t m t m
t m
worst best
worst
i
i
= '
(5)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
6
=
=
N
j
j
i
i
t m
t m
t M
1
) (
) (
) (
(6)
such that,
] , 1 [ : N i M M M M
i ii pi ai
e = = =
(7)
Then the gravitational constant, G(t+1) and the Euclidean distance R
ij
(t) is computed as
the following:
|
|
.
|
\
|
o
= +
max
exp ) ( ) 1 (
T
t
t G t G
(8)
( ) ( )
2 2
) ( ) ( ) ( t x t x t R
j i ij
=
(9)
where
o
is some arbitrary constant and T
max
is the maximum number of iterations, x
i
(t) and
x
j
(t) are the positions of particle i and j at time t . The interaction forces at time t, F
ij
d
(t) for
each of the masses are then computed:
( ) ) ( ) (
) (
) ( ) (
) ( ) ( t x t x
t R
t M t M
t G t F
d
i
d
j
ij
aj pi
d
ij
|
|
.
|
\
|
c +
=
(10)
where
c
is some small parameter. The total force acting on each mass i is given in a
stochastic form as the following:
] 1 , 0 [ ) ( : ) ( ) ( ) (
1
e =
=
=
j
d
ij
N
j i
j
j
d
i
w rand t F w rand t F
(11)
where rand(w
j
) is a randomly assigned weight. Consequently, the acceleration of each of the
masses, a
i
d
(t) is then as follows:
|
|
.
|
\
|
=
) (
) (
) (
t M
t F
t a
ii
d
i d
i
(12)
After the computation of the particle aceleration, the particle position and velocity is then
calculated:
) ( ) ( ) ( ) 1 ( t a t v w rand t v
d
i
d
i j
d
i
+ + = +
(13)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
7
) 1 ( ( ) ( ( ) 1 ( + + = + t t v t t x t x
d
i
d
i
d
i
(14)
where rand(w
j
) is a randomly assigned weight. The iterations are then continued until the all
mass agents are at their fittest positions in the fitness landscape and some stopping criterion
which is set by the user is met. The GSA algorithm is presented in Algorithm 2 and the
parameter settings are given in Table 2:
Algorithm 2: Gravitational Search Algorithm (GSA)
Step 1: Initialize no of particles, m
i
and initial positions, x
i
(0)
Step 2: Initialize algorithm parameters G(0) , o .
Step 3: Compute gravitational & inertial masses based on the fitness map
Step 4: Compute the gravitational constant, G(t)
Step 5: Compute distance between agents, R
ij
(t)
Step 6: Compute total force, F
i
d
(t) and the acceleration a
i
d
(t) of each agent.
Step 7: Compute new velocity v
i
(t) and position x
i
(t) for each agent
Step 8: If the fitness criterion is satisfied and t= T
max
, halt and print solutions
else proceed to step 3
Table 2: Gravitational Search Algorithm (GSA) Settings
Parameters Values
Initial parameter (G
o
) 100
Number of mass agents, n 6
Constant parameter,
o
20
Constant parameter,
c
0.01
3. Hypervolume Indicator
The Hypervolume Indicator (HVI) is the only strictly pareto-compliant indicator that can be
used to measure the quality of solution sets in MO optimization problems [13], [24]. Strictly
pareto-compliant can be defined such that if there exists two solution sets to a particular MO
problem, then the solution set that dominates the other would a higher indicator value. The
HVI measures the volume of the dominated section of the objective space and can be applied
for multi-dimensional scenarios. When using the HVI, a reference point needs to be defined.
Relative to this point, the volume of the space of all dominated solutions can be measured.
The HVI of a solution set x
d
e
X can be defined as follows:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
8
|
|
.
|
\
|
=
e
X x x
d d
d
x r x r vol X HVI
) ,... (
1 1
1
] , [ ... ] , [ ) (
(15)
where r
1,,
r
d
is the reference point and vol(.) being the usual Lebesgue measure. In this work
the HVI is used to measure the quality of the approximation of the pareto front by the GSA
and the DE algorithms when used in conjunction with the weighted sum approach.
4. Application Data
In the green sand mould system, the response parameters of the mould heavily influence
the quality of the final product. In Surekha et al [19], these parameters are selected as the
objective functions. The responses parameters are; green compression strength (f
1
),
permeability (f
2
), hardness (f
3
) and bulk density (f
4
). These objectives on the other hand are
influenced by on the process variables which are; the grain fineness number (A), percentage
of clay content (B), percentage of water content (C) and number of strokes (D). The objective
functions and the range of the decision variables are shown as follows:
2.689CD) + 0.6378BD +
0.5516BC + 0.0451AD - 0.1215AC - 0.0468AB + 1.2079D - 7.7857C -
0.0945B + 0.014A + 6.575D + 32.3203C + 2.7463B - 1.7384A - 17.2527 f
2 2
2 2
=
1
(16)
3.1CD -
1.99BD + 1.19BC + 0.52AD + 0.2AC + 0.11AB + 4.22D + 4.13C -
0.45B + 0.07A + 105.66D - 9.51C + 35.66B - 15.98A - 1192.51 f
2 2
2 2
=
2
(17)
0.65CD +
0.1938BD - 0.075BC - 0.0006AD - 0.0151AC - 0.0015AB - 0.6556D - 1.6988C -
0.00389B - 0.001A + 7.774D + 7.8434C + 2.4746B + 0.0494A - 38.2843 f
2 2
2 2
=
3
(18)
0.00186CD - 0.00019BD - 0.00302BC -
0.00029AD + 0.00018AC - 0.00004AB - 0.00107D - 0.0239C + 0.0009B +
0.00008A - 0.0083D + 0.06845C - 0.00052B - 0.01316A + 1.02616 f
2 2 2
2
=
4
(19)
5 3
3 5 . 1
12 8
94 52
s s
s s
s s
s s
D
C
B
A
(20)
To obtain the size distributions of the silica sand and the grain fineness number, sieve
analysis tests were carried out in Parappagoudar et al. [33]. Similarly, the authors also
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
9
conducted gelling index tests for the determination the strength of clay. Next, experiments
were conducted by varying the combination of the parameters using the central composite
design. The mathematical model of the green mould system was developed where; the
objective functions as given in equations (16)-(19) and the constraints as given in equation
(20). The MO optimization problem statement for the green mould system problem is shown
as follows:
to subject
f f f f Max )
4
,
3
,
2
,
1
(
5 3
3 5 . 1
12 8
94 52
s s
s s
s s
s s
D
C
B
A
(21)
The algorithms used in this work were programmed using the C++ programming language on
a personal computer (PC) with an Intel dual core processor running at 2 GHz.
5. Results and Analysis
In this work, the solution sets which are the approximations of the pareto frontier were
obtained using the DE and the GSA methods. The quality of these solutions was measured
using the HVI. The nadir point (or the reference point) used in the HVI is a specified point
where all the solutions sets produced by the algorithms dominate this point. The nadir point
selected in this work is (r
1
, r
2
, r
3
, r
4
) = (15, 50, 40, 1). The individual solutions (for specific
weights) of the GSA and the DE algorithms were gauged with the HVI and the best, median
and worst solution was determined. The individual solutions for the GSA algorithm and their
individual HVI values are as in Table 3. The associated weights, computational time as well
as the number of function evaluations are given in Table 4.
For the approximation of the pareto frontier, 21 solution for various weights were obtained
for both the algorithms. The approximate pareto frontiers obtained using the GSA algorithm
is shown in Figure 1:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
10
Table 3: Individual Solutions generated by the GSA algorithm
Description Best Median Worst
Objective
Function
f
1
34.6562 25.8028 25.5917
f
2
205.619 210.839 210.877
f
3
81.1777 78.8635 78.8113
f
4
1.46948 1.47897 1.47937
Decision Variable
x
1
52.0015 52.0023 52.0009
x
2
8.00543 8.00205 8.0007
x
3
2.49446 1.51268 1.5
x
4
3.19889 3.00254 3
Hypervolume
Indicator 59134.56 32342.82 31702.15
Table 4: Weights, computational time and the number of function evaluations of the
individual solutions generated by the GSA algorithm
Description Best Median Worst
Computational Time (Secs) 0.52 0.128571 0.042857
No Of Function Evaluations 182 45 15
w
1
0.2 0.1 0.1
w
2
0.3 0.6 0.2
w
3
0.4 0.2 0.5
w
4
0.1 0.1 0.2
Figure 1: Pareto frontiers of the objectives obtained by the GSA method
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
11
The individual solutions for the DE algorithm and their individual HVI values are given in
Table 5. The associated weights, computational time as well as the number of function
evaluations are given in Table 6. The approximate pareto frontiers obtained using the DE
algorithm is shown in Figure 2:
Figure 2: Pareto frontiers of the objectives obtained by the DE method
Table 5: Individual Solutions generated by the DE algorithm
Description Best Median Worst
Objective
Function
f
1
50.2281 45.8468 39.4176
f
2
137.769 137.19 137.105
f
3
86.289 85.3638 83.7878
f
4
1.50073 1.50953 1.52143
Decision Variable
x
1
53.0457 55.5626 59.4985
x
2
9.39392 9.13418 8.74959
x
3
2.99849 2.99044 2.99944
x
4
4.39392 4.13418 3.74959
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
12
Hypervolume
Indicator
71665.77 62166.44 48561.85
Table 6: Weights, computational time and the number of function evaluations of the
individual solutions generated by the DE algorithm
Description Best Median Worst
Computational Time (Secs) 0.191129 0.419344 6.022004
No Of Function Evaluations 67 47 2111
w
1
0.4 0.4 0.2
w
2
0.2 0.2 0.2
w
3
0.3 0.1 0.1
w
4
0.1 0.3 0.5
The best solution candidate in Table 4 obtained by the GSA method at the weights
(objective function trade-offs) (w
1
, w
2
, w
3
, w
4
) = (0.2, 0.3, 0.4, 0.1). The best solution
candidate obtained by the DE method at the weights (objective function trade-offs) (w
1
, w
2
,
w
3
, w
4
) = (0.4, 0.2, 0.3, 0.1) (see Table 6). It can be observed using the HVI that the best
solution obtained by DE algorithm dominates the best solution produced by the GSA
algorithm by 21.19%. Besides, the number of function evaluations taken to obtain the best
solution by the DE method is lesser than the GSA method. The comparison of the best
candidate solutions obtained by the DE and PSO methods in this work as well as the PSO
method in Surekha et al [19] is shown in Table 7. In Table 7, it can be seen that the best
solutions produces by DE and GSA are more dominant the PSO approach [19].
The HVI computed for the entire frontier of each solution set produced by an algorithm
gives the true measure of dominance when compared with another algorithm. In this work,
the HVI for the entire frontier was computed for each of the algorithm. The execution time
for each algorithm to generate the entire frontier was also obtained. The HVI for the entire
frontier for the solution sets produced by the PSO [19], DE as well as GSA and the associated
execution time is shown in Table 8.
Table 7: The Comparison of the best solutions obtained by the algorithms
Description PSO [19] DE GSA
Objective
Function
f
1
55.4112 50.2281 34.6562
f
2
107.8949 137.769 205.619
f
3
84.7936 86.289 81.1777
f
4
1.5079 1.50073 1.46948
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
13
Decision
Variable
x
1
52.0001 53.0457 52.0015
x
2
11.9998 9.39392 8.00543
x
3
2.8452 2.99849 2.49446
x
4
4.9999 4.39392 3.19889
HVI 8876.715583 71665.77161 59134.55919
Table 8: The HVI obtained by the algorithms and the computational time for the entire
frontier
PSO [19] DE GSA
HVI 130,108.48 492, 5326.52 309, 5494.45
Computational time (secs) 0.013 39.08 21.20
In Table 8, the frontier produced by the DE algorithm is more dominant than the one
produced by the GSA and PSO [19] algorithms by 59.1128% and 3685.5538% respectively.
The GSA algorithm produces a frontier that is more dominant than the PSO [19] algorithm by
2279.1643%. The dominance ranking of the approximate pareto frontiers produced by the
algorithms is as follows:
] 19 [ PSO GSA DE
(22)
where the symbol, is denoted as more dominant than. A new optima is achieved by the
DE method (see Table 7) since it outperforms the GSA and the PSO [19] methods. The
individual best solution of the DE method compromises on the objective f
2
while maximizing
the objectives f
1
, f
3
and f
4
very effectively as compared to the one obtained by using GSA.
The individual best solution of the DE method compromises on the objective f
1
and f
2
while
maximizing the objectives f
3
and f
4
as compared to the one obtained by using PSO [19]
method. Thus, it can be said that the DE method in this work outweighs the overall
optimization capabilities of GSA and PSO [19] (see HVI value in Table 8). In terms of
computational time taken for the algorithm to produce the entire approximate pareto frontier,
the DE method takes the longest time followed by the GSA and the PSO [19] method
respectively. Although the DE algorithm produces the most dominant frontier, it sacrifices
computational time as compared to the GSA and PSO [19] methods.
In Surekha et al [19], the GA and the PSO method was used in conjunction with the
weighted sum method on an Intel Pentium IV processor (single core). As mentioned
previously, the algorithms presented in this work; DE and GSA were executed on an Intel
dual-core processor which is more superior than the machine used in Surekha et al [19].
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
14
However, it can be seen that the computational time for DE and GSA algorithms are far
greater as compared to the GA and PSO in [19]. Although the algorithms; DE and GSA were
executed on a superior machine, these algorithms seem to be computationally inferior as
compared to the GA and PSO [19] algorithms. This can be mainly attributed to the high
complexity of the DE and GSA algorithms presented in this work.
In Surekha et al [19] as well as in this work, the scalarization scheme adopted is the
weighted sum method which produces a progression of the approximate pareto efficient
solutions for various trade-offs in the objective functions. As can be seen in Figure 1 and
Figure 2, the frontier produced by the DE algorithm is more uniformly spaced as compared to
the GSA algorithm. The frontier produced by the GSA algorithm seems to be conglomerated
and localized at certain portions of the objective space. This spacing property as can be seen
in this work heavily influences the ability of the algorithm to approximate the pareto frontier.
Localized solutions of the frontier such as the ones produced by the GSA algorithm miss out
on certain solutions in the objective space. Therefore, this causes the GSA algorithm to have
a lower HVI as compared to the DE algorithm. In effect, this causes the approximated pareto
frontier produced by the GSA algorithm to become less dominant as compared to the one
produced by the DE algorithm.
One of the setbacks of the weighted sum method is that does not guarantee pareto
optimality (only in the weak sense [34]). Besides, scalarization techniques (such as weighted
sum as well Normal Boundary Intersection (NBI)) cannot approximate sections of the pareto
frontier that is concave [35]. Although the HVI metric is most effective in benchmarking the
dominance of solution sets produced by algorithms, this metric is very dependent on the
choice of the nadir point (reference set). The HVI metrics only weak is in this aspect.
In this work, the DE and GSA algorithms performed stable computations during the
program executions. All pareto-efficient solutions produced by the algorithms developed in
this work were feasible and no constraints were compromised. One of the advantages of
using the DE algorithm as compared to the other algorithms used in this work is that it
produces highly effective results in terms of approximating the pareto frontier. Although, the
DE algorithm performs well relative to other algorithms used in this work, it can be clearly
seen that the execution time is much higher than the one obtained by Surekha et al [19] using
the PSO method. Since DE is an evolutionary-type algorithm, the diversification of the
search space is high and thus resulting in high computational time as compared with the GSA
and PSO methods which are swarm-type algorithms.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
15
The GSA method can be said to be the second best optimizer as compared to the DE
method. Besides, in comparison with the PSO algorithm the GSA method produces more
superior results.
6. Conclusions
In this work, a new local maximum and a more dominant approximation of the pareto
frontier was achieved using the DE method. More pareto-efficient solution options to the
green mould system MO optimization problem were obtained. Besides, using the DE
algorithm, the solution spread of the frontier was near-uniformly distributed. When gauged
with the HVI metric, the DE algorithm produced the most dominant approximate of the
pareto frontier as compared to the GSA and the PSO [19] methods.
For future works, other meta-heuristic algorithms such as Genetic Programming (GP) [36],
Analytical Programing (AP) [37], Hybrid Neuro-GP [38], MO evolutionary algorithm [39],
[40] and Hybrid Neuro-Swarm [33] should be applied to the green mould system.
The solution sets obtained by these algorithms should be tested and evaluated using the
HVI metric. The HVI metric should be tested using various nadir points for a better
understanding of the effects of these points to the dominance measurements of solution sets.
During these numerical experiments, the spacing metric should also be measured and
compared for the observation of the uniformity of the spreads with respect to the algorithms.
Besides, convergence and diversity metrics should also be utilized to compare the
performance of the algorithms. More large-scale MO problems should be studied using the
DE and GSA algorithms for a better understanding of the mentioned algorithms performance
and efficiency.
Acknowledgements
This work was supported by STIRF grant (STIRF CODE NO: 90/10.11) of University
Technology Petronas (UTP), Malaysia.
References
[1] H. Eschenauer, J. Koski, and A. Osyczka, Multicriteria Design Optimization, Springer-
Verlag, Berlin, 1990.
[2] R.B. Statnikov and J.B. Matusov, Multicriteria Optimization and Engineering,
Chapman and Hall, New York, 1995.
[3] E. Zitzler and L. Thiele, Multiobjective optimization using evolutionary algorithmsA
comparative case study, In A. E. Eiben, T. Back, M. Schoenauer, and H. P. Schwefel
(Eds.), Parallel Problem Solving from Nature, V, Springer, Berlin, Germany, 1998, pp
292301.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
16
[4] K. Deb, A. Pratap, S. Agarwal, T. Meyarivan, A Fast and Elitist Multi-Objective
Genetic Algorithm: NSGA-II, IEEE Transactions on Evolutionary Computation, 6 (2)
(2002) 182197.
[5] P.C. Fishburn, Additive Utilities with Incomplete Product Set: Applications to
Priorities and Assignments, Operations Research Society of America (ORSA),
Baltimore, MD, U.S.A, 1967.
[6] E. Triantaphyllou, Multi-Criteria Decision Making: A Comparative Study. Dordrecht,
The Netherlands, Kluwer Academic Publishers (now Springer), 2000, pp 320.
[7] M. L. Luyben and C. A. Floudas, Analyzing the interaction of design and control. 1. A
multiobjective framework and application to binary distillation synthesis, Computers
and Chemical Engineering, 18 (10) (1994) 933-969.
[8] I. Das and J.E. Dennis, Normal-boundary intersection: A new method for generating
the Pareto surface in nonlinear multicriteria optimization problems, SIAM Journal of
Optimization 8(3) (1998) 631657.
[9] H. Eschenauer, J. Koski, and A. Osyczka, Multicriteria Design Optimization, Berlin:
Springer-Verlag, 1990.
[10] E.Sandgren, Multicriteria design optimization by goal programming, In H. Adeli,
editor, Advances in Design Optimization, Chapman & Hall, London, 1994, pp 225
265.
[11] R. B. Stanikov and J. B. Matusov. Multicriteria Optimization and Engineering. New
York: Chapman and Hall, 1995.
[12] C. Grosan, M. Oltean., D.Dumitrescu, Performance Metrics for Multiobjective
Optimization Evolutionary Algorithms, In Proceedings of Conference on Applied and
Industrial Mathematics (CAIM), Oradea, 2003.
[13] E. Zitzler and L. Thiele. Multiobjective Optimization Using Evolutionary Algorithms -
A Comparative Case Study. In Conference on Parallel Problem Solving from Nature
(PPSN V), pages 292301, 1998.
[14] J. Knowles and D. Corne. Properties of an Adaptive Archiving Algorithm for Storing
Nondominated Vectors. IEEE Transactions on Evolutionary Computation, 7(2):100
116, 2003.
[15] C. Igel, N. Hansen, and S. Roth. Covariance Matrix Adaptation for Multi-objective
Optimization. Evolutionary Computation, 15(1):128, 2007
[16] M. Emmerich, N. Beume, and B. Naujoks. An EMO Algorithm Using the
Hypervolume Measure as Selection Criterion. In Conference on Evolutionary Multi-
Criterion Optimization (EMO 2005), pp 6276. Springer, 2005.
[17] M. Fleischer. The measure of Pareto optima. Applications to multi-objective
metaheuristics. In Conference on Evolutionary Multi-Criterion Optimization (EMO
2003), pages 519533. Springer, 2003.
[18] E. Zitzler, L. Thiele, M. Laumanns, C. M. Fonseca, and V. Grunert da Fonseca.
Performance Assessment of Multiobjective Optimizers: An Analysis and Review. IEEE
Transactions on Evolutionary Computation, 7(2):117132, 2003.
[19] B. Surekha, Lalith K. Kaushik, A. K. Panduy, A.P. R. Vundavilli & M. B.
Parappagoudar, Multi-objective optimization of green sand mould system using
evolutionary algorithms, International Journal of Advance Manufacturing Technoloqy, ,
2011, pp 1-9.
[20] K. Sushil, P.S. Satsangi, D.R. Prajapati, Optimization of green sand casting process
parameters of a foundry by using Taguchi method, International Journal of Advance
Manufacturing Technology, 55 (2010) 23-34.
[21] R.S. Rosenberg, Simulation of genetic populations with biochemical properties, Ph.D.
thesis, University of Michigan, 1967.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
17
[22] R. Storn and K. V. Price, Differential evolution a simple and efficient adaptive
scheme for global optimization over continuous spaces, ICSI, Technical Report TR-
95-012, 1995.
[23] E.Rashedi, H.Nezamabadi-pour, S.Saryazdi, GSA: A Gravitational Search Algorithm,
Information Sciences ,Vol. 179 (2009), pp 22322248.
[24] E. Zitzler and L. Thiele. Multiobjective evolutionary algorithms: a comparative case
study and the strength Pareto approach. IEEE Transactions on Evolutionary
Computation, 3(4):257271, 1999.
[25] J. Kennedy, and R. Eberhart, Particle Swarm Optimization, IEEE Proceedings of the
International Conference on Neural Networks: Perth, Australia, 1995, pp. 1942-1948.
[26] X.-S. Yang, S. Deb, Cuckoo search via Levy flights, in: Proc. of World Congress on
Nature & Biologically Inspired Computing (NaBIC 2009, IEEE Publications, USA, pp.
210-214 (2009).
[27] A. Chatterjee and G. K. Mahanti, Comparative Performance Of Gravitational Search
Algorithm And Modified Particle Swarm Optimization Algorithm For Synthesis Of
Thinned Scanned Concentric Ring Array Antenna, Progress In Electromagnetics
Research B, Vol. 25 ,pp 331- 348, 2010.
[28] J.H. Holland, Adaptation in Natural and Artificial Systems: An Introductory Analysis
with Applications to Biology, Control and Artificial Intelligence, MIT Press, USA,
1992.
[29] B. V. Babu, S. A. Munawar, Differential Evolution for the Optimal Design of Heat
Exchangers, In: Proceedings of All-India seminar on Chemical Engineering Progress
on Resource Development: A Vision 2010 and Beyond, Bhuvaneshwar, 2000.
[30] B. V. Babu, R. P.Singh, Synthesis & Optimization of Heat Integrated Distillation
Systems Using Differential Evolution, In: Proceedings of All- India seminar on
Chemical Engineering Progress on Resource Development: A Vision 2010 and Beyond,
Bhuvaneshwar, 2000
[31] R Angira, B. V. Babu, Optimization of Non-Linear Chemical Processes Using
Modified Differential Evolution (MDE), In: Proceedings of the 2
nd
Indian
International Conference on Artificial Intelligence, Pune, India, 2005a, pp. 911-923
[32] A.Colorni, M.Dorigo and V.Maniezzo, Distributed Optimization by Ant Colonies,
Proceedings of the first European Conference of Artificial Intelligence, Paris, France,
Elsevier Publishing, 1991, pp.134-142.
[33] M.B. Parappagoudar, D.K. Pratihar, G.L. Datta, Non-linear modeling using central
composite design to predict green sand mould properties, Proceedings IMechE B
Journal of Engineering Manufacture, 221, 2007, pp 881894.
[34] S. K. Pradyumn, On the Normal Boundary Intersection Method for Generation of
Efficient Front, in Y. Shi et al. (Eds.): ICCS 2007, Part I, LNCS 4487, Springer-Verlag
Berlin Heidelberg, (2007) 310317
[35] E.Zitzler, J.Knowles, and L.Thiele, Quality Assessment of Pareto Set
Approximations, Chapter 15, J. Branke et al. (Eds.): Multiobjective Optimization,
LNCS 5252, pp. 373404, Springer-Verlag Berlin Heidelberg, 2008.
[36] J.R. Koza, Genetic Programming: On the Programming of Computers by means of
Natural Selection, MIT Press, USA, 1992.
[37] I. Zelinka, Analytic programming by Means of SOMA Algorithm. Mendel 02, In:
Proc. 8
th
, International Conference on Soft Computing Mendel02, Brno, Czech
Republic, 2002, pp 93-101.
[38] T. Ganesan, P. Vasant and I. Elamvazuthi, Optimization Of Nonlinear Geological
Structure Mapping Using Hybrid Neuro-Genetic Techniques, Mathematical and
Computer Modelling, 54 (11-12) (2011) 2913 2922.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
18
[39] B.Y. Qu and P.N. Suganthan, Multi-objective evolutionary algorithms based on the
summationof normalized objectives and diversified selection, Information Sciences,
180 (2010) 31703181.
[40] Ke Li, S. Kwong, J. Cao, M. Li, J. Zheng and R. Shen, Achieving Balance Between
Proximity And Diversity In Multi-Objective Evolutionary Algorithm, Information
Sciences, 182 (2011) 220242.
[41] I. Elamvazuthi, T. Ganesan, and P. Vasant, A comparative study of HNN and Hybrid
HNN-PSO techniques in the optimization of distributed generation (DG) power
systems, International Conference on Advanced Computer Science and Information
System (ICACSIS), pp 195 - 200 ,2011.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65