You are on page 1of 55

Multi-Objective Evolutionary Algorithms

Kalyanmoy Deb a
Kanpur Genetic Algorithm Laboratory (KanGAL)
Indian Institute of Technology Kanpur
Kanpur, Pin 208016 INDIA
deb@iitk.ac.in
http://www.iitk.ac.in/kangal/deb.html
a Currently visiting TIK, ETH Zürich

1
Overview of the Tutorial

• Multi-objective optimization
• Classical methods
• History of multi-objective evolutionary algorithms (MOEAs)
• Non-elitst MOEAs
• Elitist MOEAs
• Constrained MOEAs
• Applications of MOEAs
• Salient research issues

2
Multi-Objective Optimization

• We often face them


2
90% C
B
Comfort

1
40%

10k 100k
Cost

3
More Examples

A cheaper but inconvenient A convenient but expensive


flight flight

4
Which Solutions are Optimal?

f2 (minimize)
Domination: 6

x(1) dominates x(2) if 5


2

1. x(1) is no worse than x(2) in all 4


3
objectives 1 5

1 3
2. x(1) is strictly better than x(2)
in at least one objective 2 6 10 14 18
f1 (maximize)

5
Pareto-Optimal Solutions

Non-dominated solutions: Among


a set of solutions P , the non- f2 (minimize)
dominated set of solutions P 0
2
are those that are not dominated 5 Non−dominated
front
4
by any member of the set P . 3
1 5

O(M N 2 ) algorithms exist. 1


3

Pareto-Optimal solutions: When 2 6 10 14 18

P = S, the resulting P 0 is Pareto- f1 (maximize)

optimal set
A number of solutions are optimal

6
Pareto-Optimal Fronts

f2 Min−−Min f2 Min−−Max

f1 f1

f2 Max−−Min f2 Max−−Max

f1 f1

7
Preference-Based Approach

Multi−objective
Single−objective
optimization problem Estimate a
optimization problem
Minimize f1 Higher−level relative
importance F = w1 f1 + w 2 f2 +...+ w M fM
Minimize f2 information
...... vector or
Minimize fM (w1 w2.. wM) a composite function
subject to constraints

Single−objective
optimizer

One optimum
solution

• Classical approaches follow it

8
Classical Approaches

• No Preference methods (heuristic-based)


• Posteriori methods (generating solutions)
• A priori methods (one preferred solution)
• Interactive methods (involving a decision-maker)

9
Weighted Sum Method

Feasible objective space


• Construct a weighted sum of f2
objectives and optimize
a
M
X
F (x) = wm fm (x).
m=1 A b
w1
w2
c
• User supplies weight vector w d
f1
Pareto−optimal front

10
Difficulties with Weighted Sum Method

Feasible objective space


f2
• Need to know w A

B
• Non-uniformity in Pareto-
optimal solutions
• Inability to find some
C
Pareto-optimal solutions a
b D
Pareto−optimal front
f1

11
-Constraint Method

• Optimize one objective, f2








constrain all other











B


Minimize fµ (x),








C





subject to fm (x) ≤ m , m 6= µ;























• User supplies a  vector D


ε1d f


ε1 ε1
a b ε1 c


1

• Need to know relevant  vectors


• Non-uniformity in Pareto-optimal solutions

12
Difficulties with Most Classical Methods

• Need to run a single-


objective optimizer many
A
1
times 0.9
0.8
• Expect a lot of problem 0.7
B

knowledge 0.6
0.5
f2
• Even then, good distribu- 0.4
0.3 Pareto−optimal
front
tion is not guaranteed 0.2
0.1 C

• Multi-objective optimiza- 0 D

−0.1
tion as an application of 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9
f1
1

single-objective optimiza-
tion

13
Ideal Multi-Objective Optimization

Multi−objective
optimization problem
Minimize f1
Minimize f2
......
Minimize f M

Step 1
subject to constraints

IDEAL
Multi−objective
optimizer

Multiple trade−off Choose one


solutions found solution
Higher−level
information

Step 2

Step 1 Find a set of Pareto-optimal solutions


Step 2 Choose one from the set

14
Advantages of Ideal Multi-Objective Optimization

• Decision-making becomes easier and less subjective

• Single-objective optimization is a degen-


erate case of multi-objective optimiza-
tion
– Step 1 finds a single solution
– No need for Step 2
• Multi-modal optimization is a special
case of multi-objective optimization

15
Two Goals in Ideal Multi-Objective Optimization

f2

1. Converge on the Pareto-


optimal front
2. Maintain as diverse a distri-
bution as possible

f1

16
Why Evolutionary?

• Population approach suits well to find multiple solutions


• Niche-preservation methods can be exploited to find diverse
solutions
1 1

0.8 0.8

0.6
0.6
f2
f2
0.4
0.4

0.2
0.2
0
0
0 0.2 0.4 0.6 0.8 1 0 0.2 0.4 0.6 0.8 1
f1 f1

17
1990 1991 1992 1993 1994 1995 1996 1997 1998 1999

                     












































                
History of Multi-Objective Evolutionary

        
Year

        
  
  
  
 
Algorithms (MOEAs)

 
 
and before
  
  
140

120

100

80

60

40

20

0
Number of Studies

18
suggestion

NSGA-II, PAES, MOMGA


ap-

• Elitist MOEAs (SPEA,


NPGA
• Early penalty-based

etc.) (1998 – Present)


• MOGA, NSGA,
• VEGA (1984)
• Goldberg’s

(1993-95)
proaches

(1989)
19
What to Change in a Simple GA?

• Modify the fitness computation


Begin

Initialize Population

t=0

No
Cond?

Yes Evaluation

Stop Assign Fitness

t=t+1 Reproduction

Crossover

Mutation

20
Identifying the Non-dominated Set

Step 1 Set i = 1 and create an empty set P 0 .


Step 2 For a solution j ∈ P (but j 6= i), check if solution j
dominates solution i. If yes, go to Step 4.
Step 3 If more solutions are left in P , increment j by one and go
to Step 2; otherwise, set P 0 = P 0 ∪ {i}.
Step 4 Increment i by one. If i ≤ N , go to Step 2; otherwise stop
and declare P 0 as the non-dominated set.
O(M N 2 ) computational complexity

21
An Efficient Approach

Kung et al.’s algorithm (1975)


Step 1 Sort the population in descend-
ing order of importance of f1
Step 2, Front(P ) If |P | = 1,
T1
return P as the output
of Front(P ). Otherwise,
T = Front(P (1) −−P (|P |/2) ) and T2

B = Front(P (|P |/2+1) −−P (|P |) ). Merging

If the i-th solution of B is not dom- B2 B1

inated by any solution of T , create


a merged set M = T ∪ {i}. Return
M as the output of Front(P ).
O N (log N )M −2 for M ≥ 4 and O(N log N ) for M = 2 and 3


22
A Simple Non-dominated Sorting Algorithm

• Identify the best non-dominated set


• Discard them from population
• Identify the next-best non-dominated set
• Continue till all solutions are classified
• We discuss a O(M N 2 ) algorithm later

23
Non-Elitist MOEAs

• Vector evaluated GA (VEGA) (Schaffer, 1984)


• Vector optimized EA (VOES) (Kursawe, 1990)
• Weight based GA (WBGA) (Hajela and Lin, 1993)
• Multiple objective GA (MOGA) (Fonseca and Fleming, 1993)
• Non-dominated sorting GA (NSGA) (Srinivas and Deb, 1994)
• Niched Pareto GA (NPGA) (Horn et al., 1994)
• Predator-prey ES (Laumanns et al., 1998)
• Other methods: Distributed sharing GA, neighborhood
constrained GA, Nash GA etc.

24
Non-Dominated Sorting GA (NSGA)

• A non-dominated sorting of the population


• First front: Fitness F = N to all
• Niching among all solutions in first front
• Note worst fitness (say Fw1 )
• Second front: Fitness Fw1 − 1 to all
• Niching among all solutions in second front
• Continue till all fronts are assigned a fitness

25
Non-Dominated Sorting GA (NSGA)

30

25 6
f1 f2 Fitness
x Front before after 20
−1.50 2.25 12.25 2 3.00 3.00
f2 15
0.70 0.49 1.69 1 6.00 6.00
4.20 17.64 4.84 2 3.00 3.00 1
10
2.00 4.00 0.00 1 6.00 3.43
1.75 3.06 0.06 1 6.00 3.43 5 3
2
−3.00 9.00 25.00 3 2.00 2.00 54
0
0 5 10 15 20 25 30
f1

• Niching in parameter space


• Non-dominated solutions are emphasized
• Diversity among them is maintained

26
Vector-Evaluated GA (VEGA)

• Divide population into M equal blocks


• Each block is reproduced with one objective function
• Complete population participates in crossover and mutation
• Bias towards to individual best objective solutions
• A non-dominated selection: Non-dominated solutions are
assigned more copies
• Mate selection: Two distant (in parameter space) solutions are
mated
• Both necessary aspects missing in one algorithm

27
Multi-Objective GA (MOGA)

• Count the number of domi-


F Asgn. Fit.
nated solutions (say n)
1 2 3 2.5
• Fitness: F = n + 1
2 1 6 5.0
• A fitness ranking adjust-
3 2 2 2.5
ment
4 1 5 5.0
• Niching in fitness space
5 1 4 5.0
• Rest all are similar to
6 3 1 1.0
NSGA

28
Niched Pareto GA (NPGA)

• Solutions in a tournament are checked for domination with


respect to a small subpopulation (tdom )
• If one dominated and other non-dominated, select second
• If both non-dominated or both dominated, choose the one with
smaller niche count in the subpopulation
• Algorithm depends on tdom
• Nevertheless, it has both necessary components

29
NPGA (cont.)

X Check for
Y domination

t_dom

Parameter Space
Population

X
Y

30
Shortcoming of Non-Elitist MOEAs

• Elite-preservation is missing
• Elite-preservation is important for proper convergence in
SOEAs
• Same is true in MOEAs
• Three tasks
– Elite preservation
– Progress towards the Pareto-optimal front
– Maintain diversity among solutions

31
Elitist MOEAs

EA Elite

Elite-preservation:
• Maintain an archive of non-dominated solu-
tions f2 (minimize)

2
Progress towards Pareto-optimal front: 5

4
Non−dominated
front
3 5
1

• Preferring non-dominated solutions 1


3

2 6 10 14 18
f1 (maximize)
Maintaining spread of solutions:
f2

• Clustering, niching, or grid-based competi-


tion for a place in the archive
clusters
f1

32
Elitist MOEAs (cont.)

• Distance-based Pareto GA (DPGA) (Osyczka and Kundu,


1995)
• Thermodynamical GA (TDGA) (Kita et al., 1996)
• Strength Pareto EA (SPEA) (Zitzler and Thiele, 1998)
• Non-dominated sorting GA-II (NSGA-II) (Deb et al., 1999)
• Pareto-archived ES (PAES) (Knowles and Corne, 1999)
• Multi-objective Messy GA (MOMGA) (Veldhuizen and
Lamont, 1999)
• Other methods: Pareto-converging GA, multi-objective
micro-GA, elitist MOGA with coevolutionary sharing

33
Elitist Non-dominated Sorting Genetic Algorithm
(NSGA-II)

Non-dominated sorting: O(M N 2 )

f2 6
• Calculate (ni , Si ) for each 1 10 (5, Nil)
solution i 7 11
2 8
• ni : Number of solutions
9
dominating i 3
4
• Si : Set of solutions domi- 5
(0, {9,11})
nated by i f1

34
NSGA-II (cont.)

Elites are preserved

Non−dominated Crowding
sorting distance Pt+1
sorting
F1
Pt F2
F3

Qt
Rejected

Rt

35
NSGA-II (cont.)

Diversity is maintained: O(M N log N )

f2 0

Cuboid
i-1
i
l
i+1

f1

Overall Complexity: O(M N 2 )

36
NSGA-II Simulation Results

1.1 2
1
0 NSGA−II
0.9
0.8 −2
0.7
0.6 −4
f2 f2
0.5 −6
0.4
0.3 −8

0.2 NSGA−II (binary−coded) −10


0.1
0 −12
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 −20 −19 −18 −17 −16 −15 −14
f1 f1

37
Strength Pareto EA (SPEA)

• Stores non-dominated solutions externally


• Pareto-dominance to assign fitness
– External members: Assign number of dominated solutions
in population (smaller, better)
– Population members: Assign sum of fitness of external
dominating members (smaller, better)
• Tournament selection and recombination applied to combined
current and elite populations
• A clustering technique to maintain diversity in updated
external population, when size increases a limit

38
SPEA (cont.)

• Fitness assignment and clustering methods


Fitness Assignment
Population Ext_pop
x
x x
x
x x
x
x
x

Function Space

Clustering (d and p_max)

Function Space

39
Pareto Archived ES (PAES)

• An (1+1)-ES
• Parent pt and child ct are compared with an external archive At
• If ct is dominated by At , pt+1 = pt
• If ct dominates a member of At , delete it from At and include
ct in At and pt+1 = ct
• If |At | < N , include ct and pt+1 = winner(pt , ct )
• If |At | = N and ct does not lie in highest count hypercube H,
replace ct with a random solution from H and
pt+1 = winner(pt , ct ).
The winner is based on least number of solutions in the hypercube

40
Niching in PAES-(1+1)

f2
1 Offspring

3 Parent
2

Pareto−optimal f1
front

f2

Offspring
Parent

Pareto−optimal f1
front

41
Constrained Handling

• Penalty function approach

Fm = fm + Rm Ω(~g ).

• Explicit procedures to handle infeasible solutions


– Jimenez’s approach
– Ray-Tang-Seow’s approach
• Modified definition of domination
– Fonseca and Fleming’s approach
– Deb et al.’s approach

42
Constrain-Domination Principle

A solution i constrained-
dominates a solution j, if any is
true: 10

1. Solution i is feasible and so- 8 4


3
lution j is not. 1 2
6 5 3
2. Solutions i and j are both in- f2 4
6
4 5
feasible, but solution i has a Front 2
Front 1
smaller overall constraint vi- 2

olation. 0
0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

3. Solutions i and j are feasible f1

and solution i dominates so-


lution j.

43
1.2 1.4
1
Constrained NSGA-II Simulation Results

0.2 0.4 0.6 0.8


f1
0
1.4

1.2

0.8

0.6

0.4

0.2

0
f2

44
1
0.9
0.8
0.7
                           
                           


0.6
                           
                           

f1

                           
                           


0.5
                           
                           

                           
                           


0.4
                           
                           

                           
                           


0.3



0.2



0.1


10

0
f2
Applications of MOEAs

• Space-craft trajectory optimization


• Engineering component design
• Microwave absorber design
• Ground-water monitoring
• Extruder screw design
• Airline scheduling
• VLSI circuit design
• Other applications (refer Deb, 2001 and EMO-01 proceedings)

45
Spacecraft Trajectory Optimization

• Coverstone-Carroll et al. (2000) with JPL Pasadena


• Three objectives for inter-planetary trajectory design
– Minimize time of flight
– Maximize payload delivered at destination
– Maximize heliocentric revolutions around the Sun
• NSGA invoked with SEPTOP software for evaluation

46
Earth–Mars Rendezvous

1000
22
36
900 73 72
Earth Earth
Mass Delivered to Target (kg.)

800 132 09.01.05 Mars 09.01.05


Mars
10.16.06 08.25.08
44
700

600
Individual 44 Individual 72
500

400

300
Mars
09.22.07
200

100

0
1 1.5 2 2.5 3 3.5 4 Earth Earth
Transfer Time (yrs.) 09.01.05 09.01.05

Mars
02.04.09
Individual 73 Individual 36

47
Salient Research Tasks

• Scalability of MOEAs to handle more than two objectives


• Mathematically convergent algorithms with guaranteed spread
of solutions
• Test problem design
• Performance metrics and comparative studies
• Controlled elitism
• Developing practical MOEAs – Hybridization, parallelization
• Application case studies

49
Hybrid MOEAs

• Combine EAs with a local search method


– Better convergence
– Faster approach
• Two hybrid approaches
– Local search to update each solution in an EA population
(Ishubuchi and Murata, 1998; Jaskiewicz, 1998)
– First EA and then apply a local search

97
Posteriori Approach in an MOEA

Multiple
MOEA local searches
Problem

Non−domination
Clustering check

• Which objective to use in local search?

98
Proposed Local Search Method

• Weighted sum strategy (or a Tchebycheff metric)


X
F = wi ∗ f i
i

• fi is scaled
• Weight wi chosen based on location of i in the obtained front
(fjmax − fj (x))/(fjmax − fjmin )
w̄j = PM
max − fk (x))/(fkmax − fkmin )
k=1 (fk

• Weights are normalized


X
wi = 1
i

99
Fixed Weight Strategy

f2
• Extreme solutions are as-
max
signed extreme weights f2 a
A b
• Linear relation between MOEA solution set

weight and fitness


min
• Many solution can converge f2

to same solution after local f1


min max
search f1 f1
set after
local search

100
Design of a Cantilever Plate

18 18

16 16
NSGA−II Local search

Scaled deflection

Scaled deflection
14 14

12 12

10 10

8 8

6 6

4 4
 
  
 
 

2 2
 
 
 

20 25 30 35 40 45 50 55 60 20 25 30 35 40 45 50 55 60
 
 
 

Weight Weight
 
 
 
 
 

60 mm
 
 
 
 

P
 
 
 

18 18
 
 
 

16
 

16 Clustered solutions Non−dominated solutions


100 mm

Scaled deflection
Scaled deflection

14 14

12 12

Base plate 10 10

8 8

6 6

4 4

2 2
20 25 30 35 40 45 50 55 60 20 25 30 35 40 45 50 55 60
Weight Weight

Nine trade-off solutions are chosen

102
Trade-off Solutions

(1.00, 0.00) (0.60, 0.40) (0.50, 0.50)

(0.43, 0.57) (0.38, 0.62) (0.35, 0.65)

(0.23, 0.77) (0.14, 0.86) (0.00, 1.00)

103
Conclusions

• Ideal multi-objective optimization is generic and pragmatic


• Evolutionary algorithms are ideal candidates
• Many efficient algorithms exist, more efficient ones are needed
• With some salient research studies, MOEAs will revolutionize
the act of optimization
• EAs have a definite edge in multi-objective optimization and
should become more useful in practice in coming years

106

You might also like