You are on page 1of 23

Multi-objective Optimization

Implementation of Constrained GA Based on NSGA-II

Some introductory figures from : Deb Kalyanmoy, Multi-Objective Optimization using Evolutionary Algorithms, Wiley 2001

Optimization
Optimization refers to finding one or more feasible solutions which correspond to extreme values of one or more objectives Finding out design variable : x Minimize f(x) - Single objective Subjected to gj(x) 0, j=1,,nj hk(x) = 0, k=1,,nk xi(L) xi xi(U)

Optimization Model Classification


Basic classifications are:
Constrained or unconstrained Linear or non-linear Single objective or multi-objective Another classification can be made by variables:
continuous/discrete/mixed-integer

Single and Multi-objective Optimization


Single Objective : Only one objective function Multi-Objective : Two or more and often conflicting objective functions e.g. Buying a car : minimize cost and maximize comfort

Pareto Optimal Front


Mapping between feasible decision space and objective space Dominated solutions : Set of design points performing worse than some other better points Domination criterion :
A feasible solution x1 dominates an other feasible solution x2 (denoted as x1 < x2), if both of the following conditions are true: 1) The solution x1 is no worse than x2 in all objectives, i.e. fi(x1) fi(x2) 2) The solution x1 is strictly better than x2 in at least one objective, i.e. fi(x1) < fi(x2)

Non-dominated solutions :
If two solutions are compared, then the solutions are said to be non-dominated with respect to each other IF neither solution dominates the other

Pareto optimal front : The function space representation of all the nondominated solutions

Pareto Optimal Front .. contd

Options : Min Min Min Max Max Min Max Max Which one is which ?

Solution Methods
Methods that try to avoid generating the Pareto front
Generate utopia point Define optimum based on some measure of distance from utopia point

Generating entire Pareto front


Weighted sum of objectives with variable coefficients Optimize one objective for a range of constraints on the others Niching methods with population based algorithms

Implementation of Multiobjective Constrained GA, Based on NSGA-II

Genetic Algorithms
Genetic algorithms imitate natural optimization process, natural selection in evolution Coding: replace design variables with a continuous string of digits or genes
Binary Integer Real

Population: Create population of design points Selection: Select parents based on fitness Crossover: Create child designs Mutation: Mutate child designs

Problem Formulation

Inequalities defined 0

Current program is written for 2 objectives (M=2), it is possible to change it

NSGA-II
Non-dominated Sorting Genetic Algorithm (NSGA)-II performs better than other constrained multi-objective optimizers* (PAEA, SPEA)
Better and faster convergence to true optimal front Better spread on Pareto optimal front

NSGA-II ranks designs based on non-domination For example : min-max problem


3

Design A B 3

Cost 25K 45K 55K

Comfort 65% 80% 50%

Design 3 is dominated by both design A and B (and thus undesirable), but design A and B are non-dominated with respect to one another (and thus Pareto optimal).

* Deb, K, et al, A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II, IEEE Transactions on Evolutionary Computations, Vol. 6, No. 2, pp. 182-197, 2002

Flow Chart *

* From presentation of Tushar Goel

Implementation
Initialize population
Fixed number of population size (N_pop) Fixed number of variables (N_var) Discrete variables
Variable upper (UB) and lower bounds (LB) Number of increments (N_increments)

Randomly distributed throughout the design space

Ranking
Ranks designs based on nondomination
The Pareto front is all rank 1 designs If the rank 1 designs are removed, the next Pareto front will be all rank 2 designs, etc. Sorting method is different than what NSGA-II* details

Rank 1

Rank 2

Constraints : handled with constraint-domination ideas


If two designs are both feasible, the standard non-domination techniques are used If one design is feasible and the other is not, the former is obviously favored (ranked lower) If both designs are infeasible, the design with a smaller overall constraint violation is favored (ranked lower)

* Deb, K, et al, A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II, IEEE Transactions on Evolutionary Computations, Vol. 6, No. 2, pp. 182-197, 2002

Selection and Fitness


More fit designs have higher chance of passing their genes to the next generation Fitness is based on rank, low rank designs have higher fitness Selection :
Using fitness based roulette wheel Create roulette wheel with ns segments Create random number between 0 and 1 Find segment on roulette wheel that contains the random number Segment number corresponds to design number Build parent database
* From presentation of Gerhard Venter

Child Population Creation


Select two parents for each reproduction randomly from parent database Crossover :
Probability close to 1 One point crossover randomly select crossover point Child = [parent1(1:cross_pt),parent2(cross_pt+1:N_var)]

Mutation :
Exploration parameter Probability of mutation is typically small (e.g. 0.2) Randomly select gene to mutate Randomly modify gene

Elitism
Keeps best individuals
Combine the child and parent population Select best individuals from the combined population

* Figure from presentation of Tushar Goel

Nitching
Guides the selection process toward a uniformly spread-out Pareto front Uses a parameter based upon crowding distance (c = a + b), where designs which provide the greatest spread along the Pareto front are favored Between two solutions with differing nondomination ranks, we prefer the solution with the lower (better) rank If both solutions belong to the same front, then we prefer the solution that is located in a lesser crowded region
* Figure from presentation of Tushar Goel

Example
Laminate Design

Problem Formulation
Objectives : Design a symmetric laminate
Maximize D11, maximize D22

Design Variables :
8 to 16 layers Layup orientations, 0 i 90 (15 step)

Constraints :
D12 0.5*D11 D12 0.5*D22

Optimization Settings
N_pop = 10; % size of the population N_gen = 30; % # of generations cross = 1.0; % crossover probability mut = 0.2; % mutation probability LB = [1 1 1 1 0 0 0 0]; UB = [7 7 7 7 7 7 7 7]; N_increments = [7 7 7 7 8 8 8 8];
Use higher values

Layup Orientations
For last 4 layers If variable is 0,the ply does not exist
for i = 1:4 ply_angles(i) = (X(i)-1)*15; end count = 5; for i=5:8 if X(i) > 0 ply_angles(count) = (X(i)-1)*15; count = count+1; end end

Pareto Front 30 Generations


A
B C

Design A B C D

D11 259.81 287.08 310.96 325.42

D22 333.23 317.54 300.29 275.83

con1 0.14248 0.06128 0.00750 0.00034

con2 0.00093 0.00744 0.02553 0.09029

1 15 15 15 15

2 15 15 15 15

Layup orientation () 3 4 5 6 90 90 45 15 90 45 75 30 90 45 0 75 90 15 0 45

7 0 0 30 0

8 90 45 15 90

1 is outermost layer

You might also like