4 views

Uploaded by Lizeth Ramires

- e 31034043
- An Effective Adaptive Approach for Joining Data in Data
- Discrete Particle Swarm Optimization Algorithm Used for TNEP Considering Network Adequacy Restriction
- Power Density Paper
- Not Detailed
- Recent Trends Indicate Rapid Growth of Nature-Inspired Optimization in Academia and Industry
- optimization of friction welding parameters using friction welding
- Chapter
- Full Text
- Bibi Ks 2018
- Architectural Genomics
- wsc16_Milani_Santucci
- A Non-Revisiting Genetic Algorithm for Optimizing Numeric Multi-Dimensional Functions
- 11 Genetic Algorithms
- 39 Optimization(Cad) 1
- Ajith Abraham Swagatam Das Computational Intelligence in Power Engineering Studies in Computational Intelligence 302 2010
- hi1998-GeneticAndGlobal
- Glob.optim.usersguide
- Operation Sequencing and Machining Parameter Selection in CAPP for Cylindrical Part using Hybrid Feature Based Genetic Algorithm and Expert System
- Mining Frequent Item set Using Genetic Algorithm

You are on page 1of 4

71[pp 21-24]

A Comparative Study of Genetic Algorithm and the Particle

Swarm Optimization

Sapna Katiyar

A.B.E.S. Institute of Technology, NH-24, Vijay Nagar, Ghaziabad 201010 UP.

sapna.katiyar@abes.in

---------------------------------------------------------------------------------------------------------------------------------- ----------------------------

Abstract --The Genetic Algorithm (GA) and its many versions

have been popular in the academy and the industry mainly

because of their intuitiveness, ease of implementation, and the

ability to effectively solve highly nonlinear, mixed integer

optimization problems that are typical of complex engineering

systems. The drawback of the GA is its expensive computational

cost. Particle Swarm Optimization (PSO) is a relatively recent

heuristic search method whose mechanics are inspired by the

swarming or collaborative behavior of biological populations.

PSO is similar to the GA as these two evolutionary heuristics are

population-based search methods. In other words, PSO and the

GA move from a set of points (population) to another set of

points in a single iteration with likely improvement using a

combination of deterministic and probabilistic rules. This paper

attempts to examine the claim that PSO has the same

effectiveness (finding the true global optimal solution) as the GA

but with significantly better computational efficiency (less

function evaluations) by implementing statistical analysis and

formal hypothesis testing. The major objective of this paper is to

compare the computational effectiveness and efficiency of the GA

and PSO using a formal hypothesis testing approach.

Keywords: Genetic algorithm, Candid solution, Hybrid PSO,

Metaheuristics, Numerical optimization, Stochastic, Swarm.

I. INTRODUCTION

THE Genetic Algorithm (GA) was introduced in the mid

1970s by John Holland and his colleagues and students at the

University of Michigan. The GA is inspired by the principles

of genetics and evolution, and mimics the reproduction

behavior observed in biological populations. The GA employs

the principal of survival of the fittest in its search process to

select and generate individuals (design solutions) that are

adapted to their environment (design objectives/constraints).

Therefore, over a number of generations (iterations), desirable

traits (design characteristics) will evolve and remain in the

genome composition of the population (set of design solutions

Generated each iteration) over traits with weaker undesirable

characteristics.

The GA is applied to solve complex design optimization

problems because it can handle both discrete and continuous

variables and nonlinear objective and constrain functions

without requiring gradient information.

Particle Swarm Optimization (PSO) was invented by Kennedy

and Eberhart in the mid 1990s while attempting to simulate

the choreographed, graceful motion of swarms of birds as part

of a socio cognitive study investigating the notion of

collective intelligence in biological populations. In PSO, a

set of randomly generated solutions (initial swarm) propagates

in the design space towards the optimal solution over a

number of iterations (moves) based on large amount of

information about the design space that is assimilated and

shared by all members of the swarm. PSO is inspired by the

ability of flocks of birds, schools of fish, and herds of animals

to adapt to their environment, find rich sources of food, and

avoid predators by implementing an information sharing

approaches, hence, developing an evolutionary advantage.

II. GENETIC ALGORITHM

In a genetic algorithm, a population of strings

(called chromosomes or the genotype of the genome), which

encode candidate solutions (called phenotypes) to an

optimization problem, evolves toward better solutions.

Solutions are represented in binary as strings of 0s and 1s, but

other encodings are also possible. The evolution usually starts

from a population of randomly generated individuals and

happens in generations. In each generation, the fitness of

every individual in the population is evaluated, multiple

individuals are stochastically selected from the current

population (based on their fitness), and modified (recombined

and possibly randomly mutated) to form a new population.

The new population is then used in the next iteration of

the algorithm. The algorithm terminates when either a

maximum number of generations has been produced, or a

satisfactory fitness level has been reached for the population.

If the algorithm has terminated due to a maximum number of

generations, a satisfactory solution may or may not have been

reached [1].

A typical genetic algorithm requires:

A genetic representation of the solution domain

A fitness function to evaluate the solution domain.

A standard representation of the solution is as an array of bits.

The main property that makes these genetic representations

convenient is that their parts are easily aligned due to their

fixed size, which facilitates simple crossover operations.

Variable length representations may also be used, but

crossover implementation is more complex in this case.

22

Start

Create randomly selected

initial population

Fitness

Evaluation

Stop

Reproduction

Mutation

Discard

Fit Chromosome?

Optimal

Solution?

No

No

Yes

Yes

Figure 1 Flow Chart of the Genetic Algorithm.

AKGEC INTERNATIONAL JOURNAL OF TECHNOLOGY, Vol. 2, No. 2

Tree-like representations are explored in genetic programming

and graph-form representations are explored in evolutionary

programming.

Objective Function of Genetic Algorithm: The fitness function

is defined over the genetic representation and measures

the quality of the represented solution. The fitness function is

always problem dependent. For instance, in the knapsack

problem, one wants to maximize the total value of objects that

can be put in a knapsack of some fixed capacity.

A representation of a solution might be an array of bits, where

each bit represents a different object, and the value of the bit

(0 or 1) represents whether or not the object is in the

knapsack. Not every such representation is valid, as the size of

objects may exceed the capacity of the knapsack.

The fitness of the solution is the sum of values of all objects in

the knapsack if the representation is valid or 0 otherwise. In

some problems, it is hard or even impossible to define the

fitness expression; in these cases, interactive genetic

algorithms are used.

Once we have the genetic representation and the fitness

function defined, GA proceeds to initialize a population of

solutions randomly, and then improve it through repetitive

application of mutation, crossover, and inversion and selection

operators.

Implementation Algorithm: The genetic algorithm uses the

chromosomes fitness value to create a new population

consisting of the fittest members. The flow chart of the GA is

given in Fig. 1.

Applications: Genetic algorithms find application

in bioinformatics, phylogenetics, computational science,

engineering, economics, chemistry,

manufacturing, mathematics, physics and other fields.

Learning Robot behavior using Genetic Algorithms: Robot

has become such a prominent tool that it has increasingly

taken a more important role in many different industries. As

such, it has to operate with great efficiency and accuracy. This

may not sound very difficult if the environment in which the

robot operates remain unchanged, since the behavior of the

robot could be pre-programmed.

However, if the environment is ever-changing, it gets

extremely difficult, if not impossible, for programmers to

figure out every possible behavior of the robot. Applying

robot in a changing environment is not only inevitable in

modern technology, but is also becoming more frequent. This

led to the development of a learning robot.

III. PARTICLE SWARM OPTIMIZATION

The PSO was first designed to simulate birds seeking food

which is defined as a cornfield vector. The bird would find

food through social cooperation with other birds around it

(within its neighborhood). It was then expanded to

multidimensional search.

Particle swarm optimization (PSO) is a computational method

that optimizes a problem by iteratively trying to improve

a candidate solution with regard to a given measure of quality.

Such methods are commonly known as metaheuristics as they

make few or no assumptions about the problem being

optimized and can search very large spaces of candidate

solutions. However, metaheuristics such as PSO do not

guarantee an optimal solution is ever found.

PSO does not use the gradient of the problem being

optimized, which means PSO does not require for the

optimization problem to be differentiable as is required by

classic optimization methods such as gradient descent

and quasi-Nnewton methods. PSO can therefore also be used

on optimization problems that are partially irregular, noisy,

change over time, etc [2].

23

COMPARATIVE STUDY OF ALGORITHMS

PSO optimizes a problem by having a population of candidate

solutions, here dubbed particles, and moving these particles

around in the search-space according to simple mathematical

formulae. The movements of the particles are guided by the

best found positions in the search-space which are updated as

better positions are found by the particles.

PSO algorithm works by having a population (called a swarm)

of candidate solutions (called particles). These particles are

moved around in the search-space according to a few simple

formulae. The movements of the particles are guided by their

own best known position in the search-space as well as the

entire swarm's best known position. When improved positions

are being discovered these will then it will guide the

movements of the swarm. The process is repeated and

satisfactory solution will be discovered.

PSO Variants: Various variants of a basic PSO algorithm are

possible. New and some more sophisticated PSO variants are

continually being introduced in an attempt to improve

optimization performance. There is a trend in that research;

one can make a hybrid optimization method using PSO

combined with other optimization techniques [3, 5].

Discrete PSO

Constriction Coefficient

Bare-bones PSO

Fully informed PSO.

Applications: The first practical application of PSO was in the

field of neural network training and was reported together

with the algorithm itself (Kennedy and Eberhart 1995). Many

more areas of application have been explored ever since,

including telecommunications, control, data mining, design,

combinatorial optimization, power systems, signal processing,

and many others. PSO algorithms have been developed to

solve:

Constrained optimization problems

Min-max problems

Multi objective optimization problems

Dynamic tracking.

Implementation Algorithm: The PSO algorithm is simple in

concept, easy to implement and computational efficient.

Original PSO was implemented in a synchronous manner (Fig

2) but improved convergence rate is achieved by

asynchronous PSO algorithm Figure 2 [4, 6].

IV. GENETIC ALGORITHM VERSUS PARTICLE

N SWARM OPTIMIZATION

GA is inherently discrete, i.e. it encodes the design variables

into bits of 0s and 1s, therefore it easily handles discrete

design variables, and PSO is inherently continuous and

must be modified to handle discrete design variables.

Unlike the GA with its binary encoding, in PSO, the design

variables can take any values, even outside their side

constraints, based on their current position in the design space

and the calculated velocity vector.

Fig 2: Synchronous PSO algorithm (Parallel Processing).

V. CONCLUSION

Particle Swarm Optimization (PSO) is a relatively recent

heuristic search method that is based on the idea of

collaborative behavior and swarming in biological

populations. PSO is similar to the Genetic Algorithm (GA) in

the sense that they are both population-based search

24

AKGEC INTERNATIONAL JOURNAL OF TECHNOLOGY, Vol. 2, No. 2

approaches and that they both depend on information sharing

among their population members to enhance their search

processes using a combination of deterministic and

probabilistic rules. Conversely, the GA is a well established

algorithm with many versions and applications.

GA is very helpful when the developer does not have precise

domain expertise, because GAs possesses the ability to

explore and learn from their domain. PSO can be applied

to multi-objective problems, in which the fitness comparison

takes pareto dominance into account when moving the PSO

particles and non-dominated solutions are stored so as to

approximate the pareto front.

The objective of this research paper is to test the hypothesis

that states that although PSO and the GA on average yield the

same effectiveness (solution quality), PSO is more

computationally efficient (uses less number of function

evaluations) than the GA.

VI. REFERENCES

[1] Goldberg, D.E. (1989). Genetic Algorithms in Search, Optimization and

Machine Learning, Addison-Wesley.

[2] Kennedy, J. and Eberhart, R., Particle Swarm

Optimization, Proceedings of the IEEE International Conference on Neural

Networks, Perth, Australia 1995, pp. 1942-1945.

[3] Kennedy, J. and Eberhart, R., Swarm Intelligence,

Academic Press, 1st ed., San Diego, CA, 2001.

[4] Yuhui Shi., Particle Swarm Optimization. Electronic Data Systems, Inc.

IEEE Neural Network Society Magazine, 2004.

[5] R. Poli. Analysis of the publications on the applications of particle swarm

optimization. Journal of Artificial Evolution and Applications, Article ID

685175, 10 pages, 2008.

[6] R. Poli, J. Kennedy, and T. Blackwell. Particle swarm optimization. An

overview. Swarm Intelligence, 1(1): pp 33-57, 2007.

Department of Electronics and Communications, ABES Institute of

Technology, Ghaziabad and has teaching experience of over 10 years in

Sharda Group of Institutions Agra, IILM academy of Higher Learning Greater

Noida, ITS Greater Noida and GNIT Greater Noida. Her areas of interest

include communication systems, signal processing, image processing, control

system with microcontrollers and artificial intelligence. Authored books for

B. Tech students Electronics & Communications and Electrical.

AKGEC INTERNATIONAL JOURNAL OF TECHNOLOGY, Vol. 2, No. 2

SAPNA KATIYAR was born in Kanpur,

India, in 1979. She received the Bachelor

degree in 2001 in Electronics and

Communication Engineering from

Moradabad Institute of Technology and the

Masters in 2006 in Communication Systems

from University of Rajasthan.

She is currently pursuing the Ph.D. degree

from Jamia Millia Islamia, New Delhi in

Adaptive Control Systems. Currently,

working as an Associate Professor in the

- e 31034043Uploaded byAnonymous 7VPPkWS8O
- An Effective Adaptive Approach for Joining Data in DataUploaded byInternational Journal of Research in Engineering and Technology
- Discrete Particle Swarm Optimization Algorithm Used for TNEP Considering Network Adequacy RestrictionUploaded bywvargas926
- Power Density PaperUploaded byPranay Patel
- Not DetailedUploaded bySuresh Basuvaraj
- Recent Trends Indicate Rapid Growth of Nature-Inspired Optimization in Academia and IndustryUploaded byJames Whitacre
- optimization of friction welding parameters using friction weldingUploaded byNihas Noushad
- ChapterUploaded bytamann2004
- Full TextUploaded bypragatinaresh
- Bibi Ks 2018Uploaded byCandra Dewi Jodistiara
- Architectural GenomicsUploaded byAltrerosje Asri Ngastowo
- wsc16_Milani_SantucciUploaded byabdoul-rahman
- A Non-Revisiting Genetic Algorithm for Optimizing Numeric Multi-Dimensional FunctionsUploaded byAnonymous lVQ83F8mC
- 11 Genetic AlgorithmsUploaded byShubham
- 39 Optimization(Cad) 1Uploaded byAman Garg
- Ajith Abraham Swagatam Das Computational Intelligence in Power Engineering Studies in Computational Intelligence 302 2010Uploaded byJose Fco. Aleman Arriaga
- hi1998-GeneticAndGlobalUploaded byOktafian Prabandaru
- Glob.optim.usersguideUploaded byKavan Khaledi
- Operation Sequencing and Machining Parameter Selection in CAPP for Cylindrical Part using Hybrid Feature Based Genetic Algorithm and Expert SystemUploaded byIRJET Journal
- Mining Frequent Item set Using Genetic AlgorithmUploaded byInternational Journal for Scientific Research and Development - IJSRD
- System Identification Using Particle Swarm Optimization Based Artificial Neural NetworksUploaded bysoumitradas2013
- Region Water Supply System Optimization Based on Binary and Continuous Ant Colony AlgorithmsUploaded byfarzanesunflower
- Strategic and Robust Deployment of. Synchronized Phasor Measurement Units With. Restricted Channel CapacityUploaded bywvargas926
- A brief overview of population-based optimization techniques and their applications post 2011Uploaded byseventhsensegroup
- Artificial Intelligence for Industial ApplicationsUploaded bygmcnultyenergy
- IRJET-Prediction of Ultimate Shear Strength of Moderate Deep Concrete Beam Including Size EffectUploaded byIRJET Journal
- Improvement of Position on a Scale and Stance of Distributed Generation with Lowest Interconnected System InterruptionUploaded byIRJET Journal
- iscsct10p283Uploaded bycooker2012
- CIVIL Mech Prod AutoUploaded bySanjay
- What is a MetaheuristicUploaded bySushil Rana Magar

- 113694_2Uploaded byLizeth Ramires
- Práctica1Uploaded byLizeth Ramires
- Cap1-Grafos [Modo de Compatibilidad] 2Uploaded byLizeth Ramires
- 111527121-126Uploaded byLizeth Ramires
- CAPITULO_6_ppt[1]Uploaded byLizeth Ramires
- Teoria de Grafos y RedesUploaded byLizeth Ramires
- ANALISIS SINTACTICOUploaded byLizeth Ramires
- Teoria de SistemasUploaded byLizeth Ramires
- Teoria de SistemasUploaded byLizeth Ramires
- Lab 1Uploaded byVictor Manuel Gonzalez
- Concepto de ÁrbolesUploaded byLizeth Ramires
- Cap2- Arboles [Modo de Compatibilidad]Uploaded byLizeth Ramires
- Sum_G5_08Uploaded byFrancisco Gerardo Garcia Zepeda
- DC01_Project_Charter_CIUploaded byMaynor_Carranz_2812
- COBIT 4Uploaded byLizeth Ramires
- Balotario SRUploaded byLizeth Ramires
- AUDITORIA INFORMATICA A LA SUPERINTENDENCIA DE TELECOMUNICACIONESUploaded bymbendezuc
- Aplicación Nano Robots 04 ING ISC PII E PUploaded byLizeth Ramires
- EjemploUploaded byLizeth Ramires
- VTPUploaded byLizeth Ramires
- T6 HandoutsUploaded byLizeth Ramires
- SWEBOK en españolUploaded byxmren@hotmail.com
- Ejercicio01_lab08Uploaded byLizeth Ramires

- sv labsUploaded bySandeep Boyina
- CS372_SyllabusUploaded byLaura Craig
- Lesson Plan Numerical MethodsUploaded bySaravanan Atthiappan
- ADTs.pptUploaded byCHANDRA BHUSHAN
- Crack the Interview Part 1Uploaded byAnkush Choudhary
- Pai Ch04 StackUploaded byanon
- Solve Linear Programming Problems - MATLAB Linprog - MathWorks IndiaUploaded bysharmi
- Lecture 15, MLM matchingUploaded byGrisha
- mid1key-s08.pdfUploaded byBradley Mitchell
- BCSL-33 lab-manual-solution.pdfUploaded bysparamveer95_4991596
- Prgs PrintUploaded bySrujana Inturi
- 1-DAA-Intro.pdfUploaded byharry_i3t
- hashmeUploaded bydragzillacar
- bkdsstd03Uploaded byThuan Vo
- Convex Optimization Exam January 2017Uploaded byAlexander Louwerse
- Unit 3Uploaded bymohit12341234
- Array.pdfUploaded byVanessa Johnson
- Dmlab Manual + FomattedUploaded bysyam_5491983
- CSCI 2100 Data StructuresUploaded bynishant.vepachedu8644
- The Proposed Simplex Method for the Solution of Linear Programming ProblemsUploaded byjoe
- Amcat Computer Programming Paper 1 - Placements SolutionsUploaded bySneha Kalidindi
- Binary Search Trees data structureUploaded byAbdullah
- Fast Fourier Transform (FFT)Uploaded byVictor Vasilios Serbanescu
- Circular Linked ListsUploaded byVishnu Prasad Goranthala
- quicksort.pdfUploaded byAndrew Lee
- Amortized AnalysisUploaded bymadhurocksktm
- Left recursionUploaded byPriyanka Bose
- Topological SortingUploaded bypravin2m
- aiesUploaded byapi-279049687
- theis4Uploaded byAnonymous TxPyX8c