Professional Documents
Culture Documents
IEEE TRANSACTIONS ON
SYSTEMS, MAN,
AND
I. INTRODUCTION
Environment E
Performance Measure u
Control
Input c
Feedback f
Adaptive
Strategy
Suppose we desire to optimize a process having a response surface u, which depends on some input vector x. It
is assumed that no initial information is available concerning the surface u, but that a black box evaluation procedure can be invoked to compute the scalar function u(x).
The state of the art in such situations is to perform some
sort of random search, perhaps combined with local hillclimbing procedures [5], [9]. Genetic algorithms are global
optimization techniques that avoid many of the shortcomings exhibited by local search techniques on difficult search
spaces.
A GA is an iterative procedure which maintains a constant-size population P(t) of candidate solutions. During
each iteration step, called a generation, the structures in
the current population are evaluated, and, on the basis of
those evaluations, a new population of candidate solutions
is formed (see Fig. 3.)
The initial population P(O) can be chosen heuristically
or at random. The structures of the population P(t + 1)
are chosen from P(t) by a randomized selection procedure
that ensures that the expected number of times a structure
is chosen is approximately proportional to that structure's
123
Parameters
x
_--------------
St
end;
t <- t+l;
select P(t);
recombine P(t);
evaluate P(t);
We now describe experiments which attempted to optimize the performance of GA's on a given set of function
optimization problems. These experiments were designed
to search the space of GA's defined by six control parameters, and to identify the optimal parameter settings with
124
IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS, VOL. SMC-16, NO. 1, JANUARY/FEBRUARY 1986
125
TABLE I
However, it is difficult to predict how the various parameFUNCTIONS COMPRISING THE TEST ENVIRONMENT
ters interact. For example, what is the effect of increasing
the population size, while lowering the crossover rate? The
Function Dimensions Size of Space
Description
analytic -optimization of this space is well beyond our
fl
3
1.0 x 109
parabola
current understanding of GA's. It is also clear that an
f2
2
1.7 x 106
Rosenbrock's saddle [23]
5
1.0 X 1015 step function
f3
exhaustive enumeration of the space is infeasible. Our
30
1.0 X 1072 quartic with noise
f4
approach was to apply a metalevel GA's to the problem of
2
1.6 X 1010 Shekel's foxholes [241
f5
identifying high-performance GA's. Each structure in the
population of the meta-level GA consisted of an 18-bit
vector which identified a particular GA. The performance response surface e. As normalized, UE and UE for random
of each GA was measured during the performance of a search will be 100.0, while UE and UE* for more effective
series of function optimization tasks. The meta-level GA search strategies will be correspondingly lower (for minimiused this information to conduct a search for high-perfor- zation problems).
mance algorithms.
D. Experimental Procedures
B. Task Environment
Two experiments were performed, one to optimize on-line
Each GA was evaluated by using it to perform five performance and one to optimize offline performance. For
optimization tasks, one for each of five carefully selected each experiment, the procedure for obtaining the optimum
numerical test functions. As a results of these optimization GA was as follows.
tasks, the GA was assigned a value according to one of the
1) One thousand GA's were evaluated, using a metalevel
performance measures explained below. The functions GA to perform the search through the space of GA's
comprising the task environment have been studied in defined by the six GA parameters. Each evaluation comprevious studies of GA's [3], [6] and included functions prised running one GA against each of the five test funcwith various characteristics, including discontinuous, multi- tions for 5000 function evaluations and normalizing the
dimensional, and noisy functions. Table I gives a brief result with respect to the performance of random search on
description of the test functions.
the same function. The metalevel GA started with a population of 50 randomly chosen GA's and used the standard
C. Performance Measures
parameter settings, i.e., GA(50, 0.6, 0.001, 1.0, 7, E). Past
Two performance metrics for adaptive search strategies experience has shown that these parameters yield a fairly
were considered, online performance and offline perfor- good search for a variety of problems, and so this was the
mance [6]. The on-line performance of a search strategy s natural choice for the meta-level.
2) Since GA's are randomized algorithms, the perforon a response surface e is defined as follows:
mance
of a GA during a single trial in the metalevel
Ue(s, T) = avet(Ue(t)), t = 0,1, *, T
experiment represents a sample from a distribution of
where ue(t) is the performance of the structure evaluated performances. Therefore, it was decided that the GA's
at time t. That is, online performance is the average showing the best performances during step 1 would be
performance of all tested structures over the course of the subjected to more extensive testing. Each of the 20 best
search.
GA's in step 1 was again run against the task environment,
The offline performance of a search strategy s on a this time for five trials for each test function, using differresponse surface e is defined as follows:
ent random number seeds for each trial. The GA which
exhibited the best performance in this step was declared
T
t = 0, 1,
Ue*(s, T) =
the winner of the experiment.
where u*(t) is the best performance achieved in the time
IV. RESULTS
interval [0, t]. Offline performance is the relevant measure
when the search can- be performed offline (e.g., via a
simulation model), while the best structure so far is used to A. Experiment 1-Online Performance
control the online system.
The first experiment was designed to search for the
In order to measure global robustness, corresponding optimal GA with respect to online performance on the task
performance measures are defined for the entire set of environment. Fig. 4 shows the average online performance
response surfaces E:
for the 50 GA's in each of the 20 generations of experiment
= 100.0*
1. Recall that the overall scores for random search on the
UE(s, T)
avee(Ue(s, T)/Ue(rand, T)),
task environment is 100.0. From the initial data point in
e in E
Fig.
4, we can estimate that the average online performance
UE (S, T) = 1000* avee(Ue*(s, T)/Ue*(rand, T)),
of all GA's in our search space is approximately 56.6, or
e in E
about 43.4 percent better than random search. Fig. 4 shows
where U, (rand, T) and U,*(rand, T) are on-line and off- that the final population of GA's had significantly better
line performance, respectively, of pure random search on performance than the average GA.
avej(u*(t)),
126
250k
so
uJD
c: 200
a:
50
0
CC
U-
a:
06150
a:
z
Cl;
040
U-
tr:
aL
hi
hi
-J
U-
-J
z
0
hi
C:
a:
a:
a:
at:
IL100
Z30
020
i 5
10
-0
-0
12
GENERRTIONS
16
20
Fig. 5 Experiment 2.
8
12
GENERAT I ONS
16
20
C. General Observations
Besides suggesting optimal GA's, the above experiments
also provide performance data for 2000 GA's with various
parameter settings. Given that these are not independent
samples from the space of GA's, it is rather difficult to
make valid statistical inferences from this data. Nevertheless the data does suggest some regularities that might
warrant further studies.
The experimental data confirms several observations first
made by De Jong on the basis of a relatively small number
of experiments [6]. For example, mutation rates above 0.05
are generally harmful with respect to online performance,
Fig. 4 Experiment 1.
127
30~
- 20
-j
10
)DI,
-j
0i
-u
n_
A.
3u
u0
u 120
so
IS
POPULATION SIZE
Fig. 6 Average online performance of various population sizes according to Experiment 1.
'IO,
-0. -
10-
-0-.
GA1
a:
100oo
O2000
4000
3000
TRIALS
Fig. 7 Online performance of GA1 and
task.
5000
8
z
Lu 6
CC
w
u.i
GAS
c:
10
1000
2000
3000
TRIALS
4000
5000
128
REFERENCES
Michigan, 1970.
[9]
[10]
[11]
[12]
[13]
[14]
[15]
[16]
[17]
[18]
[19]
[20]
[21]
[22]
[23]
[24]
[25]
systems, Ph.D. thesis, Dept. Computer and Communication Sciences, Univ. of Michigan, 1975.
, "Adaptive system design: A genetic approach," IEEE Trans.
Syst., Man, Cyber. vol. SMC-10, no. 9, pp. 566-574, Sept. 1980.
L. C. W. Dixon, "The choice of step length, a crucial factor in the
performance of variable metric algorithms," in Numerical Methods
for Nonlinear Optimization, F. A. Lootsma, Ed. New York:
Academic, 1972.
W. Farrell, Optimization Techniques for Computerized Simulation.
Los Angeles: CACI, 1975.
J. M. Fitzpatrick, J. J. Grefenstette, and D. Van Gucht, "Image
registration by genetic search," in Proc. of IEEE Southeastcon '84,
pp. 460-464, Apr. 1984.
L. J. Fogel, A. J. Owens, and M. J. Walsh, Artificial Intelligence
Through Simulated Evolution. New York: Wiley and Sons, 1966.
M. P. Fourman, "Compaction of symbolic layout using genetic
algorithms," Proc. Intl. Conf. on Genetic Algorithms and their
Applications, pp. 141-153, July 1985.
D). R. Frantz, Non-linearities in genetic adaptive search, Ph.D. thesis,
Dept. Computer and Communication Sciences, Univ. of Michigan,
1972.
W. Frei, T. Shibata, and C. C. Chen, "Fast matching of non-stationary images with false fix protection," Proc. 5th Intl. Conif. Patt.
Recog., vol. 1, pp. 208-212, IEEE, 1980.
D. Goldberg, "Computer-aided gas pipeline operation using genetic
algorithms and rule learning," Ph.D. thesis, Dept. Civil Eng., Univ.
of Michigan, 1983.
J. J. Grefenstette, R. Gopal, B. J. Rosmaita, and D. Van Gucht,
"Genetic algorithms for the traveling salesman problem," in Proc.
Intl. Conf. Genetic Algorithms and their Applications, pp. 160-168,
July 1985.
J. H. Holland, Adaptation in Natural and Artificial Systems, Univ.
Michigan, Ann Arbor, MI, 1975.
"Escaping brittleness," in Proc. Int. Machine Learning
Workshop, pp. 92-95, June 1983.
R. B. Hollstien, Artificial Genetic Adaptation in Computer Control
Systems, Ph.D. Thesis, Dept., Computer and Communication Sci
ences, Univ. of Michigan, 1971.
R. A. Kruger and S. J. Riederer, Basic Concepts of Digital Subtraction Angiography. Boston: G. K. Hall, 1984.
James J. Little, "Automatic registration of landsat MSS images to
digital elevation models," in Proc. IEEE Workshop Computer Vision: Representation and Control, pp. 178-184, 1982.
E. Pettit and K. M. Swigger, "An analysis of genetic-based pattern
tracking,' in Proc. National Conf. on Al, AAAI 83, pp. 327-332,
1983.
H. H. Rosenbrock, "An automatic method for finding the greatest
or least value of a function," Computer J., vol. 3, pp. 175-184, Oct.
1960.
J. Shekel, "Test functions for multimodal search techniques," in
Fifth Ann. Princeton Conf. Inform. Sci. Syst., 1971.
S. F. Smith, "Flexible learning of problem solving heuristics through
adaptive search," in Proc. of 8th IJCAI 1983.