Professional Documents
Culture Documents
mathematicians
in the
optimization
field
by Wikipedians
Note. This book is based on the Wikipedia article:
“Optimization_(mathematics)”. The supporting
articles are those referenced as major expansion
of selected sections.
3
Table of Contents
Optimization..............................................................................7
George Dantzig........................................................................28
Richard E. Bellman..................................................................37
Ronald A. Howard...................................................................43
Leonid Kantorovich.................................................................46
Narendra Karmarkar................................................................50
William Karush........................................................................55
Harold W. Kuhn.......................................................................57
Joseph Louis Lagrange............................................................60
John von Neumann..................................................................86
Lev Pontryagin.......................................................................114
Naum Z. Shor.........................................................................117
Albert W. Tucker...................................................................120
Hoang Tuy.............................................................................124
Optimization
Optimization
In mathematics and computer science,
optimization, or mathematical programming,
refers to choosing the best element from some
set of available alternatives.
In the simplest case, this means solving problems
in which one seeks to minimize or maximize a
real function by systematically choosing the
values of real or integer variables from within an
allowed set. This formulation, using a scalar, real-
valued objective function, is probably the
simplest example; the generalization of
optimization theory and techniques to other
formulations comprises a large area of applied
mathematics. More generally, it means finding
"best available" values of some objective function
given a defined domain, including a variety of
different types of objective functions and
different types of domains.
History
The first optimization technique, which is known
as steepest descent, goes back to Gauss.
Historically, the first term to be introduced was
linear programming, which was invented by
7
Important mathematicians in the optimization field
8
Optimization
Major sub-fields
•Convex programming studies the case when the
objective function is convex and the constraints,
if any, form a convex set. This can be viewed as a
particular case of nonlinear programming or as
generalization of linear or convex quadratic
programming.
1. Linear programming (LP), a type of
convex programming, studies the case in
which the objective function f is linear
and the set of constraints is specified
using only linear equalities and
inequalities. Such a set is called a
polyhedron or a polytope if it is bounded.
2. Second order cone programming (SOCP)
is a convex program, and includes
certain types of quadratic programs.
3. Semidefinite programming (SDP) is a
subfield of convex optimization where
the underlying variables are semidefinite
matrices. It is generalization of linear
and convex quadratic programming.
9
Important mathematicians in the optimization field
10
Optimization
11
Important mathematicians in the optimization field
Multi-objective optimization
Adding more than one objective to an
optimization problem adds complexity. For
example, if you wanted to optimize a structural
design, you would want a design that is both light
and rigid. Because these two objectives conflict, a
12
Optimization
Multi-modal optimization
Optimization problems are often multi-modal,
that is they possess multiple good solutions. They
could all be globally good (same cost function
value) or there could be a mix of globally good
and locally good solutions. Obtaining all (or at
least some of) the multiple solutions is the goal of
a multi-modal optimizer.
Classical optimization techniques due to their
iterative approach do not perform satisfactorily
when they are used to obtain multiple solutions,
since it is not guaranteed that different solutions
will be obtained even with different starting
13
Important mathematicians in the optimization field
Dimensionless optimization
Dimensionless optimization (DO) is used in design
problems, and consists of the following steps:
•Rendering the dimensions of the design
dimensionless
•Selecting a local region of the design space to
perform analysis on
•Creating an I-optimal design within the local
design space
•Forming response surfaces based on the
analysis
•Optimizing the design based on the evaluation of
the objective function, using the response surface
models
14
Optimization
Optimization problems
An optimization problem can be represented in
the following way
Given: a function f : A → R from some set A
to the real numbers
Sought: an element x0 in A such that f(x0) ≤
f(x) for all x in A ("minimization") or such that
f(x0) ≥ f(x) for all x in A ("maximization").
Such a formulation is called an optimization
problem or a mathematical programming
problem (a term not directly related to computer
programming, but still in use for example in
linear programming - see History above). Many
real-world and theoretical problems may be
modeled in this general framework. Problems
formulated using this technique in the fields of
physics and computer vision may refer to the
technique as energy minimization, speaking of
the value of the function f as representing the
energy of the system being modeled.
Typically, A is some subset of the Euclidean space
Rn, often specified by a set of constraints,
15
Important mathematicians in the optimization field
16
Optimization
Notation
Optimization problems are often expressed with
special notation. Here are some examples.
min x 21
x∈ℝ
17
Important mathematicians in the optimization field
Analytical characterization of
optima
18
Optimization
19
Important mathematicians in the optimization field
20
Optimization
Computational optimization
techniques
Optimization methods are crudely divided into
two groups:
SVO - Single-variable optimization
MVO - Multi-variable optimization
For twice-differentiable functions, unconstrained
problems can be solved by finding the points
where the gradient of the objective function is
zero (that is, the stationary points) and using the
Hessian matrix to classify the type of each point.
If the Hessian is positive definite, the point is a
local minimum, if negative definite, a local
21
Important mathematicians in the optimization field
22
Optimization
23
Important mathematicians in the optimization field
•Cuckoo search
•Differential evolution
•Dynamic relaxation
•Evolution strategy
•Filled function method
•Firefly algorithm
•Genetic algorithms
•Harmony search
•Hill climbing
•IOSO
•Particle swarm optimization
•Quantum annealing
•Simulated annealing
•Stochastic tunneling
•Tabu search
Applications
Problems in rigid body dynamics (in particular
articulated rigid body dynamics) often require
mathematical programming techniques, since you
can view rigid body dynamics as attempting to
24
Optimization
25
Important mathematicians in the optimization field
References
•Mordecai Avriel (2003). Nonlinear
Programming: Analysis and Methods. Dover
Publishing. ISBN 0-486-43227-0.
•Stephen Boyd and Lieven Vandenberghe (2004).
Convex Optimization, Cambridge University
Press. ISBN 0-521-83378-7.
•Elster K.-H. (1993), Modern Mathematical
Methods of Optimization, Vch Pub. ISBN 3-05-
501452-9.
•Jorge Nocedal and Stephen J. Wright (2006).
Numerical Optimization, Springer. ISBN 0-387-
30303-0.
•Panos Y. Papalambros and Douglass J. Wilde
(2000). Principles of Optimal Design : Modeling
and Computation, Cambridge University Press.
ISBN 0-521-62727-3.
•Rowe W.B.; O'Donoghue J.P. Cameron, A; (1970)
Optimization of externally pressurized bearing for
26
Optimization
27
Important mathematicians in the optimization field
George Dantzig
George Bernard Dantzig
Born November 8, 1914
Portland, Oregon
Died May 13, 2005 (aged 90)
Stanford, California
Citizenship American
Fields Mathematician
Operations research
Computer science
Economics
Statistics
Institutions U.S. Air Force Office of Statistical Control
RAND Corporation
University of California, Berkeley
Stanford University
Alma mater Bachelor's degrees - University of
Maryland
Master's degree - University of Michigan
Doctor of Philosophy - University of
California, Berkeley
Doctoral advisor Jerzy Neyman
Doctoral students Ilan Adler
Kurt Anstreicher
John Birge
Richard W. Cottle
B. Curtis Eaves
Robert Fourer
28
George Dantzig
Saul Gass
Alfredo Iusem
Ellis Johnson
Hiroshi Konno
Irvin Lustig
Thomas Magnanti
S. Thomas McCormick, V
David Morton
Mukund Thapa
Craig Tovey
Alan Tucker
Richard Van Slyke
Roger J-B Wets
Robert Wittrock
Yinyu Ye
Known for Linear programming
Simplex algorithm
Dantzig-Wolfe decomposition principle
Generalized linear programming
Generalized upper bounding
Max-flow min-cut theorem of networks
Quadratic programming
Complementary pivot algorithms
Linear complementary problem
Stochastic programming
Influences Wassily Leontief
John von Neumann
Marshal K. Wood
Influenced Kenneth J. Arrow
Robert Dorfman
29
Important mathematicians in the optimization field
Leonid Hurwicz
Tjalling C. Koopmans
Thomas L. Saaty
Paul Samuelson
Phil. Wolfe
Notable awards John von Neumann Theory Prize [1974]
National Medal of Science [1975]
George Bernard Dantzig (November 8, 1914 –
May 13, 2005) was an American mathematician,
and the Professor Emeritus of Transportation
Sciences and Professor of Operations Research
and of Computer Science at Stanford.
Dantzig is known for his development of the
simplex algorithm, an algorithm for solving linear
programming problems, and his work with linear
programming, some years after it was initially
invented by Soviet economist and mathematician
Leonid Kantorovich. Dantzig is the real-life
perpetrator of the tale of a student solving an
unproven equation after mistaking it for
homework (see "Biography" below), often thought
to be an urban legend.
Biography
George Dantzig was born in Portland, Oregon,
and with his middle name "Bernard" named after
the writer George Bernard Shaw. His father,
30
George Dantzig
31
Important mathematicians in the optimization field
32
George Dantzig
Work
Dantzig is "generally regarded as one of the three
founders of linear programming, along with John
von Neumann and Leonid Kantorovich",
according to Freund (1994), "through his
research in mathematical theory, computation,
economic analysis, and applications to industrial
problems, he has contributed more than any
33
Important mathematicians in the optimization field
Mathematical statistics
An event in Dantzig's life became the origin of a
famous story in 1939 while he was a graduate
student at UC Berkeley. Near the beginning of a
class for which Dantzig was late, professor Jerzy
Neyman wrote two examples of famously
unsolved statistics problems on the blackboard.
When Dantzig arrived, he assumed that the two
problems were a homework assignment and
wrote them down. According to Dantzig, the
problems "seemed to be a little harder than
34
George Dantzig
Linear programming
In 1946, as mathematical adviser to the U.S. Air
Force Comptroller, he was challenged by his
Pentagon colleagues to see what he could do to
35
Important mathematicians in the optimization field
36
Richard E. Bellman
Richard E. Bellman
Richard E. Bellman
Born August 26, 1920
New York City, New York
Died March 19, 1984 (aged 63)
Fields Mathematics and Control theory
Alma mater Princeton University
University of Wisconsin–Madison
Brooklyn College
Known for Dynamic programming
Richard Ernest Bellman (August 26, 1920 –
March 19, 1984) was an applied mathematician,
celebrated for his invention of dynamic
programming in 1953, and important
contributions in other fields of mathematics.
Biography
Bellman was born in 1920 in New York City,
where his father John James Bellman ran a small
grocery store on Bergen Street near Prospect
Park in Brooklyn. Bellman completed his studies
at Abraham Lincoln High School in 1937, and
studied mathematics at Brooklyn College where
he received a BA in 1941. He later earned an MA
from the University of Wisconsin–Madison.
37
Important mathematicians in the optimization field
Work
Bellman equation
A Bellman equation, also known as a dynamic
programming equation, is a necessary condition
for optimality associated with the mathematical
optimization method known as dynamic
programming. Almost any problem which can be
solved using optimal control theory can also be
solved by analyzing the appropriate Bellman
38
Richard E. Bellman
Hamilton-Jacobi-Bellman
The Hamilton-Jacobi-Bellman equation (HJB)
equation is a partial differential equation which is
central to optimal control theory. The solution of
the HJB equation is the 'value function', which
gives the optimal cost-to-go for a given dynamical
system with an associated cost function. Classical
variational problems, for example, the
brachistochrone problem can be solved using this
method as well.
The equation is a result of the theory of dynamic
programming which was pioneered in the 1950s
by Richard Bellman and coworkers. The
corresponding discrete-time equation is usually
referred to as the Bellman equation. In
continuous time, the result can be seen as an
extension of earlier work in classical physics on
the Hamilton-Jacobi equation by William Rowan
Hamilton and Carl Gustav Jacob Jacobi.
39
Important mathematicians in the optimization field
Curse of dimensionality
The "Curse of dimensionality", is a term coined by
Bellman to describe the problem caused by the
exponential increase in volume associated with
adding extra dimensions to a (mathematical)
space. One implication of the curse of
dimensionality is that some methods for
numerical solution of the Bellman equation
require vastly more computer time when there
are more state variables in the value function.
For example, 100 evenly-spaced sample points
suffice to sample a unit interval with no more
than 0.01 distance between points; an equivalent
sampling of a 10-dimensional unit hypercube with
a lattice with a spacing of 0.01 between adjacent
points would require 1020 sample points: thus, in
some sense, the 10-dimensional hypercube can be
said to be a factor of 10 18 "larger" than the unit
interval. (Adapted from an example by R. E.
Bellman, see below.)
Bellman–Ford algorithm
The Bellman-Ford algorithm sometimes referred
to as the Label Correcting Algorithm, computes
single-source shortest paths in a weighted
digraph (where some of the edge weights may be
40
Richard E. Bellman
Publications
Over the course of his career he published 619
papers and 39 books. During the last 11 years of
his life he published over 100 papers despite
suffering from crippling complications of a brain
surgery (Dreyfus, 2003). A selection:
•1959. Asymptotic Behavior of Solutions of
Differential Equations
•1961. An Introduction to Inequalities
•1961. Adaptive Control Processes: A Guided
Tour
•1962. Applied Dynamic Programming
•1967. Introduction to the Mathematical Theory
of Control Processes
•1970. Algorithms, Graphs and Computers
•1972. Dynamic Programming and Partial
Differential Equations
41
Important mathematicians in the optimization field
42
Ronald A. Howard
Ronald A. Howard
Ronald A. Howard (born August 27, 1934) has
been a professor at Stanford University since
1965. In 1964 he defined the profession of
decision analysis, and since then has been
developing the field as professor in the
Department of Engineering-Economic Systems
(now the Department of Management Science
and Engineering) in the School of Engineering at
Stanford.
Professor Howard directs teaching and research
in decision analysis at Stanford, and is the
Director of the Decisions and Ethics Center,
which examines the efficacy and ethics of social
arrangements. He was a founding Director and
Chairman of Strategic Decisions Group. Current
research interests are improving the quality of
decisions, life-and-death decision making, and the
creation of a coercion-free society.
In 1986 he received the Operations Research
Society of America's Frank P. Ramsey Medal "for
distinguished contributions in decision analysis".
In 1998 he received from the Institute for
Operations Research and the Management
Sciences (INFORMS) the first award for the
teaching of operations research/management
science practice. In 1999 INFORMS invited him
43
Important mathematicians in the optimization field
Publications
•1960. Dynamic Programming and Markov
Processes, The M.I.T. Press.
•1971. Dynamic Probabilistic Systems (two
volumes), John F. Wiley & Sons, Inc., New York
City.
•1977. Readings in Decision Analysis. With Jim E.
Matheson (editors). SRI International, Menlo
Park, California.
44
Ronald A. Howard
45
Important mathematicians in the optimization field
Leonid Kantorovich
Leonid Vitaliyevich Kantorovich
Born 19 January 1912
Saint Petersburg, Russian Empire
Died 7 April 1986 (aged 74)
Moscow, Russia, USSR
Nationality Soviet Russia
Fields Mathematics
Alma mater Leningrad State University
Known for Linear programming
Kantorovich theorem
normed vector lattice (Kantorovich space)
Kantorovich metric
Kantorovich inequality
approximation theory
iterative methods
functional analysis
numerical analysis
scientific computing
Notable Sveriges Riksbank Prize in Economic Sciences
awards in Memory of Alfred Nobel (1975)
Leonid Vitaliyevich Kantorovich (Russian:
Леонид Витальевич Канторович) (19 January
1912, Saint Petersburg – 7 April 1986, Moscow)
was a Soviet/Russian mathematician and
economist, known for his theory and development
of techniques for the optimal allocation of
resources. He was the winner of the Nobel Prize
46
Leonid Kantorovich
47
Important mathematicians in the optimization field
Mathematics
In mathematical analysis, Kantorovich had
important results in functional analysis,
approximation theory, and operator theory.
In particular, Kantorovich formulated
fundamental results in the theory of normed
vector lattices, which are called "K-spaces" in his
honor.
Kantorovich showed that functional analysis
could be used in the analysis of iterative methods,
obtaining the Kantorovich inequalities on the
convergence rate of the gradient method and of
Newton's method.
Kantorovich considered infinite-dimensional
optimization problems, such as the Kantorovich-
Monge problem of mass transfer. His analysis
proposed the Kantorovich metric, which is used
in probability theory, in the theory of the weak
convergence of probability measures.
48
Leonid Kantorovich
References
•V. Makarov (1987). "Kantorovich, Leonid
Vitaliyevich" The New Palgrave: A Dictionary of
Economics, v. 3, pp. 14-15.
•L.V. Kantorovich (1939). "Mathematical
Methods of Organizing and Planning Production"
Management Science, Vol. 6, No. 4 (Jul., 1960),
pp. 366-422.
•Klaus Hagendorf (2008). Spreadsheet
presenting all examples of Kantorovich, 1939
with the OpenOffice.org Calc Solver as well as
the lp_solver.
49
Important mathematicians in the optimization field
Narendra Karmarkar
Narendra K. Karmarkar (born 1957) is an
Indian mathematician, renowned for developing
Karmarkar's algorithm. He is listed as an ISI
highly cited researcher.
Biography
Narendra Karmarkar was born in Gwalior to a
Marathi family. Karmarkar received his B.Tech at
the IIT Bombay in 1978. Later, he received his
M.S. at the California Institute of Technology,
and his Ph.D. in Computer Science at the
University of California, Berkeley.
He published his famous result in 1984 while he
was working for Bell Laboratories in New Jersey.
Karmarkar was a professor at the Tata Institute
of Fundamental Research in Bombay.
He is currently working on a new architecture for
supercomputing. Some of the ideas are published
at Fab5 conference organised by MIT center for
bits and atoms.
Karmarkar received a number of awards for his
algorithm, among them:
50
Narendra Karmarkar
51
Important mathematicians in the optimization field
Work
Karmarkar's algorithm
Karmarkar's algorithm solves linear
programming problems in polynomial time. These
problems are represented by "n" variables and
"m" constraints. The previous method of solving
these problems consisted of problem
representation by an "x" sided solid with "y"
vertices, where the solution was approached by
traversing from vertex to vertex. Karmarkar's
novel method approaches the solution by cutting
through the above solid in its traversal.
Consequently, complex optimization problems are
solved much faster using the Karmarkar
algorithm. A practical example of this efficiency
is the solution to a complex problem in
communications network optimization where the
solution time was reduced from weeks to days.
His algorithm thus enables faster business and
policy decisions. Karmarkar's algorithm has
stimulated the development of several other
interior point methods, some of which are used in
current codes for solving linear programs.
52
Narendra Karmarkar
Research on computer
architecture
Galois geometry
After working on the Interior Point Method,
Karmarkar worked on a new architecture for
supercomputing, based on concepts from
53
Important mathematicians in the optimization field
Current investigations
Currently, he is synthesizing these concepts with
some new ideas he calls sculpturing free space (a
non-linear analogue of what has popularly been
described as folding the perfect corner). This
approach allows him to extend this work to the
physical design of machines. He is now
publishing updates on his recent work, including
an extended abstract. This new paradigm was
presented at IVNC, Poland on 16 July 2008, and
at MIT on 25 July 2008. Some of the recent work
is published at and Fab5 conference organised by
MIT center for bits and atoms
54
William Karush
William Karush
William Karush
Born March 1, 1917
Died February 22, 1997 (aged 79)
Known for Contribution to Karush-Kuhn-Tucker conditions
William Karush (1917-03-01 – 1997-02-22) was
a professor emeritus of California State
University at Northridge and is a mathematician
best known for his contribution to Karush-Kuhn-
Tucker conditions. He was the first to publish the
necessary conditions for the inequality
constrained problem in his Masters thesis,
although he became renowned after a seminal
conference paper by Harold W. Kuhn and Albert
W. Tucker.
Selected works
•Webster's New World Dictionary of
Mathematics, MacMillan Reference Books,
Revised edition (April 1989), ISBN 978-
0131926677
•On the Maximum Transform and Semigroups of
Transformations (1998), Richard Bellman,
William Karush,
55
Important mathematicians in the optimization field
56
Harold W. Kuhn
Harold W. Kuhn
Harold W. Kuhn
Residence United States
Nationality American
Fields Mathematics
Institutions Princeton University
Known for Hungarian method
Karush–Kuhn–Tucker conditions
Notable John von Neumann Theory Prize
awards
Harold William Kuhn (born 1925) is an
American mathematician who studied game
theory. He won the 1980 John von Neumann
Theory Prize along with David Gale and Albert W.
Tucker. A Professor-Emeritus of Mathematics at
Princeton University, he is known for the Karush-
Kuhn-Tucker conditions, for developing Kuhn
poker as well as the description of the Hungarian
method for the assignment problem.
He is known for his association with John Forbes
Nash, as a fellow graduate student, a lifelong
friend and colleague, and a key figure in getting
Nash the attention of the Nobel Prize committee
that led to Nash's 1994 Nobel Prize in Economics.
Kuhn and Nash both had long associations and
collaborations with Albert W. Tucker, who was
Nash's dissertation advisor. Kuhn co-edited The
57
Important mathematicians in the optimization field
Bibliography
•Kuhn, H. W. (1955), "The Hungarian method for
the assignment problem", Naval Research
Logistics Quarterly, 2:83–87.
1. Republished as: Kuhn, H. W. (2005),
"The Hungarian method for the
assignment problem", Naval Research
Logistics, 52(1):7–21. DOI:
10.1002/nav.20053.
•Guillermo Owen (2004) IFORS' Operational
Research Hall of Fame Harold W. Kuhn
International Transactions in Operational
Research 11 (6), 715–718. doi:10.1111/j.1475-
3995.2004.00486.
•Kuhn, H.W. "Classics in Game Theory."
(Princeton University Press, 1997). ISBN 978-0-
691-01192-9.
58
Harold W. Kuhn
59
Important mathematicians in the optimization field
Notes
Note he did not have a doctoral advisor but
60
Joseph Louis Lagrange
61
Important mathematicians in the optimization field
Scientific contribution
Lagrange was one of the creators of the calculus
of variations, deriving the Euler–Lagrange
equations for extrema of functionals. He also
extended the method to take into account
possible constraints, arriving at the method of
Lagrange multipliers. Lagrange invented the
method of solving differential equations known as
variation of parameters, applied differential
calculus to the theory of probabilities and
attained notable work on the solution of
equations. He proved that every natural number
is a sum of four squares. His treatise Theorie des
fonctions analytiques laid some of the
foundations of group theory, anticipating Galois.
62
Joseph Louis Lagrange
Biography
Early years
Lagrange was born, of French and Italian descent
(a paternal great grandfather was a French army
officer who then moved to Turin), as Giuseppe
Lodovico Lagrangia in Turin. His father, who
had charge of the Kingdom of Sardinia's military
chest, was of good social position and wealthy,
but before his son grew up he had lost most of his
property in speculations, and young Lagrange
had to rely on his own abilities for his position.
He was educated at the college of Turin, but it
63
Important mathematicians in the optimization field
Variational calculus
Lagrange is one of the founders of the calculus of
variations. Starting in 1754, he worked on the
problem of tautochrone, discovering a method of
maximizing and minimizing functionals in a way
similar to finding extrema of functions. Lagrange
wrote several letters to Leonhard Euler between
1754 and 1756 describing his results. He outlined
his "δ-algorithm", leading to the Euler–Lagrange
equations of variational calculus and considerably
simplifying Euler's earlier analysis. Lagrange also
applied his ideas to problems of classical
mechanics, generalizing the results of Euler and
Maupertuis.
Euler was very impressed with Lagrange's
results. It has sometimes been stated that "with
characteristic courtesy he withheld a paper he
had previously written, which covered some of
the same ground, in order that the young Italian
64
Joseph Louis Lagrange
Miscellanea Taurinensia
In 1758, with the aid of his pupils, Lagrange
established a society, which was subsequently
incorporated as the Turin Academy of Sciences,
and most of his early writings are to be found in
the five volumes of its transactions, usually
known as the Miscellanea Taurinensia. Many of
these are elaborate papers. The first volume
contains a paper on the theory of the propagation
of sound; in this he indicates a mistake made by
Newton, obtains the general differential equation
for the motion, and integrates it for motion in a
straight line. This volume also contains the
complete solution of the problem of a string
vibrating transversely; in this paper he points out
a lack of generality in the solutions previously
given by Brook Taylor, D'Alembert, and Euler,
and arrives at the conclusion that the form of the
curve at any time t is given by the equation
y=a sin m xsin n t . The article concludes with a
masterly discussion of echoes, beats, and
compound sounds. Other articles in this volume
65
Important mathematicians in the optimization field
66
Joseph Louis Lagrange
Berlin Academy
Already in 1756 Euler, with support from
Maupertuis, made an attempt to bring Lagrange
to the Berlin Academy. Later, D'Alambert
interceded on Lagrange's behalf with Frederick
of Prussia and wrote to Lagrange asking him to
leave Turin for a considerably more prestigious
position in Berlin. Lagrange turned down both
offers, responding in 1765 that
It seems to me that Berlin would not be at all
suitable for me while M.Euler is there.
In 1766 Euler left Berlin for Saint Petersburg,
and Frederick wrote to Lagrange expressing the
wish of "the greatest king in Europe" to have "the
greatest mathematician in Europe" resident at his
court. Lagrange was finally persuaded and he
spent the next twenty years in Prussia, where he
produced not only the long series of papers
published in the Berlin and Turin transactions,
but his monumental work, the Mécanique
analytique. His residence at Berlin commenced
with an unfortunate mistake. Finding most of his
colleagues married, and assured by their wives
that it was the only way to be happy, he married;
his wife soon died, but the union was not a happy
one.
67
Important mathematicians in the optimization field
France
In 1786, Frederick died, and Lagrange, who had
found the climate of Berlin trying, gladly
accepted the offer of Louis XVI to move to Paris.
He received similar invitations from Spain and
Naples. In France he was received with every
mark of distinction and special apartments in the
Louvre were prepared for his reception, and he
became a member of the French Academy of
Sciences, which later became part of the National
Institute. At the beginning of his residence in
68
Joseph Louis Lagrange
69
Important mathematicians in the optimization field
École normale
In 1795, Lagrange was appointed to a
mathematical chair at the newly established
École normale, which enjoyed only a brief
existence of four months. His lectures here were
quite elementary, and contain nothing of any
special importance, but they were published
because the professors had to "pledge themselves
to the representatives of the people and to each
other neither to read nor to repeat from
memory," and the discourses were ordered to be
taken down in shorthand in order to enable the
deputies to see how the professors acquitted
themselves.
70
Joseph Louis Lagrange
École Polytechnique
Lagrange was appointed professor of the École
Polytechnique in 1794; and his lectures there are
described by mathematicians who had the good
fortune to be able to attend them, as almost
perfect both in form and matter. Beginning with
the merest elements, he led his hearers on until,
almost unknown to themselves, they were
themselves extending the bounds of the subject:
above all he impressed on his pupils the
advantage of always using general methods
expressed in a symmetrical notation.
On the other hand, Fourier, who attended his
lectures in 1795, wrote:
His voice is very feeble, at least in that he
does not become heated; he has a very
pronounced Italian accent and pronounces
the s like z … The students, of whom the
majority are incapable of appreciating him,
give him little welcome, but the professors
make amends for it.
Late years
In 1810, Lagrange commenced a thorough
revision of the Mécanique analytique, but he was
able to complete only about two-thirds of it
before his death in 1813. He was buried that
71
Important mathematicians in the optimization field
Work in Berlin
Lagrange was extremely active scientifically
during twenty years he spent in Berlin. Not only
did he produce his splendid Mécanique
analytique, but he contributed between one and
two hundred papers to the Academy of Turin, the
Berlin Academy, and the French Academy. Some
of these are really treatises, and all without
exception are of a high order of excellence.
Except for a short time when he was ill he
produced on average about one paper a month.
Of these, note the following as amongst the most
important.
First, his contributions to the fourth and fifth
volumes, 1766–1773, of the Miscellanea
Taurinensia; of which the most important was the
one in 1771, in which he discussed how
72
Joseph Louis Lagrange
Lagrangian mechanics
Between 1772 and 1788, Lagrange re-formulated
Classical/Newtonian mechanics to simplify
formulas and ease calculations. These mechanics
are called Lagrangian mechanics.
73
Important mathematicians in the optimization field
Algebra
The greater number of his papers during this
time were, however, contributed to the Prussian
Academy of Sciences. Several of them deal with
questions in algebra.
•His discussion of representations of integers by
quadratic forms (1769) and by more general
algebraic forms (1770).
•His tract on the Theory of Elimination, 1770.
•Lagrange's theorem that the order of a
subgroup H of a group G must divide the order of
G.
•His papers of 1770 and 1771 on the general
process for solving an algebraic equation of any
degree via the Lagrange resolvents. This method
fails to give a general formula for solutions of an
equation of degree five and higher, because the
auxiliary equation involved has higher degree
than the original one. The significance of this
method is that it exhibits the already known
formulas for solving equations of second, third,
and fourth degrees as manifestations of a single
principle, and was foundational in Galois theory.
The complete solution of a binomial equation of
any degree is also treated in these papers.
74
Joseph Louis Lagrange
Number Theory
Several of his early papers also deal with
questions of number theory.
•Lagrange (1766–1769) was the first to prove
that Pell's equation x 2−n y 2=1 has a nontrivial
solution in the integers for any non-square
natural number n.
•He proved the theorem, stated by Bachet
without justification, that every positive integer is
the sum of four squares, 1770.
•He proved Wilson's theorem that if n is a prime,
then (n − 1)! + 1 is always a multiple of n, 1771.
•His papers of 1773, 1775, and 1777 gave
demonstrations of several results enunciated by
Fermat, and not previously proved.
•His Recherches d'Arithmétique of 1775
developed a general theory of binary quadratic
forms to handle the general problem of when an
75
Important mathematicians in the optimization field
Astronomy
Lastly, there are numerous papers on problems in
astronomy. Of these the most important are the
following:
•Attempting to solve the three-body problem
resulting in the discovery of Lagrangian points,
1772
76
Joseph Louis Lagrange
77
Important mathematicians in the optimization field
Mécanique analytique
Over and above these various papers he
composed his great treatise, the Mécanique
analytique. In this he lays down the law of virtual
work, and from that one fundamental principle,
by the aid of the calculus of variations, deduces
the whole of mechanics, both of solids and fluids.
The object of the book is to show that the subject
is implicitly included in a single principle, and to
give general formulae from which any particular
result can be obtained. The method of
generalized co-ordinates by which he obtained
this result is perhaps the most brilliant result of
his analysis. Instead of following the motion of
each individual part of a material system, as
D'Alembert and Euler had done, he showed that,
if we determine its configuration by a sufficient
number of variables whose number is the same as
that of the degrees of freedom possessed by the
system, then the kinetic and potential energies of
78
Joseph Louis Lagrange
79
Important mathematicians in the optimization field
Work in France
80
Joseph Louis Lagrange
Infinitesimals
At a later period Lagrange reverted to the use of
infinitesimals in preference to founding the
differential calculus on the study of algebraic
forms; and in the preface to the second edition of
the Mécanique, which was issued in 1811, he
justifies the employment of infinitesimals, and
concludes by saying that:
81
Important mathematicians in the optimization field
Continued fractions
His Résolution des équations numériques,
published in 1798, was also the fruit of his
lectures at École Polytechnique. There he gives
the method of approximating to the real roots of
an equation by means of continued fractions, and
enunciates several other theorems. In a note at
the end he shows how Fermat's little theorem
that
ap−1 − 1 ≡ 0 (mod p)
where p is a prime and a is prime to p, may be
applied to give the complete algebraic solution of
any binomial equation. He also here explains how
the equation whose roots are the squares of the
differences of the roots of the original equation
may be used so as to give considerable
information as to the position and nature of those
roots.
82
Joseph Louis Lagrange
83
Important mathematicians in the optimization field
Apocrypha
He was of medium height and slightly formed,
with pale blue eyes and a colorless complexion.
He was nervous and timid, he detested
controversy, and, to avoid it, willingly allowed
others to take credit for what he had done
himself.
Due to thorough preparation, he was usually able
to write out his papers complete without a single
crossing-out or correction.
84
Joseph Louis Lagrange
References
•Columbia Encyclopedia, 6th ed., 2005, "
Lagrange, Joseph Louis."
•W. W. Rouse Ball, 1908, " Joseph Louis
Lagrange (1736 - 1813)," A Short Account of the
History of Mathematics, 4th ed.
•Chanson, Hubert, 2007, "Velocity Potential in
Real Fluid Flows: Joseph-Louis Lagrange's
Contribution," La Houille Blanche 5: 127-31.
•Fraser, Craig G., 2005, "Théorie des fonctions
analytiques" in Grattan-Guinness, I., ed.,
Landmark Writings in Western Mathematics.
Elsevier: 258-76.
•Lagrange, Joseph-Louis. (1811). Mecanique
Analytique. Courcier (reissued by Cambridge
University Press, 2009; ISBN 9781108001748)
•Lagrange, J.L. (1781) "Mémoire sur la Théorie
du Mouvement des Fluides"(Memoir on the
Theory of Fluid Motion) in Serret, J.A., ed., 1867.
Oeuvres de Lagrange, Vol. 4. Paris" Gauthier-
Villars: 695-748.
•Pulte, Helmut, 2005, "Méchanique Analytique"
in Grattan-Guinness, I., ed., Landmark Writings
in Western Mathematics. Elsevier: 208-24.
85
Important mathematicians in the optimization field
86
John von Neumann
87
Important mathematicians in the optimization field
Biography
The eldest of three brothers, von Neumann was
born Neumann János Lajos (in Hungarian the
family name comes first) on December 28, 1903
in Budapest, Austro-Hungarian Empire, to a
wealthy Jewish family. His father was Neumann
Miksa (Max Neumann), a lawyer who worked in a
bank. His mother was Kann Margit (Margaret
Kann). Von Neumann's ancestors had originally
immigrated to Hungary from Russia.
88
John von Neumann
89
Important mathematicians in the optimization field
90
John von Neumann
91
Important mathematicians in the optimization field
92
John von Neumann
93
Important mathematicians in the optimization field
94
John von Neumann
Quantum mechanics
At the International Congress of Mathematicians
of 1900, David Hilbert presented his famous list
of twenty-three problems considered central for
the development of the mathematics of the new
century. The sixth of these was the
axiomatization of physical theories. Among the
new physical theories of the century the only one
which had yet to receive such a treatment by the
end of the 1930s was quantum mechanics.
Quantum mechanics found itself in a condition of
foundational crisis similar to that of set theory at
the beginning of the century, facing problems of
both philosophical and technical natures. On the
one hand, its apparent non-determinism had not
been reduced to an explanation of a deterministic
form. On the other, there still existed two
independent but equivalent heuristic
formulations, the so-called matrix mechanical
formulation due to Werner Heisenberg and the
wave mechanical formulation due to Erwin
Schrödinger, but there was not yet a single,
unified satisfactory theoretical formulation.
After having completed the axiomatization of set
theory, von Neumann began to confront the
axiomatization of quantum mechanics. He
immediately realized, in 1926, that a quantum
system could be considered as a point in a so-
95
Important mathematicians in the optimization field
96
John von Neumann
97
Important mathematicians in the optimization field
98
John von Neumann
Nuclear weapons
Beginning in the late 1930s, von Neumann began
to take more of an interest in applied (as opposed
to pure) mathematics. In particular, he developed
an expertise in explosions—phenomena which are
difficult to model mathematically. This led him to
a large number of military consultancies,
primarily for the Navy, which in turn led to his
involvement in the Manhattan Project. The
involvement included frequent trips by train to
the project's secret research facilities in Los
Alamos, New Mexico.
Von Neumann's principal contribution to the
atomic bomb itself was in the concept and design
of the explosive lenses needed to compress the
plutonium core of the Trinity test device and the
99
Important mathematicians in the optimization field
100
John von Neumann
101
Important mathematicians in the optimization field
Computer science
Von Neumann's hydrogen bomb work was also
played out in the realm of computing, where he
and Stanislaw Ulam developed simulations on von
Neumann's digital computers for the
hydrodynamic computations. During this time he
contributed to the development of the Monte
Carlo method, which allowed complicated
problems to be approximated using random
numbers. Because using lists of "truly" random
numbers was extremely slow, von Neumann
developed a form of making pseudorandom
numbers, using the middle-square method.
Though this method has been criticized as crude,
von Neumann was aware of this: he justified it as
being faster than any other method at his
disposal, and also noted that when it went awry it
102
John von Neumann
103
Important mathematicians in the optimization field
104
John von Neumann
105
Important mathematicians in the optimization field
Personality
Von Neumann invariably wore a conservative
grey flannel business suit - he was even known to
play tennis wearing his business suit - and he
enjoyed throwing large parties at his home in
Princeton, occasionally twice a week. His white
clapboard house at 26 Westcott Road was one of
the largest in Princeton. Despite being a
notoriously bad driver, he nonetheless enjoyed
driving (frequently while reading a book) -
occasioning numerous arrests as well as
accidents. When Cuthbert Hurd hired him as a
consultant to IBM, Hurd often quietly paid the
fines for his traffic tickets.
106
John von Neumann
Honors
The John von Neumann Theory Prize of the
Institute for Operations Research and the
Management Sciences (INFORMS, previously
TIMS-ORSA) is awarded annually to an individual
(or group) who have made fundamental and
sustained contributions to theory in operations
research and the management sciences.
The IEEE John von Neumann Medal is awarded
annually by the IEEE "for outstanding
achievements in computer-related science and
technology."
107
Important mathematicians in the optimization field
108
John von Neumann
Selected works
•Jean van Heijenoort, 1967. A Source Book in
Mathematical Logic, 1879-1931. Harvard Univ.
Press.
1. 1923. On the introduction of transfinite
numbers, 346-54.
2. 1925. An axiomatization of set theory,
393-413.
109
Important mathematicians in the optimization field
Biographical material
•Aspray, William, 1990. John von Neumann and
the Origins of Modern Computing.
•Chiara, Dalla, Maria Luisa and Giuntini, Roberto
1997, La Logica Quantistica in Boniolo, Giovani,
ed., Filosofia della Fisica (Philosophy of Physics).
Bruno Mondadori.
•Goldstine, Herman, 1980. The Computer from
Pascal to von Neumann.
•Halmos, Paul R., 1985. I Want To Be A
Mathematician Springer-Verlag
•Hashagen, Ulf, 2006: Johann Ludwig Neumann
von Margitta (1903–1957). Teil 1: Lehrjahre eines
jüdischen Mathematikers während der Zeit der
Weimarer Republik. In: Informatik-Spektrum 29
(2), S. 133-141.
•Hashagen, Ulf, 2006: Johann Ludwig Neumann
von Margitta (1903–1957). Teil 2: Ein
Privatdozent auf dem Weg von Berlin nach
Princeton. In: Informatik-Spektrum 29 (3), S. 227-
236.
110
John von Neumann
111
Important mathematicians in the optimization field
Popular periodicals
•Good Housekeeping Magazine, September 1956
Married to a Man Who Believes the Mind Can
Move the World
•Life Magazine, February 25, 1957 Passing of a
Great Mind
Video
•John von Neumann, A Documentary (60 min.),
Mathematical Association of America
References
This article was originally based on material from
the Free On-line Dictionary of Computing, which
is licensed under the GFDL.
•Doran, Robert S.; John Von Neumann, Marshall
Harvey Stone, Richard V. Kadison, American
Mathematical Society (2004). Operator Algebras,
Quantization, and Noncommutative Geometry: A
Centennial Celebration Honoring John Von
Neumann and Marshall H. Stone. American
Mathematical Society Bookstore.
ISBN 9780821834022.
•Heims, Steve J. (1980). John von Neumann and
Norbert Wiener, from Mathematics to the
112
John von Neumann
113
Important mathematicians in the optimization field
Lev Pontryagin
Lev Semenovich Pontryagin (Russian: Лев
Семёнович Понтря‰гин) (3 September 1908 –
3 May 1988) was a Soviet Russian
mathematician. He was born in Moscow and lost
his eyesight in a primus stove explosion when he
was 14. Despite his blindness he was able to
become a mathematician due to the help of his
mother Tatyana Andreevna who read
mathematical books and papers (notably those of
Heinz Hopf, J. H. C. Whitehead and Hassler
Whitney) to him. He made major discoveries in a
number of fields of mathematics, including the
geometric parts of topology.
Work
He worked on duality theory for homology while
still a student. He went on to lay foundations for
the abstract theory of the Fourier transform, now
called Pontryagin duality. In topology he posed
the basic problem of cobordism theory. This led
to the introduction around 1940 of a theory of
characteristic classes, now called Pontryagin
classes, designed to vanish on a manifold that is a
boundary. Moreover, in operator theory there are
114
Lev Pontryagin
115
Important mathematicians in the optimization field
116
Naum Z. Shor
Naum Z. Shor
Naum Zuselevich Shor
Born 1 January 1937
Kiev, Ukraine, USSR
Died 26 February 2006 (aged 69)
Nationality Soviet Union
Ukraine
Institutions V.M. Glushkov Institute of
Cybernetics, Kiev, Ukraine
Naum Zuselevich Shor (Ukrainian: Шор Наум
Зуселевич) (1 January 1937 – 26 February
2006) was a Soviet and Ukrainian mathematician
specializing in optimization.
He made significant contributions to nonlinear
and stochastic programming, numerical
techniques for non-smooth optimization, discrete
optimization problems, matrix optimization, dual
quadratic bounds in multi-extremal programming
problems.
Shor became a full member of the National
Academy of Science of Ukraine in 1998.
117
Important mathematicians in the optimization field
Subgradient methods
N. Z. Shor is well known for his method of
generalized gradient descent with space dilation
in the direction of the difference of two
successive subgradients (the so-called r-
algorithm). The ellipsoid method was re-
invigorated by A.S. Nemirovsky and D.B. Yudin,
who developed a careful complexity analysis of its
approximation properties for problems of convex
minimization with real data. However, it was
Leonid Khachiyan who provided the rational-
arithmetic complexity analysis, using an ellipsoid
algorithm, that established that linear
programming problems can be solved in
polynomial time.
It has long been known that the ellipsoidal
methods are special cases of these subgradient-
type methods.
118
Naum Z. Shor
References
Bibliography
•"Congratulations to Naum Shor on his 65th
birthday", Journal of Global Optimization 24 (2):
111–114, 2002, doi:10.1023/A:1020215832722.
119
Important mathematicians in the optimization field
Albert W. Tucker
Albert W. Tucker
Born 28 November 1905
Oshawa, Ontario, Canada
Died 25 January 1995 (aged 89)
Highstown, N.J., U.S.
Residence U.S.
Nationality American
Canadian
Fields Mathematician:
Combinatorial topology
Optimization
Institutions Princeton University
Alma mater University of Toronto, Princeton
University
Doctoral advisor Solomon Lefschetz
Doctoral students David Gale
Marvin Minsky
John Forbes Nash
Torrence Parsons
Lloyd Shapley
Known for Prisoner's dilemma
Karush-Kuhn-Tucker conditions
Combinatorial linear algebra
Influenced Harold W. Kuhn
David Gale
R. T. Rockafellar
120
Albert W. Tucker
121
Important mathematicians in the optimization field
122
Albert W. Tucker
123
Important mathematicians in the optimization field
Hoang Tuy
Hoàng "Jefferson" Tụy (born December 17,
1927) is a prominent Vietnamese applied
mathematician. He is one of two founders of
Vietnamese Mathematics, the other is Le Van
Thiem. He received his PhD in mathematics from
Moscow State University in 1959. He has worked
mainly and did pioneering work in the field of
global optimization. He has published more than
160 referred journal and conference articles. He
presently is with the Institute of Mathematics of
the Vietnamese Academy of Science and
Technology, where he was director from 1980 to
1989. His son, Hoang Duong Tuan, is now an
Associate Professor in Electrical Engineering and
Telecommunications at the University of New
South Wales, Australia, where he is working on
the applications of optimization in various
engineering fields. His son-in-law, Phan Thien
Thach, works also on Optimization.
In 1997, a workshop in honor of Hoang Tuy was
organized at the Linkoping Institute of
Technology, Sweden.
In December 2007, an international conference
on Nonconvex Programming was held in Rouen,
France to pay tribute to him on the occasion of
his 80th birthday, in recognition of his pioneering
124
Hoang Tuy
125