You are on page 1of 86

Disclaimer

This offering is not approved or


endorsed by OpenCFD Limited, the
producer of the OpenFOAM software
and owner of the OPENFOAM and
OpenCFD trade marks.
Optimization with DAKOTA and
OpenFOAM
University of Genoa, DICCA
Dipartimento di Ingegneria Civile, Chimica e Ambientale
From 17
th
to 21
th
February, 2014
Your Lecturer
Joel GUERRERO

joel.guerrero@unige.it





guerrero@wolfdynamics.com

Your Lecturer
Damiano NATALI

damiano.natali@unige.it





natali@wolfdynamics.com
Your Lecturer
Matteo BARGIACCHI

matteo.bargiacchi@unige.it




bargiacchi@wolfdynamics.com
Todays lecture
1. Introduction to optimization methods
2. DAKOTA overview
3. Working with DAKOTA: Rosenbrock function
4. Working with DAKOTA: Branin function
5. Coupling DAKOTA and OpenFOAM: driven cavity
case
6. Coupling DAKOTA and OpenFOAM: ahmed body
case
7. Coupling DAKOTA and OpenFOAM: naca airfoil
case (multi-odjective optimization)
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Todays lecture
1. Introduction to optimization methods
2. DAKOTA overview
3. Working with DAKOTA: Rosenbrock function
4. Working with DAKOTA: Branin function
5. Coupling DAKOTA and OpenFOAM: driven cavity
case
6. Coupling DAKOTA and OpenFOAM: ahmed body
case
7. Coupling DAKOTA and OpenFOAM: naca airfoil
case (multi-odjective optimization)
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Introduction to optimization methods
What is optimization?
In plain English, optimization is the act of obtaining the best
result under given circumstances. This applies to any field
(finance, health, construction, operations, manufacturing,
transportation, engineering design, sales, public services, mail,
and so on).
The ultimate goal is either to minimize or to maximize an
outcome, a process or a function.
Optimization, in its broadest sense, can be applied to solve any
real life problem. Some typical applications from different
engineering disciplines indicate the wide scope of the subject:
Introduction to optimization methods
What is optimization?
Some typical applications indicate the wide scope of the subject:
Design of aircraft and aerospace structures for minimum weight.
Reduce of fuel consumption in transportation.
Design of civil engineering for minimum cost.
Optimum design of mechanical components.
Optimum design of electrical networks.
Shortest route taken by a salesperson visiting various cities during one tour.
Optimal production planning, controlling, and scheduling.
Selection of a site for an industry.
Planning of maintenance to reduce operating costs.
Inventory control.
Controlling the waiting and idle times and queuing in production lines to reduce
the costs.
Planning the best strategy to obtain maximum profit in the presence of a
competitor.
Find which minimizes or maximizes
where is an n-dimensional vector called the design vector, is the
objective function, and and are known as inequality and
equality constraints, respectively.
Introduction to optimization methods
What is optimization?
Mathematically speaking, optimization is the minimization or maximization of
a quantity of interest (objective function), subject to constraints on its
variables. An optimization problem, can be stated as follows,
X =

x
1
x
2
.
.
.
x
n

f (X)
g
j
(X) 0, j = 1, 2, . . . , m
l
j
(X) = 0, j = 1, 2, . . . , p
l
j
(X) g
j
(X)
X
Subject to the constraints
f (X)
Introduction to optimization methods
Multimodal function
Let us use this figure as a model problem to introduce a few concepts. The
goal is to optimize this function or quantity of interest.
By the way, this figure can easily represent an actual application.
Minimum of is the same as the maximum of - f (X) f (X) has one local minimum, one local maximum
and one global minimum
f (X)
Introduction to optimization methods
Multimodal function
In physical or computer experiments, the quantity of interest often does not
exhibit a smooth behavior. Very often the response is noisy, which makes
optimization very challenging. Also, evaluating the objective function can be
very expensive.
Smooth f (X) Noisy f (X)
Introduction to optimization methods
Constrained multimodal function
We can optimize this function or quantity of interest, subject to many linear
and/or non-linear constraint (equalities and inequalities).
Constrained ,the grey area represents the non-feasible region. f (X)
Introduction to optimization methods
Multi-objective optimization
In multi-objective optimization we are interested in optimizing more than one objective
function simultaneously.
The final goal is to find a representative set optimal solutions (Pareto frontier), quantify
the trade-offs in satisfying the different objectives, and/or finding a single solution that
satisfies the subjective preferences of a human decision maker.
Introduction to optimization methods
Multi-objective optimization
In multi-objective optimization we are interested in optimizing more than one objective
function simultaneously.
Todays lecture
1. Introduction to optimization methods
2. DAKOTA overview
3. Working with DAKOTA: Rosenbrock function
4. Working with DAKOTA: Branin function
5. Coupling DAKOTA and OpenFOAM: driven cavity
case
6. Coupling DAKOTA and OpenFOAM: ahmed body
case
7. Coupling DAKOTA and OpenFOAM: naca airfoil
case (multi-odjective optimization)
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
DAKOTA overview
DAKOTA in a nutshell

You can download DAKOTA toolkit at the following link:

http://dakota.sandia.gov/

Official releases and nightly stable releases are freely available worldwide via
GNU GPL.

DAKOTA overview
DAKOTA in a nutshell

DAKOTA stands for Design and Analysis toolKit for Optimization and Terascale
Applications

DAKOTA is a general-purpose software toolkit for performing systems analysis
and design on high performance computers. DAKOTA provides algorithms for
design optimization, uncertainty quantification, parameter estimation, design of
experiments, and sensitivity analysis, as well as a range of parallel computing
and simulation interfacing services.

DAKOTA is developed and supported by U.S. Sandia National Labs.
DAKOTA is well documented and comes with many tutorials.
Extensive support via a dedicated mailing list.
DAKOTA overview
DAKOTA capabilities

Parameter Studies (PS).
Design of Experiments (DOE) Design and Analysis of Computer Experiments
(DACE) .
Sensitivity Analysis (SA).
Uncertainty Quantification (UQ).
Optimization (OPT) via Gradient-based, derivative-free local, and global methods.
Surrogate based optimization (SBO).
Calibration Parameter estimation or Nonlinear Least Squares Capabilities.
Time-tested and advanced research algorithms to address challenging science and
engineering simulations.
Generic interface to black box solvers.
Scalable parallel computations from desktop to clusters.
Asynchronous evaluations.
Matlab, scilab, AMPL interface.
DAKOTA overview
Typical automatic parameter variation study
Optimization
Sensitivity analysis
Uncertainty quantification
Parametric studies
Design of experiments
BLACK BOX SOLVER
It can any kind of software.
The only requirement is that is must be able to run
from the shell
Input parameters
(design variables) (design function)
Reponse metrics
DAKOTA
DAKOTA overview
Workflow of a DAKOTA case with a generic
interface to a black box solver
DAKOTA
DAKOTA
Input file
DAKOTA
Output
DAKOTA
Parameters file
DAKOTA
Results file
Analysis driver script
BLACK BOX
simulator
Pre-processing
APREPRO
DPREPRO
Automatic
post-processing
User supplied
Code input Code output
INTERFACE
VARIABLES RESPONCES
METHOD
DAKOTA overview
DAKOTA Input file




strategy
single_method
graphics, tabular_graphics_data
tabular_graphics_file = rosen_grad_opt.dat

method
max_iterations = 100
convergence_tolerance = 1e-4
conmin_frcg

model
single

variables
continuous_design = 2
initial_point -1.2 1.0
lower_bounds -2.0 -2.0
upper_bounds 2.0 2.0
descriptors x1 x2

interface
analysis_driver = 'rosenbrock'
direct

responses
objective_functions = 1
numerical_gradients
no_hessians
responses (required): model output(s)
to be studied
interface (required): map from
variables to responses; control
parallelism
variables (required): parameters input
to simulation; different types, domains
for different studies
method (required): specify iterative
algorithm and settings
strategy (optional): coordinate hybrid
methods and high level settings
model (optional): single, surrogate,
nested
m
o
d
e
l


D
e
f
i
n
e

a
l
g
o
r
i
t
h
m

D
e
f
i
n
e

p
r
o
b
l
e
m
,

s
e
t

i
n
t
e
r
f
a
c
e

DAKOTA overview
DAKOTA execution
DAKOTA is most commonly run from a UNIX or Windows command prompt,
also in Jaguar (GUI).
To run DAKOTA just type,
dakota i input_file.in

To run DAKOTA and save the standard output stream (stdout); input variable
and response information for each function evaluation, method-specific info
and so on,
dakota i input_file.in o output_file.out
DAKOTA overview
DAKOTA execution
In the input file, the entry strategy tabular_graphics_data
generates tabular listing of inputs and outputs, called by the default
dakota_tabular.dat. Useful for Excel, Matlab, gnuplot, or other plotting
or post-processing package.
To get additional information and command line options,
dakota help


By the way, the banana method also applies to DAKOTA. If you misspell
something or use a keyword that does not exist, DAKOTA will list the
available options.
DAKOTA overview
Summary of DAKOTA optimization methods
Gradient-based Optimization:

DOT: frcg, bfgs, mmfd, slp, sqp (commercial)
CONMIN: frcg, mfd
NPSOL: sqp (commercial)
NLPQLP: sqp (commercial)
OPT++: cg, Newton, quasi-Newton

Derivative-free Optimization:

COLINY: PS, EA, Solis-Wets, COBYLA, DIRECT
JEGA: MOGA, SOGA
EGO: efficient global optimization via Gaussian Process models
NCSU: DIRECT
OPT++: PDS (Parallel Direct Search, simplex based method)
Parameter studies: vector, list, centered, grid, multidimensional

Design of experiments:

DDACE: LHS, MC, grid, OA, OA_LHS, CCD, BB
FSUDace: CVT, Halton, Hammersley
PSUADE: MOAT
Sampling: LHS, MC, Incr. LHS, IS/AIS/MMAIS

Multi-objective optimization, pareto, hybrid, multi-start, surrogate-based optimization (local and global).
DAKOTA overview
Choosing an optimization method
Gradient-based methods:
Looks for improvement based on derivative information.
Requires analytic or numerical derivatives.
Efficient/scalable for smooth problems.
Converges to local extreme.


DAKOTA overview
Choosing an optimization method
Derivative-free local:
Sampling with bias/rules toward improvement.
Requires only function values.
Good for noisy, unreliable or expensive derivatives.
Converges to local extreme.


DAKOTA overview
Choosing an optimization method
Derivative-free global:
Broad exploration with selective exploitation.
Requires only function values.
Typically computationally intensive.
Converges to global extreme.
Multi-objective optimization.


DAKOTA overview
Choosing an optimization method
Key considerations (see DAKOTA Users Manual Usage Guidelines):
Trend and smoothness (perform local and global sensitivity analysis).
Simulation cost.
Constraint types present; single or multi-objective.
Goal: local optimization (improvement) or global optimization (best
possible).
Variable types present (real, integer, categorical).
Any special structure, e.g., quadratic objective, highly linearly
constrained.
Resources available.


DAKOTA overview
Choosing an optimization method
Unconstrained or bound-constrained problems
Smooth and cheap: any method but gradient-based will be fastest.
Smooth and expensive: gradient-based methods.
Non-smooth and cheap: non-gradient methods such as pattern search
(local opt.), genetic algorithms (global opt.), DIRECT (global opt.), or
surrogate-based optimization (quasi local/global opt.).
Nonsmooth and expensive: surrogate-based optimization (SBO).
Nonlinearly-constrained problems
Smooth and cheap: gradient-based methods.
Smooth and expensive: gradient-based methods.
Nonsmooth and cheap: non-gradient methods, SBO
Nonsmooth and expensive: SBO.


Todays lecture
1. Introduction to optimization methods
2. DAKOTA overview
3. Working with DAKOTA: Rosenbrock function
4. Working with DAKOTA: Branin function
5. Coupling DAKOTA and OpenFOAM: driven cavity
case
6. Coupling DAKOTA and OpenFOAM: ahmed body
case
7. Coupling DAKOTA and OpenFOAM: naca airfoil
case (multi-odjective optimization)
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Working with DAKOTA: Rosenbrock function
Rosenbrock function The banana function
f(x, y) = (1 x)
2
+ 100(y x
2
)
2
(x, y) = (1, 1)
f(x, y) = 0
Search domain
2 y 2 2 x 2
Global minimum
In the directory $ptodac/model_fuctions/rosenbrock/ you will find the input files to run
these cases.
Working with DAKOTA: Rosenbrock function
Rosenbrock function The banana function
f(x, y) = (1 x)
2
+ 100(y x
2
)
2
(x, y) = (1, 1)
f(x, y) = 0
s.t. 2 x 2
s.t. 2 y 2
Working with DAKOTA: Rosenbrock function
Rosenbrock function The banana function
It actually looks like a banana (well, use your imagination).
Working with DAKOTA: Rosenbrock function
Rosenbrock function The banana function
In the ideal world, the optimization is done in a smooth function. But in reality
(numerical or physical experiments), we have something like this (next slide).
Working with DAKOTA: Rosenbrock function
Rosenbrock function The banana function
In reality, the quantity of interest often exhibit some noise. Here, optimization
using gradient based methods does not perform very well. We are going to
study a sample case later on.
Working with DAKOTA: Rosenbrock function
So, what can we do in DAKOTA?

Hereafter, I will briefly explain a few of the
methods available in DAKOTA.
If you are interested in learning more, please
(I am begging you) refer to the users guide,
developers manual and reference manual.
Contrary to the other software we are using,
DAKOTAs documentation is extremely
complete.
Working with DAKOTA: Rosenbrock function
Multidimensional study
In the directory $ptodac/model_fuctions/rosenbrock/rosenbrock_multi1 you will find
the input files to run this case.
Use multidimensional experiments for parametrical studies. The output can be used in a sensitivity
analysis, uncertainty quantification, initial screening or building a surrogate.
Working with DAKOTA: Rosenbrock function
DACE Latin Hypercube Sampling (LHS)
In the directory $ptodac/model_fuctions/rosenbrock/rosenbrock_dace1 you will find
the input files to run this case.
DACE stands for design and analysis of computer experiments. Use DACE methods for
deterministic experiments (computer simulations). In computer experiments we are interested in
sampling the parameter space in a representative way with the minimum number of samples.
The output can be used in a sensitivity analysis, uncertainty quantification, or building a surrogate.
Working with DAKOTA: Rosenbrock function
DOE Central Composite Design (CCD)
In the directory $ptodac/model_fuctions/rosenbrock/rosenbrock_doe1 you will find the
input files to run this case.
DOE stands for design of experiments. Use DOE methods for stochastic experiments (physical
experiments).
Working with DAKOTA: Rosenbrock function
DOE/DACE guidelines
Parameter studies, classical design of experiments (DOE), design/analysis of
computer experiments (DACE), and sampling methods share the purpose of
exploring the parameter space.
When a global space-filling set of samples is desired, then the DOE, DACE, and
sampling methods are recommended. These techniques are useful for scatter plot
and variance analysis as well as surrogate model construction.
The distinction between DOE and DACE methods is that the former are intended for
physical experiments containing an element of nonrepeatability (and therefore tend to
place samples at the extreme parameter vertices), whereas the latter are intended for
repeatable computer experiments and are more space-filling in nature.
Refer to DAKOTA Users Manual Usage Guidelines
Working with DAKOTA: Rosenbrock function
Gradient optimization Fletcher-Reeves
Working with DAKOTA: Rosenbrock function
Gradient-based optimization Fletcher-Reeves
In the directory $ptodac/model_fuctions/rosenbrock/rosenbrock_grad1 you will find
the input files to run this case.
Gradient-based optimizers are best suited for efficient navigation to a local minimum in the vicinity
of the initial point. They are not intended to find global optima in nonconvex design spaces. There
are many gradient-based optimizer implemented in DAKOTA. The Fletcher-Reeves method
requires first derivative information and can only be used in unconstrained problems.
Working with DAKOTA: Rosenbrock function
Gradient-free optimization Pattern search
In the directory $ptodac/model_fuctions/rosenbrock/rosenbrock_pattern_search
you will find the input files to run this case.
Derivative/gradient-free methods can be more robust than gradient-based approaches. They can be
applied in situations were gradient calculations are too expensive or unreliable. Pattern Search
methods can be applied to nonlinear optimization problems. They generally walk through the
domain according to a defined stencil of search directions.
Working with DAKOTA: Rosenbrock function
Evolutionary algorithm COLINY_EA
In the directory $ptodac/model_fuctions/rosenbrock/rosenbrock_ea1 you will find the
input files to run this case.
Evolutionary algorithm are used for global optimization or multi-objective optimization. Evolutionary
Algorithms (EA) are based on Darwins theory of survival of the fittest. The EA simulates the
evolutionary process by employing the mathematical analogs of processes such as natural
selection, breeding, and mutation.
Working with DAKOTA: Rosenbrock function
Evolutionary algorithm SOGA
In the directory $ptodac/model_fuctions/rosenbrock/rosenbrock_ea2 you will find the
input files to run this case.
Ultimately, the EA identifies a design point (or a family of design points) that minimizes the objective
function. EA methods are often used when the problem is nonsmooth, multimodal, or poorly
behaved.
Working with DAKOTA: Rosenbrock function
DOE/DACE guidelines
Gradient-based optimization methods are highly efficient, with the best
convergence rates of all of the optimization methods. Gradient-based
optimization methods are the clear choice when the problem is smooth,
unimodal, and well-behaved.
However, when the problem exhibits nonsmooth, discontinuous, or
multimodal behavior, these methods can also be the least robust since
inaccurate gradients will lead to bad search directions.
Gradient-free optimization methods can be applied in situations were
gradient calculations are too expensive or unreliable. In addition, some
nongradient-based methods can be used for global optimization.
Refer to DAKOTA Users Manual Usage Guidelines
Working with DAKOTA: Rosenbrock function
DOE/DACE guidelines
Nongradient-based methods deserve consideration when the problem may
be nonsmooth, multimodal, or poorly behaved.
Nongradient-based methods exhibit much slower convergence rates for
finding an optimum, and as a result, tend to be much more computationally
demanding than gradient-based methods. Clearly, for nongradient
optimization studies, the computational cost of the function evaluation must
be relatively small in order to obtain an optimal solution in a reasonable
amount of time.
Refer to DAKOTA Users Manual Usage Guidelines
Todays lecture
1. Introduction to optimization methods
2. DAKOTA overview
3. Working with DAKOTA: Rosenbrock function
4. Working with DAKOTA: Branin function
5. Coupling DAKOTA and OpenFOAM: driven cavity
case
6. Coupling DAKOTA and OpenFOAM: ahmed body
case
7. Coupling DAKOTA and OpenFOAM: naca airfoil
case (multi-odjective optimization)
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Working with DAKOTA: Branin function
The Branin function
f(x, y) =

y
5.1
4
2
x
2
+
5

x 6

2
+ 10

1
1
8

cos(x) + 10
The problem:
We want to optimize the Branin function using surrogate based
optimization (SBO).
Hence, we need to proceed as follows:
Design an experiment.
Built the surrogate.
Do the optimization at the surrogate level.
As we are doing surrogate based optimization, computing the objective
function is not expensive (in the surrogate), consequently any optimization
method will work. For this tutorial we will use a gradient-based method.
Working with DAKOTA: Branin function
The Branin function
f(x, y) = 0.397887
f(x, y) =

y
5.1
4
2
x
2
+
5

x 6

2
+ 10

1
1
8

cos(x) + 10
(x, y) = (, 12.275), (, 2.275), (9.42478, 2.475)
s.t. 0 y 15
s.t. 5 x 10
Subject
Global minimum
In the directory $ptodac/model_fuctions/branin/ you will find the input files to run these
cases.
In the directory $ptodac/model_fuctions/branin/ you will find the input files to run this
case.
The Branin function
Working with DAKOTA: Branin function
In the directory $ptodac/model_fuctions/branin/ you will find the input files to run this
case.
The Branin function
Working with DAKOTA: Branin function
DACE
In the directory $ptodac/model_fuctions/branin/dace you will find the input files to run this
case.
Working with DAKOTA: Branin function
Surrogate Kriging interpolation
In the directory $ptodac/model_fuctions/branin/surfpack you will find the input files to
run this case.
Working with DAKOTA: Branin function
Surrogate Kriging interpolation
In the directory $ptodac/model_fuctions/branin/surfpack you will find the input files to
run this case.
Working with DAKOTA: Branin function
Surrogate based optimization
In the directory $ptodac/model_fuctions/branin/dakota you will find the input files to run
this case.
We conduct constrained gradient based optimization on the surrogate. We use the method of
multiple feasible directions (MFD), with multi-start. We choose different initial points because we
want to increase the possibilities of finding all the minimum. The red points are global minimum of
the analytical Branin function, and the yellow points are the global minimum in the surrogate.
Working with DAKOTA: Branin function
Surrogate based optimization
In the directory $ptodac/model_fuctions/branin/dakota you will find the input files to run
this case.
Working with DAKOTA: Branin function
Surrogate based optimization
In the directory $ptodac/model_fuctions/branin/dakota you will find the input files to run
this case.
Working with DAKOTA: Branin function
Todays lecture
1. Introduction to optimization methods
2. DAKOTA overview
3. Working with DAKOTA: Rosenbrock function
4. Working with DAKOTA: Branin function
5. Coupling DAKOTA and OpenFOAM: driven cavity
case
6. Coupling DAKOTA and OpenFOAM: ahmed body
case
7. Coupling DAKOTA and OpenFOAM: naca airfoil
case (multi-odjective optimization)
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Coupling DAKOTA and OpenFOAM: driven cavity case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Driven cavity optimization
In this tutorial we conduct a parametric study and a bounded-unconstrained
gradient optimization. We aim at finding the optimal velocity (design variable)
to obtain the maximum pressure (objective function or quantity of interest) in the
middle of the cavity.
In the directory $ptodac/dakota_openfoam_cases/cavity1/ you will find the input files to
run this case.
Coupling DAKOTA and OpenFOAM: driven cavity case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Driven cavity optimization
Parametric study High fidelity gradient based optimization
(CONMIN FRCG)
In the directory $ptodac/dakota_openfoam_cases/cavity1/ you will find the input files to
run this case.
Coupling DAKOTA and OpenFOAM: driven cavity case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Driven cavity optimization
High fidelity gradient based optimization (CONMIN FRCG)
In the directory $ptodac/dakota_openfoam_cases/cavity1/ you will find the input files to
run this case.
Todays lecture
1. Introduction to optimization methods
2. DAKOTA overview
3. Working with DAKOTA: Rosenbrock function
4. Working with DAKOTA: Branin function
5. Coupling DAKOTA and OpenFOAM: driven cavity
case
6. Coupling DAKOTA and OpenFOAM: ahmed body
case
7. Coupling DAKOTA and OpenFOAM: naca airfoil
case (multi-odjective optimization)
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Coupling DAKOTA and OpenFOAM: ahmed body case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Ahmed body
In this tutorial we aim at optimizing the ahmed body. The design variable is the
slant angle and the objective function is drag. In this tutorial we conduct a
parametric study, constrained gradient optimization and surrogate based
optimization (SBO).
In the directory $ptodac/dakota_openfoam_cases/ahmed/ you will find the input files to run
this case.
Coupling DAKOTA and OpenFOAM: ahmed body case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Ahmed body

By the way, this case is computationally expensive so
you better run in parallel.
In the directory $ptodac/dakota_openfoam_cases/ahmed/ you will find the input files to run
this case.
Coupling DAKOTA and OpenFOAM: ahmed body case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Ahmed body
In the directory $ptodac/dakota_openfoam_cases/ahmed/ you will find the input files to run
this case.
Parametric study
Coupling DAKOTA and OpenFOAM: ahmed body case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Ahmed body
In the directory $ptodac/dakota_openfoam_cases/ahmed/ you will find the input files to run
this case.
High fidelity gradient based optimization
(CONMIN MFD)
Parametric study
Coupling DAKOTA and OpenFOAM: ahmed body case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Ahmed body
In the directory $ptodac/dakota_openfoam_cases/ahmed/ you will find the input files to run
this case.
High fidelity gradient based optimization (CONMIN MFD)
Coupling DAKOTA and OpenFOAM: ahmed body case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Ahmed body
In the directory $ptodac/dakota_openfoam_cases/ahmed/ you will find the input files to run
this case.
Surrogate based optimization. Kriging interpolation. CONMIN MFD opt. method.
Todays lecture
1. Introduction to optimization methods
2. DAKOTA overview
3. Working with DAKOTA: Rosenbrock function
4. Working with DAKOTA: Branin function
5. Coupling DAKOTA and OpenFOAM: driven cavity
case
6. Coupling DAKOTA and OpenFOAM: ahmed body
case
7. Coupling DAKOTA and OpenFOAM: naca airfoil
case (multi-odjective optimization)
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
NACA airfoil shape optimization
In this tutorial we aim at optimizing the shape of a NACA Series 4 airfoil. The
design variables are the curvature and position of the maximum curvature. The
objective functions are drag, lift and moment coefficient. In this tutorial we also
show how to setup a multi-objective optimization case using evolutionary
algorithms (EA).
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
NACA airfoil shape optimization

By the way, this case is computationally expensive so
you better run in parallel.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
NACA airfoil shape optimization
After building the surrogates, we can optimize the airfoil shape.
The goals are to maximize the lift coefficient and minimize the drag coefficient.
For multi-objective optimization we use the gradient-free MOGA method.
By the way, I am using the surrogates of the multidimensional experiment.
At this point, let us do unconstrained surrogate-based optimization.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Coupling DAKOTA and OpenFOAM: naca airfoil case
This offering is not approved or endorsed by OpenCFD Limited, the producer of the OpenFOAM software and owner of the OPENFOAM and OpenCFD trade marks.
In the directory $ptodac/dakota_openfoam_cases/naca_shape/ you will find the input files
to run this case.
NACA airfoil shape optimization
Thank you for your attention
Hands-on session
In the courses directory ($ptodac) you will find many tutorials, let us try to go through
each one to understand and get functional using DAKOTA.
If you have a case of your own, let me know and I will try to do my best to help you to
setup your case. But remember, the physics is yours.

You might also like