You are on page 1of 5

Optimization Techniques

Dragos Mocrii FAF-101


May 16, 2012

Report
Laboratory Work Nr.1
Due May 16.

Chisinau 2012

Problem 1
Implement the following line search algorithms for these functions. Initial guess (0,1)
The Algorithms:
Newtons Method
Newtons Simplified
Steepest Descend
The Functions:
Rosenbrock - f (x, y) = (1 x)2 + 100(y x2 )2
Himmelblau - f (x, y) = (x2 + y 11)2 + (x + y 2 7)2
Solving:
Steepest Descend Method
The source code of the program in MATLAB:
function [result] = steepestDescent(x_k, a, eps, b, t)
x_0 = [0 0];
flag = true;
iter = 0;
iter2 = 0;
while abs(x_k - x_0) > eps | flag
iter = iter + 1;
x_0 = x_k;
p_k = -dfRosenbrock(x_k);
[a iter1] = armijoBacktracking(a, x_k, p_k, b, t);
iter2 = iter2 + iter1;
x_k = x_0 + a * p_k;
if iter <= 50000
flag = true;
else
flag = false;
end
end
total = iter + iter2;
result = [x_k total];
===========================================================
function [a i] = armijoBacktracking(a, x_k, p_k, b, t)
i = 0;
while fRosenbrock(x_k + a * p_k) > fRosenbrock(x_k) + a * b * p_k * dfRosenbrock(x_k)
a = a * t;
i = i + 1;
fprintf(x = %f, y = %f\n,x_k(1), x_k(2))
end

Results:
For Rosenbrock function
I have implemented the following functions in MATLAB
function [result] = fRosenbrock(x_k)
result = 100*(x_k(2)-(x_k(1))^2)^2+(1-x_k(1))^2;
===========================================================
function [result] = dfRosenbrock(xk)
x=2*xk(1)-400*xk(1)*(xk(2) - xk(1)^2) - 2;
y=200*xk(2) - 200*xk(1)^2;
result = [x y];
We gave the following values to these variables
x k - The first guess [0,1]
a - the step-size, initially we make it 0.1, afterwards its computed by armijoBacktracking function.
eps - the precision 105
b - 0.2
t - 1/2
We have computed the following solutions x = 1.000000, y = 1.000000
For Himmelblau function
I have implemented the following functions in MATLAB
function [result] = fHimmelblau(x_k)
result = (x_k(1)^2 + x_k(2) - 11 )^2 + (x_k(1)+x_k(2)^2-7);
===========================================================
function [result] = dfHimmelblau(xk)
x=4*xk(1)^3+xk(1)*(4*xk(2)-42)+2*xk(2)^2 -14;
y=2*xk(1)^2+4*xk(1)*xk(2)+4*xk(2)^3-26*xk(2)-22;
result = [x y];
We gave the following values to these variables
x k - The initial guess is [1,1]
a - the step-size, initially we make it 0.1, afterwards its computed by armijoBacktracking function.
eps - the precision 105
b - 0.2
t - 1/2
I have computed the following solutions x = 3.000000, y = 2.000000

Newtons Method
The source code of the program in MATLAB:

function [ r ] = newton(a,x,y)
v = [x,y];
l=1;
tau = 0.6;
beta = 0.8;
alfa(1) = a;
while f(v+ alfa(1) * (-ff(v))) > (f(v) + alfa(1)*beta*( transpose(H(v))*(-transpose(ff(
alfa(l+1)=tau*alfa(l);
disp(alfa(l));
l=l+1;
if (alfa(l)==alfa(l-1))
break
end
end
disp(alfa(l))
r= alfa
end
===========================================================
function [ M] = H( x )
%M(1,1)
%M(1,2)
%M(2,1)
%M(2,2)
M(1,1)
M(1,2)
M(2,1)
M(2,2)

=
=
=
=

=
=
=
=

2+ 1200* x(1) ^2;


-400;
-1600* x(1) ^3;
200;
9*x(1)^2 + 4*x(2) - 42;
4*x(1)+4*x(2);
4*x(1)+4*x(2);
4*x(1)+12*x(2)^2 - 26;

end
===========================================================

function [ r ] =f( x )
%%r = (1-x(1))^2 + 100 * (x(2)- x(1)^2)^2;
r= x(1)^4+2* x(1)^2* x(2)-21* x(1)^2+2* x(1)* x(2)^2-14* x(1)+x(2)^4-13* x(2)^2-22* x(2)
end
===========================================================
function [ r ] = ff( a,b)
%r1 = -2 - 2 *a - 400 * a^3 - 400* b* a;
r1 = 3*a^3 + 4*a*b - 42*a + 2*b^2-14;
%r2 = -200*(a^2) +200 *b;
4

r2 = 2*a^2+4*a*b+4*b^3 - 26 *b -22;
r = [r1,r2];
end
Results:

You might also like