You are on page 1of 7

Solving Nonlinear Equations (Root finding)

• Equation solving is one of the most basic problems in scientific computing.


• Roots of equations frequently occur in the area of engineering design.
• For example,
Medical studies have established that a free falling Jumper’s
chances of sustaining an injury increases significantly if the free-
falling velocity exceeds 36 m/s after 4 s of free fall. We want to
determine the mass at which this criteria is exceeded at a given
drag coefficient of 0.25.

We know the analytical solution of the free fall velocity as a function of time
is given by,
mg  gc d 
v( t )  tanh  t 
cd  m 
mg  gc d 
 tanh  t   v( t )  0
cd  m 
Now, we see that the answer of the problem is the value of m that makes
f(m)=0 i.e. the Root of the eqation. 1

Graphical method

• A simple method for obtaining the estimate of the root of the equation
f(x)=0 is

• plot of the function and


• observe where it crosses the x -axis.

1
Graphical method

Example:
Use graphical approach to determine the mass of a Jumper with a drag
coefficient of 0.25 kg/m to have a velocity of 36 m/s after 4 s of free
fall. g=9.8 m/s2 .

MATLAB code:
format long e; clear all;close all;
cd=0.25; % drag coefficient
g=9.81; % gravity in m/s2
v=36; % velocity in m/s
t=4; % time in s
mp=linspace(50,200);
fp=sqrt(g*mp/cd).*tanh(sqrt(g*cd./mp)*t)-v;
plot(mp,fp,'Linewidth',2);
grid;

Graphical method
1

-1

m  145 kg
f(m)

-2

-3

-4

-5
50 100 150 200
m

Graphical method provides a rough approximation of the root.


4

2
Solving Nonlinear Equations (Root finding)

Two major classes of methods available are distinguished by the initial guess.
They are
Bracketing methods:
as the name implies, they are based on two initial guesses that “bracket” the root,
That is, they are on either side of the root.

Open methods:
These methods can involve one or more initial guess but there is no need for
them to bracket the root.

For well-posed problems, the bracketing methods always work but converge
Slowly (i.e. they typically take more number of iterations)

In contrast, the open methods do not work always (i.e. they can diverge),
but when they do they usually converge quickly.

Bisection method

Theorem

Let f be a continuous function on [a, b], satisfying f(a)*f(b)<0.

Then f has a root between a and b, that is, there exists a number r
satisfying a<r<b and f(r)=0.

3
Bisection method

Definition

A solution is correct with in P decimal places (significant figures) if

the absolute error is less than 0.5* 10-P

Or, the relative error is less than 0.5*102-P %

Bisection method
a=0;b=1;
TOL=0.5e-3;
f=inline('x^3+x-1');
[inter_number xc]=Bisect(f,a,b,TOL);

i ai f(ai) ci f(ci) bi f(bi)


0 0.00000000 -1.00000000 0.50000000 -0.37500000 1.00000000 1.00000000
1 0.50000000 -0.37500000 0.75000000 0.17187500 1.00000000 1.00000000
2 0.50000000 -0.37500000 0.62500000 -0.13085938 0.75000000 0.17187500
3 0.62500000 -0.13085938 0.68750000 0.01245117 0.75000000 0.17187500
4 0.62500000 -0.13085938 0.65625000 -0.06112671 0.68750000 0.01245117
5 0.65625000 -0.06112671 0.67187500 -0.02482986 0.68750000 0.01245117
6 0.67187500 -0.02482986 0.67968750 -0.00631380 0.68750000 0.01245117
7 0.67968750 -0.00631380 0.68359375 0.00303739 0.68750000 0.01245117
8 0.67968750 -0.00631380 0.68164063 -0.00164600 0.68359375 0.00303739
9 0.68164063 -0.00164600 0.68261719 0.00069374 0.68359375 0.00303739

0.68212891

4
Bisection method
function [iter_number xc]=Bisect(f,a,b,TOL)
fa=f(a); fb=f(b);
if sign(fa)*sign(fb)>=0
error('Root is NOT bracketed\n'); % Program STOPs
end

iter_number=0;
while(b-a)/2>TOL
c=(a+b)/2;
fc=f(c);
if fc==0
break % c is a solution, done
end
if sign(fc)*sign(fa)<0
b=c; % a and c make the new interval
fb=fc;
else
a=c; % a and c make the new interval
fa=fc;
end
iter_number=iter_number+1;

end
xc=(a+b)/2; % final ROOT (after coming our of the while loop).
9

Newton’s method
clc; clear all; close all; format long e
f=inline('x^3+x-1'); % f(x)
fprime=inline('3*x*x+1'); % derivite of f(x)
tol=0.5e-3; x0=-0.7;
max_iteration=100;
[xout,iter_number]=newton(x0,f,fprime,tol,max_iteration);

i xi ei=| xi-r|
0 -0.70000000 1.38232780
1 0.12712551 0.55520230
2 0.95767812 0.27535032
3 0.73482779 0.05249999
4 0.68459177 0.00226397
5 0.68233217 0.00000437
6 0.68232780 0.00000000

After only six steps, the root has eight correct digits.
Note: once convergence starts to take hold,
the number of correct places in xi approximately doubles on each iterations.
10

5
Newton’s method
function [xout,k]=newton(x,f,fprime,tol,max_iteration)
k=0;
xnew=x;
xold=x+5*tol; % dummy, used only to start the while loop
fprintf(1,'%d %15.10f \n',k,xnew);
while abs(xnew-xold)>tol
xold=xnew;
xnew=xold-f(xold)/fprime(xold);
k=k+1;
fprintf(1,'%d %15.10f \n',k,xnew);
if k>=max_iteration
break;
end
end
xout=xnew;

11

Backward and forward error

Definition

Assume that f is a function and that r is a root, meaning that it satisfies f(r) =0.

Assume that xc is an approximate solution of f.

For the root-finding problem,

• the backward error of the approximation xc is |f(xc)|

• the forward error is |r-xc|

12

6
Newton’s method: multiplicity of roots

Definition

Assume that r is a root of a differentiable function f; i.e. assume that f(r) =0.

Then if f(r)=f’(r)=f’’(r)=…=f(m-1)(r)=0, but f(m)(r)  0,

we say that f has a root of multiplicity m at r.

• f has a multiple root at r if the multiplicity is greater that one.

• The root is called simple if the multiplicity is one.

13

Newton’s method: multiplicity of roots

f=sin(x)+x*x*cos(x)-x*x-x; % has a multiplicity of 3 at r=0

Newton’s method Modified Newton’s method

i xi i xi
0 1.000000000000000 0 1.000000000000000
1 0.721590239860747 1 0.164770719582242
2 0.521370951820402 2 0.016207337711438
3 0.375308308590761 3 0.000246541437739
4 0.268363490527132 4 0.000000060720923
5 0.190261613699237 5 -0.000000002389877
6 0.133612505326191
7 0.092925286725174
8 0.064039266777341
Only 5 iterations !!!
9 0.043778062160090
10 0.029728055524228

36 iterations will be needed for
tol=0.5e-6 14

You might also like