14 views

Uploaded by Khoa

- CURRICULUM AND SYLLABUS
- Department of Mathematics
- The Optimisation of Internal Gears - ScienceDirect
- Basic Numeric s 2
- 1221
- Take Home Exam SD2175-2013
- Application of Modified EP Method in Steady Seepage Analysis
- 615.15numerical Analysis
- 2008 Linear Programming Bounds University Bordeaux.pdf
- Computer Scince & Engineering
- MTF Term 2 Timetable FT
- ie4902-set-2-5
- mat888
- rfpo-120103003512-phpapp01
- Gate
- Eigenvalues.pdf
- li2016
- Introduction - Computer Errors.pdf
- ethz_lecture6
- LPOPF

You are on page 1of 93

Numerical Analysis (9th Edition)

R L Burden & J D Faires

Beamer Presentation Slides

prepared by

John Carroll

Dublin City University

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Outline

Relaxation Techniques

2 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Outline

Relaxation Techniques

2 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Outline

Relaxation Techniques

2 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Outline

Relaxation Techniques

2 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Outline

Relaxation Techniques

3 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Motivation

We have seen that the rate of convergence of an iterative

technique depends on the spectral radius of the matrix associated

with the method.

Relaxation Techniques

4 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Motivation

We have seen that the rate of convergence of an iterative

technique depends on the spectral radius of the matrix associated

with the method.

One way to select a procedure to accelerate convergence is to

choose a method whose associated matrix has minimal spectral

radius.

Relaxation Techniques

4 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Motivation

We have seen that the rate of convergence of an iterative

technique depends on the spectral radius of the matrix associated

with the method.

One way to select a procedure to accelerate convergence is to

choose a method whose associated matrix has minimal spectral

radius.

We start by introducing a new means of measuring the amount by

which an approximation to the solution to a linear system differs

from the true solution to the system.

Relaxation Techniques

4 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Motivation

We have seen that the rate of convergence of an iterative

technique depends on the spectral radius of the matrix associated

with the method.

One way to select a procedure to accelerate convergence is to

choose a method whose associated matrix has minimal spectral

radius.

We start by introducing a new means of measuring the amount by

which an approximation to the solution to a linear system differs

from the true solution to the system.

The method makes use of the vector described in the following

definition.

Numerical Analysis (Chapter 7)

Relaxation Techniques

4 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Definition

IRn is an approximation to the solution of the linear

Suppose x

system defined by

Ax = b

with respect to this system is

The residual vector for x

r = b Ax

Relaxation Techniques

5 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Definition

IRn is an approximation to the solution of the linear

Suppose x

system defined by

Ax = b

with respect to this system is

The residual vector for x

r = b Ax

Comments

A residual vector is associated with each calculation of an

approximate component to the solution vector.

Relaxation Techniques

5 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Definition

IRn is an approximation to the solution of the linear

Suppose x

system defined by

Ax = b

with respect to this system is

The residual vector for x

r = b Ax

Comments

A residual vector is associated with each calculation of an

approximate component to the solution vector.

The true objective is to generate a sequence of approximations

that will cause the residual vectors to converge rapidly to zero.

Numerical Analysis (Chapter 7)

Relaxation Techniques

5 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Looking at the Gauss-Seidel Method

Suppose we let

(k )

ri

(k )

(k )

(k )

Relaxation Techniques

6 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Looking at the Gauss-Seidel Method

Suppose we let

(k )

ri

(k )

(k )

(k )

(k )

to the approximate solution vector xi defined by

(k )

xi

(k )

(k )

(k )

(k 1)

= (x1 , x2 , . . . , xi1 , xi

Relaxation Techniques

(k 1) t

, . . . , xn

6 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Looking at the Gauss-Seidel Method

Suppose we let

(k )

ri

(k )

(k )

(k )

(k )

to the approximate solution vector xi defined by

(k )

xi

(k )

(k )

(k )

(k 1)

= (x1 , x2 , . . . , xi1 , xi

(k )

(k )

rmi

= bm

is

i1

X

(k )

amj xj

j=1

(k 1) t

, . . . , xn

Relaxation Techniques

n

X

(k 1)

amj xj

j=i

6 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k )

rmi

= bm

i1

X

(k )

amj xj

n

X

j=1

(k 1)

amj xj

j=i

(k )

(k )

rmi

= bm

i1

X

j=1

(k )

amj xj

n

X

(k 1)

amj xj

(k 1)

ami xi

j=i+1

for each m = 1, 2, . . . , n.

Relaxation Techniques

7 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k )

rmi = bm

i1

X

(k )

amj xj

j=1

n

X

(k 1)

amj xj

(k 1)

ami xi

j=i+1

(k )

(k )

rii

= bi

i1

X

j=1

(k )

aij xj

is

n

X

(k 1)

aij xj

(k 1)

aii xi

j=i+1

Relaxation Techniques

8 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k )

rmi = bm

i1

X

(k )

amj xj

j=1

n

X

(k 1)

amj xj

(k 1)

ami xi

j=i+1

(k )

(k )

= bi

rii

i1

X

(k )

aij xj

j=1

(k 1)

(k )

+ rii

n

X

(k 1)

aij xj

= bi

i1

X

(k )

aij xj

j=1

Numerical Analysis (Chapter 7)

(k 1)

aii xi

j=i+1

so

aii xi

is

Relaxation Techniques

n

X

(k 1)

aij xj

j=i+1

R L Burden & J D Faires

8 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(E)

(k 1)

aii xi

(k )

rii

= bi

i1

X

(k )

aij xj

n

X

(k 1)

aij xj

j=i+1

j=1

(k )

i1

n

X

X

1

(k )

(k )

(k 1)

bi

xi =

aij xj

aij xj

aii

j=1

Relaxation Techniques

j=i+1

9 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(E)

(k 1)

aii xi

(k )

rii

= bi

i1

X

(k )

aij xj

n

X

(k 1)

aij xj

j=i+1

j=1

(k )

i1

n

X

X

1

(k )

(k )

(k 1)

bi

xi =

aij xj

aij xj

aii

j=1

j=i+1

(k 1)

aii xi

Numerical Analysis (Chapter 7)

(k )

+ rii

(k )

= aii xi

Relaxation Techniques

9 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k 1)

aii xi

(k )

+ rii

(k )

= aii xi

Consequently, the Gauss-Seidel method can be characterized as

(k )

choosing xi to satisfy

(k )

(k )

xi

(k 1)

= xi

rii

aii

Relaxation Techniques

10 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

We can derive another connection between the residual vectors

and the Gauss-Seidel technique.

Relaxation Techniques

11 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

We can derive another connection between the residual vectors

and the Gauss-Seidel technique.

(k )

(k )

(k )

(k )

xi+1 = (x1 ,. . ., xi

(k 1)

(k 1) t

).

, xi+1 , . . ., xn

Relaxation Techniques

11 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

We can derive another connection between the residual vectors

and the Gauss-Seidel technique.

(k )

(k )

(k )

(k )

xi+1 = (x1 ,. . ., xi

(k 1)

(k 1) t

).

, xi+1 , . . ., xn

(k )

(k )

rmi

= bm

i1

X

(k )

amj xj

j=1

Relaxation Techniques

n

X

is

(k 1)

amj xj

j=i

11 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k )

rmi

= bm

i1

X

(k )

amj xj

n

X

(k 1)

amj xj

j=i

j=1

(k )

(k )

ri,i+1 = bi

i

X

j=1

(k )

aij xj

n

X

(k 1)

aij xj

j=i+1

Relaxation Techniques

12 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k )

rmi

= bm

i1

X

(k )

amj xj

n

X

(k 1)

amj xj

j=i

j=1

(k )

(k )

ri,i+1 = bi

i

X

(k )

aij xj

j=1

= bi

i1

X

j=1

n

X

(k 1)

aij xj

j=i+1

(k )

aij xj

n

X

(k 1)

aij xj

(k )

aii xi

j=i+1

Relaxation Techniques

12 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k )

ri,i+1

= bi

i1

X

(k )

aij xj

j=1

n

X

(k 1)

aij xj

(k )

aii xi

j=i+1

(k )

i1

n

X

X

1

(k )

(k )

(k 1)

bi

xi =

aij xj

aij xj

aii

j=1

j=i+1

(k )

Relaxation Techniques

13 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k )

ri,i+1

= bi

i1

X

(k )

aij xj

j=1

n

X

(k 1)

aij xj

(k )

aii xi

j=i+1

(k )

i1

n

X

X

1

(k )

(k )

(k 1)

bi

xi =

aij xj

aij xj

aii

j=1

j=i+1

(k )

(k )

characterized by choosing each xi+1 in such a way that the ith

(k )

component of ri+1 is zero.

Numerical Analysis (Chapter 7)

Relaxation Techniques

13 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Outline

Relaxation Techniques

14 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Reducing the Norm of the Residual Vector

(k )

zero, however, is not necessarily the most efficient way to reduce

(k )

the norm of the vector ri+1 .

Relaxation Techniques

15 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Reducing the Norm of the Residual Vector

(k )

zero, however, is not necessarily the most efficient way to reduce

(k )

the norm of the vector ri+1 .

If we modify the Gauss-Seidel procedure, as given by

(k )

(k )

xi

(k 1)

xi

r

+ ii

aii

Relaxation Techniques

15 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Reducing the Norm of the Residual Vector

(k )

zero, however, is not necessarily the most efficient way to reduce

(k )

the norm of the vector ri+1 .

If we modify the Gauss-Seidel procedure, as given by

(k )

(k )

xi

(k 1)

xi

r

+ ii

aii

to

(k )

(k )

xi

(k 1)

= xi

Relaxation Techniques

rii

aii

15 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Reducing the Norm of the Residual Vector

(k )

zero, however, is not necessarily the most efficient way to reduce

(k )

the norm of the vector ri+1 .

If we modify the Gauss-Seidel procedure, as given by

(k )

(k )

xi

(k 1)

xi

r

+ ii

aii

to

(k )

(k )

xi

(k 1)

= xi

rii

aii

the residual vector and obtain significantly faster convergence.

Numerical Analysis (Chapter 7)

Relaxation Techniques

15 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Introducing the SOR Method

Methods involving

(k )

(k )

xi

(k 1)

= xi

rii

aii

Relaxation Techniques

16 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Introducing the SOR Method

Methods involving

(k )

(k )

xi

(k 1)

= xi

rii

aii

the procedures are called under-relaxation methods.

Relaxation Techniques

16 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Introducing the SOR Method

Methods involving

(k )

(k )

xi

(k 1)

= xi

rii

aii

the procedures are called under-relaxation methods.

We will be interested in choices of with 1 < , and these are

called over-relaxation methods.

Relaxation Techniques

16 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Introducing the SOR Method

Methods involving

(k )

(k )

xi

(k 1)

= xi

rii

aii

the procedures are called under-relaxation methods.

We will be interested in choices of with 1 < , and these are

called over-relaxation methods.

They are used to accelerate the convergence for systems that are

convergent by the Gauss-Seidel technique.

Relaxation Techniques

16 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Introducing the SOR Method

Methods involving

(k )

(k )

xi

(k 1)

= xi

rii

aii

the procedures are called under-relaxation methods.

We will be interested in choices of with 1 < , and these are

called over-relaxation methods.

They are used to accelerate the convergence for systems that are

convergent by the Gauss-Seidel technique.

The methods are abbreviated SOR, for Successive

Over-Relaxation, and are particularly useful for solving the linear

systems that occur in the numerical solution of certain

partial-differential equations.

Numerical Analysis (Chapter 7)

Relaxation Techniques

16 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

A More Computationally-Efficient Formulation

(k )

(k )

rii

= bi

i1

X

j=1

(k )

aij xj

n

X

in the form

(k 1)

aij xj

(k 1)

aii xi

j=i+1

Relaxation Techniques

17 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

A More Computationally-Efficient Formulation

(k )

(k )

rii

= bi

i1

X

(k )

aij xj

n

X

in the form

(k 1)

aij xj

(k 1)

aii xi

j=i+1

j=1

(k )

(k )

xi

(k 1)

xi

r

+ ii

aii

Relaxation Techniques

17 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

A More Computationally-Efficient Formulation

(k )

(k )

rii

= bi

i1

X

(k )

aij xj

n

X

in the form

(k 1)

aij xj

(k 1)

aii xi

j=i+1

j=1

(k )

(k )

xi

(k 1)

xi

r

+ ii

aii

(k )

xi

i1

n

X

X

(k 1)

(k )

(k 1)

bi

= (1 )xi

+

aij xj

aij xj

aii

j=1

Relaxation Techniques

j=i+1

17 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

To determine the matrix form of the SOR method, we rewrite

i1

n

X

X

(k )

(k 1)

(k )

(k 1)

bi

aij xj

aij xj

xi = (1 )xi

+

aii

j=1

Relaxation Techniques

j=i+1

18 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

To determine the matrix form of the SOR method, we rewrite

i1

n

X

X

(k )

(k 1)

(k )

(k 1)

bi

aij xj

aij xj

xi = (1 )xi

+

aii

j=1

j=i+1

as

(k )

aii xi

i1

X

(k )

aij xj

= (1

(k 1)

)aii xi

j=1

n

X

(k 1)

aij xj

+ bi

j=i+1

Relaxation Techniques

18 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k )

aii xi

i1

X

(k )

aij xj

(k 1)

= (1 )aii xi

n

X

(k 1)

aij xj

+ bi

j=i+1

j=1

In vector form, we therefore have

(D L)x(k ) = [(1 )D + U]x(k 1) + b

Relaxation Techniques

19 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

(k )

aii xi

i1

X

(k )

aij xj

(k 1)

= (1 )aii xi

n

X

(k 1)

aij xj

+ bi

j=i+1

j=1

In vector form, we therefore have

(D L)x(k ) = [(1 )D + U]x(k 1) + b

from which we obtain:

x(k ) = (D L)1 [(1 )D + U]x(k 1) + (D L)1 b

Relaxation Techniques

19 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Relaxation Techniques

20 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Letting

T = (D L)1 [(1 )D + U]

Relaxation Techniques

20 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Letting

T = (D L)1 [(1 )D + U]

and

c = (D L)1 b

Relaxation Techniques

20 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Letting

T = (D L)1 [(1 )D + U]

and

c = (D L)1 b

x(k ) = T x(k 1) + c

Relaxation Techniques

20 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Example

The linear system Ax = b given by

4x1 + 3x2

= 24

3x1 + 4x2 x3 = 30

x2 + 4x3 = 24

has the solution (3, 4, 5)t .

Relaxation Techniques

21 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Example

The linear system Ax = b given by

4x1 + 3x2

= 24

3x1 + 4x2 x3 = 30

x2 + 4x3 = 24

has the solution (3, 4, 5)t .

Compare the iterations from the Gauss-Seidel method and the

SOR method with = 1.25 using x(0) = (1, 1, 1)t for both

methods.

Relaxation Techniques

21 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (1/3)

Relaxation Techniques

22 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (1/3)

For each k = 1, 2, . . . , the equations for the Gauss-Seidel method are

(k 1)

(k )

= 0.75x2

(k )

= 0.75x1 + 0.25x3

(k )

= 0.25x2 6

x1

x2

x3

+6

(k )

(k 1)

+ 7.5

(k )

Relaxation Techniques

22 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (1/3)

For each k = 1, 2, . . . , the equations for the Gauss-Seidel method are

(k 1)

(k )

= 0.75x2

(k )

= 0.75x1 + 0.25x3

(k )

= 0.25x2 6

x1

x2

x3

+6

(k 1)

(k )

+ 7.5

(k )

and the equations for the SOR method with = 1.25 are

(k )

x1

(k )

x2

(k )

x3

(k 1)

= 0.25x1

=

=

(k 1)

0.9375x2

+ 7.5

(k )

(k 1)

(k 1)

0.9375x1 0.25x2

+ 0.3125x3

(k )

(k 1)

0.3125x2 0.25x3

7.5

Relaxation Techniques

+ 9.375

22 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Gauss-Seidel Iterations

k

(k )

x1

(k )

x2

(k )

x3

1

1

1

5.250000

3.812500

5.046875

3.1406250

3.8828125

5.0292969

3.0878906

3.9267578

5.0183105

Relaxation Techniques

7

3.0134110

3.9888241

5.0027940

23 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Gauss-Seidel Iterations

k

(k )

x1

(k )

x2

(k )

x3

1

1

1

5.250000

3.812500

5.046875

3.1406250

3.8828125

5.0292969

3.0878906

3.9267578

5.0183105

7

3.0134110

3.9888241

5.0027940

k

(k )

x1

(k )

x2

(k )

x3

1

1

1

6.312500

3.5195313

6.6501465

2.6223145

3.9585266

4.6004238

3.1333027

4.0102646

5.0966863

Relaxation Techniques

7

3.0000498

4.0002586

5.0003486

23 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (3/3)

Relaxation Techniques

24 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (3/3)

For the iterates to be accurate to 7 decimal places,

the Gauss-Seidel method requires 34 iterations,

Relaxation Techniques

24 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (3/3)

For the iterates to be accurate to 7 decimal places,

the Gauss-Seidel method requires 34 iterations,

as opposed to 14 iterations for the SOR method with = 1.25.

Relaxation Techniques

24 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Outline

Relaxation Techniques

25 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

chosen when the SOR method is used?

Relaxation Techniques

26 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

chosen when the SOR method is used?

Although no complete answer to this question is known for the

general n n linear system, the following results can be used in

certain important situations.

Relaxation Techniques

26 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Theorem (Kahan)

If aii 6= 0, for each i = 1, 2, . . . , n, then (T ) | 1|. This implies

that the SOR method can converge only if 0 < < 2.

Relaxation Techniques

27 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Theorem (Kahan)

If aii 6= 0, for each i = 1, 2, . . . , n, then (T ) | 1|. This implies

that the SOR method can converge only if 0 < < 2.

The proof of this theorem is considered in Exercise 9, Chapter 7 of

Burden R. L. & Faires J. D., Numerical Analysis, 9th Ed., Cengage,

2011.

Relaxation Techniques

27 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Theorem (Kahan)

If aii 6= 0, for each i = 1, 2, . . . , n, then (T ) | 1|. This implies

that the SOR method can converge only if 0 < < 2.

The proof of this theorem is considered in Exercise 9, Chapter 7 of

Burden R. L. & Faires J. D., Numerical Analysis, 9th Ed., Cengage,

2011.

Theorem (Ostrowski-Reich)

If A is a positive definite matrix and 0 < < 2, then the SOR method

converges for any choice of initial approximate vector x(0) .

Relaxation Techniques

27 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Theorem (Kahan)

If aii 6= 0, for each i = 1, 2, . . . , n, then (T ) | 1|. This implies

that the SOR method can converge only if 0 < < 2.

The proof of this theorem is considered in Exercise 9, Chapter 7 of

Burden R. L. & Faires J. D., Numerical Analysis, 9th Ed., Cengage,

2011.

Theorem (Ostrowski-Reich)

If A is a positive definite matrix and 0 < < 2, then the SOR method

converges for any choice of initial approximate vector x(0) .

The proof of this theorem can be found in Ortega, J. M., Numerical

Analysis; A Second Course, Academic Press, New York, 1972, 201 pp.

Numerical Analysis (Chapter 7)

Relaxation Techniques

27 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Theorem

If A is positive definite and tridiagonal, then (Tg ) = [(Tj )]2 < 1, and

the optimal choice of for the SOR method is

2

=

1+

1 [(Tj )]2

Relaxation Techniques

28 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Theorem

If A is positive definite and tridiagonal, then (Tg ) = [(Tj )]2 < 1, and

the optimal choice of for the SOR method is

2

=

1+

1 [(Tj )]2

Relaxation Techniques

28 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Theorem

If A is positive definite and tridiagonal, then (Tg ) = [(Tj )]2 < 1, and

the optimal choice of for the SOR method is

2

=

1+

1 [(Tj )]2

The proof of this theorem can be found in Ortega, J. M., Numerical

Analysis; A Second Course, Academic Press, New York, 1972, 201 pp.

Relaxation Techniques

28 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Example

Find the optimal choice of for the SOR method for the matrix

4

3

0

4 1

A= 3

0 1

4

Relaxation Techniques

29 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (1/3)

This matrix is clearly tridiagonal, so we can apply the result in the

SOR theorem if we can also show that it is positive definite.

Relaxation Techniques

30 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (1/3)

This matrix is clearly tridiagonal, so we can apply the result in the

SOR theorem if we can also show that it is positive definite.

Because the matrix is symmetric, the theory tells us that it is

positive definite if and only if all its leading principle submatrices

has a positive determinant.

Relaxation Techniques

30 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (1/3)

This matrix is clearly tridiagonal, so we can apply the result in the

SOR theorem if we can also show that it is positive definite.

Because the matrix is symmetric, the theory tells us that it is

positive definite if and only if all its leading principle submatrices

has a positive determinant.

This is easily seen to be the case because

4 3

det(A) = 24, det

= 7 and

3 4

Relaxation Techniques

det ([4]) = 4

30 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (2/3)

We compute

Tj

= D 1 (L + U)

Relaxation Techniques

31 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (2/3)

We compute

Tj

= D 1 (L + U)

0 3 0

4 0 0

0 1

= 0 14 0 3

0

1 0

0 0 1

4

Relaxation Techniques

31 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (2/3)

We compute

Tj

= D 1 (L + U)

0 3 0

4 0 0

0 1

= 0 14 0 3

0

1 0

0 0 14

0

0.75 0

0

0.25

= 0.75

0

0.25 0

Relaxation Techniques

31 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (2/3)

We compute

Tj

so that

= D 1 (L + U)

0 3 0

4 0 0

0 1

= 0 14 0 3

0

1 0

0 0 14

0

0.75 0

0

0.25

= 0.75

0

0.25 0

0.75

0

0.25

Tj I = 0.75

0

0.25

Relaxation Techniques

31 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (3/3)

Therefore

0.75

0

0.25

det(Tj I) = 0.75

0

0.25

Relaxation Techniques

32 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (3/3)

Therefore

0.75

0

0.25

det(Tj I) = 0.75

0

0.25

Relaxation Techniques

= (2 0.625)

32 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (3/3)

Therefore

0.75

0

0.25

det(Tj I) = 0.75

0

0.25

Thus

(Tj ) =

= (2 0.625)

0.625

Relaxation Techniques

32 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (3/3)

Therefore

0.75

0

0.25

det(Tj I) = 0.75

0

0.25

Thus

(Tj ) =

and

=

= (2 0.625)

0.625

2

2

q

1.24.

=

1 + 1 0.625

1 + 1 [(Tj )]2

Relaxation Techniques

32 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Solution (3/3)

Therefore

0.75

0

0.25

det(Tj I) = 0.75

0

0.25

Thus

(Tj ) =

and

=

= (2 0.625)

0.625

2

2

q

1.24.

=

1 + 1 0.625

1 + 1 [(Tj )]2

This explains the rapid convergence obtained in the last example when

using = 1.25.

Numerical Analysis (Chapter 7)

Relaxation Techniques

32 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Outline

Relaxation Techniques

33 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

To solve

Ax = b

given the parameter and an initial approximation x(0) :

Relaxation Techniques

34 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

To solve

Ax = b

given the parameter and an initial approximation x(0) :

INPUT

the entries aij , 1 i, j n, of the matrix A;

the entries bi , 1 i n, of b;

the entries XOi , 1 i n, of XO = x(0) ;

the parameter ; tolerance TOL;

maximum number of iterations N.

Numerical Analysis (Chapter 7)

Relaxation Techniques

34 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Step 1 Set k = 1

Step 2 While (k N) do Steps 36:

Relaxation Techniques

35 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Step 1 Set k = 1

Step 2 While (k N) do Steps 36:

Step 3 For i = 1, . . . , n

set xi = (1 )XOi +

i

P

1 h Pi1

j=1 aij xj nj=i+1 aij XOj + bi

aii

Relaxation Techniques

35 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Step 1 Set k = 1

Step 2 While (k N) do Steps 36:

Step 3 For i = 1, . . . , n

set xi = (1 )XOi +

i

P

1 h Pi1

j=1 aij xj nj=i+1 aij XOj + bi

aii

STOP (The procedure was successful)

Relaxation Techniques

35 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Step 1 Set k = 1

Step 2 While (k N) do Steps 36:

Step 3 For i = 1, . . . , n

set xi = (1 )XOi +

i

P

1 h Pi1

j=1 aij xj nj=i+1 aij XOj + bi

aii

STOP (The procedure was successful)

Step 5 Set k = k + 1

Relaxation Techniques

35 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Step 1 Set k = 1

Step 2 While (k N) do Steps 36:

Step 3 For i = 1, . . . , n

set xi = (1 )XOi +

i

P

1 h Pi1

j=1 aij xj nj=i+1 aij XOj + bi

aii

STOP (The procedure was successful)

Step 5 Set k = k + 1

Step 6 For i = 1, . . . , n set XOi = xi

Relaxation Techniques

35 / 36

Residual Vectors

SOR Method

Optimal

SOR Algorithm

Step 1 Set k = 1

Step 2 While (k N) do Steps 36:

Step 3 For i = 1, . . . , n

set xi = (1 )XOi +

i

P

1 h Pi1

j=1 aij xj nj=i+1 aij XOj + bi

aii

STOP (The procedure was successful)

Step 5 Set k = k + 1

Step 6 For i = 1, . . . , n set XOi = xi

Step 7 OUTPUT (Maximum number of iterations exceeded)

STOP (The procedure was successful)

Numerical Analysis (Chapter 7)

Relaxation Techniques

35 / 36

Questions?

- CURRICULUM AND SYLLABUSUploaded byRajesh Ganesan
- Department of MathematicsUploaded byMuthu MK
- The Optimisation of Internal Gears - ScienceDirectUploaded byjavad
- Basic Numeric s 2Uploaded bydiegofly
- 1221Uploaded byhossein1985
- Take Home Exam SD2175-2013Uploaded byds_srinivas
- Application of Modified EP Method in Steady Seepage AnalysisUploaded byEndless Love
- 615.15numerical AnalysisUploaded byperiodnot
- 2008 Linear Programming Bounds University Bordeaux.pdfUploaded bydanut89
- Computer Scince & EngineeringUploaded byJezeel Mohd
- MTF Term 2 Timetable FTUploaded byakkakarania
- ie4902-set-2-5Uploaded byistenctarhan
- mat888Uploaded byBurican Bogdan Alexandru
- rfpo-120103003512-phpapp01Uploaded byAnonymous pTtEWaa
- GateUploaded byChandan Kumar Bera
- Eigenvalues.pdfUploaded byChang Xin Lim
- li2016Uploaded byAbdelrahman Almassry
- Introduction - Computer Errors.pdfUploaded byNishanthini Suppiah
- ethz_lecture6Uploaded byprasaad08
- LPOPFUploaded bykinny1974
- Recommendation Reports 調查性與建議性報告(上)Uploaded by柯泰德 (Ted Knoy)
- Artificial Intelligence and Expert Systems in Energy Systems AnalysisUploaded byJoao Baltagesti
- seismic-tomography8.pptUploaded bysatyam mehta
- ant-pat.pdfUploaded byRamkumar Thiruvenkadam
- EX1001 OverviewUploaded byOlabode George Brown
- 1-Introduction to Management ScienceUploaded byEDENSONAM
- FEM_1_INTRO1Uploaded bydiaz1887
- Integral Equations With Singular KernelsUploaded byM S Mostafa
- t3 Renmod h1c106097 Ghifari RaziUploaded byghifari razi
- Basic Concepts of Finite Element Analysis and Its Applications in Dentistry an Overview 2332 0702.1000156Uploaded byShameer SFs

- Mechanismus Design KON I 11Uploaded byKhoa
- PPK_I_1Uploaded byKhoa
- ELEKTRICKE POHONY v02Uploaded byKhoa
- Malá cvičebnice tvůrčího psaníUploaded byKhoa
- 113_Sbornik_PLM_1Uploaded byKhoa
- Skripta Cad IIIUploaded byKhoa
- Programování a řízení CNC strojů - prezentace 2.pdfUploaded byKhoa
- [123doc.vn] - giai-bai-thuc-hanh-symbolic-matlab.docUploaded byKhoa
- car - Thư viện luận văn myweb.pro.vn.pdfUploaded byKhoa
- interpolace.pdfUploaded byKhoa
- Matlab_pr_3Uploaded byKhoa
- Matlab_pr_3Uploaded byKhoa
- Strucny Prehled Funkci Pouzivanych v MATLABuUploaded byKhoa
- Matlab_pr_2Uploaded byKhoa
- Matlab_pr_1Uploaded byKhoa
- Matlab_pr_1Uploaded byKhoa
- ma3_prednaska_1_2012.pdfUploaded byKhoa

- The Application of Gauss-Jordan Method in Construction. Study Case : Determining the Amount of Material Needed in Building ProjectUploaded byshabrina
- Genesis the Big Bluff6 (1)Uploaded byJose Lopez
- What is Iaso TeaUploaded bymatgar
- COMPARATIVE_STUDY_OF_STABILISATION_OF_EXPANSIVE_SOIL_TREATED_WITH_BRICK_DUST_AND_MARBLE_DUST.docxUploaded byPrateek Negi
- Three TreasuresUploaded byChuck Sullivan
- Air Products gas book.pdfUploaded bypumpeng
- FPGA Implementation OfUploaded byأبو أحمد
- IbsUploaded byastuti susanti
- Docslide.com.Br de Thi Chuyen Tieng Anh Dap an de Thi Chuyen Tieng Anh ThptUploaded byĐại Quý Trần
- Out of Darkness - Into the Light // Charts see link at the end of .docUploaded bypapa martin
- W9 1489436 10 11Uploaded byMatthew Monroe
- Food list in TNUploaded byharii1234
- Periodic TableUploaded byproodoot
- Automatic Street Light Control Using LdrUploaded byDivyang Vijay
- Los adjetivos en inglésUploaded byRafael Pirela
- B767 Air ConditioningUploaded byGustavo Avila Rodriguez
- Siemens (LR560-201012) documentUploaded bycuongnammu
- The Study on Ecological Sustainable Development in ChengduUploaded byDaniel Camilo Gonzalez
- Cake Masters - February 2013Uploaded byNesrin Meçik
- Istana Balai BesarUploaded byFauziah Hanum Abdullah
- AS2083-2005Uploaded byahmmush
- PRABHU Mainpaperfinal3grammarcorrectionUploaded byPrabhu Govind
- PUWER 1998Uploaded byJogi Oscar Sinaga
- nav aids prefi reviewer.docxUploaded byChris Jomarie Obra
- Machine Design MD by S K Mondal T&QUploaded bySatish Kumar
- Bus Duct MV Non-Seg.docUploaded byGilberto Mejia
- PD05P-XXX-XXX-B-ENUploaded bymark929
- Chapter3.2 Ammeter ShuntUploaded byChristina de Leon
- Isotron Systems - RM200 Canopen ManualUploaded by1tuk
- ANsys Modelling Paper SIFUploaded byPablo Coll