You are on page 1of 3

Generating a random diagonally dominate matrix:

To create a matrix of rank, N, the user calls on the function matrix_generator(N). Using
the rand command, an N by N random matrix, A, was generated with each number being less
than one. To make the matrix diagonally dominate, I simply multiplied the identity by 2 and
added it to the random matrix. I also used the rand function to generate an N by 1 vector, b, as
the solution to the nonlinear system of equations.
Gauss Elimination:
To solve using gauss elimination, the user calls on the function gauss_elimination(A,b).
The forward substitution works by dividing the k+1 row by the k row, multiplying this by the k
row, and subtracting this from the k+1 row. The backwards substitution works essentially the
same way.
Gauss-Jordan Elimination:
To solve using gauss-jordan elimination, the user calls on the function gauss_jordan(A,b).
The code is very similar to gauss elimination since the backwards substitution used in gauss
elimination is the same as gauss-jordan.
Jacobi Iterations:
To approximate using Jacobi iterations, the user calls the function using
jacobi_iterations(A,b,tol,maxit). The variable tol is the tolerance and maxit is the number of
maximum iterations. The initial guess is set to zero for each x0. Then the code runs until the
tolerance is greater than the error.
Gauss Seidel:
To approximate using Gauss Seidel, the user calls the function using
gauss_seidel(A,b,tol,maxit). The input variables are the same as the jacobian and the initial
guess is still 0 for each x0. The code is similar to the Jacobian except now it includes an elseif
statement to provide the substitution for the new x1 at each iteration.

SOR Solver:
The SOR solver can be called with sor_solver(A,b,tol,maxit,omega). All of the variables
are the same as Gauss Seidel and Jacobian except the omega is the multiplier for SOR. The code
is exactly the same as Gauss Seidel with the exception of omega over Ann.

Results:
Tolerance=10-6
Max iterations=200
Omega=1.02, 1.04, 1.06, 1.08,1.1
At N=50, all of the solvers were about the same. The direct solvers were just a bit faster than the
iterative solvers
At N=100, Gause took .0468 seconds and Gauss Jordan took .0156 seconds. Jacobi took .25
seconds, Gauss seidel took .234 seconds, and SOR(omega 1.04) took .56 seconds.
At N=200, Gauss.156 seconds and Gauss Jordan took .14 seconds. Gauss seidel, Jacobian, and
SOR(omega 1.04) took .484, .641, and 1.203 seconds.
At N=500, the direct solvers took 2.2 and 2.3 seconds. Gauss seidel, Jacobian, and SOR(omega
1.04) took 1.86, 2.56, and 4.59 seconds. Matlabs solver was .065
At N=1000, the gauss and gauss Jordan took 21.4 and 21 seconds. Gauss seidel, Jacobi, and
SOR(omega 1.04) took 6.25, 8.84, and 15.4 seconds respectfully. Matlabs solver took .178
seconds.
At N=2000, Gause took 216 seconds and Gauss Jordan took 214 seconds. Jacobi took 35.6
seconds, Gauss seidel took 23.57 seconds, and SOR(omega 1.04) took 59 seconds. The matlab
built-in solver took only .75 seconds.

1- Which direct method is faster? Which iterative method is faster? How does the overrelaxation
factor affect the convergence speed of the SOR solver? Which method is the
fastest of all?
The faster direct method was Gauss Jordan. The fasted iterative method was Gauss seidel. The
overrelaxation factored in differently on each run. Usually it was slower than the other iterative
methods. The fasted of all is Matlabs solver. Out of the methods that I coded, it depended on the
rank. For the high ranks, the Gauss seidel was the fasted by far. For lower ranks, the Gauss
Jordan was the fastest.
2- How do your solvers perform compared to the MATLAB solver, i.e. A\b?
My solvers were much slower than Matlabs. Much more noticeable at higher ranks.

3- Is there a relation between the size (or rank) of matrix A and the performance of direct
methods compared to that of the iterative methods? Discuss.
As stated in number one, for smaller ranks, the direct methods were faster. For higher ranks, the
iterative methods were much better.

You might also like