You are on page 1of 38

Eigenvalues and Eigenvectors

Ng Tin Yau (PhD)


Department of Mechanical Engineering
The Hong Kong Polytechnic University

Jan 2016

By Ng Tin Yau (PhD) 1/38

Table of Contents

Eigenproblem & Some of Its Applications

Numerical Examples

Diagonalization Problem

A Transformation Method Jacobi

By Ng Tin Yau (PhD) 2/38

Introduction
Thoughtout this notes, denote K to be either the real field R or the
complex field C. Similarly, Kn represents the collection of all n-tuples
such that each component belongs to K. Unless otherwise stated, we
shall denote A to be any n n matrix over K.
Given a square matrix A. Suppose that there exist K and a
nonzero vector v Kn such that
Av = v

(1)

Then we say is an eignvalue of A and v is an accompanying


eigenvector. Finding such a pair (, v), also called an eigenpair that
satisfies equation (1) is called an eigenproblem.
Eigenproblem plays a significant role in engineering applications and is
frequently encountered in numerical methods.
Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 3/38

A Mechanical Vibration Problem


Consider the vibration of a 3-spring 2-mass problem:

Using Newtons second law to obtain



  
   
m1 0
x
1
k1 + k2
k2
x1
0
+
=
0 m2 x
2
k2
k2 + k3 x2
0
or in matrix form
M
x + Kx = 0
Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 4/38

Mechanical Vibration cont


By assuming that the solution is purely oscillatory, we have x = veit .
Then the equation of motion becomes
( 2 M + K)veit = 0
Since eit 6= 0, then
(K 2 M)v = 0
For nontrival solution, v 6= 0, we must have det(K 2 M) = 0.
Physically, i represents the natural frequency of mi and the
eigenvectors v(1) and v(2) are the mode shapes of the two masses.
By letting A = M1 K, we arrive the equation
Av = v
where = 2 .
Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 5/38

How to determine the eigenvalues of a square matrix?


Rewrite equation (1) as
(I A)v = 0

or

(A I)v = 0

(2)

where I is a the identity matrix. If is the eigenvalue of A and v 6= 0,


then we must have
det(I A) = 0
(3)
Define the characteristic polynomial of A by
p() , det(I A)

(4)

Thus, the zeros of p() are eigenvalues of the matrix A. Hence, we


have the following theorem:

Theorem
Let A be a complex square matrix. Then C is an eigenvalue of A if
and only if det(I A) = 0.
Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 6/38

Facts about eigenvalues

Theorem
Let A be a square matrix over Kn . Then
1

Every A has an eigenvalue.

The eigenvalue associated with an eigenvector is uniquely


determined.

If v(1) , v(2) Kn are eigenvectors with the same eigenvalue , then


for every scalar c1 , c2 , the vector c1 v(1) + c2 v(2) , if it is nonzero, is
also an eigenvector with eigenvalue .

Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 7/38

A warm up example
Example
Given



1 4 3
A=
10 6 7

Verify that
  
1
1,
2


and

 
1
1
,
10 1

are the eigenpairs of matrix A.


Solution:

and


 
 
4/10 3/10 1
1
=1
6/10 7/10 2
2


 
 
1
1
4/10 3/10
1
=
6/10 7/10 1
10 1

Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 8/38

Bookkeeping of cars of a car rental company


Example
A car rental company has offices in Edmonton and Calgary. Relying on
its records, the company knows that on a monthly basis 40% of rentals
from the Edmonton office are returned there and 60% are one-way
rentals that are dropped off in the Calgary office. Similarly, 70% of
rentals from the Calgary office are returned there, whereas 30% car
dropped off in Edmonton.
(1) Obtain a matrix equation that describes the number of cars at the
depots in Edmonton and Calgary at the beginning of month.
(2) Estimate the number of cars at the depots in Edmonton and
Calgary in the long run.
(3) Determine the number of cars in the each office after 20 months if
orginally there are 100 cars and 500 cars at the Edmonton and Calgary
offices, respectively.

Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 9/38

Car rental company cont

(1) Obtain a matrix equation that describes the number of cars at the
depots in Edmonton and Calgary at the beginning of month.
Let xk and yk denote the number of cars at the depots in Edmonton
and Calgary, respectively, at the beginning of month k (k = 0, 1, 2, . . .).
We can express this information in terms of difference equations:
xk+1 = 0.4xk + 0.3yk
yk+1 = 0.6xk + 0.7yk
In matrix form

xk+1
yk+1

 
0.4 0.3 xk
=
0.6 0.7 yk

Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 10/38

Car rental company cont


(2) Estimate the number of cars at the depots in Edmonton and
Calgary in the long run.
Denote zk = (xk , yk )T and A as the coefficient matrix of the system of
difference equations for the model. Then, zk+1 = Azk and notice that
zk = Ak z0 . Recall that the eigenvalues of A are 1 = 1 and 2 = 0.1
and their corresponding eigenvectors
v1 = (1, 2)T and v2 = (1, 1)T ,
P2
respectively. By writing z0 = i=1 ci vi and using the fact that
Avi = i vi , then we have
k

zk = A z0 =

2
X
i=1

ci A vi =

2
X
i=1

ci ki vi

c1 + c2 (0.1)k
2c1 c2 (0.1)k


=

Now if k , we have limk zk = (c1 , 2c1 )T . Thus, in the long run,


the number of cars in Edmonton depot tends to a value that is half the
number of cars at the Calgary office.
Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 11/38

Car rental company cont


(3) Determine the number of cars in the each office after 20 months if
orginally there are 100 cars and 500 cars at the Edmonton and Calgary
offices, respectively.
Let


1 1
Q=
2 1

Since
Q

and Q

1/3 1/3
=
2/3 1/3


1 0
AQ =
=
0 0.1

then zk+1 = Azk = QQ1 zk and zk = (QQ1 )k z0 . Now for


k = 20, we have
  

20 

  
x20
1 1
1 0
1/3 1/3
x0 = 100
200
=
=
y20
2 1 0 0.1
2/3 1/3 y0 = 500
400

Eigenproblem & Some of Its Applications By Ng Tin Yau (PhD) 12/38

Example 1
Example
Compute the eigenvalues and their corresponding eigenvectors of
matrix


3 4
A=
2 6
First, we need to compute
det(I A) = det



3
4
=0
2 + 6

This leads to the charisteristic equation


( 3)( + 6) + 8 = 2 + 3 10 = ( 2)( + 5) = 0
Thus, the eigenvalues are 1 = 2 and 2 = 5.

Numerical Examples By Ng Tin Yau (PhD) 13/38

Example 1 cont
Next we need to use the equation (I A)v = 0 to determine the
corresponding eigenvectors.
For = 2, the equation becomes

 ( (1) )  
1
4
v1
0
(1 I A)v(1) =
=
(1)
2 8
0
v2
(1)

(1)

(2)

(2)

Set v2 = 1, then v1 = 4, hence v(1) = (4, 1)T .


For = 5, the equation becomes
 ( (2) )  

8
4
v1
0
=
(2 I A)v(2) =
(2)
2 1
0
v2
Set v1 = 1, then v2 = 2, hence v(2) = (1, 2)T .

Numerical Examples By Ng Tin Yau (PhD) 14/38

Example 2
Example
Compute the eigenvalues of matrix


2 6
A=
3
4
The characteristic polynomial is


+2
6
det(I A) = det
= 2 2 + 10 = 0
3 4
Notice that if we restrict that R, then we have no eigenvalue that
satisfy the characteristic polynomial! However, if we allow C, then
we have
1 = 1 + 3i and 2 = 1 3i

where i = 1.

Numerical Examples By Ng Tin Yau (PhD) 15/38

Example 3
Example
Determine the eigenvalues and their
following matrix.

1
A= 2
1

corresponding eigenvectors for the

1 4
0 4
1 2

First, we need to compute

1 1
4

4 = 0
det(A I) = det 2
1
1 (2 + )
This leads to the charisteristic equation
p() = 3 + 2 4 4 = ( + 1)( + 2)( 2) = 0
Thus,
1 = 2 2 = 2 3 = 1

Numerical Examples By Ng Tin Yau (PhD) 16/38

Example 3 cont
For = 2, we have
(1)

(1)

v
1 1 4
0
1
1
v1
2 2 4 v (1) = 0 v (1) = 1
2
2

0
(1)
1 1 4 v (1)
0
v
3
3
For = 2, we

3
2
1

have

For = 1, we

2
2
1

have

(2)
(2)

v
1 4
0
1
v1
1
(2)
(2)

2 4
= 0 v2
= 1
v
2

(2)
1
1 0 v (2)
0
v3
3

(3)
(3)

1 4
v1
0
v1
1
(3)

1 4
= 0 v2(3) = 2
v2

(3)
1
1 1 v (3)
0
v3
3

Numerical Examples By Ng Tin Yau (PhD) 17/38

Diagonalization Problem
A square matrix A is called a diagonal matrix if Aij = 0 when i 6= j.
It is easy to see that working with a diagonal matrix is much more
convenient than working with a nondiagonal matrix. A matrix A is
said to be similar to a matrix B if there exists an nonsingular matrix
P such that P1 AP = B. In particular, if A is similar to a diagonal
matrix D, then it is said to be diagonalizable.
Now the two key questions are:
1

How do we know that a given square matrix is diagonalizable?

Suppose that the square matrix is diagonalizable, then how to


obtain a matrix P?

In the sequel, we shall see that the diagonalization problem is directly


related to an eigenproblem.

Diagonalization Problem By Ng Tin Yau (PhD) 18/38

Idea of constructing P
Suppose that A is diagonalizable, then P1 AP = D. Denote Dii = i
where i is the diagonal entry at row i. Since P is invertible, then the
column space forms a linearly independent subset of Cn . Denote
column i of P as v(i) and thus,
P = [v(1)

v(2)

v(n) ]

(5)

From the relation AP = PD, we must have


Av(i) = i v(i)

(6)

or equivalently (i I A)v(i) = 0. Since v(i) cannot be zero, therefore,


det(i I A) = 0. This suggest that we can solve for all i and then
obtain the corresponding v(i) and if all the v(i) s are linearly
independent, then we are succeeded! Hence, all we need are some
theorems to guarantee that the eigenvectors forms a linearly
independent set. At least we must have n eigenvectors at the outset.

Diagonalization Problem By Ng Tin Yau (PhD) 19/38

Example 4
Example
Construct a matrix P such that P1 AP is a diagonal matrix if

1 1 4
A = 2 0 4
1 1 2
By using the eigenvectors of A, we have

1 1 1
1
0 1
P = [v(1) v(2) v(3) ] = 1 1 2 and P1 = 1 1 1
0 1 1
1 1
0
Then we have the desired result.

2 0
0
P1 AP = 0 2 0
0 0 1

Diagonalization Problem By Ng Tin Yau (PhD) 20/38

A matrix with n distinct eigenvalues


Theorem
Let 1 , 2 , . . . , n be distinct eigenvalues of an n n matrix A. If
v(1) , v(2) , . . . , v(n) are the eigenvectors of A such that i corresponds to
v(i) , then the set {v(1) , v(2) , . . . , v(n) } is linearly independent.

Corollary
Let Abe an n n matrix. If A has n distinct eigenvalues, then A is
diagonalizable.

Example



7
13 16
3 4
The matrices
and 13 10 13 are diagonalizable.
2 6
16 13
7

Diagonalization Problem By Ng Tin Yau (PhD) 21/38

Algebraic Multiplicity & Geometric Multiplicity


Let p() be the characteristic polynomial of A. Suppose that p() is
written in the form:
k
Y
p() =
( i )mi
(7)
i=1

Pk

with i 6= j and i=1 mi = n, the postive integer mi is called the


algebraic multiplicity of the eigenvalue i .
Let an n n matrix A be an operator on the vector space Kn and let
be an eigenvalue of A. Denote the eigenspace of A corresponding to
as
E = {v Kn | Av = v}
(8)
The dimension of the eigenspace E , denoted dim(E ) is called the
geometric multiplicity of .

Diagonalization Problem By Ng Tin Yau (PhD) 22/38

Necessary & Sufficient Conditions for Diagonalizability

Theorem
Let be an eigenvalue of A having algebraic multiplicity m. Then
1 dim(E ) m.

Theorem
Let an n n matrix A be an operator on the vector space Kn . Let
1 , 2 , . . . , k be the distinct eigenvalues of an n n matrix A. Then
A is diagonalizable if and only if for all 1 i k, the algebraic
multiplicity of i is equal to dim(Ei ).
In practice, for each eigenvalue we compute n rank(A I) and if
the algebraic multiplicity is equal to n rank(A I), then A is
diagonalizable.

Diagonalization Problem By Ng Tin Yau (PhD) 23/38

Example 5
Example
Determine the eigenvalues and eigenvectors of the matrix


4 1
A=
1 2
The characteristic polynomial is


4
1
det
= 2 6 + 9 = ( 3)2 = 0
1 2
which gives an eigenvalue = 3. That is, the algebraic multiplicity of
= 3 is 2. The eigenvector is

   
1 1 v1
0
(3I A)v =
=
1 1 v2
0
Set v2 = 1, then v1 = 1, and hence v = (1, 1)T . However, since
rank(3I A) = 1 then dim(E3 ) = 1. Therefore, A is not diagonalizable.

Diagonalization Problem By Ng Tin Yau (PhD) 24/38

Example 5 cont
In fact one can obtain another vector which is not an eigenvector of by
solving the equation (A I)u = v, that is

  

1 1 u1
v1 = 1
=
1 1 u2
v2 = 1
One of the possible solution is set u = (1, 0)T . Now we obtain a
linearly independent set
   


1
1
v, u =
,
1
0
Notice that we have
(I A)2 u = (I A)v = 0
In this case, we say u is a generalized eigenvector of A associated
with = 3.

Diagonalization Problem By Ng Tin Yau (PhD) 25/38

Example 6
Example
Determine the eigenvalues and eigenvectors of the matrix

4 2 2
1
A = 2 1
2 1
1
The characteristic equation is 2 ( 6) = 0, hence, we have two
eigenvalues, namely, 1 = 6 and 2 = 0. Solve for

(1)
2 2 2
v1
0
(A 6I)v = 2 5 1 v2(1) = 0


2 1 5 v (1)
0
3
to get v(1) = (2, 1, 1)T . The algebraic multiplicity of 1 is 1.

Diagonalization Problem By Ng Tin Yau (PhD) 26/38

Example 6 cont
On the other hand, the algebraic multiplicity of 2 is 2. The
eigenvector is

(2)
4 2 2
v1
0
1 v2(2) = 0
(A 0I)v = 2 1


2 1
1 v (2)
0
3
Since

4 2 2
2 1 1
2 1
1 0 0 0
2 1
1
0 0 0

Set v(2) = (1, 0, 2)T . Notice that rank(A) = 1, therefore, dim(E0 ) = 2


which is equal to the algebraic multiplicity of 2 . Hence, it is possible
to find another eigenvector in E0 . To find another eigenvector we
(3)
(3)
(3)
reconsider the equation 2v1 + v2 + v3 = 0. By letting
(3)
T
(2)
(3)
v = (1, 2, 0) , then v and v are linearly independent and hence,
they jointly form a basis for E0 .

Diagonalization Problem By Ng Tin Yau (PhD) 27/38

Brief Survey of Numerical Methods


Many methods can be used to determine the eigenvalues and
eigenvectors of a square matrix. For example, the power method,
deflation methods, QR method, Jacobi method, to name a few.
The power method can be used to find the dominant eigenvalue and
an associated eigenvector for an arbitrary matrix. The inverse power
method will find the eigenvalue closest to a given value and associated
eigenvector. This method is often used to refine an approximate
eigenvalue and to compute an eigenvector once an eigenvalue has been
found by some other technique.
Methods based on similarity transformations, such as Householders
method, are used to convert a symmetric matrix into a similar matrix
that is tridiagonal (or upper Hessenberg if the matrix is not
symmetric). Techniques such as the QR method can then be applied to
the upper Hessenberg matrix to obtain approximations to all the
eigenvalues. The associated eigenvectors can be found by using an
iterative method, such as the Inverse Power Method, or by modifying
the QR method to include the approximation of eigenvectors.
A Transformation Method Jacobi By Ng Tin Yau (PhD) 28/38

Symmetric Matrices
A matrix A is said to be symmetric if A = AT . An n n matrix Q
is said to be an orthogonal matrix if Q1 = QT .

Theorem
If A is a real symmetric square matrix and D is a diagonal matrix
whose diagonal entries are the eigenvalues of A, then there exists an
orthogonal matrix Q such that D = QT AQ.
The following corollary to the above theorem demonstrate some of the
interesting properties of symmetric matrices.

Corollary
If A is a real symmetric n n matrix, then there exist n eigenvectors
of A that form an orthonormal set and the eigenvalues of A are real
numbers.

A Transformation Method Jacobi By Ng Tin Yau (PhD) 29/38

The Jacobis Method


Suppose that we have a real symmetric square matrix A. The Jacobis
method is an iterative method that produce all the eigenvalues and
eigenvectors of a real symmetric matrix simultaneously. This method is
based on a theorem in linear algebra stated previously, that is we need
to determine an orthogonal matrix Q such that D = QT AQ.
However, from a practical viewpoint, it may not be possible to obtain a
truely diagonal matrix D. Instead we seek a sequence of matrics
{Dk }kN and hoping that
lim Dk = D

(9)

where
Dk = QTk Dk1 Qk

kN

(10)

with D0 A. Then the eigenvalues are given by the diagonal entries


(k)
Dii of matrix Dk . The corresponding eigenvectors {v(i) }ni=1 are given
by the columns of the matrix V(k) where it is given by


V(k) = Q1 Q2 Qk = v(1) v(2) v(n)
(11)
A Transformation Method Jacobi By Ng Tin Yau (PhD) 30/38

How to convert a 2 2 matrix to a diagonal one?


Using the idea of rotating a vector in the plane, we have the rotation
matrix given by


cos sin
Q=
(12)
sin cos
Then we can obtain an orthogonal transformation A0 = QT AQ. Carry
out the matrix multiplication, we have
A011 = A11 cos2 + 2A12 sin cos + A22 sin2
A012

A021

= (A22 A11 ) sin cos + A12 (cos sin )

A022

= A11 sin 2A12 sin k cos + A22 cos

(13)
(14)
(15)

To obtain a diagonal matrix, we need to kill the off-diagonal terms. In


other words, we require A012 = A021 = 0 and using the identities
cos 2 = cos2 sin2 and sin 2 = 2 sin cos to yield
tan 2 =

2A12
A11 A22

(16)

A Transformation Method Jacobi By Ng Tin Yau (PhD) 31/38

How to determine matrix Qk ?


Extending this idea to the n n case and using our notation in
previous analysis, we have

I
0
0
0
0
0 cos k 0 sin k 0

0
0
I
0
0
Qk =

0 sin k 0 cos k 0
0
0
0
0
I nn

(17)

for all k N. Here the sine and cosine entries appear in the position
(i, i), (i, j), (j, i) and (j, j). In this case, we require
(k+1)
(k+1)
Dij
= Dji
= 0 which gives
(k)

tan 2k+1 =

2Dij
(k)

(k)

(18)

Dii Djj

Thus, each step of Jacobis method reduces a pair of off-diagonal


elements to zero.
A Transformation Method Jacobi By Ng Tin Yau (PhD) 32/38

Example 6
Example
Find the eigenvalues and eigenvectors of the matrix

1 1 1
A = 1 2 2
1 2 3
Ans: The largest off-diagonal term is |A23 | = 2. In this case, we have
i = 2 and j = 3. Thus


4
1
2A23
1
1 = tan
tan 21 =
= 0.662909
A22 A33
2
23
and

1
0
0
1.0
0
0
0.7882054 0.6154122
Q1 = 0 cos 1 sin 1 = 0
0 sin 1 cos 1
0 0.6154122 0.7882054
A Transformation Method Jacobi By Ng Tin Yau (PhD) 33/38

Example 6 - First Iteration


With D0 = A and using the calculated Q1 , we have

1.0
0.1727932 1.4036176

0.0
D1 = QT1 D0 Q1 = 0.1727932 0.4384472
1.4036176
0.0
4.5615525
Now we try to reduce the largest off-diagonal term of D1 , namely,
(1)
|D13 | = 1.4036176 to zero. In this case, we have i = 1 and j = 3.
(1)

tan 22 =

2D13
(1)

(1)

D11 D33

1
2 = tan1
2

2.8072352
1.0 4.5615525


= 0.333754

and

cos 2 0 sin 2
0.9448193
0 0.3275920

1
0 =
0
1.0
0
Q2 = 0
sin 2 0 cos 2
0.3275920 0 0.9448193
A Transformation Method Jacobi By Ng Tin Yau (PhD) 34/38

Example 6 - Second Iteration


Using the calculated Q2 , we have

0.5133313 0.1632584
0.0
D2 = QT2 D1 Q2 = 0.1632584 0.4384472 0.0566057
0.0
0.0566057 5.0482211
(2)

The largest off-diagonal term of D2 , namely, |D12 | = 0.1632584 to


(2)

zero. In this case, we have i = 1 and j = 2. Now tan 23 =


which gives
1
3 = tan1
2

0.3265167
0.5133313 0.4384472

2D12
(2)

(2)

D11 D22


= 0.672676

and

cos 3 sin 3 0
0.7821569 0.6230815 0.0
Q3 = sin 3 cos 3 0 = 0.6230815 0.7821569 0.0
0
0
1
0.0
0.0
1.0
A Transformation Method Jacobi By Ng Tin Yau (PhD) 35/38

Example 6 - Third Iteration

Using the calculated Q3 , we have

0.6433861
0.0
0.0352699
0.0
0.3083924 0.0442745
D3 = QT3 D2 Q3 =
0.0352699 0.0442745 5.0482211
Suppose that you want to stop the process, then the three eigenvalues
are
1 = 0.6433861 2 = 0.3083924 3 = 5.0482211
In fact the eigenvalues obtained by Matlab are
1 = 0.6431 2 = 0.3080

3 = 5.0489

A Transformation Method Jacobi By Ng Tin Yau (PhD) 36/38

Example 6 - Eigenvectors
To obtain the corresponding eigenvectors we compute

0.7389969 0.5886994 0.3275920


0.7421160 0.5814533
V(3) = Q1 Q2 Q3 = 0.3334301
0.5854125 0.3204631 0.7447116
Then the eigenvectors are given by the columns of the matrix Q(3) .
That is,

0.7389969
0.5886994
0.3275920
0.3334301
0.7421160
v(1) =
v(2) =
v(3) = 0.5814533

0.5854125
0.3204631
0.7447116
Using Matlab, we have the corresponding eigenvectors

0.7370
0.591
0.3280
0.3280
0.7370
v(1) =
v(2) =
v(3) = 0.5910

0.5910
0.3280
0.7370
A Transformation Method Jacobi By Ng Tin Yau (PhD) 37/38

Exercises
(1) Given matrices

6 7 2
A = 4 5 2
1 1 1

3 1 0
and B = 1 4 2
0 2 3

Determine the eigenvalues and eigenvectors of matrix A by the


conventional method and matrix B by the Jacobis method with 3
iterations ONLY.
(2) Given a matrix

3
1 2
5
C = 1 0
1 1 4
(a) Determine the eigenvalues and eigenvectors of C by conventional
method. (b) Compute the algebraic multiplicity and geometric
multiplicity of all eigenvalues. (c) Is it a diagonalizable matrix?
A Transformation Method Jacobi By Ng Tin Yau (PhD) 38/38

You might also like