Professional Documents
Culture Documents
tr(BA) =
+
+
+
Comparing the terms in the first column sum of tr(AB) and the first row sum of tr(BA) we have the
same entries. Similarly, the second column sum of tr(AB) is equal to the second row sum of tr(BA)
and so on.
Lemma:
If A is an n n matrix and B and C are invertible n n matrices, then Col(AB) = Col(A) and
Row(CA) = Row(A) and thus Rank(AB) = Rank(A) = Rank(CA).
Proof:
Denote the columns of A by ~c1 , ~c2 , . . . , ~cn and let ~x be the j th column of B.
Then column j of AB is A~x = x1~c1 + x2~c2 + . . . + xn~cn Col(A).
Therefore Col(AB) Col(A).
1
cB () =
=
=
=
=
=
Since A and B have the same characteristic polynomial they must have the same solutions to this
equation, hence they have the same eigenvalues .
Recall: Eigenvectors ~x of an n n matrix A corresponding to an eigenvalue are the non-trivial
solutions to (I A)~x = ~0. If we include the trivial solution, we will have a subspace called the
eigenspace of A corresponding to , denoted E . We can see that we have a subspace since it is
simply the Null Space of the matrix (I A).
Recall that A is diagonalizable iff there exists an invertible matrix P such that P 1 AP = D, a
diagonal matrix. We found P by placing the basic eigenvectors as columns of P . In order for P to
be invertible, these vectors we now know must be linearly independent and in fact form a basis for
Rn . This gives us the following result:
An n n matrix A is diagonalizable iff A has n linearly independent eigenvectors. Furthermore,
these eigenvectors form a basis for Rn .
2
We know that the basic eigenvectors corresponding to a particular eigenvalue will be independent.
But what about the eigenvectors corresponding to different eigenvalues?
Theorem:
Let {~v1 , ~v2 , . . . , ~vk } be eigenvectors corresponding to distinct eigenvalues 1 , 2 , . . . , k of an n n
matrix A. Then {~v1 , ~v2 , . . . , ~vk } is linearly independent.
Proof:
We will prove this indirectly or by contradiction.
Assume the set {~v1 , ~v2 , . . . , ~vk } is linearly dependent.
Then at least one of these vectors can be written as a linear combination of the others.
Let ~vp+1 be the first vector in the set that makes the set linearly dependent.
That is, in the set {~v1 , ~v2 , . . . , ~vp , ~vp+1 , . . . , ~vk }, the first p vectors are independent, and ~vp+1 is in the
span{~v1 , ~v2 , . . . , ~vp }.
Thus,
~vp+1 = c1~v1 + c2~v2 + . . . + cp~vp
(1)
(2)
Example
5
6
16
2 1 1
1
8 and B = 2 1 2. Which matrix is diagonalizable?
Suppose A = 4
4 4 11
1 0 2
Solution:
Proof:
Example:
If we can write A = QDQ1 , where Q is orthogonal, then the columns of Q form an orthonormal
basis for Rn . (Place eigenvectors in Q like as usual).
Definition:
An n n matrix A is orthogonally diagonalizable if there exists an orthogonal matrix Q such
that A = QDQ1 or A = QDQT , where D is a diagonal matrix containing the eigenvalues of A.
Theorem:
A is orthogonally diagonalizable if and only if A is symmetric.
Proof:
We will prove this theorem in one direction.
Assume A is orthogonally diagonalizable.
Then we can write A = QDQT .
Then AT = (QDQT )T = (QT )T DT QT = QDQT = A.
Thus, A is symmetric.
In fact, this theorem holds in the other direction as well. That is, if A is symmetric, then it is
orthogonally diagonalizable. This direction is shown in the text.
Theorem:
If A is a symmetric n n matrix then the eigenvectors corresponding to distinct eigenvalues of A are
orthogonal.
Proof:
First we note the following: If ~x and ~y are column vectors, then ~x ~y = ~xT ~y .
Let ~v1 and ~v2 be eigenvectors corresponding to distinct eigenvalues 1 and 2 , respectively.
Suppose A~v1 = 1~v1 and A~v2 = 2~v2 , where 1 6= 2 .
A~v1 ~v2 = 1~v1 ~v2 = 1 (~v1 ~v2 )
A~v1 ~v2 =
=
=
=
=
=
(A~v1 )T ~v2
~v1T AT ~v2
~v1T A~v2
~v1 A~v2
~v1 2~v2
2 (~v1 ~v2 )
Setting these two expressions for A~v1 ~v2 equal to each other, we have 1 (~v1 ~v2 ) = 2 (~v1 ~v2 ) = 0 or
(1 2 )(~v1 ~v2 ) = 0, and since 1 6= 2 , we have ~v1 ~v2 = 0 and so these eigenvectors are orthogonal.
Thus, if A is symmetric and has n distinct eigenvalues, the set of n corresponding eigenvectors will
be orthogonal.
6
8 2 2
Orthogonally diagonalize A = 2 5 4
2
4 5
Solution: