Professional Documents
Culture Documents
Textbook problems: Ch. 3: 3.3.1, 3.3.12, 3.3.13, 3.5.4, 3.5.6, 3.5.9, 3.5.30
Chapter 3
CC
e = (AB)(AB)
g = AB B
eAe = AA
e=I
The statement that this is a key step in showing that the orthogonal matrices form
a group is because one of the requirements of being a group is that the product
of any two elements (ie A and B) in the group yields a result (ie C) that is also
in the group. This is also known as closure. Along with closure, we also need
to show associativity (okay for matrices), the existence of an identity element
(also okay for matrices) and the existence of an inverse (okay for orthogonal
matrices). Since all four conditions are satisfied, the set of n × n orthogonal
matrices form the orthogonal group denoted O(n). While general orthogonal
matrices have determinants ±1, the subgroup of matrices with determinant +1
form the “special orthogonal” group SO(n).
i) a2 + b2 = 1, ii) c2 + d2 = 1, iii) ac + bd = 0
These are three equations for four unknowns, so there will be a free parameter
left over. There are many ways to solve the equations. However, one nice way is
to notice that a2 + b2 = 1 is the equation for a unit circle in the a–b plane. This
means we can write a and b in terms of an angle θ
a = cos θ, b = sin θ
c = cos φ, d = sin φ
or
cos θ sin θ
A1 = (1)
sin θ − cos θ
This looks almost like a rotation, but not quite (since the minus sign is in the
wrong place).
φ = θ − 3π/2: This gives
or
cos θ sin θ
A2 = (2)
− sin θ cos θ
3.3.13 Here |~x i and |~y i are column vectors. Under an orthogonal transformation S, |~x 0 i =
S|~x i, |~y 0 i = S|~y i. Show that the scalar product h~x |~y i is invariant under this orthog-
onal transformation.
where we used SSe = I for an orthogonal matrix S. This demonstrates that the
scalar product is invariant (same in primed and unprimed frame).
3.5.4 Show that a real matrix that is not symmetric cannot be diagonalized by an orthogonal
similarity transformation.
We take the hint, and start by denoting the real non-symmetric matrix by A.
Assuming that A can be diagonalized by an orthogonal similarity transformation,
that means there exists an orthogonal matrix S such that
We can ‘invert’ this relation by multiplying both sides on the left by Se and on
the right by S. This yields
A = SΛS
e
A
e = (SΛS)
eg = SeΛ
eSee
A
e = SΛS
e =A
A|~xi i = λi |~xi i
or
|~xi i = λi A−1 |~xi i
Rewriting this as
A−1 |~xi i = λ−1
i |~
xi i
it is now obvious that A−1 has the same eigenvectors, but eigenvalues λ−1
i .
3.5.9 Two Hermitian matrices A and B have the same eigenvalues. Show that A and B are
related by a unitary similarity transformation.
Since both A and B have the same eigenvalues, they can both be diagonalized
according to
Λ = U AU † , Λ = V BV †
U AU † = V BV † ⇒ B = V † U AU † V
Note that the eigenvalues are degenerate for = 0 but the eigenvectors are or-
thogonal for all 6= 0 and → 0.
We first find the eigenvalues through the secular equation
1 − λ
= (1 − λ)2 − 2 = 0
1 −
Since the problem did not ask to normalize the eigenvectors, we can take simply
1
λ+ = 1 + : |x+ i =
1
This gives
1
λ− = 1 − : |x− i =
−1
Note that the eigenvectors |x+ i and |x− i are orthogonal and independent of . In
a way, we are just lucky that they are independent of (they did not have to turn
out that way). However, orthogonality is guaranteed so long as the eigenvalues
are distinct (ie 6= 0). This was something we proved in class.
Note that the eigenvalues are degenerate for = 0 and for this (nonsymmetric)
matrix the eigenvectors ( = 0) do not span the space.
In this nonsymmetric case, the secular equation is
1 − λ 1
2 = (1 − λ)2 − 2 = 0
1 − λ
Interestingly enough, this equation is the same as (3), even though the matrix is
different. Hence this matrix has the same eigenvalues λ+ = 1 + and λ− = 1 − .
For λ+ = 1 + , the eigenvector equation is
− 1 a
=0 ⇒ −a + b = 0 ⇒ b = a
2 − b
Up to normalization, this gives
1
λ+ = 1 + : |x+ i = (4)
For the other eigenvalue, λ− = 1 − , we find
1 a
2 =0 ⇒ a + b = 0 ⇒ b = −a
b
Hence, we obtain
1
λ− = 1 − : |x− i = (5)
−
In this nonsymmetric case, the eigenvectors do depend on . And furthermore,
1
when = 0 it is easy to see that both eigenvectors degenerate into the same .
0
c) Find the cosine of the angle between the two eigenvectors as a function of for
0 ≤ ≤ 1.
For the eigenvectors of part a), they are orthogonal, so the angle is 90◦ . Thus
this part really refers to the eigenvectors of part b). Recalling that the angle can
be defined through the inner product, we have
hx+ |x− i = |x+ | |x− | cos θ
or
hx+ |x− i
cos θ =
hx+ |x+ i1/2 hx− |x− i1/2
Using the eigenvectors of (4) and (5), we find
1 − 2 1 − 2
cos θ = √ √ =
1 + 2 1 + 2 1 + 2
Recall that the Cauchy-Schwarz inequality guarantees that cos θ lies between −1
and +1. When = 0 we find cos θ = 1, so the eigenvectors are collinear (and
degenerate), while for = 1, we find instead cos θ = 0, so the eigenvectors are
orthogonal.