You are on page 1of 9

Transition matrices

Suppose now that

are bases for the

dimensional vector space . We shall examine the relationship between the


coordinate vectors [ ] and [ ] of the vector in
to the bases and
respectively.

Let the coordinate vector

with respect to
[

be denoted

].

Then a matrix

is called the transition matrix from the

basis to the

basis.

(It is easier to memorize it in the form:


T=S

where T is a row vector(matrix) with components


, where S is a
row vector(matrix) with components
. You can memorize it in the
form T=S
)

Then
(S

[ ]
[ ]

[ ] Indeed, as far as u=S[u]S=T[u]T one has S[u]S=


[ ] that is
[ ]

[ ]

Example 1. Let

and

( )

Where

be bases for

( )

( )

( )

Compute the transition matrix

from the

Solution.

Thus T=S[

] that is

].

basis to the

basis.

Orthogonal basis

Definition. Let

be any real vector space. An inner product (dot product,


scalar product) on is a function that assigns to each ordered pair of vectors
in
a real number denoted by
satisfying :
a)
b)
c)
d)

if and only if
for any
in
for any
for any
in
and

Example 1. An example of the inner product on

in
a real scalar

is
, where

and

For

Definition. A set

of vectors from

is called orthogonal

if
for

An orthonormal set of vectors is an orthogonal set of unit vectors.


That is
and

in
for

is called orthonormal if

for

Example 2. If
orthogonal set in
,

. Then
, because
.

The vectors

unit vectors, because

So

is

orthonormal set.
Example 2. The natural basis
orthonormal set in
.

is an

Theorem 1. Let
. Then is linearly independent.

be an orthogonal set of nonzero vectors in

Example 3. The vectors


is an
orthonormal set in
. We know that this vectors are linearly independent.

Corollary. An orthonormal set of vectors in

is linearly independent.

Definition. . An orthogonal ( orthonormal) basis for a vector space is a basis


that is orthogonal ( orthonormal) set.
Theorem 2. Let
any vector in

be an orthonormal basis for

and

Then
where

Example 4. Let
(

Write the vector


Solution.

be an orthonormal basis for


(

).

as linear combination of the vectors in

where

Thus

Theorem 3. (Gram-Schmidt Process)


Let

be a nonzero subspace of

with basis

Then there exists an orthogonal basis

for

The Gram-Schmidt process for computing orthogonal basis


for nonzero subspace
of
with basis

Step 1. Let

Step 2. Compute the vectors


(

by the formula

The set of vectors


Step 3.
Then

is orthogonal set.

is an orthonormal basis for

Example 5. Consider the basis

for

where

.
Use Gram-Schmidt process to transform

to an orthonormal basis for

Solution.
Step 1. Let

.
(

Step 2.
(

(
(

)
)

)
(

Theorem 4. If
matrix with linear independent columns,
[ ] is an
then can be factored as
where is an
matrix whose columns
form an orthogonal basis for the column space of
and is an
nonsingular upper triangular matrix.
The procedure to finding
factorization of
independent columns is as follows.

matrix

Step 1. Let the columns of


with these vectors as basis.

and

Step 2. Transform the basis


process to orthogonal basis
Let

denoted by
for
.

with linear

be subspace of

by using Gram-Schimidt

] be the matrix whose columns are

[ ], where

Step 3. Compute
form as

You can memorize it in matrix

1. R=((ui.wj)i,j=1,2,,n)t , where i stands for row number, j stands for column


number.
or
2. R=Wt U, where W is matrix with columns w1 ,,wn , t stands for transpose,
U is matrix with columns u1 ,,un.

Example 6. Let

Step 1. Let

+. Compute

* +

factorization of matrix .

{
. Then

is basis in

Step 2. Let

+.

].

* +

)*

* +

[ ]

[ ]

] .

Step 3. Compute

], where

].

] [

Definition. Let
be a subspace of
A vector
in
is to be orthogonal
to
if it is orthogonal to every vector in
The set off all vectors in
that orthogonal to all vectors in is called the orthogonal compliment of
in
and is denoted by
Theorem 5. Let
a).

be a subspace of

Then

is subspace of

b).

Example 7. Let

be a subspace of

with basis

where

. Find a basis for

Solution. Let
Thus we have

be vector in

Then

and

Hence the vectors

and

Theorem 5.
from
can be written as
Theorem 6.

span

and form a basis for

that is each vector u


only uniquely.
.

You might also like