Professional Documents
Culture Documents
E, F bases of V
1
Questions:
Answer: P is orthogonal.
2
Reason:
h~ y i = h[~
x, ~ y ]E i = [~
x]E , [~ x]E [~
y ]E
u1]E [~
2. P = [~ un ]
3. ~ un orthonormal
u1 , . . . , ~
[~
u1]E , . . . , [~
un]E orthonormal
4. P orthogonal P 1 orthogonal
3
To see 1 on preceding slide, let
a1 b1
E = {~ un},
u1 , . . . , ~ x]E = ... ,
[~ y ]E = ...
[~
an bn
Then
* n n +
X X
h~ yi =
x, ~ ai ~
ui , bj ~
uj
i=1 j=1
X
= aibj h~ uj i
ui , ~
i,j
X
= aibih~ ui i
ui , ~
i
X
= aibi
i
a1 b1
.. ..
= . .
an bn
4
Rule of thumb
5
Example
Extend an orthonormal set {~ uk } in V to
u1 , . . . , ~
an orthonormal basis of V .
Strategy:
6
Blast from the past:
7
How to make the extended basis of Rn or-
thonormal?
8
Example
V = P2
Z 1
hf, gi = f (x)g(x) dx
1
Problemq1.
Extend 3/8(1 + x), 1/ 8(1 3x)
to an orthonormal basis of P2
Problem 2.
Find the orthogonal projection of x onto
span{1 + x2, x + x2}
9
To use the coordinatization approach, we need
an orthonormal basis of P2. Heres one:
q q
E = 1/ 2, 3/2 x, 5/8(3x2 1)
0 0
10
For example, the 2nd coordinate of [g]E is
Z 1
hg, u2i = g(x)u2(x) dx
1 s
Z 1
1 3
= (1 3x) x dx
s
1 8 2
3 1
Z
= (x 3x2) dx
s
16 1
3 2
= (3)
16 3
3
=
2
11
Next, let
3/2 1/2 1 0 0
A = [f ]E [g]E I = 1/2 3/2 0 1 0
0 0 0 0 1
0 0 0 0 1
leading columns: 1, 2, 5, so take correspond-
ing columns of A:
3/2 1/2
0
1/2 , 3/2 , 0
0 0 1
13
Actually, there is an alternative approach to
the above problem: put together the given or-
thonormal set {f, g} with any basis for ex-
ample, the standard one {1, x, x2}, and apply
Gram-Schmidt to the set {f, g, 1, x, x2}.
14
Anyway, the end result is that this alternative
approach has both pros and cons: it avoids
the coordinatization (and the need for an or-
thonormal basis of P2 to begin with), but on
the other hand it involves a significant amount
of algebraic manipulation with vectors in P2, as
opposed to (the more easily manageable) vec-
tors in R3. And it wouldnt really involve much
less work. However, its certainly a reasonable
approach.
15
For Problem 2, again we start by coordina-
tizing: compute the inner product of each of
the vectors x, 1 + x2, x + x2 with each vector
u1, u2, u3 in our orthonormal basis to get the
coordinate vectors.
0
4 2/3
[1 + x2]E = 0
2 10/15
2/3
[x + x2]E =
6/3
2 10/15
Call these vectors ~
u, ~v1, ~v2.
16
The associated problem in R3 is to find the
orthogonal projection of ~
u onto the subspace
W spanned by ~v1, ~v2.
18
Confession: this time it really would have
been less work to solve the problem directly
in the inner product space P2:
q1 := p1
hp2, q1i 5 2
q2 := p2 q 1 = x2 + x
hq1, q1i 7 7
Next use the formula for the orthogonal pro-
jection:
hx, q1i hx, q2i
q1 + q2
hq1, q1i hq2, q2i
5 2 7 1
= x + x
8 8 4
(again all done on computer)
19
Conclusion: sometimes it is definitely bet-
ter to work directly in the inner product space
rather than coordinatizing relative to an or-
thonormal basis.
20
Least Squares and Normal Equations
A Mmn
W = Col A subspace of Rm
~b Rm
~c := projW ~b
approximation to ~b from W
~c = A~ x Rn
x for some ~
21
Consider the equation A~ x = ~b as a system
which we would like to solve. If the system
is consistent, we know what to do. But what
if the system is inconsistent?
22
Heres how we solve the least squares problem
without using an orthogonal basis of W :
Thus A~ x W , that
x = ~c if and only if ~b A~
y Rn we have
is, if and only if for every ~
23
Therefore ~x is a least squares solution of the
system A~x = ~b if and only if it is a solution of
the normal equations:
AtA~
x = At~b.
Weve seen that even if the original system
A~x = ~b might be inconsistent, the normal equa-
tions AtA~ x = At~b are always consistent.
24
Example
1 1
A = 2 0
2 1
1
~b =
1
1
x = ~b is inconsistent:
A~
1 0 0
reduce
A ~b
0 1 0
0 0 1
!
1 2 2
At =
1 0 1
!
9 3
AtA =
3 2
!
5
At~b =
2
25
normal equations:
! !
9 3 5
~
x=
3 2 2
Gauss-Jordan Elimination:
! !
9 3 5 reduce 1 0 4/9
3 2 2 0 1 1/3
So:
26
More on extending independent sets to bases:
28
By the general theory,
(Col A) = Null At
So, we solve the homogeneous problem
At~
x=~ 0:
! !
1 1 1 0 reduce 1 1 0 1/3
At =
1 1 2 1 0 0 1 1/3
solution space:
x2 (1/3)x4 1 1/3
x2 1 0
= x2 + x4
(1/3)x4 0 1/3
x4 0 1
So:
29
So to finish we need to apply Gram-Schmidt to
this latter basis (then normalize). But first we
scale the 2nd vector, getting a basis {~ u2 }
u1 , ~
of W :
1
1
~
u1 =
0
0
1
0
~
u2 =
1
3
Gram-Schmidt produces an orthogonal basis
{~v1, ~v2}, where
1
1
~v1 = ~
u1 =
0
0
1/2
u2 ~v1
~ 1/2
u2
~v2 = ~ ~v1 =
1
~v1 ~v1
3
30
Actually, for convenience we scale the second
vector ~v2, getting the following orthogonal ba-
sis of W :
1 1
1 1
,
0 2
0 6
Now we normalize:
1 1
1 1
1 1
,
0 2
2 42
0 6
31
Thus an orthonormal basis of R4 containing
the given vectors is
1 1 1 1
1 1 1 1 1 1 1 1
, ,
,
1 2 0 2
3 7 2 42
0 1 0 6
32