You are on page 1of 6

MATH 110: LINEAR ALGEBRA

SPRING 2007/08
PROBLEM SET 4 SOLUTIONS
1. Let W
1
, W
2
be subspaces of V such that V = W
1
W
2
. Let W be a subspace of V . Show that
if W
1
W or W
2
W, then
W = (W W
1
) (W W
2
).
Is this still true if we omit the condition W
1
W or W
2
W ?
Solution. Without loss of generality, we will assume that W
1
W. So W W
1
= W
1
and
we just need to show that
W = W
1
(W W
2
).
By Problem 2(b) in Homework 2, the smallest subspace containing both W
1
and W W
2
is
W
1
+ (W W
2
); since W
1
W and W W
2
W, it follows that
W
1
+ (W W
2
) W. (1.1)
To show the reverse inclusion, let w W. Then w V . Since V = W
1
W
2
, there exist
w
1
W
1
and w
2
W
2
such that
w = w
1
+w
2
.
Observe that
w
2
= ww
1
W W
2
since w
2
W
2
and ww
1
W. Hence w = w
1
+w
2
W
1
+ (W W
2
). Since this is true for
arbitrary w W, we conclude that
W W
1
+ (W W
2
). (1.2)
By (1.1) and (1.2),
W = W
1
+ (W W
2
).
This is a direct sum since
(W W
1
) (W W
2
) W
1
W
2
= {0}.
The statement is false without the condition W
1
W or W
2
W. Here is a counter-example.
Let W = {(x, y) R
2
| x = y}, W
1
= {(x, y) R
2
| y = 0}, and W
2
= {(x, y) R
2
| x = 0}.
Then W
1
W
2
= R
2
since every (x, y) R
2
may be written uniquely as (x, y) = (x, 0) + (0, y).
However, W W
1
= {(0, 0)} = W W
2
and so
(W W
1
) (W W
2
) = {(0, 0)} = W.
2. For the following vector spaces V , nd the coordinate representation of the respective elements.
(a) V = P
2
= {ax
2
+ bx + c | a, b, c R}. Find [p(x)]
B
where
p(x) = 2x
2
5x + 6, B = [1, x 1, (x 1)
2
].
Solution. Since
p(x) = 2x
2
5x + 6 = 3 (x 1) + 2(x 1)
2
,
so
[p(x)]
B
=
_
_
3
1
2
_
_
.
Date: March 15, 2008 (Version 1.1).
1
(b) V = R
22
. Find [A]
B
where
A =
_
2 3
4 7
_
, B =
__
1 1
1 1
_
,
_
0 1
1 0
_
,
_
1 1
0 0
_
,
_
1 0
0 0
__
.
Solution. Let
_
2 3
4 7
_
=
_
1 1
1 1
_
+
_
0 1
1 0
_
+
_
1 1
0 0
_
+
_
1 0
0 0
_
.
Then solving
+ + = 2,
= 3,
+ = 4,
= 7,
gives = 7, = 11, = 21, = 30. So
[A]
B
=
_

_
7
11
21
30
_

_
.
(c) V = R
2
. Let R be xed. Find [v]
B
where
v =
_
x
y
_
, B =
__
cos
sin
_
,
_
sin
cos
__
.
Solution. Let
_
x
y
_
=
_
cos
sin
_
+
_
sin
cos
_
.
Then solving
x = cos sin,
y = sin + cos ,
gives
= xcos + y sin,
= xsin + y cos .
So
[v]
B
=
_
xcos + y sin
xsin + y cos
_
.
3. Let W
1
and W
2
be the following subspaces of R
4
.
W
1
= span
_

_
_

_
1
2
3
6
_

_
,
_

_
4
1
3
6
_

_
,
_

_
5
1
6
12
_

_
_

_
, W
2
= span
_

_
_

_
1
1
1
1
_

_
,
_

_
2
1
4
5
_

_
_

_
.
(a) Find a basis of W
1
W
2
.
Solution. Let the three vectors spanning W
1
be u
1
, u
2
, u
3
and the two vectors spanning
W
2
be v
1
, v
2
respectively. If w W
1
W
2
, then there exist
1
,
2
,
3
R and
1
,
2
R
such that

1
u
1
+
2
u
2
+
3
u
3
= w =
1
v
1
+
2
v
2
.
2
In other words, we want to solve
1

1
2
3
6

+2

4
1
3
6

+3

5
1
6
12

= 1

1
1
1
1

+2

2
1
4
5

.
Forming the augmented system to solve for
1
,
2
,
3
in terms of
1
,
2
(alternatively, we
could also solve for
1
and
2
in terms of
1
,
2
,
3
), we get upon Gauss-Jordan elimination,

1 4 5 1 + 22
2 1 1 1 2
3 3 6 1 + 42
6 6 12 1 + 52

. . .

1 0 1
1
4
1
2
9
2
0 1 1
1
3
1 +
5
9
2
0 0 0 1 + 32
0 0 0 1 32

.
For this system to be consistent, we must have
1
+ 3
2
= 0. Hence
w = 1

1
1
1
1

+2

2
1
4
5

= 2

1
2
1
2

for some
2
R. In other words,
W
1
W
2
= span{[1, 2, 1, 2]

}.
(b) Find a basis of W
1
+ W
2
.
Solution. We will do this in (e). For an alternative way, see the solution to Homework
5, Problem 1 from Fall 2007.
(c) Extend the basis of W
1
W
2
in (a) to get a basis of W
1
.
Solution. We will apply the Adding-On Algorithm to {[1, 2, 1, 2]

} and check each


of the three vectors u
1
, u
2
, u
3
in turn. Since

1
1
1
1

?
=

1
2
1
2

clearly has no solution, we add u


1
to get {[1, 2, 1, 2]

, [1, 1, 1, 1]

}. Since

4
1
3
6

= 1

1
2
1
2

+2

1
2
3
6

has a solution
1
= 9/4,
2
= 7/4, u
2
need not be added. Since

5
1
6
12

= 1

1
2
1
2

+2

1
2
3
6

has a solution
1
= 9/4,
2
= 11/4, u
3
need not be added. Hence {[1, 2, 1, 2]

, [1, 2, 3, 6]

}
is a basis for W
1
by the Adding-On Algorithm.
(d) Extend the basis of W
1
W
2
in (a) to get a basis of W
2
.
Solution. We will apply the Adding-On Algorithm to {[1, 2, 1, 2]

} and check each


of the two vectors v
1
, v
2
in turn. Since

1
2
3
6

?
=

1
2
1
2

3
clearly has no solution, we add v
1
to get {[1, 2, 1, 2]

, [1, 2, 3, 6]

}. Since

2
1
4
5

= 1

1
2
1
2

+2

1
1
1
1

has a solution
1
= 1,
2
= 3, v
2
need not be added. Hence {[1, 2, 1, 2]

, [1, 1, 1, 1]

} is
a basis for W
2
by the Adding-On Algorithm.
(e) From the bases in (c) and (d), obtain a basis of W
1
+ W
2
.
Solution. Recall from Homework 3, Problem 3(b) that
span(S
1
) + span(S
2
) = span(S
1
S
2
).
So from (c) and (d),
W
1
+ W
2
= span{[1, 2, 1, 2]

, [1, 2, 3, 6]

} + span{[1, 2, 1, 2]

, [1, 1, 1, 1]

}
= span{[1, 2, 1, 2]

, [1, 2, 3, 6]

, [1, 1, 1, 1]

}.
Since
1

1
2
1
2

+2

1
2
3
6

+3

1
1
1
1

0
0
0
0

implies
1
=
2
=
3
= 0. The vectors [1, 2, 1, 2]

, [1, 2, 3, 6]

, [1, 1, 1, 1]

are linearly
independent. Hence {[1, 2, 1, 2]

, [1, 2, 3, 6]

, [1, 1, 1, 1]

} is a basis for W
1
+ W
2
.
4. Let W
1
, W
2
, W
3
be subspaces of a vector space V .
(a) Show that
dim(W
1
+ W
2
) + dim(W
1
W
2
) = dim(W
1
) + dim(W
2
).
Solution. Let dim(W
1
W
2
) = k and let {u
1
, . . . , u
k
} be a basis for W
1
W
2
. Note that
{u
1
, . . . , u
k
} is a linearly independent subset of W
1
and by the Adding-On Algorithm, we
may add vectors v
1
, . . . , v
m
to it so that
{u
1
, . . . , u
k
, v
1
, . . . , v
m
}
is a basis for W
1
. Applying the same argument now to W
2
, we may also add vectors
w
1
, . . . , w
n
so that
{u
1
, . . . , u
k
, w
1
, . . . , w
n
}
is a basis for W
2
. Hence
dim(W
1
) = k + m, dim(W
2
) = k + n.
From Homework 3, Problem 3(b), we know that
W
1
+ W
2
= span({u
1
, . . . , u
k
, v
1
, . . . , v
m
} {u
1
, . . . , u
k
, w
1
, . . . , w
n
})
= span{u
1
, . . . , u
k
, v
1
, . . . , v
m
, w
1
, . . . , w
n
}.
We will show that {u
1
, . . . , u
k
, v
1
, . . . , v
m
, w
1
, . . . , w
n
} is linearly independent. Suppose

1
u
1
+ +
k
u
k
+
1
v
1
+ +
m
v
m
+
1
w
1
+ +
n
w
n
= 0. (4.3)
We let
v =
1
u
1
+ +
k
u
k
+
1
v
1
+ +
m
v
m
W
1
. (4.4)
Then (4.3) implies that
v =
1
w
1

n
w
n
W
2
. (4.5)
So v W
1
W
2
and since this has {u
1
, . . . , u
k
} as a basis, there exist
1
, . . . ,
k
such that
v =
1
u
1
+ +
k
u
k
. (4.6)
4
Now (4.5) and (4.6) together implies that

1
u
1
+ +
k
u
k
+
1
w
1
+ +
n
w
n
= 0;
and since {u
1
, . . . , u
k
, w
1
, . . . , w
n
} is linearly independent, we get

1
= =
k
=
1
= =
n
= 0.
Substituting
1
= =
n
= 0 into (4.3) gives

1
u
1
+ +
k
u
k
+
1
v
1
+ +
m
v
m
= 0;
and since {u
1
, . . . , u
k
, v
1
, . . . , v
n
} is linearly independent, we get

1
= =
k
=
1
= =
m
= 0.
Hence this shows that {u
1
, . . . , u
k
, v
1
, . . . , v
m
, w
1
, . . . , w
n
} is linearly independent and is
thus a basis for W
1
+ W
2
. Therefore dim(W
1
+ W
2
) = k + m + n. Finally, we have
dim(W
1
+ W
2
) + dim(W
1
W
2
) = (k + m + n) + k
= (k + m) + (k + n)
= dim(W
1
) + dim(W
2
).
(b) Suppose dim(W
1
+ W
2
) = dim(W
1
W
2
) + 1. Show that either W
1
W
2
or W
2
W
1
.
Solution. Since
W
1
W
2
W
1
W
1
+ W
2
,
by Theorem 3.14,
dim(W
1
W
2
) dim(W
1
) dim(W
1
+ W
2
).
From the condition given, we must have either
dim(W
1
) = dim(W
1
W
2
)
and in which case
W
1
= W
1
W
2
, (4.7)
or we have
dim(W
1
) = dim(W
1
+ W
2
)
and in which case
W
1
= W
1
+ W
2
. (4.8)
By Problems 2(b) and 2(c) in Homework 2, the case (4.7) implies that W
1
W
2
while the
case (4.8) implies W
2
W
1
.
(c) Show that
dim(W
1
W
2
W
3
) dim(W
1
) + dim(W
2
) + dim(W
3
) 2 dim(V ).
Solution. This follows directly from applying the formula in (a) twice:
dim(W
1
W
2
W
3
) = dim(W
1
) + dim(W
2
W
3
) dim(W
1
+ (W
2
W
3
))
= dim(W
1
) + dim(W
2
) + dim(W
3
) dim(W
2
+ W
3
) dim(W
1
+ (W
2
W
3
))
dim(W
1
) + dim(W
2
) + dim(W
3
) 2 dim(V )
where the last inequality follows from dim(W
2
+W
3
) dim(V ) and dim(W
1
+(W
2
W
3
))
dim(V ) (by Theorem 3.14).
5
5. Let V be a vector space over R. We have seen in Homework 1 Problem 2 that W = V V may
be made into a vector space over C with appropriate addition and scalar multiplication. W is
called the complexication of V . Show that
dim
C
(W) = dim
R
(V ).
Solution. Let dim
R
(V ) = n and B = {e
1
, e
2
, . . . , e
n
} be a basis of V . We claim that
B

= {(e
1
, 0), (e
2
, 0), . . . , (e
n
, 0)}
is a basis of W and so dim
C
(W) = n too. Let (u, v) W. Then since u V and v V , there
exists a
1
, . . . , a
n
R and b
1
, . . . , b
n
R such that
u = a
1
e
1
+ + a
n
e
n
and v = b
1
e
1
+ + b
n
e
n
.
Hence
(u, v) = (a
1
e
1
+ + a
n
e
n
, b
1
e
1
+ + b
n
e
n
)
= (a
1
e
1
, b
1
e
1
) (a
n
e
n
, b
n
e
n
)
= (a
1
+ b
1
i) (e
1
, 0) (a
n
+ b
n
i) (e
n
, 0)
since
(a
j
+ b
j
i) (e
j
, 0) = (a
j
e
j
b
j
0, b
j
e
j
+ a
j
0) = (a
j
e
j
, b
j
e
j
).
Hence B

spans W. To show linear independence, let a


1
+ b
1
i, . . . , a
n
+ b
n
i C be such that
(a
1
+ b
1
i) (e
1
, 0) (a
n
+ b
n
i) (e
n
, 0) = (0, 0),
that is,
(a
1
e
1
+ + a
n
e
n
, b
1
e
1
+ + b
n
e
n
) = (0, 0),
which is equivalent to
a
1
e
1
+ + a
n
e
n
= 0 and b
1
e
1
+ + b
n
e
n
= 0.
So a
1
= = a
n
= 0 and b
1
= = b
n
= 0 by the linear independence of e
1
, . . . , e
n
and so
a
1
+ b
1
i = = a
n
+ b
n
i = 0 + 0i.
Hence (e
1
, 0), . . . , (e
n
, 0) are linearly independent over C.
6

You might also like