You are on page 1of 6

Linear Algebra

Final exam - Groups 19/59-29-39

Questions (0.5 points each, 2 points total)


Q1: Let A be an mn matrix with p pivot columns. Compute dim(Col(A)),
dim(Nul(A)) and dim(Row(A)).
The dimension of the columns space will be equal to the number of pivot
columns, so dim(Col(A)) = p; the dimension of the null space of A will
be equal to the number of free variables, that is, the number of non-pivot
columns, so dim(Nul(A)) = n p; the dimension of the row space of A will
be the number of pivot columns of AT , which is equal to the number of pivot
columns of A, so dim(Row(A)) = p.
Are the following statements true or false? Justify your answers by briefly
citing appropriate facts or theorems if true or, if false, explain why or give
counterexamples:
Q2: If v1 , . . . , v4 are linearly independent vectors in R4 , then {v1 , v2 , v3 }
is also linearly independent.
True. Infact, if the equation x1 v1 + x2 v2 + x3 v3 = 0 had a nontrivial
solution, then so would the equation x1 v1 + x2 v2 + x3 v3 + 0v4 = 0. But that
cannot happen because {v1 , . . . , v4 } is a linearly independent set. Then, the
set {v1 , . . . , v3 } is linearly independent.
Q3: If A is an invertible n n matrix, then the equation Ax = b is
consistent for each b in Rn .
True. This follows from theorem 5 in section 2.2 of Lays book.
Q4: If the distance from u to v equals the distance from u to v, then
u and v are orthogonal.
True. The squared distance from u to v is
ku vk2 = (u v) (u v) = kuk2 + kvk2 2u v
1

while the squared distance from u to v is


ku + vk2 = (u + v) (u + v) = kuk2 + kvk2 + 2u v
If we suppose that the distances are equal, then the same holds for the square
distances. Then
kuk2 + kvk2 2u v = kuk2 + kvk2 + 2u v
and this means
uv =0
and so u and v are orthogonal.
Problems (1 point each, 8 points total)
P1: Let T : R3 R3 with T (x) = Ax and

1 2 4
A = 1 3 3
2 0 3
Find Range(T ) and discuss whether T is onto, one-to-one, both or none of
them.
To find Range(T ) is to find the set of vectors b R3 such that the
equation Ax = b has a solution. The augmented matrix of the system is

1 2 4 b1
1 2 4
b1
1 3 3 b2 0 5 7

b1 + b2
2 0 3 b3
0 0 4 6b1 + 4b2 + 5b3
The system is always consistent, so Range(T ) = R3 . Since the range is the
whole codomain, the transformation is by definition onto R3 , and since the
solution is unique, the transformation is also one-to-one.
P2: Suppose A, B and X are n n invertible matrices, and also A AX
is invertible. Consider the equation
(A AX)1 = X 1 B
Solve for X. If you need to invert a matrix, explain why that matrix is
invertible.

Multiplying both members on the left by (A AX), we get


I = (A AX)X 1 B
Multiplying both members on the right by B 1 X, we get
B 1 X = A AX
taking X on the left hand side and pulling out the common factor, we get
(B 1 + A)X = A
which means that A is the product of two matrices. Then, since A is invertible, so is the product (B 1 + A)X; since X is known to be invertible, by the
theorem about the inverse of a product, also (B 1 + A) is invertible. Then,
multiplying both members on the left by (B 1 + A)1 , we get the solution
X = (B 1 + A)1 A
P3: Assume that the matrix A is row equivalent

1 0
1 0 3 0
0 1
2 1 8 0

A=
4 1 14 0 B = 0 0
0 0
2 1 7 1

to B where

0 3
0 2

1 1
0 0

Without calculations, list rank(A) and dim(Nul(A)). Then, find bases for
Col(A) and Nul(A).
The rank of A is the dimension of the column space of A, which is equal
to the number of pivot columns of A which can be read directly from the
matrix B. So, rank(A) = 3. The dimension of the null space is equal to the
number of free variables, so dim(Nul(A)) = 1. A basis for Col(A) will be
given by the pivot columns of A:

3
0
1


2 1 8

BCol(A) = , ,
14
1
4

7
1
2
For Nul(A), we need the reduced echelon form, but B is already in reduced
echelon form and therefore we can read the solution directly:


3
x1
2
x2
= x4
1
x3
1
x4
3

Then, a basis for Nul(A) will be given by



3

BNul(A) =
1

1
P4: Consider the following set
 

x
W =
: xy 0
y
Show that W is NOT a subspace of R2 .
We can show that byfinding
 two vectors
  in W such that their sum is not
2
1
are both in W , but their sum is
and v =
in W ; for example, u =
3
7
 
1
which is not in W . Then, W is not closed under vector addiction,
z=
4
and therefore is not a subspace of R2 .
P5: Find the eigenvalues of the following matrix

1 3 3
A= 3 1 3
3 3 1
Discuss whether it is diagonalizable or not WITHOUT computing the eigenvectors.
Expanding det(A I) over the first row, we get
(1 )[(1 )2 9] 3[3(1 ) 9] + 3[9 3(1 )] = 0
the first term can be expanded as a difference of squares, while the second
and third term are the same except for a sign. Then, we get
(1 )[(1 ) + 3][(1 ) 3] 6[3(1 ) 9] = 0
and pulling out the common factor (1 ) 3, we get
[(1 ) 3]{(1 )[(1 ) + 3] 18} = 0
4

That is
(2 )[2 5 14] = 0
From which we read that one solution is 1 = 2; The quadratic equation
has got solutions 2 = 2 and 3 = 7, so that the eigenvalues are 2
with multiplicity 2 and 7 with multiplicity one. The matrix is diagonalizable
because its symmetric.
P6: The eigenvalues of the matrix

1 0 1
A= 0 1 1
0 0 2
are 1, 1 and 2. Compute its eigenvectors and discuss whether A is diagonalizable or not.
The eigenvectors are obtained by solving for each eigenvalue the homogeneous system (A I)x = 0, finding



1
0
1

v=1 = 0
v=1 = 1
v=2 = 3
0
0
3
Then, A is diagonalizable because the dimension of each eigenspace is equal
to the algebraic multiplicity of the corresponding eigenvalue. Also, it was
straightforward to see that its diagonalizable because A is a n n matrix
with n distinct eigenvalues.
P7: Find the matrix Q corresponding to the QR factorization of

1 1 2
A = 0 2 6
1 3 2
We apply the Gram-Schmidt orthogonalization procedure (we refer to the

columns of A as x1 , x2 , x3 ):

1

v1 = x1 = 0
1

2
x2 v1
v2 = x2
v1 = 2
v1 v1
2


2
x3 v1
x3 v2

4
v3 = x3
v1
v2 =
v1 v1
v2 v2
2

Normalizing the so obtained vectors we can form Q:



1/6
1/ 2 1/ 3
Q = 0 1/3 2/ 6
1/ 2 1/ 3 1/ 6
P8: Given

1 2
A= 1 2
1 2

Find a least square solution for Ax = b.

1
b= 0
2

We compute
T

A A=

3 2
2 12

A b=

1
2

and with this we solve the normal equations AT A


x = AT b to obtain


1/4
=
x
1/8

You might also like