Professional Documents
Culture Documents
ISBN-13: 9781502901811
ISBN-10: 1502901811
We recommend that you review your book three times, with each time
focusing on a different aspect.
Once you are satisfied with your review, you can approve your proof
and move forward to the next step in the publishing process.
To print this proof we recommend that you scale the PDF to fit the size
of your printer paper.
About the Author
Mohammed Kaabar is a math tutor at the Math
Learning Center (MLC) at Washington State
Table of Contents
University, Pullman, and he is interested in linear
algebra, scientific computing, numerical analysis, 1 Systems of Linear Equations 1
differential equations, and several programming
languages such as SQL, C#, Scala, C++, C, JavaScript, 1.1 Row Operations Method……………….........….1
Python, HTML 5 and MATLAB. He is a member of of 1.2 Basic Algebra of Matrix……………………….17
Institute of Electrical and Electronics Engineers
(IEEE), IEEE Antennas and Propagation Society, 1.3 Linear Combinations…………………….…….19
IEEE Consultants Network, IEEE Smart Grid 1.4 Square Matrix……………………….………….29
Community, IEEE Technical Committee on RFID,
IEEE Life Sciences Community, IEEE Green. ICT 1.5 Inverse Square Matrix…………………..…….33
Community, IEEE Cloud Computing Community, 1.6 Transpose Matrix……………..………….…….42
IEEE Internet of Things Community, IEEE Committee
on Earth Observations, IEEE Electric Vehicles 1.7 Determinants……….……………………..…....46
Community, IEEE Electron Devices Society, IEEE 1.8 Cramer’s Rule……………………………..…….53
Communications Society, and IEEE Computer Society.
He participated in several competitions, conferences, 1.9 Adjoint Method………………………………….55
research papers and projects. He is an online 1.10 Exercises………………………………...……….60
instructor of numerical analysis at Udemy Inc, San
Francisco, CA. In addition, he is also a Technical
Program Committee (TPC) member, reviewer and
presenter at CCA-2014, WSMEAP 2014, EECSI 2014, 2 Vector Spaces 62
JIEEEC 2013 and WCEEENG 2012. He worked as
2.1 Span and Vector Space...................................62
electrical engineering intern at Al-Arabia for Safety
and Security L.L.C. He also received several 2.2 The Dimension of Vector Space………….......65
educational awards and certificates from accredited
2.3 Linear Independence……….………………….66
institutions. For more information about the author
and his free online courses, please visit his personal 2.4 Subspace and Basis…………………………….69
website: http://www.mohammed-kaabar.net.
2.5 Exercises…………………………………….…...82
3 Homogeneous Systems 84
Introduction
3.1 Null Space and Rank......................................84
In this book, I wrote five chapters: Systems of Linear
3.2 Linear Transformation…………………..…….91
Equations, Vector Spaces, Homogeneous Systems,
3.3 Kernel and Range …………………...….…….100 Characteristic Equation of Matrix, and Matrix Dot
Product. I also added exercises at the end of each
3.4 Exercises……………………………………..…105
chapter above to let students practice additional sets of
problems other than examples, and they can also check
their solutions to some of these exercises by looking at
4 Characteristic Equation of Matrix 107 “Answers to Odd-Numbered Exercises” section at the
end of this book. This book is very useful for college
4.1 Eigenvalues and Eigenvectors……..…..……107 students who studied Calculus I, and other students
4.2 Diagonalizable Matrix…...............................112 who want to review some linear algebra concepts
before studying a second course in linear algebra.
4.3 Exercises………………………………………...115 According to my experience as a math tutor at Math
Learning Center, I have noticed that some students
have difficulty to understand some linear algebra
5 Matrix Dot Product 116 concepts in general, and vector spaces concepts in
particular. Therefore, my purpose is to provide
5.1 The Dot Product in Թ ..................................116 students with an interactive method to explain the
5.2 Gram-Schmidt Orthonormalization ……….118 concept, and then provide different sets of examples
related to that concept. If you have any comments
5.3 Exercises........................................................120 related to the contents of this book, please email your
comments to mohammed.kaabar@email.wsu.edu.
I wish to express my gratitude and appreciation to my
Answers to Odd-Numbered Exercises 121 father, my mother, and my brother. I would also like to
give a special thanks to my mathematics professor
Dr. Ayman Badawi, Professor of Mathematics &
Bibliography 125 Statistics at AUS, for his brilliant efforts in revising
the content of this book and giving me the permission
to use his lecture notes and his online resources as a
guide to write this book, and I would also like to thank
all math professors at Washington State University,
Pullman. Ultimately, I would appreciate to consider
this book as a milestone for developing more math
books that can serve our mathematical society.
Chapter 1 2 M. Kaabar
݊ ൈ ݉ systems of linear equation. Now, we need to substitute the value of y in one of the
Example 1.1.1 Solve for x and y for the following ʹ ൈ ʹ two original equations. Let’s substitute y in the first
3
As we can see from the above augmented matrix, the 6 M. Kaabar
first column represents the coefficients of x1 in the
three linear equations. The second column represents -2R1+R2--Æ R2
the coefficients of x2 in the three linear equations. The (This means that we multiply the first row by -2 and
third column represents the coefficients of x3 in the we add it to the second row and the change will be only
three linear equations. The fourth column is not an in the second row and no change in the first row)
only in the first row and no change in the second row) definitions.
Ͳ ͳۇ ͵ inconsistent.
Ͳ ተʹۊ
ͳ Ͳۈ െͳ ۋ Definition 1.1.2 ݊ ൈ ݉ system of linear equations has a
Ͳ Ͳۈ ͳ ተ͵ۋ
ͷ unique solution if each variable has exactly one value.
ۉ ͵ی
Now, let’s apply what we have learned from example
Step 5: Move to the third row and do the same as we
1.1.3 for the following example 1.1.4.
did in the first row and the second row of matrix.
Example 1.1.4 Solve for x1, x2, x3, x4 and x5 for the
R3+R2--Æ R2
(This means that we add the third row to the second following ͵ ൈ ͷ system of linear equations:
ݔଶ െ ݔଷ ݔସ െ ݔହ ൌ ͳ
row and the change will be only in the second row and
൝ െʹݔଵ ݔଷ െ ݔହ ൌ Ͳ
no change in the third row) െݔଶ ݔଷ ʹݔସ െ ͳͲݔହ ൌ ͳʹ
Hence, we obtain: Solution: In the above system, we have 3 equations
Ͷ and 5 unknown variables x1, x2, x3, x4 and x5. To solve
͵ this ͵ ൈ ͷ system, we need to use Row Operation
ͳۇ Ͳ Ͳተۊ
Ͳۈ Method. First, we need to construct the augmented
ͳ Ͳ ۋ
Ͳۈ Ͳ ͳተ͵ۋ matrix.
ͷ x1 x2 x3 x4 x5 C
ۉ ͵ی
ସ ହ Ͳ ͳ െͳ ͳ െͳ ͳ
Therefore, x1= , x2= and x3= ൭െʹ Ͳ ͳ Ͳ െͳ อ Ͳ ൱
ଷ ଷ ଷ
Ͳ െͳ ͳ ʹ െͳͲ ͳʹ
These are one solution only.
The leader number here is 1.
Hence, the system has a unique solution (one solution).
Ͳ ͳ െͳ ͳ െͳ ͳ
൭െʹ Ͳ ͳ Ͳ െͳ อ Ͳ ൱
7 Ͳ െͳ ͳ ʹ െͳͲ ͳʹ
By doing the same steps in example 1.1.3, we obtain 10 M. Kaabar
the following:
R1+R3--Æ R3 ͺ ͳͲ
ݔଶ െ ݔଷ ݔହ ൌ െ
͵ ͵
x1 x2 x3 x4 x5 C
ͳ ͳ
Ͳ ͳ െͳ ͳ െͳ ͳ ݔଵ െ ݔଷ ݔହ ൌ Ͳ
ʹ ʹ
൭െʹ Ͳ ͳ Ͳ െͳ อ Ͳ ൱ ͳͳ ͳ͵
Ͳ Ͳ Ͳ ͵ െͳͳ ͳ͵ ݔସ െ ݔହ ൌ
͵ ͵
ିଵ
R2 Since we have 3 leader numbers (numbers equal to 1)
ଶ
ͳ ͷ ͵ ͳ ͷ ͵
Hence, AB = ͵ ͳͷ ͻ൩ Hence, AB = ͵ ͳͷ ͻ൩
ͳ ͵ ʹ ͳ ͵ ʹ
Part b: As we did in part a but the difference here is Part d: Since AB = C, then c23 means that we need to
rows instead of columns. find the number that is located in 2nd row and 3rd
Step 1: 1st row of AB: column.
ܿଵଵ ܿଵଶ ܿଵଷ ͳ ͷ ͵
ܿ ܿ ܿ
C = ଶଵ ଶଶ ଶଷ ൌ ͵ ͳͷ ͻ൩
൩
1 * ሾͳ ͳ ͳሿ + 2 * ሾ Ͳ ʹ ͳሿ = ሾͳ ͷ ͵ሿ
ܿଷଵ ܿଷଶ ܿଷଷ ͳ ͵ ʹ
Now, let’s explain each element of matrix C.
Step 2: 2nd row of AB:
c11 means that the number that is located in 1st row
and 1st column. c11 = 1.
3 * ሾͳ ͳ ͳሿ + 6 * ሾ Ͳ ʹ ͳሿ = ሾ͵ ͳͷ ͻሿ
c12 means that the number that is located in 1 st row
and 2nd column. c12 = 5.
Step 3: 3rd row of AB:
c13 means that the number that is located in 1 st row
and 3rd column. c13 = 3.
1 * ሾͳ ͳ ͳሿ + 1 * ሾ Ͳ ʹ ͳሿ = ሾͳ ͵ ʹሿ
c21 means that the number that is located in 2 nd row
and 1st column. c21 = 3.
ͳ ͷ ͵
Hence, AB = ͵ ͳͷ ͻ൩ c22 means that the number that is located in 2 nd row
ͳ ͵ ʹ and 2nd column. c22 = 15.
c23 means that the number that is located in 2 nd row
Part c: Here we use the usual matrix multiplication. and 3rd column. c23 = 9.
c31 means that the number that is located in 3 rd row
ͳ ʹ and 1st column. c31 = 1.
ͳ ͳ ͳ
AB =͵ ൩ כቂ ቃ c32 means that the number that is located in 3 rd row
Ͳ ʹ ͳ
ͳ ͳ
and 2nd column. c32 = 3.
c33 means that the number that is located in 3 rd row
23
and 3rd column. c33 = 2.
Hence, c23 = 9. 26 M. Kaabar
Part e: As we did in part a. Since B has an Example 1.3.4 Given the following matrix A and
orderሺሻʹ ൈ ͵ǡ and A has an orderሺሻ͵ ൈ ʹ, then matrix B:
BA will have an orderሺሻʹ ൈ ʹ according to ͳ Ͳ Ͳ Ͳ
A=ቂ ቃ , B= ቂ ቃ
Definition 1.3.1. Ͳ Ͳ ͳ Ͳ
a) Find AB.
Step 1: 1st column of BA:
b) Find BA.
ͳ ͳ ͳ ͷ
1 *ቂ ቃ + 3 * ቂ ቃ + 1 * ቂ ቃ = ቂ ቃ Solution: Part a: Using the usual matrix
Ͳ ʹ ͳ
multiplication, we obtain:
Step 2: 2nd column of BA: ͳ Ͳ Ͳ Ͳ Ͳ Ͳ
AB = ቂ ቃቂ ቃ=ቂ ቃ
Ͳ Ͳ ͳ Ͳ Ͳ Ͳ
ͳ ͳ ͳ ͻ
2 *ቂ ቃ + 6 * ቂ ቃ + 1 * ቂ ቃ = ቂ ቃ Part b: Using the usual matrix multiplication, we
Ͳ ʹ ͳ ͳ͵
obtain:
ͷ ͻቃ Ͳ Ͳ ͳ Ͳ Ͳ Ͳ
Hence, BA = ቂ BA = ቂ ቃቂ ቃ= ቂ ቃ
ͳ͵ ͳ Ͳ Ͳ Ͳ ͳ Ͳ
Part f: As we did in part b. Result 1.3.2 It is possible that the product of two non-
Step 1: 1st row of AB: zero matrices is a zero matrix. However, this is not
true for product of real numbers.
1 * ሾͳ ʹሿ + 1 * ሾ ͵ ሿ + 1 * ሾͳ ͳሿ = ሾͷ ͻሿ
Example 1.3.5 Given the following matrix A, matrix B
Step 2: 2nd row of AB: and matrix D:
ͳ Ͳ Ͳ Ͳ Ͳ Ͳ
A=ቂ ቃ , B= ቂ ቃ , D= ቂ ቃ
0 * ሾͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ
ʹሿ + 2 * ሾ ͵ ሿ + 1 * ሾͳ ͳሿ = ሾ ͳ͵ሿ
a) Find AB.
b) Find AD.
ͷ ͻቃ
Hence, BA = ቂ
ͳ͵
Solution: Part a: Using the usual matrix
Result 1.3.1 In general, matrix multiplication is not multiplication, we obtain:
commutative (i.e. AB is not necessarily equal to BA). ͳ Ͳ Ͳ Ͳ Ͳ Ͳ
AB = ቂ ቃቂ ቃ=ቂ ቃ zero matrix.
Ͳ Ͳ ͳ Ͳ Ͳ Ͳ
25
Part b: Using the usual matrix multiplication, we 28 M. Kaabar
obtain:
ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Now, let ݔଵ ൌ ͳǡ ݔଶ ൌ ͳǡ ݔଷ ൌ ͳǤ
AD = ቂ ቃቂ ቃ=ቂ ቃ zero matrix.
Ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ ͳ ͳ ͳ ͵
െͳ െͳ ʹ൩ ͳ൩ ൌ Ͳ൩
Result 1.3.3 In general, If AB = AD and A is not a zero Ͳ ͳ Ͷ ͳ ͷ
matrix, then it is possible that B ് D.
Definition 1.3.2 Suppose matrix A has an order (size) Result 1.3.4 Let CX = A be a system of linear equations
ݔଵ ܽଵ
݊ ൈ ݉ , A is a zero matrix if each number of A is a zero ݔ ۍଶ ܽ ۍ ېଶ ې
number. C ێǤ ێ= ۑǤ ۑ
ێǤ ێ ۑǤ ۑ
ݔ ۏ ܽ ۏ ے ے
Example 1.3.5 Given the following system of linear
Then, ଵ ൌ ଵ ǡ ଶ ൌ ଶ ǡ ǥ Ǥ ǡ ୬ ൌ ୬ is a solution to the
equations:
ݔଵ ݔଶ ݔଷ ൌ ͵ system if and only if ଵ ଵ ଶ ଶ ڮ ୬ ୫ ൌ ǡC1 is
൝െݔଵ െ ݔଶ ʹݔଷ ൌ Ͳ the 1st column of C, C2 is the 2nd column of C, …. , Cm is
ݔଶ Ͷݔଷ ൌ ͷ nth column of C.
Write the above system in matrix-form.
Solution: We write the above system in matrix-form as Example 1.3.6 Given the following system of linear
follows: equations:
(Coefficient Matrix) ή (Variable Column) = (Constant ݔ ݔଶ ൌ ʹ
൜ ଵ
Column) ʹݔଵ ʹݔଶ ൌ Ͷ
CX = A where C is a Coefficient Matrix, X is a Variable Write the above system in matrix-form.
Column, and A is a Constant Column. Solution: We write the above system in matrix-form
ͳ ͳ ͳ ݔଵ ͵ CX = A as follows:
െͳ െͳ ʹ൩ ݔଶ ൩ ൌ Ͳ൩ ͳ ͳ ͳݔ ʹ
C=ቂ ቃ , X=ቂ ቃ , A=ቂ ቃ
Ͳ ͳ Ͷ ݔଷ ͷ ʹ ʹ ʹݔ Ͷ
The above matrix-form means the following: Hence, CX = A:
ͳ ͳ ͳ ͳ ͳ ݔଵ ʹ
ቂ ቃቂ ቃ = ቂ ቃ
ݔଵ כെͳ൩ ݔଶ כെͳ൩ ݔଷ כʹ൩ ʹ ʹ ݔଶ Ͷ
Ͳ ͳ Ͷ ͳ ͳ ʹ
ݔଵ כቂ ቃ ݔଶ כቂ ቃ ൌ ቂ ቃ
Hence, we obtain: ʹ ʹ Ͷ
ݔଵ ݔଶ ݔଷ ൌ ͵ Now, we need to choose values for x1 and x2 such that
͵
െݔଵ െ ݔଶ ʹݔଷ ൌ Ͳ൩ ൌ Ͳ൩ these values satisfy that above matrix-form.
ݔଶ Ͷݔଷ ൌ ͷ ͷ First, let’s try x1 = 3 and x2 = 4.
ͳ ͳ ʹ
27 ͵ כቂ ቃ Ͷ כቂ ቃ ൌ ቂ ቃ ് ቂ ቃ
ʹ ʹ ͳͶ Ͷ
Therefore, x1 = 3, x2 = 4 is not solution, and our 30 M. Kaabar
assumption is wrong.
Now, let’s try x1 = 0 and x2 = 2. Fact 1.4.5 Թൈ ൌ ଶ ሺԹሻ = set of all ʹ ൈ ʹ matrices.
ͳ ͳ ʹ ʹ
Ͳ כቂ ቃ ʹ כቂ ቃ ൌ ቂ ቃ ൌ ቂ ቃ
ʹ ʹ Ͷ Ͷ Now, we give some helpful notations:
Therefore, x1 = 0, x2 = 2 is solution.
Ժ: Set of all integers.
Է: Set of all rational numbers.
Result 1.3.5 Let CX = A be a system of linear
Թ : Set of all real number.
equations. The constant column A can be written
Գ : Set of all natural numbers.
uniquely as a linear combination of the columns of C.
ͳ Ͳ Ͳ Ͳ Ͳ ͳ
Part c: According to the given four row-operations ଷ ൌ Ͳ ͳ Ͳ൩ R3ÅÆ R1Ͳ ͳ Ͳ൩ (Step Four)
steps, we need to find four elementary matrices from Z Ͳ Ͳ ͳ ͳ Ͳ Ͳ
to C which means m=4 because we want to go 4 steps
forward from Z to C. From part a, we already have the Hence, we obtain the following four elementary
following: matrices:
ͳ Ͳ Ͳ െʹ Ͳ Ͳ
ଷ ൌ Ͳ ͳ Ͳ൩-2R1Æ Ͳ ͳ Ͳ൩ (Step One)
Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ െʹ Ͳ Ͳ
Ͳ ͳ Ͳ൩െ͵ ͳ Ͳ൩Ͳ ͳ Ͳ൩ Ͳ ͳ Ͳ൩ ൌ
ͳ Ͳ Ͳ Ͳ Ͳ ͳ ʹ Ͳ ͳ Ͳ Ͳ ͳ
െʹ Ͳ Ͳ
Hence, Ͳ ͳ Ͳ൩ ൌ
Ͳ Ͳ ͳ Part d: According to the given three row-operations
steps from B to Z, we need to find three elementary
ͳ Ͳ Ͳ ͳ Ͳ Ͳ matrices which means n=3 because we want to go 3
ଷ ൌ Ͳ ͳ Ͳ൩2R1+R3--Æ R3Ͳ ͳ Ͳ൩ (Step Two) steps backward from B to Z. Backward steps mean
Ͳ Ͳ ͳ ʹ Ͳ ͳ that we need to do inverse steps (i.e. The inverse of
ଵ
-2R1 is െ R1 because it is row-multiplication step). We
ͳ Ͳ Ͳ െʹ Ͳ Ͳ ଶ
Hence, Ͳ ͳ Ͳ൩ Ͳ ͳ Ͳ൩ ൌ start from the third step as follows:
ʹ Ͳ ͳ Ͳ Ͳ ͳ The inverse of -3R1+R2--Æ R2 is 3R1+R2--Æ R2 because
it is row-addition step.
ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ
ଷ ൌ Ͳ ͳ Ͳ൩-3R1+R2--Æ R2െ͵ ͳ Ͳ൩ (Step Three) ଷ ൌ Ͳ ͳ Ͳ൩3R1+R2--Æ R2 ͵ ͳ Ͳ൩ (Step Three)
Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ
ͳ Ͳ Ͳ
ͳ Ͳ Ͳ ͳ Ͳ Ͳ െʹ Ͳ Ͳ Hence, ͵ ͳ Ͳ൩ ൌ
Hence, െ͵ ͳ Ͳ൩Ͳ ͳ Ͳ൩ Ͳ ͳ Ͳ൩ ൌ Ͳ Ͳ ͳ
Ͳ Ͳ ͳ ʹ Ͳ ͳ Ͳ Ͳ ͳ The inverse of 2R1+R3--Æ R3 is -2R1+R3--Æ R3 because
it is row-addition step.
ͳ Ͳ Ͳ ͳ Ͳ Ͳ
35 ଷ ൌ Ͳ ͳ Ͳ൩-2R1+R3--Æ R3 Ͳ ͳ Ͳ൩ (Step Two)
Ͳ Ͳ ͳ െʹ Ͳ ͳ
38 M. Kaabar
ͳ Ͳ Ͳ ͳ Ͳ Ͳ
Hence, Ͳ ͳ Ͳ൩ ͵ ͳ Ͳ൩ ൌ Part f: According to the given last row-operations step
െʹ Ͳ ͳ Ͳ Ͳ ͳ
ଵ from B to C, we need to find one elementary matrix
The inverse of -2R1 is െ R1 because it is row-
ଶ because we want to go 1 step forward from B to C.
multiplication step ͳ Ͳ Ͳ Ͳ Ͳ ͳ
ଷ ൌ Ͳ ͳ Ͳ൩ R3ÅÆ R1Ͳ ͳ Ͳ൩
ͳ Ͳ Ͳ െ
ଵ
Ͳ Ͳ Ͳ Ͳ ͳ ͳ Ͳ Ͳ
ଵ ଶ
ଷ ൌ Ͳ ͳ Ͳ൩ െ ଶR1Æ Ͳ ͳ Ͳ (Step One)
Ͳ Ͳ ͳ Ͳ Ͳ ͳ
Ͳ Ͳ ͳ Hence, we obtain: Ͳ ͳ Ͳ൩ ൌ . This means that
ͳ Ͳ Ͳ
Hence, we obtain the following three elementary Ͳ Ͳ ͳ
matrices: X = Ͳ ͳ Ͳ൩.
ͳ Ͳ Ͳ
ͳ
െ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ Example 1.5.2 Given the following matrix:
൦ ʹ ൪ Ͳ ͳ Ͳ ൩ ͵ ͳ Ͳ൩ ൌ ʹ ͳ
Ͳ ͳ Ͳ ൌቂ ቃ
െʹ Ͳ ͳ Ͳ Ͳ ͳ Ͷ Ͳ
Ͳ Ͳ ͳ
Find A-1 if possible.
Part e: According to the given last row-operations step
from C to B, we need to find one elementary matrix Solution: Finding A-1 if possible means that we need to
because we want to go 1 step backward from C to B. find a possible inverse matrix called A-1 such that
The inverse of R3ÅÆ R1 is R1ÅÆ R3 which is the same AA-1 = ଶ = A-1A. To find this possible matrix A-1, we
as R3ÅÆ R1. need to do the following steps:
ͳ Ͳ Ͳ Ͳ Ͳ ͳ Step 1: Write A and ଶ in the following form:
ଷ ൌ Ͳ ͳ Ͳ൩ R3ÅÆ R1Ͳ ͳ Ͳ൩
Ͳ Ͳ ͳ ͳ Ͳ Ͳ
ሺȁʹ ሻ
Ͳ Ͳ ͳ ʹ ͳ ͳ Ͳ
Hence, we obtain: Ͳ ͳ Ͳ൩ ൌ . This means that ቀ ቚ ቁ
Ͷ Ͳ Ͳ ͳ
ͳ Ͳ Ͳ
Ͳ Ͳ ͳ Step 2: Do some Row-Operations until you get the
S = Ͳ ͳ Ͳ൩. following:
ͳ Ͳ Ͳ ሺʹ ǫ ǡ ൌ െͳ ȁሻ
37 ሺʹ ǫ ǡ ȁሻ
Now, let’s do the above step until we get the 40 M. Kaabar
Completely-Reduced-Echelon matrix.
ʹ ͳ ͳ Ͳ
ͳ ͳ Solution: Finding A-1 if possible means that we need to
ͳ Ͳ
ቀ ቚ ቁ െ ଶR1Æ ൬
ଵ
ฬ
ʹ ʹ ൰ find a possible inverse matrix called A-1 such that
Ͷ Ͳ Ͳ ͳ Ͷ Ͳ Ͳ ͳ
AA-1 = ଷ = A-1A. To find this possible matrix A-1, we
ͳ ͳ
ଵ ଵ
need to do the following steps:
ͳ Ͳ
൬ ฬ
ʹ ʹ ൰-4R1+R2--Æ R2ቆͳ ቤ ଶ
ଶ
Ͳ
ቇ Step 1: Write A and ଷ in the following form:
Ͳ െʹ െʹ ͳ
Ͷ Ͳ Ͳ ͳ ሺȁ͵ ሻ
ଵ ଵ ଵ
ଵ
Ͳ ͳ Ͳ ʹ ͳ Ͳ Ͳ
ͳ Ͳ ଵ ͳ ଶ
ቆ ଶ ቤ ଶ ቇ െ R2Æ ቌ ଶቮ ଵቍ
Ͳ െʹ െʹ ͳ ଶ
Ͳ ͳͳ െ ቆͲ ͳ ͲቤͲ ͳ Ͳቇ
ଶ
Ͳ െͳ ͳ Ͳ Ͳ ͳ
ଵ ଵ
Step 2: Do some Row-Operations until you get the
ଵ Ͳ ͳ ͲͲ following:
ቌͳ ଶ ଵ ସ
ଶቮ ଵቍ െ ଶR2+R1--Æ R1ቌͲ ቮ
ͳͳ ଵቍ
Ͳ ͳͳ െ
ଶ
െ
ଶ ሺ͵ ǫ ǡ ൌ െͳ ȁሻ
ሺ͵ ǫ ǡ ȁሻ
Since we got the Completely-Reduced-Echelon matrix
Now, let’s do the above step until we get the
which is the identity matrix ଶ , then A has an inverse
Completely-Reduced-Echelon matrix.
matrix which is A-1.
ͳ Ͳ ʹ ͳ Ͳ Ͳ
ଵ
Ͳ
Hence, A-1 = ସ
ଵ
ቆͲ ͳ ͲቤͲ ͳ ͲቇR2+R3--Æ R3
ͳ െ
ଶ Ͳ െͳ ͳ Ͳ Ͳ ͳ
ͳ Ͳ Ͳ ͳ െʹ െʹ
ቆͲ ͳ ͲቤͲ ͳ Ͳ ቇ
39
Ͳ Ͳ ͳ Ͳ ͳ ͳ
42 M. Kaabar
, and with some row-operations, we got Ԣ and Ԣ as Definition 1.6.1 Given ݊ ൈ ݉ matrix A, is called A
transpose and it is ݉ ൈ ݊ matrix. We obtain from A
follows: ሺȁ݊ ሻ Row-Operations-Æ ሺԢȁ݊ Ԣሻ
by making columns of A rows or by making rows of A
Then,ᇱ ൌ Ԣ.
columns.
Result 1.5.3 Given ݊ ൈ ݊ matrix A and identity matrix
Example 1.6.1 Given the following matrix:
ିଵ
, and with some row-operations, we got and as ʹ ͵ ͳ Ͳ
ൌ ͷ ͳʹ ͵൩
follows: ሺȁ݊ ሻ Row-Operations-Æ ሺ݊ ȁ ሻ
െͳ
ͳ ͳ Ͳ ͷ
Find .
Then,AA-1 = = A-1A.
Fact 1.5.1 Given ݊ ൈ ݉ matrix A and identity matrix , Solution: According to definition 1.6.1, A is͵ ൈ Ͷ
is called a left identity for all ݊ ൈ ݉ matrices such matrix. Thus, should beͶ ൈ ͵. By making columns
of A rows, we obtain the following:
that = A. (i.e. Given͵ ൈ ͷ matrix A, then ଷ = A). ʹ ͷ ͳ
Fact 1.5.2 Given ݊ ൈ ݊ matrix A and identity matrix , ൌ ͵ ͳ ͳ
ͳ ʹ Ͳ
is called a right identity for all ݊ ൈ ݊ matrices such Ͳ ͵ ͷ
that = A. (i.e. Given͵ ൈ ͷ matrix A, then ହ = A). Definition 1.6.2 Given ݊ ൈ ݉ matrix A and݉ ൈ ݊
Result 1.5.4 Given ݊ ൈ ݊ matrix A, A-4 has a meaning if , then is always defined, and it is ݊ ൈ ݊
and only if A-1 exists. matrix. (i.e. Let ൌ , then ൈ ൈ ൌ ൈ ).
Result 1.5.5 Given ݊ ൈ ݊ matrix A, If A-1 exists, then Definition 1.6.3 Given ݊ ൈ ݉ matrix A and݉ ൈ ݊
A-4 = A-1 ൈ A-1ൈ A-1ൈ A-1. , then is always defined, and it is ݉ ൈ ݉
41 matrix. (i.e. Let ൌ , then ൈ ൈ ൌ ൈ ).
Definition 1.6.4 Given ݊ ൈ ݊ matrix A and݊ ൈ ݊ 44 M. Kaabar
, then is symmetric if ൌ Ǥ
Ͳ െʹ െͷ
Definition 1.6.5 Given ݊ ൈ ݊ matrix A and݊ ൈ ݊
Since ൌ െ ൌ ʹ Ͳ െ͵൩, then A is skew-
, then is skew-symmetric if ൌ െǤ ͷ ͵ Ͳ
Example 1.6.2 Given the following matrix: symmetric.
ͳ ͷ ͳͲ Fact 1.6.1 Given ݊ ൈ ݊ matrix A, if A is skew-
ൌͷ ͵ ൩
ͳͲ ͳͲ symmetric, then all numbers on the main diagonal of A
Show that A is symmetric. are always zeros.
Fact 1.6.2 Given ݊ ൈ ݉ matrix A, then ሺ ሻ ൌ Ǥ
Solution: According to definition 1.6.4, A is͵ ൈ ͵
matrix. Thus, should be͵ ൈ ͵. By making columns Fact 1.6.3 Given ݊ ൈ ݉ matrix A and ݉ ൈ ݇ matrix B,
of A rows, we obtain the following: then ሺ ሻ ൌ Ǥ (Warning:ሺ ሻ ് ).
ͳ ͷ ͳͲ
ൌ ͷ ͵ ൩ Fact 1.6.4 Given ݊ ൈ ݉ matrices A and B, then
ͳͲ ͳͲ ሺ േ ሻ ൌ േ Ǥ
ͳ ͷ ͳͲ
Since ൌ ൌ ͷ ͵ ൩, then A is symmetric. Result 1.6.1 Given matrix A and constantߙ, if A is
ͳͲ ͳͲ symmetric, then ߙ is symmetric such that
ሺߙሻ ൌ ߙ Ǥ
Example 1.6.3 Given the following matrix:
Ͳ ʹ ͷ Result 1.6.2 Given matrix A and constantߙ, if A is
ൌ െʹ Ͳ ͵൩ skew-symmetric, then ߙ is skew-symmetric such that
െͷ െ͵ Ͳ
Show that A is skew-symmetric. ሺߙሻ ൌ ߙ Ǥ
Result 1.6.3 Let A be ݊ ൈ ݊ matrix, there exists a
Solution: According to definition 1.6.5, A is͵ ൈ ͵ symmetric matrix B and a skew-symmetric matrix C
matrix. Thus, should be͵ ൈ ͵. By making columns
such that A is a linear combination of B and C. This
of A rows, we obtain the following:
Ͳ െʹ െͷ means that those should be numbers ߙଵ ߙଶ such
ൌ ʹ Ͳ െ͵൩
that ൌ ߙଵ ߙଶ .
ͷ ͵ Ͳ
Proof of Result 1.6.3 We will show that A is a linear
43 combination of B and C.
We assume that B is symmetric such that ൌ . 46 M. Kaabar
ൌ ሺ ሻ ൌ ሺ ሻ ൌ ൌ Ǥ
Now, we assume that C is skew-symmetric such that 1.7 Determinants
ൌ െ . In this section, we introduce step by step for finding
ൌ ሺ െ ሻ ൌ െ ሺ ሻ ൌ െ ൌ െሺ െ ሻ ൌ െ determinant of a certain matrix. In addition, we
By using algebra, we do the following: discuss some important properties such as invertible
ͳ ͳ ͳ ͳ and non-invertible. In addition, we talk about the
ൌ ሺ ሻ ሺ െ ሻ
ʹ ʹ ʹ ʹ
effect of row-operations on determinants.
ͳ ͳ ͳ ͳ
ൌ െ ൌ Ǥ Definition 1.7.1 Determinant is a square matrix. Given
ʹ ʹ ʹ ʹ
Thus, A is a linear combination of B and C. ଶ ሺԹሻ ൌ Թൈ ൌ Թൈ , let אଶ ሺԹሻ where A is ʹ ൈ ʹ
Example 1.6.4 Given the following matrix: ܽଵଵ ܽଵଶ
matrix, ൌ ቂܽ ܽଶଶ ቃǤ The determinant of A is
ʹ ͳ Ͷ ଶଵ
ൌ ͵ Ͳ ͳ ൩ represented by ሺሻȁȁ.
ͷ
Find symmetric matrix B and skew-symmetric matrix Hence, ሺሻ ൌ ȁȁ ൌ ܽଵଵ ܽଶଶ െ ܽଵଶ ܽଶଵ אԹ. (Warning:
C such that ൌ ߙଵ ߙଶ for some numbersߙଵ ߙଶ . this definition works only for ʹ ൈ ʹ matrices).
Example 1.7.1 Given the following matrix:
Solution: As we did in the above proof, we do the ͵ ʹ
following: ൌቂ ቃ
ͷ
ʹ ͳ Ͷ ʹ ͵ ͷ Ͷ Ͷ ͻ Find the determinant of A.
ൌ ൌ ͵ Ͳ ͳ൩ ͳ Ͳ ൩ ൌ Ͷ Ͳ ൩
ͷ Ͷ ͳ ͻ ͳͶ
ʹ ͳ Ͷ ʹ ͵ ͷ Ͳ െʹ െͳ Solution: Using definition 1.7.1, we do the following:
ൌ െ ൌ ͵ Ͳ ͳ൩ െ ͳ Ͳ ൩ ൌ ʹ Ͳ െͷ൩ ሺሻ ൌ ȁȁ ൌ ሺ͵ሻሺሻ െ ሺʹሻሺͷሻ ൌ ʹͳ െ ͳͲ ൌ ͳͳǤ
ͷ Ͷ ͳ ͳ ͷ Ͳ Thus, the determinant of A is 11.
ଵ
Let ߙଵ ൌ ߙଶ ൌ
ଶ
Example 1.7.2 Given the following matrix:
Ͷ Ͷ ͻ Ͳ െʹ െͳ
ଵ ଵ ଵ ଵ ͳ Ͳ ʹ
Thus, ൌ ൌ Ͷ Ͳ ൩ ଶ ʹ Ͳ െͷ൩
ଶ ଶ ଶ ൌ ͵ ͳ െͳ൩
ͻ ͳͶ ͳ ͷ Ͳ
ͳ ʹ Ͷ
Find the determinant of A.
45
Solution: Since A is ͵ ൈ ͵ matrix such that 48 M. Kaabar
אଷ ሺԹሻ ൌ Թൈ , then we cannot use definition 1.7.1
because it is valid only for ʹ ൈ ʹ matrices. Thus, we ͳ Ͳ ʹ
need to use the following method to find the ൌ ͵ ͳ െͳ൩
ͳ ʹ Ͷ
determinant of A. ͳ ʹ
Step 1: Choose any row or any column. It is ሺെͳሻଷାଶ ܽଷଶ ቂ ቃ
͵ െͳ
recommended to choose the one that has more zeros. Step 3: Add all of them together as follows:
In this example, we prefer to choose the second column
͵ െͳ ͳ ʹ
or the first row. Let’s choose the second column as ሺሻ ൌ ሺെͳሻଵାଶ ܽଵଶ ቂ ቃ ሺെͳሻଶାଶ ܽଶଶ ቂ ቃ
ͳ Ͷ ͳ Ͷ
follows: ͳ ʹ
ͳ Ͳ ʹ ሺെͳሻଷାଶ ܽଷଶ ቂ ቃ
͵ െͳ
ൌ ͵ ͳ െͳ൩
͵ െͳ ͳ ʹ
ͳ ʹ Ͷ ሺሻ ൌ ሺെͳሻଷ ሺͲሻ ቂ ቃ ሺെͳሻସ ሺͳሻ ቂ ቃ
ܽଵଶ ൌ Ͳǡ ܽଶଶ ൌ ͳܽଷଶ ൌ ʹǤ ͳ Ͷ ͳ Ͷ
Step 2: To find the determinant of A, we do the ͳ ʹ
ሺെͳሻହ ሺʹሻ ቂ ቃ
͵ െͳ
following: Forܽଵଶ , since ܽଵଶ is in the first row and
͵ െͳ ͳ ʹ
second column, then we virtually remove the first row ሺሻ ൌ ሺെͳሻሺͲሻ ቂ ቃ ሺͳሻሺͳሻ ቂ ቃ
ͳ Ͷ ͳ Ͷ
and second column.
ͳ ʹ
ͳ Ͳ ʹ ሺെͳሻሺʹሻ ቂ ቃ
͵ െͳ
ൌ ͵ ͳ െͳ൩
ͳ ʹ Ͷ ሺሻ ൌ ሺെͳሻሺͲሻሺͳʹ െ െͳሻ ሺͳሻሺͳሻሺͶ െ ʹሻ ሺെͳሻሺʹሻሺെͳ
͵ െͳ െ ሻ
ሺെͳሻଵାଶ ܽଵଶ ቂ ቃ
ͳ Ͷ
ሺሻ ൌ Ͳ ʹ ͳͶ ൌ ͳǤ
Forܽଶଶ , since ܽଶଶ is in the second row and second
column, then we virtually remove the second row and Thus, the determinant of A is 16.
second column. Result 1.7.1 Let א ሺԹሻ. Then, A is invertible
ͳ Ͳ ʹ
ൌ ͵ ͳ െͳ൩ (non-singular) if and only if ሺሻ ് ͲǤ
ͳ ʹ Ͷ The above result means that if ሺሻ ് Ͳ, then A is
ͳ ʹ
ሺെͳሻଶାଶ ܽଶଶ ቂ ቃ invertible (non-singular), and if A is invertible (non-
ͳ Ͷ
singular), then ሺሻ ് Ͳ.
Forܽଷଶ , since ܽଷଶ is in the third row and second
column, then we virtually remove the third row and Example 1.7.3 Given the following matrix:
second column. ʹ ͵
ൌቂ ቃ
47 Ͷ
Is A invertible (non-singular)? 50 M. Kaabar
Solution: Using result 1.7.1, we do the following: Result 1.7.2 Let א ሺԹሻ be a triangular matrix.
ሺሻ ൌ ȁȁ ൌ ሺʹሻሺሻ െ ሺ͵ሻሺͶሻ ൌ ͳʹ െ ͳʹ ൌ ͲǤ
Then, ሺሻ = multiplication of the numbers on the
Since the determinant of A is 0, then A is non-
invertible (singular). main diagonal of A.
Thus, the answer is No because A is non-invertible There are three types of triangular matrix:
(singular).
ܽଵଵ ܽଵଶ a) Upper Triangular Matrix: it has all zeros on the
Definition 1.7.2 Given ൌ ቂܽ ܽଶଶ ቃ. Assume that
ଶଵ left side of the diagonal of ݊ ൈ ݊ matrix.
ሺሻ ് Ͳ such that ሺሻ ൌ ܽଵଵ ܽଶଶ െ ܽଵଶ ܽଶଵ . To find
ͳ ͵
ିଵ (the inverse of A), we use the following format that (i.e. ൌ Ͳ ʹ ͷ൩ is an Upper Triangular Matrix).
applies only for ʹ ൈ ʹ matrices: Ͳ Ͳ Ͷ
ͳ ܽଶଶ െܽଵଶ b) Diagonal Matrix: it has all zeros on both left and
ିଵ ൌ ቂെܽ ܽଵଵ ቃ
ሺܣሻ ଶଵ right sides of the diagonal of ݊ ൈ ݊ matrix.
ͳ ܽଶଶ െܽଵଶ ͳ Ͳ Ͳ
ିଵ ൌ ቂെܽ ܽଵଵ ቃ (i.e. ൌ Ͳ ʹ Ͳ൩ is a Diagonal Matrix).
ܽଵଵ ܽଶଶ െ ܽଵଶ ܽଶଵ ଶଵ
Ͳ Ͳ Ͷ
Example 1.7.4 Given the following matrix: c) Lower Triangular Matrix: it has all zeros on the
͵ ʹ
ൌቂ ቃ right side of the diagonal of ݊ ൈ ݊ matrix.
െͶ ͷ
Is A invertible (non-singular)? If Yes, Find ିଵ . ͳ Ͳ Ͳ
(i.e. ൌ ͷ ʹ Ͳ൩ is a Diagonal Matrix).
Solution: Using result 1.7.1, we do the following: ͳ ͻ Ͷ
ሺሻ ൌ ȁȁ ൌ ሺ͵ሻሺͷሻ െ ሺʹሻሺെͶሻ ൌ ͳͷ ͺ ൌ ʹ͵ ് ͲǤ Fact 1.7.1 Let א ሺԹሻ. Then, ሺሻ ൌ ሺ ሻ.
Since the determinant of A is not 0, then A is invertible Fact 1.7.2 Let א ሺԹሻ. If A is an invertible (non-
(non-singular).
singular) matrix, then is also an invertible (non-
Thus, the answer is Yes, there exists ିଵ according to
definition 1.7.2 as follows: singular) matrix. (i.e. ሺ ሻିଵ ൌ ሺିଵ ሻ ).
ͷ ʹ Proof of Fact 1.7.2 We will show that ሺ ሻିଵ ൌ ሺିଵ ሻ .
ͳ ͳ െ
ͷ െʹ ͷ െʹ
ିଵ ൌ ቂ ቃൌ ቂ ቃ ൌ ൦ʹ͵ ʹ͵൪
We know from previous results that ିଵ ൌ .
ሺ ሻ
ܣͶ ͵ ʹ͵ Ͷ ͵ Ͷ ͵
ʹ͵ ʹ͵ By taking the transpose of both sides, we obtain:
49
ሺିଵ ሻ ൌ ሺ ሻ 52 M. Kaabar
Then, ሺିଵ ሻ ൌ ሺ ሻ
Since ሺ ሻ ൌ , then ሺିଵ ሻ ൌ . * Ri՞Rk (Interchange two rows). It has no effect on
constant)ן.
Solution: Using what we have learned from the effect
ͳ ʹ ͵ ͳ ʹ ͵ of determinants on Row-Operations:
i.e. ൌ Ͳ Ͷ ͳ൩ 3R2 --Æ Ͳ ͳʹ ͵൩ ൌ
ሺଵ ሻ ൌ ʹ כሺሻ ൌ ʹ כͶ ൌ ͺ because ଵ has the first
ʹ Ͳ ͳ ʹ Ͳ ͳ
row of A multiplied by 2.
Assume that ሺሻ ൌ ɀ where ɀ is known, then
ሺଶ ሻ ൌ ͵ כሺଵ ሻ ൌ ͵ כͺ ൌ ʹͶ because ଶ has the
ሺሻ ൌ ͵ɀ. third row of ଵ multiplied by 3.
Similarly, if ሺሻ ൌ Ⱦ Ⱦǡ then Similarly, ሺଷ ሻ ൌ െʹ כሺଶ ሻ ൌ െʹ ʹ כͶ ൌ െͶͺ
ଵ because ଷ has the fourth row of ଶ multiplied by -2.
ሺሻ ൌ ଷ Ⱦ.
* ןRi +Rk --Æ Rk (Multiply a row with a non-zero Result 1.7.3 Assume ݊ ൈ ݊with a given
constantןǡ ). ሺሻ ൌ ߛ . Let ߙ be a number. Then, ሺߙሻ ൌ ߙ ߛ כ.
ͳ ʹ ͵ Result 1.7.4 Assume ݊ ൈ ݊
Ǥ
i.e. ൌ Ͳ Ͷ ͳ൩ ןRi +Rk --Æ Rk
ʹ Ͳ ͳ Then: a) ሺሻ ൌ ሺሻ כሺሻǤ
ͳ ʹ ͵ b) Assume ିଵ exists and ିଵ exists.
Ͳ ͳʹ ͵൩ ൌ
ʹ Ͳ ͳ Then, ሺሻିଵ ൌ ିଵ ିଵ Ǥ
Then, ሺሻ ൌ ሺሻ. c)ሺሻ ൌ ሺሻǤ
d)ሺሻ ൌ ሺ ሻǤ
51
ଵ
e) If ିଵ exists, then ሺିଵ ሻ ൌ Ǥ 54 M. Kaabar
ୢୣ୲ሺሻ
59
a. Find two elementary matrices say ܧଵ ǡ ܧଶ such that 62 M. Kaabar
ܧଵ ܧଶ ܣൌ ܣଶ Ǥ
b. Find two elementary matrices say ܨଵ ǡ ܨଶ such that Chapter 2
ܨଵ ܨଶ ܣଶ ൌ ܣǤ
ͳ ʹ Ͷ Vector Spaces
6. Let ൌ െͳ െʹ ͵ ൩. Find ሺሻ. Is A invertible?
െʹ െ͵ െ We start this chapter reviewing some concepts of set
Explain.
theory, and we discuss some important concepts of
Ͷ െʹ
7. Let ൌ ቂ ቃ. Is A invertible? If yes, find ିܣଵ . vector spaces including span and dimension. In the
െ͵ ʹ
8. Use Cramer’s Rule to find the solution to ݔଶ in the remaining sections we introduce the concept of linear
system: independence. At the end of this chapter we discuss
ʹݔଵ ݔଶ െ ݔଷ ൌ ʹ other concepts such as subspace and basis.
൝ െʹݔଵ Ͷݔଶ ʹݔଷ ൌ ͺ
െʹݔଵ െ ݔଶ ͺݔଷ ൌ െʹ
ʹ െͶ ʹ ͳ
2.1 Span and Vector Spaces
9. Let ൌ െʹ Ͳ ʹ െͳ. Find (2,4) entry of ିଵ .
In this section, we review some concepts of set theory,
ͳ െʹ ͳʹ Ͷ
െʹ Ͷ െʹ ͳʹ and we give an introduction to span and vector spaces
including some examples related to these concepts.
10. Find a ʹ ൈ ʹ matrix ܣsuch that
Before reviewing the concepts of set theory, it is
െʹ ʹ
ቂͷ ͷቃ
ܣ ͵ܫଶ ൌ ʹ ܣ ቂ ቃ.
ͳ Ͷ ͵ ʹ recommended to revisit section 1.4, and read the
ʹ െʹ െʹ ͳ െʹ െʹ notations of numbers and the representation of the
11. Given ିܣଵ ൌ െʹ ͵ Ͳ ൩and ܤ ൌ െͶ ʹ ʹ൩
െʹ ʹ ͵ Ͳ ͳ െʹ three sets of numbers in figure 1.4.1.
Ͳ Let’s explain some symbols and notations of set theory:
Solve the system ܺܣൌ െͳ൩
ͳ ͵ אԺ means that 3 is an element of ԺǤ
ଵ ଵ
בԺ means that is not an element of ԺǤ
ଶ ଶ
61 { } means that it is a set.
{5} means that 5 is a subset of Ժ, and the set consists of 64 M. Kaabar
exactly one element which is 5.
Hence, Span{0} = 0.
Definition 2.1.1 The span of a certain set is the set of
Example 2.1.4 Find Span{c} where c is a non-zero
all possible linear combinations of the subset of that
integer.
set.
Solution: Using definition 2.1.1, the span of the set {c}
Example 2.1.1 Find Span{1}.
is the set of all possible linear combinations of the
Solution: According to definition 2.1.1, then the span of
subset of {c} which is ܿ ് Ͳ.
the set {1} is the set of all possible linear combinations
Thus, Span{c} = Թ.
of the subset of {1} which is 1.
Definition 2.1.2 Թ ൌ ሼሺܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ǥ ǡ ܽ ሻȁܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ǥ ǡ ܽ אԹሽ
Hence, Span{1} = Թ.
is a set of all points where each point has exactly ݊
Example 2.1.2 Find Span{(1,2),(2,3)}.
coordinates.
Solution: According to definition 2.1.1, then the span of
Definition 2.1.3 ሺܸǡ ǡήሻ is a vector space if satisfies the
the set {(1,2),(2,3)} is the set of all possible linear
following:
combinations of the subsets of {(1,2),(2,3)} which are
a. For every ݒଵ ǡ ݒଶ ܸ א, ݒଵ ݒଶ ܸ אǤ
(1,2) and (2,3). Thus, the following is some possible
b. For every ߙ אԹܸ א ݒ, ߙܸ א ݒǤ
linear combinations:
ሺͳǡʹሻ ൌ ͳ כሺͳǡʹሻ Ͳ כሺʹǡ͵ሻ
(i.e. Given ܵ݊ܽሼݔǡ ݕሽݐ݁ݏሼݔǡ ݕሽǡ then
ሺʹǡ͵ሻ ൌ Ͳ כሺͳǡʹሻ ͳ כሺʹǡ͵ሻ
ሺͷǡͺሻ ൌ ͳ כሺͳǡʹሻ ʹ כሺʹǡ͵ሻ ξͳͲ ݔ ʹ݊ܽܵ א ݕሼݔǡ ݕሽ. Let’s assume that
݊ܽܵ א ݒሼݔǡ ݕሽ, then ݒൌ ܿଵ ݔ ܿଶ ݕfor some numbers
Hence,ሼሺͳǡʹሻǡ ሺʹǡ͵ሻǡ ሺͷǡͺሻሽ אሼሺͳǡʹሻǡ ሺʹǡ͵ሻሽ.
ܿଵ ܿଶ ).
Example 2.1.3 Find Span{0}.
Solution: According to definition 2.1.1, then the span of
the set {0} is the set of all possible linear combinations
of the subset of {0} which is 0.
63
2.2 The Dimension of Vector 66 M. Kaabar
Definition 2.2.1 Given a vector spaceܸ, the dimension Fact 2.2.3 ܵ݊ܽሼሺʹǡͳሻǡ ሺͳǡͲǤͷሻሽ ് Թଶ .
of ܸ is the number of minimum elements needed in ܸ
so that their ܵ ݊ܽis equal toܸ, and it is denoted by
2.3 Linear Independence
ሺܸሻ. (i.e. ሺԹሻ ൌ ͳ ሺԹଶ ሻ ൌ ʹ). In this section, we learn how to determine whether
Result 2.2.1 ሺԹ ሻ ൌ ݊. vector spaces are linearly independent or not.
Proof of Result 2.2.1 We will show that ሺԹ ሻ ൌ ݊Ǥ Definition 2.3.1 Given a vector spaceሺܸǡ ǡήሻ, we say
Claim: ܦൌ ܵ݊ܽሼሺͳǡͲሻǡ ሺͲǤͳሻሽ ൌ Թଶ ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ ܸ אare linearly independent if none of
them is a linear combination of the remaining ݒ Ԣݏ.
ߙଵ ሺͳǡͲሻ ߙଶ ሺͲǡͳሻ ൌ ሺߙଵ ǡ ߙଶ ሻ אԹଶ
(i.e. ሺ͵ǡͶሻǡ ሺʹǡͲሻ אԹ are linearly independent because
Thus, ܦis a subset of Թଶ ( ك ܦԹଶ ). we cannot write them as a linear combination of each
For every ݔଵ ǡ ݕଵ אԹ, ሺݔଵ ǡ ݕଵ ሻ אԹଶ Ǥ other, in other words, we cannot find a number ߙଵ ǡ ߙଶ
such that ሺ͵ǡͶሻ ൌ ߙଵ ሺʹǡͲሻ and ሺʹǡͲሻ ൌ ߙଶ ሺ͵ǡͶሻ).
Therefore, ሺݔଵ ǡ ݕଵ ሻ ൌ ݔଵ ሺͳǡͲሻ ݕଵ ሺͲǡͳሻ ܦ אǤ
Definition 2.3.2 Given a vector spaceሺܸǡ ǡήሻ, we say
We prove the above claim, and hence ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ ܸ אare linearly dependent if at least one of
ሺԹ ሻ ൌ ݊. ݒ Ԣ ݏis a linear combination of the others.
Fact 2.2.1 ܵ݊ܽሼሺ͵ǡͶሻሽ ് Թଶ . Example 2.3.1 Assume ݒଵ ݒଶ are linearly
independent. Show that ݒଵ and ͵ݒଵ ݒଶ are linearly
Proof of Fact 2.2.1 We will show that ܵ݊ܽሼሺ͵ǡͶሻሽ ് Թଶ Ǥ
independent.
65
Solution: We will show that ݒଵ and ͵ݒଵ ݒଶ are 68 M. Kaabar
linearly independent. Using proof by contradiction, we
assume that ݒଵ and ͵ݒଵ ݒଶ are linearly dependent. ͳ Ͳ െʹ
െʹ ʹ ͳ ൩ Each point is a row-operation. We need to
For some non-zero number ܿଵ , ݒଵ ൌ ܿଵ ሺ͵ݒଵ ݒଶ ሻ. െͳ Ͳ ͷ
Using the distribution property and algebra, we obtain: reduce this matrix to Semi-Reduced Matrix.
Solution: First of all, to determine whether these Are these vectors independent elements?
vectors are independent elements or not, we need to Solution: First of all, to determine whether these
write these vectors as a matrix. vectors are independent elements or not, we need to
write these vectors as a matrix.
67
ͳ െʹ Ͷ 70 M. Kaabar
െͳ ʹ Ͳ ʹ ൩ Each point is a row-operation. We
ͳ െʹ ͺ ͳͶ
need to reduce this matrix to Semi-Reduced Matrix.
ͳ െʹ Ͷ ܴ ܴ ՜ܴ ͳ െʹ Ͷ V D
ଵ ଶ ଶ
െͳ ʹ Ͳ ʹ ൩ െܴ ܴ ՜ ܴ Ͳ Ͳ Ͷ ͺ൩
ଵ ଷ ଷ
ͳ െʹ ͺ ͳͶ Ͳ Ͳ Ͷ ͺ
ͳ െʹ Ͷ
െܴଶ ܴଷ ՜ ܴଷ Ͳ Ͳ Ͷ ͺ൩ This is a Semi-Reduced
Ͳ Ͳ Ͳ Ͳ
Matrix.
Figure 2.4.1: Subspace of ܸ
Since there is a zero-row in the Semi-Reduced Matrix,
then the elements are dependent because we can write
at least one of them as a linear combination of the Fact 2.4.1 Every vector space is a subspace of itself.
others.
Example 2.4.1 Given a vector space ܮൌ ሼሺܿǡ ͵ܿሻȁܿ אԹሽ.
69
72 M. Kaabar
Now, we can answer the given questions as follows: Thus, ܮൌ ܵ݊ܽሼሺͳǡ͵ሻǡ ሺʹǡሻሽ.
We prove the above claim, and ܵ݊ܽሼሺͷǡͳͷሻሽ ് Թଶ . Solution: Since the equation of the above vector space
is a three-dimensional equation, there is no need to
Thus, ܮdoes not equal to Թଶ
draw it because it is difficult to draw it exactly. Thus,
Part c: Yes; ܮis a subspace of Թଶ because ܮlives inside
we can answer the above questions immediately.
a bigger vector space which is Թଶ .
Part a: Yes; ܦlives inside Թଷ .
71
74 M. Kaabar
Part b: ܦis a plane that passes through the origin Now, we apply the Row-Reduction Method to get the
ሺͲǡͲǡͲሻ. Since ሺܦሻ ൌ ʹ, then any two independent Semi-Reduced Matrix as follows:
points in ܦwill form a basis for ܦ. Hence, the following
െͳ ʹ Ͳ Ͳ ܴ ܴ ՜ܴ െͳ ʹ Ͳ Ͳ
are some possible bases for ܦ: ͳ െʹ ͵
ଵ ଶ ଶ
Ͳ൩ െʹܴ ܴ ՜ ܴ Ͳ Ͳ ͵ Ͳ൩
ଵ ଷ ଷ
െʹ Ͳ ͵ Ͳ Ͳ െͶ ͵ Ͳ
Basis for ܦis ሼሺͳǡ െͳǡͲሻǡ ሺʹǡʹǡͳሻሽ.
െͳ ʹ Ͳ Ͳ
Another basis for ܦis ሼሺͳǡ െͳǡͲሻǡ ሺͲǡͶǡͳሻሽ. െܴଶ ܴଷ ՜ ܴଷ Ͳ Ͳ ͵ Ͳ൩ This is a Semi-Reduced
Ͳ െͶ Ͳ Ͳ
Result 2.4.6 It is always true that ȁݏ݅ݏܽܤȁ ൌ ݀݅݉ሺܦሻ. Matrix.
Example 2.4.4 Given the following: Since there is no zero-row in the Semi-Reduced Matrix,
ܯൌ ܵ݊ܽሼሺെͳǡʹǡͲǡͲሻǡ ሺͳǡ െʹǡ͵ǡͲሻǡ ሺെʹǡͲǡ͵ǡͲሻሽ. then these elements are independent. All the three
points survived in the Semi-Reduced Matrix. Thus,
Find a basis for ܯ.
ሺܯሻ ൌ ͵. Since ሺܯሻ ൌ ͵, then any three
Solution: We have infinite set of points, and ܯlives independent points in ܯfrom the above matrices will
inside Թସ . Let’s assume the following: form a basis for ܯ. Hence, the following are some
ݒଵ ൌ ሺെͳǡʹǡͲǡͲሻ possible bases for ܯ:
ݒଶ ൌ ሺͳǡ െʹǡ͵ǡͲሻ Basis for ܯis ሼሺെͳǡʹǡͲǡͲሻǡ ሺͲǡͲǡ͵ǡͲሻǡ ሺͲǡ െͶǡͲǡͲሻሽ.
ݒଷ ൌ ሺെʹǡͲǡ͵ǡͲሻ
Another basis for ܯisሼሺെͳǡʹǡͲǡͲሻǡ ሺͲǡͲǡ͵ǡͲሻǡ ሺͲǡ െͶǡ͵ǡͲሻሽ.
We check if ݒଵ ǡ ݒଶ and ݒଷ are dependent elements.
Using what we have learned so far from section 2.3 Another basis for ܯisሼሺെͳǡʹǡͲǡͲሻǡ ሺͳǡ െʹǡ͵ǡͲሻǡ ሺെʹǡͲǡ͵ǡͲሻሽ.
and example 2.4.3: We need to write these vectors as a Example 2.4.5 Given the following:
matrix.
ܹ ൌ ܵ݊ܽሼሺܽǡ െʹܽ ܾǡ െܽሻȁܽǡ ܾ אԹሽ.
െͳ ʹ Ͳ Ͳ
a. Show that ܹ is a subspace of Թଷ .
ͳ െʹ ͵ Ͳ൩ Each point is a row-operation. We
െʹ Ͳ ͵ Ͳ b. Find a basis for ܹ.
need to reduce this matrix to Semi-Reduced Matrix.
c. Rewrite ܹ as a ܵ݊ܽ.
Solution: We have infinite set of points, and ܹ lives
75 inside Թଷ .
78 M. Kaabar
77 ݒଶ ൌ ሺͲǡͷǡͳǡͳሻ
80 M. Kaabar
ݒଷ ൌ ሺͲǡͲǡʹǡ͵ሻ
ݒସ ൌ ሺͲǡͲǡͲǡ ߨ ሻ Ͳ ʹ ͳ Ͷ Ͳ ʹ ͳ Ͷ
Ͳ െʹ ͵ െͳͲ ܴ ܴ ՜ ܴ ͵
ଵ ଶ ଶ Ͳ
Ͳ ͷ ͵Ͳ
Ͳ Ͳ Ͷ െ Ͳ Ͷ െ
Thus, the basis for Թସ ൌ ሼݒଵ ǡ ݒଶ ǡ ݒଷ ǡ ݒସ ሽ, and Ͳ Ͳ Ͳ ͳͲͲͲ Ͳ Ͳ Ͳ ͳͲͲͲ
ͳ ൌ ߙଵ െ ߙଶ Ͳ ή ߙଷ
79
ͳ ൌ ߙଵ െ ߙଶ Ͳ ή ߙଷ
82 M. Kaabar
ʹ ൌ ߙଵ ߙଷ
ܦൌ ܵ݊ܽሼሺͳǡͳǡͳǡͳሻǡ ሺͲǡͲǡͳǡͳሻሽ.
ʹ ൌ ߙଵ ߙଷ
Now, we ask ourselves the following question:
Using what we have learned from chapter 1 to solve
the above system of linear equations, we obtain: Question: Can we find ߙଵ ǡ ߙଶ and ߙଷ such that
ሺͳǡͳǡʹǡʹሻ ൌ ߙଵ ή ሺͳǡͳǡͳǡͳሻ ߙଶ ή ሺͲǡͲǡͳǡͳሻ?
ߙଵ ൌ ߙଶ ൌ ߙଷ ൌ ͳ
Answer: Yes:
Hence, Yes: ሺͳǡͳǡʹǡʹሻ ܦ אǤ
ͳ ൌ ߙଵ
The Second Way (Recommended): We first need to find
݀݅݉ሺܦሻ, and then a basis for ܦ. We have to write ͳ ൌ ߙଵ
ݒଵ ǡ ݒଶ ݒଷ as a matrix.
ʹ ൌ ߙଵ ߙଶ
ͳ ͳ ͳ ͳ
െͳ െͳ Ͳ Ͳ൩ Each point is a row-operation. We ʹ ൌ ߙଵ ߙଶ
Ͳ Ͳ ͳ ͳ
need to reduce this matrix to Semi-Reduced Matrix. Thus, ߙଵ ൌ ߙଶ ൌ ߙଷ ൌ ͳ.
Now, we apply the Row-Reduction Method to get the Hence, Yes: ሺͳǡͳǡʹǡʹሻ ܦ אǤ
Semi-Reduced Matrix as follows:
ͳ ͳ ͳ ͳ ͳ ͳ ͳ ͳ
2.5 Exercises
െͳ െͳ Ͳ ൩ܴ
Ͳ ଵ ܴଶ ՜ ܴ
ଶ Ͳ Ͳ ͳ ͳ൩
Ͳ Ͳ ͳ ͳ Ͳ Ͳ ͳ ͳ 1. Let ܯൌ ܵ݊ܽሼሺͳǡ െͳǡͳሻǡ ሺെͳǡͲǡͳሻǡ ሺͲǡ െͳǡʹሻሽ
c. Is ܥൌ ቂ
െʹ Ͳ
Ͳ െͳ
ቃ ?ܦ אWhy? 3.1 Null Space and Rank
d. If the answer to part c is yes, then write ܥas a In this section, we first give an introduction to
linear combination of the elements in ܣ. Otherwise, homogeneous systems, and we discuss how to find the
write the basis ܣas a ܵ݊ܽ. null space and rank of homogeneous systems. In
7. Let ܭൌ ܵ݊ܽሼሺͳǡ െͳǡͲሻǡ ሺʹǡ െͳǡͲሻǡ ሺͳǡͲǡͲሻሽ. addition, we explain how to find row space and column
Find݀݅݉ሺܭሻ. space.
ସ
8. Find a basis for the subspace of Թ spanned by Definition 3.1.1 Homogeneous System is a ݉ ൈ ݊
ሼሺʹǡͻǡ െʹǡͷ͵ሻǡ ሺെ͵ǡʹǡ͵ǡ െʹሻǡ ሺͺǡ െ͵ǡ െͺǡͳሻǡ ሺͲǡ െ͵ǡͲǡͳͷሻሽ. system of linear equations that has all zero constants.
9. Does the ܵ݊ܽሼሺെʹǡͳǡʹሻǡ ሺʹǡͳǡ െͳሻǡ ሺʹǡ͵ǡͲሻሽ equal to (i.e. the following is an example of homogeneous
ଷ
Թ ? ʹݔଵ ݔଶ െ ݔଷ ݔସ ൌ Ͳ
system): ൝͵ݔଵ ͷݔଶ ͵ݔଷ Ͷݔସ ൌ Ͳ
83 െݔଶ ݔଷ െ ݔସ ൌ Ͳ
86 M. Kaabar
Imagine we have the following solution to the
݉ଵ ݓଵ Ͳ
homogeneous system: ݔଵ ൌ ݔଶ ൌ ݔଷ ൌ ݔସ ൌ Ͳ. ݉ ۍଶ ې ݓۍଶ ېͲۍ ې
ۑ ێ ۑ ێ ۑ ێ
Then, this solution can be viewed as a point of Թ (here Now, using algebra: ܯ ܹ ൌ ݉ ێ ܥଷ ۑ ݓێ ܥଷ ۑൌ ۑͲێ
ۑ ڭ ێ ۑڭێ ۑ ڭ ێ
is Թସ ) : ሺͲǡͲǡͲǡͲሻ ݉ۏ ے ݓۏ ےͲۏ ے
Result 3.1.1 The solution of a homogeneous system By taking ܥas a common factor, we obtain:
݉ ൈ ݊ can be written as ݉ଵ ݓଵ Ͳ
݉ ۍଶ ݓۍ ېଶ ې ېͲۍ
ሼሺܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ܽସ ǡ ǥ ǡ ܽ ȁܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ܽସ ǡ ǥ ǡ ܽ אԹሽ. ۑ ێ ۊ ۑ ێ ۑ ێۇ
݉ ێۈ ܥଷ ۑ ݓێଷ ۋۑൌ ۑͲێ
Result 3.1.2 All solutions of a homogeneous system ۑ ڭ ێ ۑ ڭ ێ ۑڭێ
ۏ
ۉ݉ ے ۏ ݓ ے
ی ےͲۏ
݉ ൈ ݊ form a subset of Թ , and it is equal to the
݉ଵ ݓଵ Ͳ
number of variables. ݉ۍ ې ۍ ې ݓ
݉ ێ ۑͲێ ۑ ݓ
ଶ ଶ
ێܥଷ ଷ ۑൌ ۑͲێ
Result 3.1.3 Given a homogeneous system ݉ ൈ ݊. We
ێ ڭ ۑڭێ ۑ
ݔଵ Ͳ ݉ۏ ݓ ےͲۏ ے
ݔ ۍଶ ېͲۍ ې
ۑ ێ ۑ ێ Thus, ܯ ܹ is a solution.
write it in the matrix-form: ݔ ێ ܥଷ ۑൌ ۑͲێwhere ܥis a
ۑڭێ ۑ ڭ ێ Fact 3.1.1 If ܯଵ ൌ ሺ݉ଵ ǡ ݉ଶ ǡ ǥ ǡ ݉ ሻ is a solution, and
ݔۏ ےͲۏ ے
ߙ אԹ, then ߙ ܯൌ ሺߙ݉ଵ ǡ ߙ݉ଶ ǡ ǥ ǡ ߙ݉ ሻ is a solution.
coefficient. Then, the set of all solutions in this system
Fact 3.1.2 The only system where the solutions form a
is a subspace of Թ .
vector space is the homogeneous system.
Proof of Result 3.1.3 We assume that
Definition 3.1.2 Null Space of a matrix, say ܣis a set of
ܯଵ ൌ ሺ݉ଵ ǡ ݉ଶ ǡ ǥ ǡ ݉ ሻ and ܹଵ ൌ ሺݓǡ ݓଶ ǡ ǥ ǡ ݓ ሻ are two
all solutions to the homogeneous system, and it is
solutions to the above system. We will show that
denoted by ݈݈ܰݑሺܣሻ or ܰሺܣሻ.
ܯ ܹ is a solution. We write them in the matrix-form:
݉ଵ Ͳ ݓଵ Ͳ Definition 3.1.3 Rank of a matrix, say ܣis the number
݉ ۍଶ ې Ͳ ۍ ې ݓۍଶ ېͲۍ ې
ۑ ێ ۑ ێ ۑ ێ ۑ ێ of independent rows or columns of ܣ, and it is denoted
݉ێ ܥଷ ۑൌ ۑͲێand ݓێଷ ۑൌ ۑͲێ
ۑڭێ ۑ ڭ ێ ۑڭێ ۑ ڭ ێ by ܴܽ݊݇ሺܣሻ.
݉ ۏ ے Ͳ ۏ ے ݓۏ ےͲۏ ے
85
88 M. Kaabar
Definition 3.1.4 Row Space of a matrix, say ܣis the
ܵ݊ܽof independent rows of ܣ, and it is denoted by ͳ െͳ ʹ Ͳ െͳ Ͳ
൭Ͳ ͳ ʹ Ͳ ʹ อͲ൱ܴଶ ܴଵ ՜ ܴଵ
ܴݓሺܣሻ. Ͳ Ͳ Ͳ ͳ Ͳ Ͳ
Definition 3.1.5 Column Space of a matrix, say ܣis the ͳ Ͳ Ͷ Ͳ ͳͲ
ܵ݊ܽof independent columns of ܣ, and it is denoted by ൭Ͳ ͳ ʹ Ͳ ʹอͲ൱ This is a Completely-Reduced
Ͳ Ͳ Ͳ ͳ ͲͲ
݊݉ݑ݈ܥሺܣሻ. Matrix.
Example 3.1.1 Given the following ͵ ൈ ͷ matrix:
Step 3: Read the solution for the above system of linear
ͳ െͳ ʹ Ͳ െͳ equations after using Row-Operation.
ܣൌ Ͳ ͳ ʹ Ͳ ʹ ൩.
Ͳ Ͳ Ͳ ͳ Ͳ ݔଵ Ͷݔଷ ݔହ ൌ Ͳ
a. Find ݈݈ܰݑሺܣሻ. ݔଶ ʹݔଷ ʹݔହ ൌ Ͳ
ݔସ ൌ Ͳ
b. Find ݀݅݉ሺ݈݈ܰݑሺܣሻሻ.
c. Rewrite ݈݈ܰݑሺܣሻ as ܵ݊ܽ. Free variables are ݔଷ and ݔହ .
d. Find ܴܽ݊݇ሺܣሻ. Assuming that ݔଷ ,ݔହ אԹ. Then, the solution of the
e. Find ܴݓሺܣሻ. above homogeneous system is as follows:
Solution: Part a: To find the null space of ܣ, we need to ݔଵ ൌ െͶݔଷ െ ݔହ
find the solution of ܣas follows: ݔଶ ൌ െʹݔଷ െ ʹݔହ
ݔସ ൌ Ͳ
Step 1: Write the above matrix as an Augmented-
Thus, according to definition 3.1.2,
Matrix, and make all constants’ terms zeros.
ͳ െͳ ʹ Ͳ െͳ Ͳ ݈݈ܰݑሺܣሻ ൌ ሼሺെͶݔଷ െ ݔହ ǡ െʹݔଷ െ ʹݔହ ǡ ݔଷ ǡ Ͳǡ ݔହ ሻȁݔଷ ,ݔହ אԹሽ.
൭Ͳ ͳ ʹ Ͳ ʹ อͲ൱
Ͳ Part b: It is always true that
Ͳ Ͳ ͳ Ͳ Ͳ
Step 2: Apply what we have learned from chapter 1 to ݀݅݉൫݈݈ܰݑሺܣሻ൯ ൌ ݀݅݉൫ܰሺܣሻ൯ ൌ ݄ܶ݁ܰݏ݈ܾ݁ܽ݅ݎܸܽ݁݁ݎܨ݂ݎܾ݁݉ݑ
solve systems of linear equations use Row-Operation Here, ݀݅݉൫݈݈ܰݑሺܣሻ൯ ൌ ʹ.
Method.
87
90 M. Kaabar
Part d: To find the rank of matrix ܣ, we just need to Solution: Part a: To find the row space of ܤ, we need to
change matrix ܤto the Semi-Reduced Matrix as
change matrix ܣto the Semi-Reduced Matrix. We
follows:
already did that in part a. Thus, ܴܽ݊݇ሺܣሻ ൌ ͵Ǥ
ͳ ͳ ͳ ͳ ͳ ܴ ܴ ՜ܴ ͳ ͳ ͳ ͳ ͳ
Part e: To find the row space of matrix ܣ, we just need െͳ െͳ െͳͲ
ଵ ଶ ଶ
ʹ ൩ ܴ ܴ ՜ ܴ Ͳ Ͳ Ͳͳ ͵൩
ଵ ଷ ଷ
to write the ܵ ݊ܽof independent rows. Thus, Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ
ܴݓሺܣሻ ൌ ܵ݊ܽሼሺͳǡ െͳǡʹǡͲǡ െͳሻǡ ሺͲǡͳǡʹǡͲǡʹሻǡ ሺͲǡͲǡͲǡͳǡͲሻሽǤ This is a Semi-Reduced Matrix. To find the row space
It is also a subspace of Թହ . of matrix ܤ, we just need to write the ܵ ݊ܽof
Result 3.1.4 Let ܣbe ݉ ൈ ݊ matrix. Then, independent rows. Thus,
ܴܽ݊݇ሺܣሻ ݀݅݉൫ܰ ሺܣሻ൯ ൌ ݊ ൌ ܰܣ݂ݏ݊݉ݑ݈ܥ݂ݎܾ݁݉ݑ. ܴݓሺܤሻ ൌ ܵ݊ܽሼሺͳǡͳǡͳǡͳǡͳሻǡ ሺͲǡͲǡͲǡͳǡ͵ሻሽǤ
Result 3.1.5 Let ܣbe ݉ ൈ ݊ matrix. The geometric Part b: To find the column space of ܤ, we need to
change matrix ܤto the Semi-Reduced Matrix. We
meaning of ܴݓሺܣሻ ൌ ܵ݊ܽሼݏݓܴݐ݊݁݀݊݁݁݀݊ܫሽ “lives”
already did that in part a. Now, we need to locate the
inside Թ . columns in the Semi-Reduced Matrix of ܤthat contain
89
92 M. Kaabar
Part c: To find the rank of matrix ܤ, we just need to ܲ ൌ ܵ݁ ݁݁ݎ݂݃݁݀ݏ݈ܽ݅݉݊ݕ݈݈݈݂ܽݐ൏ ݊Ǥ
change matrix ܣto the Semi-Reduced Matrix. We The algebraic expression of polynomials is in the
already did that in part a. Thus, following from: ܽ ݔ ܽିଵ ݔିଵ ڮ ܽଵ ݔଵ ܽ
91
94 M. Kaabar
b. Let ܦൌ ܵ݊ܽሼ͵ ݔଶ െ ʹǡ െͷݔǡ ݔଶ െ ͳͲ ݔെ Ͷሽ. Find a Part b: Since there are only 2 vectors survived after
basis for ܦ. checking for dependency in part a, then the basis for ܦ
ሺͲǡ െͷǡͲሻ ՞ െͷݔ.
Solution: Part a: We know that these polynomial live
in ܲଷ , and as a vector space ܲଷ is the same as Թଷ . According Result 3.2.4 Given ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ points in Թ where
to result 3.2.2, we need to make each polynomial ݇ ൏ ݊. Choose one particular point, say ܳ, such that
equivalent to Թ as follows: ܳ ൌ ܿଵ ݒଵ ܿଶ ݒଶ ڮ ܿ ݒ where ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ are
93
96 M. Kaabar
ܳ ൌ ܿଵ ߙଶ ݒଶ ܿଵ ߙଷ ݒଷ ڮ ܿଵ ߙ ݒ ܿଶ ݒଶ ڮ ܿ ݒ . Part b: Since ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺ͵ܽଵ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ, then
ܳ ൌ ሺܿଵ ߙଶ ܿଶ ሻݒଶ ሺܿଵ ߙଷ ܿଷ ሻݒଷ ڮ ሺܿଵ ߙ ܿ ሻݒ ܽଵ ൌ ͳܽଶ ൌ Ͳ. Thus, ܶ൫ሺͳǡͲሻ൯ ൌ ሺ͵ሺͳሻ ͲǡͲǡ െͳሻ ൌ
Ͳݒଵ Ǥ Thus, none of them is a linear combination of the ሺ͵ǡͲǡ െͳሻ.
others which means that they are linearly Part c: Proof: We assume that ݒଵ ൌ ሺܽଵ ǡ ܽଶ ሻ,
independent. This is a contradiction. Therefore, our ݒଶ ൌ ሺܾଵ ǡ ܾଶ ሻ, and ߙ אԹ. We will show that ܶ is a linear
assumption that ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ were linearly transformation. Using algebra, we start from the Left-
dependent is false. Hence, ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ are linearly Hand-Side (LHS):
independent. ߙݒଵ ݒଶ ൌ ሺߙܽଵ ܾଵ ǡ ߙܽଶ ܾଶ ሻ
Result 3.2.5 Assume ݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ are independent and ܶሺߙݒଵ ݒଶ ሻ ൌ ܶሺሺߙܽଵ ܾଵ ǡ ߙܽଶ ܾଶ ሻሻ
ܳ ݊ܽܵ אሼݒଵ ǡ ݒଶ ǡ ǥ ǡ ݒ ሽ. Then, there exists unique ܶሺߙݒଵ ݒଶ ሻ ൌ ሺ͵ߙܽଵ ͵ܾଵ ߙܽଶ ܾଶ ǡ ߙܽଶ ܾଶ ǡ െߙܽଵ െ ܾଵ ሻ
number ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ such that ܳ ൌ ܿଵ ݒଵ ܿଶ ݒଶ ڮ
Now, we start from the Right-Hand-Side (RHS):
ܿ ݒ .
ߙܶሺݒଵ ሻ ܶሺݒଶ ሻ ൌ ߙܶሺܽଵ ǡ ܽଶ ሻ ܶሺܾଵ ǡ ܾଶ ሻ
Linear Transformation: ߙܶሺݒଵ ሻ ܶሺݒଶ ሻ ൌ ߙሺ͵ܽଵ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ ሺ͵ܾଵ ܾଶ ǡ ܾଶ ǡ െܾଵ ሻ
Definition 3.2.1 ܶǣ ܸ ՜ ܹ where ܸ is a domain and ܹ is ൌ ሺ͵ߙܽଵ ߙܽଶ ǡ ߙܽଶ ǡ െߙܽଵ ሻ ሺ͵ܾଵ ܾଶ ǡ ܾଶ ǡ െܾଵ ሻ
a co-domain. ܶ is a linear transformation if for every ൌ ሺ͵ߙܽଵ ߙܽଶ ͵ܾଵ ܾଶ ǡ ߙܽଶ ܾଶ ǡ െߙܽଵ െ ܾଵ ሻ
ݒଵ ǡ ݒଶ ܸ אand ߙ אԹ, we have the following:
ܶሺߙݒଵ ݒଶ ሻ ൌ ߙܶሺݒଵ ሻ ܶሺݒଶ ሻ. Thus, ܶ is a linear transformation.
95
98 M. Kaabar
Result 3.2.6 Given ܶǣ Թ ՜ Թ . Then,
ܶሺሺܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ǥ ǡ ܽ ሻሻ ൌ Each coordinate is a linear ͳ Ͳ
Since ܫଶ ൌ ቂ ቃ, then the standard basis for Թଶ is
combination of the ܽ Ԣݏ. Ͳ ͳ
ሼሺͳǡͲሻǡ ሺͲǡͳሻሽ.
Example 3.2.3 Givenܶǣ Թଷ ՜ Թସ where Թଷ is a domain
Example 3.2.7 Find the standard basis for Թଷ .
and Թସ is a co-domain.
Solution: The standard basis for Թଷ is the rows of ܫଷ .
a. If ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ሺെ͵ݔଷ ݔଵ ǡ െͳͲݔଶ ǡ ͳ͵ǡ െݔଷ ሻ, is
ܶ a linear transformation? ͳ Ͳ Ͳ
b. If ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ሺെ͵ݔଷ ݔଵ ǡ െͳͲݔଶ ǡ Ͳǡ െݔଷ ሻ, is ܶ Since ܫଷ ൌ Ͳ ͳ Ͳ൩, then the standard basis for Թଷ is
Ͳ Ͳ ͳ
a linear transformation? ሼሺͳǡͲǡͲሻǡ ሺͲǡͳǡͲሻǡ ሺͲǡͲǡͳሻሽ.
Solution: Part a: Since 13 is not a linear combination of
Example 3.2.8 Find the standard basis for ܲଷ .
ݔଵ ǡ ݔଶ ݔଷ . Thus, ܶ is not a linear transformation.
Solution: The standard basis for ܲଷ is ሼͳǡ ݔǡ ݔଶ ሽ.
Part b: Since 0 is a linear combination of ݔଵ ǡ ݔଶ ݔଷ .
Thus, ܶ is a linear transformation. Example 3.2.9 Find the standard basis for ܲସ .
Example 3.2.4 Givenܶǣ Թଶ ՜ Թଷ where Թଶ is a domain Solution: The standard basis for ܲସ is ሼͳǡ ݔǡ ݔଶ ǡ ݔଷ ሽ.
and Թଷ is a co-domain. If ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺܽଵ ଶ ܽଶ ǡ െܽଶ ሻ,
Example 3.2.10 Find the standard basis for Թଶൈଶ ൌ
is ܶ a linear transformation?
ܯଶൈଶ ሺԹሻ.
Solution: Since ܽଵ ଶ ܽଶ is not a linear combination of
Solution: The standard basis for Թଶൈଶ ൌ ܯଶൈଶ ሺԹሻ is
ܽଵ ܽଶ . Hence, ܶ is not a linear transformation. ͳ Ͳ Ͳ ͳ Ͳ Ͳ Ͳ Ͳ
ሼቂ ቃǡቂ ቃǡቂ ቃǡቂ ቃሽ because Թଶൈଶ ൌ
Ͳ Ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ
Example 3.2.5 Givenܶǣ Թ ՜ Թ. If ܶሺ ݔሻ ൌ ͳͲݔ, is ܶ a
ܯଶൈଶ ሺԹሻ ൌ Թସ as a vector space where standard basis
linear transformation? ͳ Ͳ Ͳ Ͳ
Solution: Since it is a linear combination of ܽଵ such for Թଶൈଶ ൌ ܯଶൈଶ ሺԹሻ is the rows of ܫସ ൌ Ͳ ͳ Ͳ Ͳthat
Ͳ Ͳ ͳ Ͳ
thatߙܽଵ ൌ ͳͲݔ. Hence, ܶ is a linear transformation. Ͳ Ͳ Ͳ ͳ
are represented by ʹ ൈ ʹ matrices.
Example 3.2.6 Find the standard basis for Թଶ .
Example 3.2.11 Let ܶǣ Թଶ ՜ Թଷ be a linear
ଶ
Solution: The standard basis for Թ is the rows of ܫଶ .
transformation such that
97 ܶሺʹǡͲሻ ൌ ሺͲǡͳǡͶሻ
100 M. Kaabar
ܶሺെͳǡͳሻ ൌ ሺʹǡͳǡͷሻ
Find ܶሺ͵ǡͷሻ.
3.3 Kernel and Range
In this section, we discuss how to find the standard
Solution: The given points are ሺʹǡͲሻ and ሺെͳǡͳሻ. These
two points are independent because of the following: matrix representation, and we give examples of how to
find kernel and range.
ʹ Ͳ ͳ ʹ Ͳ
ቂ ቃ ܴଵ ܴଶ ՜ ܴଶ ቂ ቃ
െͳ ͳ ʹ Ͳ ͳ Definition 3.3.1 Given ܶǣ Թ ՜ Թ where Թ is a
Every point in Թଶ is a linear combination of ሺʹǡͲሻ and domain and Թ is a co-domain. Then, Standard Matrix
ሺെͳǡͳሻ. There exists unique numbers ܿଵ and ܿଶ such Representation is a ݉ ൈ ݊matrix. This means that it is
that ሺ͵ǡͷሻ ൌ ܿଵ ሺʹǡͲሻ ܿଶ ሺെͳǡͳሻ.
݀݅݉ሺ ܥെ ݊݅ܽ݉ܦሻ ൈ ݀݅݉ሺ݊݅ܽ݉ܦሻ matrix.
͵ ൌ ʹܿଵ െ ܿଶ
ͷ ൌ ܿଶ Definition 3.3.2 Given ܶǣ Թ ՜ Թ where Թ is a
Now, we substitute ܿଶ ൌ ͷ in ͵ ൌ ʹܿଵ െ ܿଶ , we obtain: domain and Թ is a co-domain. Kernel is a set of all
͵ ൌ ʹܿଵ െ ͷ
points in the domain that have image which equals to
ܿଵ ൌ Ͷ
Hence, ሺ͵ǡͷሻ ൌ ͶሺʹǡͲሻ ͷሺെͳǡͳሻ. the origin point, and it is denoted by ݎ݁ܭሺܶሻ. This
ܶሺ͵ǡͷሻ ൌ ܶሺͶሺʹǡͲሻ ͷሺെͳǡͳሻሻ means that ݎ݁ܭሺܶሻ ൌ ݂݈݈ܰܶ݁ܿܽܵݑ.
ܶሺ͵ǡͷሻ ൌ ͶܶሺʹǡͲሻ ͷܶሺെͳǡͳሻ
Definition 3.3.3 Range is the column space of standard
ܶሺ͵ǡͷሻ ൌ ͶሺͲǡͳǡͶሻ ͷሺʹǡͳǡͷሻ ൌ ሺͳͲǡͻǡͶͳሻ
matrix representation, and it is denoted by ܴܽ݊݃݁ሺܶሻ.
Thus, ܶሺ͵ǡͷሻ ൌ ሺͳͲǡͻǡͶͳሻ.
Example 3.3.1 Given ܶǣ Թଷ ՜ Թସ where Թଷ is a domain
Example 3.2.12 Let ܶǣ Թ ՜ Թ be a linear
and Թସ is a co-domain.
transformation such that ܶሺͳሻ ൌ ͵. Find ܶሺͷሻ.
ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ሺെͷݔଵ ǡ ʹݔଶ ݔଷ ǡ െݔଵ ǡ Ͳሻ
Solution: Since it is a linear transformation, then a. Find the Standard Matrix Representation.
ܶሺͷሻ ൌ ܶሺͷ ή ͳሻ ൌ ͷܶሺͳሻ ൌ ͷሺ͵ሻ ൌ ͳͷ. If it is not a linear b. Find ܶሺሺ͵ǡʹǡͳሻሻ.
transformation, then it is impossible to find ܶሺͷሻ. c. Find ݎ݁ܭሺܶሻ.
d. Find ܴܽ݊݃݁ሺܶሻ.
Solution: Part a: According to definition 3.3.1, the
99 Standard Matrix Representation, let’s call it ܯ, here is
102 M. Kaabar
Ͷ ൈ ͵. We know from section 3.2 that the standard
basis for domain (here is Թଷ ) is ሼሺͳǡͲǡͲሻǡ ሺͲǡͳǡͲሻǡ ሺͲǡͲǡͳሻሽ. െͷ Ͳ Ͳ െͳͷ
We assume the following: ܶ൫ሺ͵ǡʹǡͳሻ൯ ൌ ͵ ή Ͳ ʹ ή ʹ ͳ ή ͳ ൌ ͷ
െͳ Ͳ Ͳ െ͵
ݒଵ ൌ ሺͳǡͲǡͲሻ Ͳ Ͳ Ͳ Ͳ
ݒଶ ൌ ሺͲǡͳǡͲሻ െͳͷ
െͷ Ͳ ͲͲ
ͳቮͲቍOrignial Matrix Solution:
ቌͲ ʹ
ݔൌͳ
െͳ Ͳ ͲͲ ଵ
Part a: ܶሺʹ ݔെ ͳሻ ൌ ሺʹ ݔെ ͳሻ݀ ݔൌ ݔଶ െ ݔቚ ൌ ͲǤ
Ͳ Ͳ ͲͲ ݔൌͲ
Part b: To find ݎ݁ܭሺܶሻ, we set equation of ܶ ൌ Ͳ, and
݂ሺ ݔሻ ൌ ܽ ܽଵ ܲ א ݔଶ .
103
106 M. Kaabar
ଵ భ ݔൌͳ
Thus, ܶሺ݂ሺݔሻሻ ൌ ሺܽ ܽଵ ݔሻ݀ ݔൌ ܽ ݔ ݔଶ ቚ ൌͲ
ଶ ݔൌͲ 2. Let ܶǣ Թଶ ՜ Թ be a linear transformation such that
ܽଵ
ܽ െ Ͳ ൌ Ͳ
ʹ ܶሺͳǡͲሻ ൌ Ͷ, ܶሺʹǡ െʹሻ ൌ ʹ. Find the standard matrix
ܽ ൌ െ ଶభ representation of ܶ.
భ
Hence, ݎ݁ܭሺܶሻ ൌ ሼെ ܽଵ ݔȁܽଵ אԹሽ. We also know that 3. Given ܶǣ ܲଷ ՜ Թ such that ܶሺܽ ܾ ݔ ܿ ݔଶ ሻ ൌ
ଶ
ଵ
݀݅݉ሺݎ݁ܭሺܶሻ ൌ ͳ because there is one free variable. In ሺܽ ܾ ݔ ܿ ݔଶ ሻ݀ݔ. Find ݎ݁ܭሺܶሻ.
addition, we can also find basis by letting ܽଵ be any
4. Given ܶǣ Թସ ՜ Թଷ is a linear transformation such
real number not equal to zero, say ܽଵ ൌ ͳ, as follows:
ͳ that ܶሺݔǡ ݕǡ ݖǡ ݓሻ ൌ ሺ ݔ ݕ ݖെ ʹݓǡ െʹݓǡ ݓሻ.
ݏ݅ݏܽܤൌ ሼെ ݔሽ
ʹ a. Find the standard matrix representation of ܶ.
ଵ
Thus, ܶ ݎ݁ܭሻ ൌ ܵ݊ܽሼെ ݔሽ.
ሺ
ଶ b. Find ݀݅݉ሺݎ݁ܭሺܶሻሻ.
Part c: It is very easy to find range here. ܴܽ݊݃݁ሺܶሻ ൌ Թ c. Find ܴܽ݊݃݁ሺܶሻ.
because we linearly transform from a second degree
polynomial to a real number. For example, if we 5. Given ܶǣ ܲସ ՜ Թଶൈଶ such that
linearly transform from a third degree polynomial to a ݃ሺെͳሻ ݃ሺͲሻ
second degree polynomial, then the range will be ܲଶ . ܶሺ݃ሺݔሻሻ ൌ ൨ is a linear transformation.
݃ሺെͳሻ ݃ሺͲሻ
a. Find the standard matrix representation of ܶ.
3.4 Exercises b. Find ݎ݁ܭሺܶሻ and write ݎ݁ܭሺܶሻ as a ܵ݊ܽ.
1. Given ܶǣ Թଷ ՜ Թଶൈଶ such that c. Find a basis for ܴܽ݊݃݁ሺܶሻ and write ܴܽ݊݃݁ሺܶሻ as
ݔଵ ݔଵ aܵ݊ܽ.
ܶ൫ሺݔଵ ǡ ݔଶ ǡ ݔଷ ሻ൯ ൌ ቂ ݔ ݔቃ is a linear transformation.
ଷ ଶ
a. Find the standard matrix representation of ܶ. 6. Given ܶǣ ܲଷ ՜ Թ is a linear transformation such that
b. Find ݎ݁ܭሺܶሻ. ܶሺͳሻ ൌ ǡ ܶሺ ݔଶ ݔሻ ൌ െͷ, and ܶሺ ݔଶ ʹ ݔ ͳሻ ൌ Ͷ.
c. Find a basis for ܴܽ݊݃݁ሺܶሻ and write ܴܽ݊݃݁ሺܶሻ as Ǥ Find ܶሺ ݔሻǡ ܶሺ ݔଶ ሻܶሺͷ ݔଶ ͵ ݔ ͺሻ.
aܵ݊ܽ. b. Find the standard matrix representation of ܶ.
c. Find ݎ݁ܭሺܶሻ and write ݎ݁ܭሺܶሻ as a ܵ݊ܽ.
105
Chapter 4 108 M. Kaabar
111
114 M. Kaabar
Example 4.2.2 Assume ܣis Ͷ ൈ Ͷ matrix, and
repetition if there is a repetition, and all other
݀݁ݐሺ ܣെ ߙܫସ ሻ ൌ ሺʹ െ ߙ ሻଷ ή ሺ͵ െ ߙ ሻ, and given
elements are zeros. Hence, the diagonal matrix ܦis as
݀݅݉ሺܧଶ ሻ ൌ ͵ ݀݅݉ሺܧଷ ሻ ൌ ͳ. follows:
ʹ Ͳ Ͳ
Is ܣdiagonalizable matrix? ܦൌ Ͳ െͳ Ͳ൩
Ͳ Ͳ ͳ
Solution: According result 4.2.1, it is a diagonalizable
To find an invertible matrix ܮ, we create a ͵ ൈ ͵ matrix
matrix.
like the one above but each eigenvalue above is
Example 4.2.3 Given the following ͵ ൈ ͵ matrix: represented by a column of the eigenspace that
ʹ Ͳ ͳ corresponds to that eigenvalue as follows:
ܣൌ Ͳ ͳ െʹ൩. Use example 4.1.2 from section 4.1 to
ͳ
Ͳ Ͳ െͳ ͳ െ Ͳ
ܮൌ൦ ͵ ൪
answer the following questions: Ͳ ͳ ͳ
Ͳ ͳ Ͳ
a. Is ܣdiagonalizable matrix? If yes, find a
diagonal matrix ܦ, and invertible matrix ܮsuch ʹ Ͳ ͳ
Thus, ܣൌ Ͳ ͳ െʹ൩ ൌ ିܮܦܮଵ
that ܣൌ ିܮܦܮଵ . Ͳ Ͳ െͳ
b. Find ܣsuch that ܣൌ ିܮܦܮଵ .
ଵ ଵ ିଵ
c. Find ܣଷ such that ܣൌ ିܮܦܮଵ . ͳ െ Ͳ ʹ Ͳ Ͳ ͳ െ Ͳ
ଷ ଷ
ൌ Ͳ ͳ ͳ Ͳ െͳ Ͳ൩ Ͳ ͳ ͳ
Solution: Part a: From example 4.1.2, we found the
Ͳ ͳ Ͳ Ͳ Ͳ ͳ Ͳ ͳ Ͳ
following:
ܧଶ ൌ ܵ݊ܽሼሺͳǡͲǡͲሻሽ Part b: To find ܣsuch that ܣൌ ିܮܦܮଵ , we do the
ܧଵ ൌ ܵ݊ܽሼሺͲǡͳǡͲሻሽ following steps:
ͳ
ିܧଵ ൌ ܵ݊ܽሼሺെ ǡ ͳǡͳሻሽ ܣൌ ିܮܦܮଵ
͵
According to result 4.2.1, ܣis a diagonalizable matrix. ܣൌ ሺିܮܦܮଵ ሻ ή ሺିܮܦܮଵ ሻ ή ǥ ή ሺିܮܦܮଵ ሻ
Now, we need to find a diagonal matrix ܦ, and ܣൌ ሺܦܮଶ ିܮଵ ሻ ή ǥ ή ሺିܮܦܮଵ ሻ
invertible matrix ܮsuch that ܣൌ ିܮܦܮଵ . To find a ܣൌ ିܮ ܦܮଵ
diagonal matrix ܦ, we create a ͵ ൈ ͵ matrix, and we
put eigenvalues on the main diagonal with
113
116 M. Kaabar
Thus,
݀݅݉ሺܧଷ ሻ ൌ ͵
ʹ Ͳ Ͳ ܧଶ ൌ ܵ݊ܽሼሺͲǡͲǡͲǡͳǡͳሻǡ ሺͲǡͲǡͲǡͲǡͳͲሻሽ
ܣൌ ିܮ ܦܮଵ ൌ ܮͲ െͳ Ͳ൩ ିܮଵ ൌ
Ͳ Ͳ ͳ a. Find the characteristic polynomial of ܣ.
ʹ Ͳ Ͳ Ͷ Ͳ Ͳ
b. Find a diagonal matrix ܦsuch that ܣൌ ିܮܦܮଵ .
ܮ Ͳ ሺെͳሻ Ͳ ൩ ିܮଵ ൌ ܮ Ͳ ͳ Ͳ൩ ିܮଵ
Ͳ Ͳ ͳ c. Find an invertible matrix ܮsuch that ܣൌ ିܮܦܮଵ .
Ͳ Ͳ ͳ
Part c: To find ܣଷ such that ܣൌ ିܮܦܮଵ , we do the 3. Assume that ܹ is ͵ ൈ ͵ matrix, and
following steps: ܹ െ ߙܫଷ ൌ ሺͳ െ ߙሻሺʹ െ ߙሻଶ . Given ܰሺܹ െ ܫሻ ൌ
ܵ݊ܽሼሺͳǡʹǡͲሻሽ, and ܰሺܹ െ ʹ ܫሻ ൌ ܵ݊ܽሼሺʹǡͲǡ͵ሻሽ. Is ܹ
ܣൌ ିܮܦܮଵ diagonalizable matrix? Explain.
ܣଷ ൌ ሺିܮܦܮଵ ሻ ή ሺିܮܦܮଵ ሻ ή ǥ ή ሺିܮܦܮଵ ሻ
ܣଷ ൌ ሺ ܦܮଶ ିܮଵ ሻ ή ǥ ή ሺିܮܦܮଵ ሻ
ܣଷ ൌ ܦܮଷ ିܮଵ
Chapter 5
ʹ Ͳ Ͳ ଷ
Thus, ܣଷ ൌ ܦܮଷ ିܮଵ ൌ ܮͲ െͳ Ͳ൩ ିܮଵ ൌ
Ͳ Ͳ ͳ
Matrix Dot Product
ʹଷ Ͳ Ͳ ʹଷ Ͳ Ͳ
ܮ Ͳ ሺെͳሻ ଷ ିଵ
Ͳ ൩ ܮൌ ܮ Ͳ ͳ Ͳ൩ ିܮଵ In this chapter, we discuss the dot product only in Թ .
Ͳ Ͳ ͳଷ Ͳ Ͳ ͳ In addition, we give some results about the dot product
in Թ Ǥ At the end of this chapter, we get introduced to a
4.3 Exercises concept called “Gram-Schmidt Orthonormalization”.
1. Given the following Ͷ ൈ Ͷ matrix:
ͳ Ͳ Ͳ Ͳ 5.1 The Dot Product in Թ
ܥൌ Ͳ ͳ ͳ ͳ . Is ܥdiagonalizable matrix?
Ͳ Ͳ െͳ ͳ
Ͳ Ͳ Ͳ െͳ In this section, we first give an example of the dot
product in Թ , and then we give three important
2. Assume that ܣis ͷ ൈ ͷ diagonalizable matrix, and
given the following: results related to this concept.
ܧଷ ൌ ܵ݊ܽሼሺʹǡͳǡͲǡͲǡͳሻǡ ሺͲǡͳǡͲǡͳǡͳሻǡ ሺͲǡͲǡʹǡʹǡͲሻሽ Example 5.1.1 Assume that ܣൌ ሺʹǡͶǡͳǡ͵ሻ and
ܤൌ ሺͲǡͳǡʹǡͷሻ where ܣǡ א ܤԹସ . Find ܣή ܤ.
115
118 M. Kaabar
Solution: To find ܣή ܤ, we need to do a simple vector
Result 5.1.4 Assume that ܹ ൌ ሺݔଵ ǡ ݔଶ ǡ ݔଷ ǡ ǥ ǡ ݔ ሻ, then
dot product as follows:
the squared-norm of ܹ is written as follows:
ܣή ܤൌ ሺʹǡͶǡͳǡ͵ሻ ή ሺͲǡͳǡʹǡͷሻ ȁȁܹ ȁȁଶ ൌ ݔଵ ଶ ݔଶ ଶ ݔଷ ଶ ڮ ݔ ଶ ൌ ܹ ή ܹ.
ܣή ܤൌ ʹήͲͶήͳͳήʹ͵ήͷ
ܣή ܤൌ Ͳ Ͷ ʹ ͳͷ 5.2 Gram-Schmidt
Thus, ܣή ܤൌ ʹͳ
Result 5.1.1 If ܹଵ ܹଶ in Թ and ܹଵ ് ሺͲǡͲǡͲǡ ǥ ǡͲሻ
Orthonormalization
and ܹଶ ് ሺͲǡͲǡͲǡ ǥ ǡͲሻ, and ܹଵ ή ܹଶ ൌ Ͳ, then ܹଵ and ܹଶ In this section, we give one example that explains the
are independent. concept of Gram-Schmidt Orthonormalization, and
Result 5.1.2 If ܹଵ ܹଶ in Թ are independent, then how it is related to what we have learned in chapters 2
may/maybe not ܹଵ ή ܹଶ ൌ Ͳ. (i.e. Assume that and 3.
ܹଵ ൌ ሺͳǡͳǡͳǡͳሻ and ܹଶ ൌ ሺͲǡͳǡͳǡͳሻ, then ܹଵ ή ܹଶ ൌ ͵) Example 5.2.1 Given the following:
Result 5.1.3 If ܹଵ ǡ ܹଶ ǡ ܹଷ ܹସ in Թ and none of ܣൌ ܵ݊ܽሼሺͳǡͲǡͳǡͳሻǡ ሺͲǡͳǡͲǡͳሻǡ ሺͲǡͳǡͳǡͳሻሽ. Find the
orthogonal basis for ܣ.
them is ሺͲǡͲǡͲǡ ǥ ǡͲሻ, then we say that ܹଵ ǡ ܹଶ ǡ ܹଷ ܹସ
Hint: The orthogonal basis means Gram-Schmidt
are orthogonal if they satisfy the following conditions: Orthonormalization.
ܹଵ ή ܹଶ ൌ Ͳ
Solution: To find the orthogonal basis for ܣ, we need to
ܹଵ ή ܹଷ ൌ Ͳ do the following steps:
Step 1: Find a basis for ܣ.
ܹଵ ή ܹସ ൌ Ͳ
ͳ Ͳ ͳ ͳ ͳ Ͳ ͳ ͳ
ܹଶ ή ܹଷ ൌ Ͳ Ͳ ͳ Ͳ ͳ൩ െܴଶ ܴଷ ՜ ܴଷ Ͳ ͳ Ͳ ͳ൩ This is the
Ͳ ͳ ͳ ͳ Ͳ Ͳ ͳ Ͳ
ܹଶ ή ܹସ ൌ Ͳ Semi-Reduced Matrix.
ܹଷ ή ܹସ ൌ Ͳ Since we do now have a zero-row, then ݀݅݉ሺܣሻ ൌ ͵.
To write a basis for ܣ, it is recommended to choose
rows in the above Semi-Reduced Matrix. Thus, a basis
117 for ܣis ሼሺͳǡͲǡͳǡͳሻǡ ሺͲǡͳǡͲǡͳሻǡ ሺͲǡͲǡͳǡͲሻሽ.
120 M. Kaabar
௩యήௐమ ௩ ήௐ
ݒଵ ൌ ሺͳǡͲǡͳǡͳሻ Thus, ܹଷ ൌ ݒଷ െ ൬ మ
య భ
൰ ή ܹଶ െ ቀȁȁௐ ቁ ή ܹଵ
หȁௐమ ȁห ȁȁమభ
ݒଶ ൌ ሺͲǡͳǡͲǡͳሻ
ͳ ͳ ʹ
ݒଷ ൌ ሺͲǡͲǡͳǡͲሻ ۇሺͲǡͲǡͳǡͲሻ ή ቀെ ͵ ǡ ͳǡ െ ͵ ǡ ͵ቁۊ ͳ ͳ ʹ
ܹଷ ൌ ሺͲǡͲǡͳǡͲሻ െ ۈ ଶ ۋή ൬െ ͵ ǡ ͳǡ െ ͵ ǡ ͵൰
Step 3: Use Gram-Schmidt Orthonormalization ͳ ͳ ʹ
ቤቚቀെ ǡ ͳǡ െ ǡ ቁቚቤ
ۉ ͵ ͵ ͵ ی
method.
ሺͲǡͲǡͳǡͲሻ ή ሺͳǡͲǡͳǡͳሻ
Let’s assume that the orthogonal basis for ܣis െ൭ ଶ ൱ ή ሺͳǡͲǡͳǡͳሻ
หȁሺͳǡͲǡͳǡͳሻȁห
ܤ୰୲୦୭୭୬ୟ୪ ൌ ሼܹଵ ǡ ܹଶ ǡ ܹଷ ሽ. Now, we need to find ସ ଵ ଵଵ
Thus, ܹଷ ൌ ቀെ ǡെ ǡ ǡ െ ቁ.
ଵହ ହ ଵହ ଵହ
ܹଵ ǡ ܹଶ ܹଷ . To find them, we need to do the
following: Hence, the orthogonal basis for ( ܣGram-Schmidt
Orthonormalization) is
ܹଵ ൌ ݒଵ ൌ ሺͳǡͲǡͳǡͳሻ ଵ ଵ ଶ ସ ଵ ଵଵ
ሼሺͳǡͲǡͳǡͳሻǡ ቀെ ǡ ͳǡ െ ǡ ቁ ǡ ቀെ ǡെ ǡ ǡ െ ቁሽ.
௩ ήௐ ଷ ଷ ଷ ଵହ ହ ଵହ ଵହ
ܹଶ ൌ ݒଶ െ ߙଵ ή ሺܹݏݑ݅ݒ݁ݎ ᇲ ௦ ሻ where ߙଵ ൌ ቀ మ ȁȁభమ ቁ
ȁȁௐ భ
Thus, ܹଶ ൌ ݒଶ െ ൬
௩మήௐభ
మ ൰ ή ܹଵ 5.3 Exercises
หȁௐభ ȁห
ݔଵ ൌ Ͳ െʹ ʹ ʹ
11. ݔଶ ൌ െ͵ 9. No, since the determinant of ͳ െͳ ͵൩ equals
ݔଷ ൌ ͳ ʹ െͳ Ͳ
zero, then the elements ሼሺെʹǡͳǡʹሻǡ ሺʹǡͳǡ െͳሻǡ ሺʹǡ͵ǡͲሻሽ do
not Թଷ .
121
124 M. Kaabar
4.3 Exercises
1. Since ݀݅݉ሺିܧଵ ሻ ് ʹ, ܥis not diagonalizable.
123
Bibliography
[1] Badawi, A.: MTH 221/Linear Algebra.
http://www.ayman-badawi.com/MTH221.html (2004).
Accessed 18 Aug 2014.
125
Proof Digital Proofer
Printed By Createspace