You are on page 1of 67

Digital Proofer

A First Course in Linear


A First Course in Li... Algebra
Authored by Mr. Mohammed K A K...

6.0" x 9.0" (15.24 x 22.86 cm)


Black & White on White paper
130 pages

ISBN-13: 9781502901811
ISBN-10: 1502901811

Please carefully review your Digital Proof download for formatting,


grammar, and design issues that may need to be corrected.

We recommend that you review your book three times, with each time
focusing on a different aspect.

Check the format, including headers, footers, page


1 numbers, spacing, table of contents, and index.

2 Review any images or graphics and captions if applicable.

3 Read the book for grammatical errors and typos.

Once you are satisfied with your review, you can approve your proof
and move forward to the next step in the publishing process.

To print this proof we recommend that you scale the PDF to fit the size
of your printer paper.
About the Author
Mohammed Kaabar is a math tutor at the Math
Learning Center (MLC) at Washington State
Table of Contents
University, Pullman, and he is interested in linear
algebra, scientific computing, numerical analysis, 1 Systems of Linear Equations 1
differential equations, and several programming
languages such as SQL, C#, Scala, C++, C, JavaScript, 1.1 Row Operations Method……………….........….1
Python, HTML 5 and MATLAB. He is a member of of 1.2 Basic Algebra of Matrix……………………….17
Institute of Electrical and Electronics Engineers
(IEEE), IEEE Antennas and Propagation Society, 1.3 Linear Combinations…………………….…….19
IEEE Consultants Network, IEEE Smart Grid 1.4 Square Matrix……………………….………….29
Community, IEEE Technical Committee on RFID,
IEEE Life Sciences Community, IEEE Green. ICT 1.5 Inverse Square Matrix…………………..…….33
Community, IEEE Cloud Computing Community, 1.6 Transpose Matrix……………..………….…….42
IEEE Internet of Things Community, IEEE Committee
on Earth Observations, IEEE Electric Vehicles 1.7 Determinants……….……………………..…....46
Community, IEEE Electron Devices Society, IEEE 1.8 Cramer’s Rule……………………………..…….53
Communications Society, and IEEE Computer Society.
He participated in several competitions, conferences, 1.9 Adjoint Method………………………………….55
research papers and projects. He is an online 1.10 Exercises………………………………...……….60
instructor of numerical analysis at Udemy Inc, San
Francisco, CA. In addition, he is also a Technical
Program Committee (TPC) member, reviewer and
presenter at CCA-2014, WSMEAP 2014, EECSI 2014, 2 Vector Spaces 62
JIEEEC 2013 and WCEEENG 2012. He worked as
2.1 Span and Vector Space...................................62
electrical engineering intern at Al-Arabia for Safety
and Security L.L.C. He also received several 2.2 The Dimension of Vector Space………….......65
educational awards and certificates from accredited
2.3 Linear Independence……….………………….66
institutions. For more information about the author
and his free online courses, please visit his personal 2.4 Subspace and Basis…………………………….69
website: http://www.mohammed-kaabar.net.
2.5 Exercises…………………………………….…...82
3 Homogeneous Systems 84
Introduction
3.1 Null Space and Rank......................................84
In this book, I wrote five chapters: Systems of Linear
3.2 Linear Transformation…………………..…….91
Equations, Vector Spaces, Homogeneous Systems,
3.3 Kernel and Range …………………...….…….100 Characteristic Equation of Matrix, and Matrix Dot
Product. I also added exercises at the end of each
3.4 Exercises……………………………………..…105
chapter above to let students practice additional sets of
problems other than examples, and they can also check
their solutions to some of these exercises by looking at
4 Characteristic Equation of Matrix 107 “Answers to Odd-Numbered Exercises” section at the
end of this book. This book is very useful for college
4.1 Eigenvalues and Eigenvectors……..…..……107 students who studied Calculus I, and other students
4.2 Diagonalizable Matrix…...............................112 who want to review some linear algebra concepts
before studying a second course in linear algebra.
4.3 Exercises………………………………………...115 According to my experience as a math tutor at Math
Learning Center, I have noticed that some students
have difficulty to understand some linear algebra
5 Matrix Dot Product 116 concepts in general, and vector spaces concepts in
particular. Therefore, my purpose is to provide
5.1 The Dot Product in Թ௡ ..................................116 students with an interactive method to explain the
5.2 Gram-Schmidt Orthonormalization ……….118 concept, and then provide different sets of examples
related to that concept. If you have any comments
5.3 Exercises........................................................120 related to the contents of this book, please email your
comments to mohammed.kaabar@email.wsu.edu.
I wish to express my gratitude and appreciation to my
Answers to Odd-Numbered Exercises 121 father, my mother, and my brother. I would also like to
give a special thanks to my mathematics professor
Dr. Ayman Badawi, Professor of Mathematics &
Bibliography 125 Statistics at AUS, for his brilliant efforts in revising
the content of this book and giving me the permission
to use his lecture notes and his online resources as a
guide to write this book, and I would also like to thank
all math professors at Washington State University,
Pullman. Ultimately, I would appreciate to consider
this book as a milestone for developing more math
books that can serve our mathematical society.
Chapter 1 2 M. Kaabar

First of all, each variable in this system is to the power


Systems of Linear 1 which means that this system is a linear system. As
given in the question itself, ʹ ൈ ʹ implies that the
Equations number of equations is 2, and the number of unknown
variables is also 2.
In this chapter, we discuss how to solve ݊ ൈ ݉ systems
ʹൈʹ
of linear equations using row operations method. Then,
ሺ‫܎ܗܚ܍܊ܕܝۼ‬۳‫ܛܖܗܑܜ܉ܝܙ‬ሻ ൈ ሺ‫ܛ܍ܔ܊܉ܑܚ܉܄ܖܟܗܓܖ܃܎ܗܚ܍܊ܕܝۼ‬ሻ
we give an introduction to basic algebra of matrix
Then, the unknown variables in this question are x
including matrix addition, matrix subtraction and
and y. To solve for x and y, we need to multiply the
matrix multiplication. We cover in the remaining
first equation ͵‫ ݔ‬൅ ʹ‫ ݕ‬ൌ ͷ by 2, and we also need to
sections some important concepts of linear algebra
multiply the second equation െʹ‫ ݔ‬൅ ‫ ݕ‬ൌ െ͸ by 3.
such as linear combinations, determinants, square
Hence, we obtain the following:
matrix, inverse square matrix, transpose matrix, ͸‫ ݔ‬൅ Ͷ‫ ݕ‬ൌ ͳͲ ǥ ǥ ǥ ǥ ǥ Ǥ Ǥͳ

Cramer’s Rule and Adjoint Method. െ͸‫ ݔ‬൅ ͵‫ ݕ‬ൌ െͳͺ ǥ ǥ ǥ ǥ Ǥʹ
By adding equations 1 and 2, we get the following:
1.1 Row Operations Method ͹‫ ݕ‬ൌ െͺ
ି଼
Therefore, ‫ ݕ‬ൌ
First of all, let’s start with a simple example about ଻

݊ ൈ ݉ systems of linear equation. Now, we need to substitute the value of y in one of the

Example 1.1.1 Solve for x and y for the following ʹ ൈ ʹ two original equations. Let’s substitute y in the first

system of linear equations: equation ͵‫ ݔ‬൅ ʹ‫ ݕ‬ൌ ͷ as follows:


ି଼
͵‫ ݔ‬൅ ʹ‫ ݕ‬ൌ ͷ ͵‫ ݔ‬൅ ʹ ቀ ቁ ൌ ͷ is equivalent to
൜ ଻
െʹ‫ ݔ‬൅ ‫ ݕ‬ൌ െ͸
ି଼ ଵ଺ ହଵ ଵ଻
Solution: Let’s start analyzing this ʹ ൈ ʹ system. ͵‫ ݔ‬ൌ ͷ െ ʹ ቀ ଻ ቁ ൌ ͷ ൅ ቀ ଻ ቁ ൌ  ଻  Then, ‫ ݔ‬ൌ  ଻
ଵ଻ ି଼
Therefore, ‫ ݔ‬ൌ  ܽ݊݀‫ ݕ‬ൌ
1 ଻ ଻
Hence, we solve for x and y.
Example 1.1.2 Describe the following system: 4 M. Kaabar
ʹ‫ ݔ‬൅ ͵‫ ݕ‬െ ʹ‫ ݖ‬ൌ ͳͲ
൜ Solving for unknown variables in ʹ ൈ ʹ or ʹ ൈ ͵ systems
ͷ‫ ݔ‬൅ ʹ‫ ݕ‬൅ ‫ ݖ‬ൌ ͳ͵
of linear equations using some arithmetic operations
Solution: Let’s start asking ourselves the following
such as additions, subtractions and multiplications of
questions.
linear equations is very easy. But if we have for
Question 1: How many equations do we have?
example Ͷ ൈ Ͷ system of linear equations, using some
Question 2: How many unknown variables do we have?
arithmetic operations in this case will be very difficult
Question 2: What is the power of each unknown
variable? and it will take long time to solve it. Therefore, we are
If we can answer the three questions above, then we going to use a method called Row Operation Method to
solve complicated ݊ ൈ ݉ systems of linear equations.
can discuss the above systems.
Now, let’s start with an example discussing each step
Let’s start now answering the three questions above.
of Row Operation Method for solving ݊ ൈ ݉ system of
Answer to Question 1: We have 2 equations: ʹ‫ ݔ‬൅ ͵‫ ݕ‬െ
ʹ‫ ݖ‬ൌ ͳͲ andͷ‫ ݔ‬൅ ʹ‫ ݕ‬൅ ‫ ݖ‬ൌ ͳ͵. linear equations.
Example 1.1.3 Solve for x1, x2 and x3 for the following
Answer to Question 2: We have 3 unknown variables:
x, y and z. ͵ ൈ ͵ system of linear equations:
Answer to Question 3: Each unknown variable is to the ‫ݔ‬ଵ ൅ ‫ݔ‬ଶ െ ‫ݔ‬ଷ ൌ ʹ
power 1. ൝ ଵ െ ‫ݔ‬ଶ ൅ ‫ݔ‬ଷ ൌ ʹ
ʹ‫ݔ‬
After answering the above three question, we have now െ‫ݔ‬ଶ ൅ ʹ‫ݔ‬ଷ ൌ ͳ
an idea about the above system. As we remember from Solution: In the above system, we have 3 equations
and 3 unknown variables x1, x2 and x3. To solve this
example 1.1.1 that the system of equations has the
͵ ൈ ͵ system, we need to use Row Operation Method.
following form: The following steps will describe this method.
ሺ‫܎ܗܚ܍܊ܕܝۼ‬۳‫ܛܖܗܑܜ܉ܝܙ‬ሻ ൈ ሺ‫ܛ܍ܔ܊܉ܑܚ܉܄ܖܟܗܓܖ܃܎ܗܚ܍܊ܕܝۼ‬ሻ Step 1: Rewrite the system as a matrix, and this
Since the number of equations in this example is 2 and matrix is called Augmented Matrix. Each row in this
the number of unknown variables is 3, then using the matrix is a linear equation.
above form, the above system is ʹ ൈ ͵ system. Recall x1 x2 x3 C

that each unknown variable is to the power 1. This ͳ ͳ െͳ ʹ


൭ʹ െͳ ͳ อʹ൱
means that it is a ʹ ൈ ͵ linear system.
Ͳ െͳ ʹ ͳ

3
As we can see from the above augmented matrix, the 6 M. Kaabar
first column represents the coefficients of x1 in the
three linear equations. The second column represents -2R1+R2--Æ R2

the coefficients of x2 in the three linear equations. The (This means that we multiply the first row by -2 and

third column represents the coefficients of x3 in the we add it to the second row and the change will be only

three linear equations. The fourth column is not an in the second row and no change in the first row)

actual column but it just represents the constants Hence, we obtain:


ͳ ͳ െͳ ʹ
because each linear equation equals to a constant.
൭Ͳ െ͵ ͵ อെʹ൱
Step 2: Start with the first row in matrix and make the Ͳ െͳ ʹ ͳ
first non-zero number equals to 1. This 1 is called a
ିଵ
leader number. R2

x1 x2 x3 C (This means that we multiply the second row by


ିଵ
and

ͳ ͳ െͳ ʹ
൭ʹ െͳ ͳ อʹ൱ the change will be only in the second row and no
Ͳ െͳ ʹ ͳ change in other rows)
Since the leader number is already 1in this matrix,
Hence, we obtain:
then we do not need to do anything with this number. ͳ ͳ െͳ ʹ

Otherwise, we need to make this leader number equals ൭Ͳ ͳ െͳอ య ൱
Ͳ െͳ ʹ ͳ
to 1 by multiplying the whole row with a non-zero
number.
R2+R3--Æ R3
Step 3: Eliminate all numbers that are exactly below
(This means that we add the second row to the third
the leader number in other words use the leader
row and the change will be only in the third row and
number to eliminate all things below it.
no change in the second row)
Step 4: Move to the second row and repeat what has
been done in step 2.
5
-R2+R1--Æ R1 8 M. Kaabar
(This means that we multiply the second row by -1,
and we add it to the first row and the change will be After this example, we will get introduced to two new

only in the first row and no change in the second row) definitions.

Hence, we obtain: Definition 1.1.1 ݊ ൈ ݉ system of linear equations is

Ͷ consistent if it has a solution; otherwise, it is called

‫Ͳ ͳۇ‬ ͵ inconsistent.
Ͳ ተʹ‫ۊ‬
‫ͳ Ͳۈ‬ െͳ ‫ۋ‬ Definition 1.1.2 ݊ ൈ ݉ system of linear equations has a
‫Ͳ Ͳۈ‬ ͳ ተ͵‫ۋ‬
ͷ unique solution if each variable has exactly one value.
‫ۉ‬ ͵‫ی‬
Now, let’s apply what we have learned from example
Step 5: Move to the third row and do the same as we
1.1.3 for the following example 1.1.4.
did in the first row and the second row of matrix.
Example 1.1.4 Solve for x1, x2, x3, x4 and x5 for the
R3+R2--Æ R2
(This means that we add the third row to the second following ͵ ൈ ͷ system of linear equations:
‫ݔ‬ଶ െ ‫ݔ‬ଷ ൅ ‫ݔ‬ସ െ ‫ݔ‬ହ ൌ ͳ
row and the change will be only in the second row and
൝ െʹ‫ݔ‬ଵ ൅ ‫ݔ‬ଷ െ ‫ݔ‬ହ ൌ Ͳ
no change in the third row) െ‫ݔ‬ଶ ൅ ‫ݔ‬ଷ ൅ ʹ‫ݔ‬ସ െ ͳͲ‫ݔ‬ହ ൌ ͳʹ
Hence, we obtain: Solution: In the above system, we have 3 equations
Ͷ and 5 unknown variables x1, x2, x3, x4 and x5. To solve
͵ this ͵ ൈ ͷ system, we need to use Row Operation
‫ͳۇ‬ Ͳ Ͳተ͹‫ۊ‬
‫Ͳۈ‬ Method. First, we need to construct the augmented
ͳ Ͳ ‫ۋ‬
‫Ͳۈ‬ Ͳ ͳተ͵‫ۋ‬ matrix.
ͷ x1 x2 x3 x4 x5 C
‫ۉ‬ ͵‫ی‬
ସ ଻ ହ Ͳ ͳ െͳ ͳ െͳ ͳ
Therefore, x1= , x2= and x3= ൭െʹ Ͳ ͳ Ͳ െͳ อ Ͳ ൱
ଷ ଷ ଷ
Ͳ െͳ ͳ ʹ െͳͲ ͳʹ
These are one solution only.
The leader number here is 1.
Hence, the system has a unique solution (one solution).
Ͳ ͳ െͳ ͳ െͳ ͳ
൭െʹ Ͳ ͳ Ͳ െͳ อ Ͳ ൱
7 Ͳ െͳ ͳ ʹ െͳͲ ͳʹ
By doing the same steps in example 1.1.3, we obtain 10 M. Kaabar
the following:
R1+R3--Æ R3 ͺ ͳͲ
‫ݔ‬ଶ െ ‫ݔ‬ଷ ൅ ‫ݔ‬ହ ൌ െ
͵ ͵
x1 x2 x3 x4 x5 C
ͳ ͳ
Ͳ ͳ െͳ ͳ െͳ ͳ ‫ݔ‬ଵ െ ‫ݔ‬ଷ ൅ ‫ݔ‬ହ ൌ Ͳ
ʹ ʹ
൭െʹ Ͳ ͳ Ͳ െͳ อ Ͳ ൱ ͳͳ ͳ͵
Ͳ Ͳ Ͳ ͵ െͳͳ ͳ͵ ‫ݔ‬ସ െ ‫ݔ‬ହ ൌ
͵ ͵
ିଵ
R2 Since we have 3 leader numbers (numbers equal to 1)

x1 x2 x3 x4 x5 C in x1, x2 and x4 columns, then x1, x2 and x4 are called


Ͳ ͳ െͳ ͳ െͳ leading variables. All other variables are called free
െͳ ͳ ͳ
ቌͳ Ͳ Ͳ ተ Ͳ ቍ variables such that x3, x5 ‫ א‬Թ.
ʹ ʹ ͳ͵
Ͳ Ͳ Ͳ ͵ െͳͳ Since we have leading and free variables, we need to

R3 write the leading variables in terms of free variables.

x1 x2 x3 x4 x5 C ͳ ͳ
‫ݔ‬ଵ ൌ ‫ݔ‬ଷ െ ‫ݔ‬ହ
ͳ ʹ ʹ
Ͳ ͳ െͳ ͳ െͳ ͺ ͳͲ
Ͳ ‫ݔ‬ଶ ൌ ‫ݔ‬ଷ െ ‫ݔ‬ହ െ
൮ͳ Ͳ െͳȀʹͲ ͳȀʹ ተͳ͵൲ ͵ ͵
Ͳ Ͳ Ͳ ͳ െͳͳȀ͵ ͳͳ ͳ͵
͵ ‫ݔ‬ସ ൌ ‫ ݔ‬൅
͵ ହ ͵
-R3+R1--Æ R1
Now, let x3 = 0 and x5 = 0 (You can choose any value for
x1 x2 x3 x4 x5 C
x3 and x5 because both of them are free variables).
Ͳ ͳ െͳ Ͳ ͺȀ͵ െͳͲȀ͵
ቌͳ Ͳ െͳȀʹͲ ͳȀʹ ቮ Ͳ ቍ Therefore, we obtain
ଵଷ
Ͳ Ͳ Ͳ ͳ െͳͳȀ͵ ଷ ͳ ͳ
‫ݔ‬ଵ ൌ ሺͲሻ െ ሺͲሻ ൌ Ͳ
ʹ ʹ
ͺ ͳͲ ͳͲ
‫ ݔ‬ଶ ൌ Ͳ െ ሺͲሻ െ ൌെ
Hence, we obtain from the above matrix: ͵ ͵ ͵
ͳͳ ͳ͵ ͳ͵
‫ݔ‬ସ ൌ ሺͲሻ ൅ ൌ
͵ ͵ ͵
ଵ଴ ଵଷ
Hence, ‫ݔ‬ଵ ൌ Ͳǡ ‫ݔ‬ଶ ൌ  െ ܽ݊݀‫ݔ‬ସ ൌ
ଷ ଷ
9
Another possible solution for example 1.1.4: 12 M. Kaabar
Now, let x3 = 1 and x5 = 0 (You can choose any value for
x3 and x5 because both of them are free variables). Example 1.1.5 Solve for x1, x2 and x3 for the following

Therefore, we obtain ʹ ൈ ͵ system of linear equations:

ͳ ͳ ͳ ‫ ݔ‬൅ ʹ‫ݔ‬ଶ െ ‫ݔ‬ଷ ൌ ʹ


൜ ଵ
‫ݔ‬ଵ ൌ ሺͳሻ െ ሺͲሻ ൌ ʹ‫ݔ‬ଵ ൅ Ͷ‫ݔ‬ଶ െ ʹ‫ݔ‬ଷ ൌ ͸
ʹ ʹ ʹ
ͺ ͳͲ ͳͲ ͹ Solution: In the above system, we have 2 equations
‫ ݔ‬ଶ ൌ ͳ െ ሺͲሻ െ ൌͳെ ൌെ
͵ ͵ ͵ ͵ and 3 unknown variables x1, x2 and x3. To solve this
ͳͳ ͳ͵ ͳ͵ ʹ ൈ ͵ system, we need to use Row Operation Method.
‫ݔ‬ସ ൌ ሺͲሻ ൅ ൌ
͵ ͵ ͵ First, we need to construct the augmented matrix.
Hence, we obtain x1 x2 x3 C
ͳ ͹ ͳ͵ ͳ ʹ െͳ ʹ
‫ݔ‬ଵ ൌ ǡ ‫ݔ‬ଶ ൌ  െ ܽ݊݀‫ݔ‬ସ ൌ Š‹•‹•ƒ‘–Š‡”•‘Ž—–‹‘Ǥ ቀ ቚ ቁ
ʹ ͵ ͵ ʹ Ͷ െʹ ͸
Summary of Row Operations Method in ࢔ ൈ ࢓ Systems -2R1+R2--Æ R2
of Linear Equations: x1 x2 x3 C

Suppose ‫ ן‬is a non-zero constant, and ݅ܽ݊݀݇ are row ͳ ʹ െͳ ʹ


ቀ ቚ ቁ
Ͳ Ͳ Ͳ ʹ
numbers in the augmented matrix.
We stop here and read the above matrix as follows:
* ‫ן‬Ri , ‫( Ͳ ്ן‬Multiply a row with a non-zero ‫ݔ‬ଵ ൅ ʹ‫ݔ‬ଶ െ ‫ݔ‬ଷ ൌ ʹ
constant‫)ן‬. Ͳൌʹ
Hence, we get introduced to a new result.
* Ri՞Rk (Interchange two rows).
Result 1.1.1 The system is inconsistent if and only if
* ‫ן‬Ri +Rk --Æ Rk (Multiply a row with a non-zero after reducing one of the equations, we have 0 equals
constant‫ן‬ǡ ƒ†ƒ††‹––‘ƒ‘–Š‡””‘™). to a non-zero number.
Example 1.1.6 Solve for x1, x2 and x3 for the following
Note: The change is in Rk and no change is in Ri.
ʹ ൈ ͵ system of linear equations:
Let’s start with other examples that will give us some
‫ ݔ‬൅ ʹ‫ݔ‬ଶ െ ‫ݔ‬ଷ ൌ ʹ
important results in linear algebra. ൜ ଵ
ʹ‫ݔ‬ଵ ൅ Ͷ‫ݔ‬ଶ െ ʹ‫ݔ‬ଷ ൌ Ͷ
Solution: In the above system, we have 2 equations
and 3 unknown variables x1, x2 and x3. To solve this
11 ʹ ൈ ͵ system, we need to use Row Operation Method.
First, we need to construct the augmented matrix as 14 M. Kaabar
we did in the previous example.
x1 x2 x3 C For what values of b and d will the system be
ͳ ʹ െͳ ʹ consistent?
ቀ ቚ ቁ
ʹ Ͷ െʹ Ͷ Solution: We need to multiply the leader number (2) by
-2R1+R2--Æ R2 a non-zero constant to make it equal to 1 instead of 2
as follows:
x1 x2 x3 C ଵ
R1
ͳ ʹ െͳ ʹ ଶ
ቀ ቚ ቁ
Ͳ Ͳ Ͳ Ͳ ͷ ͵ ͳ
ͳ െ
We stop here and read the above matrix as follows: ൮ ʹ ʹ ተ ʹ൲
‫ ͳݔ‬൅ ʹ‫ ʹݔ‬െ ‫ ͵ݔ‬ൌ ʹ െʹ Ͷ ܾ ͵
ͲൌͲ െʹ െͷ െ͵ ݀
The solution is x1= -2 x2 + x3 + 2
2R1+R2--Æ R2
x1 is a leading variable, and x2, x3 ‫ א‬Թ free variables.
Now, let x2 = 1 and x3 = 0 (You can choose any value for 2R1+R3--Æ R3

x2 and x3 because both of them are free variables). ͷ ͵ ͳ


ͳ െ
൮ ʹ ʹ ተ ʹ ൲
Therefore, we obtain ‫ ͳݔ‬ൌ െʹሺͲሻ ൅ ሺͲሻ ൅ ʹ ൌ ʹܽ݊݀ Ͳ ͻ ܾ൅͵ ʹ
‫ݔ‬ଶ ൌ Ͳܽ݊݀‫ݔ‬ଷ ൌ ͲǤ Ͳ Ͳ Ͳ ݀െͳ

Hence, we get introduced to new results. Therefore, ܾ ‫ א‬Թand d=1


Result 1.1.2 The ݊ ൈ ݉system is consistent if and only When the system is consistent, then we must have
if it has a unique solution (no free variable). infinitely many solutions. In this case, b is a free
Result 1.1.3 The ݊ ൈ ݉system is consistent if and only variable.
if it has infinitely many solutions (free variables). Example 1.1.8 Given an augmented matrix of a
Result 1.1.4 Assume the ݊ ൈ ݉system is consistent,
ͳ െͳ ʹ ܽ
and we have more variables than equations (m>n). system:൭െͳ ʹ െʹอܾ൱
Then, the system has infinitely many solutions. െʹ ʹ െͶ ܿ
Example 1.1.7 Given an augmented matrix of a For what values of a, b and c will the system be
ʹ ͷ ͵ െͳ consistent?
system:൭െʹ Ͷ ܾ อ ͵ ൱ Solution: We need to use the Row Operation Method to
െʹ െͷ െ͵ ݀
find a, b and c as follows:
13 R1+R2--Æ R2
2R1+R3--Æ R3
ͳ െͳ ʹ ܽ 16 M. Kaabar
൭Ͳ ͳ Ͳอ ܽ ൅ ܾ൱
Ͳ Ͳ Ͳ ʹܽ ൅ ܿ Result 1.1.7 The solution is called Non-Trivial if it has
R2+R1--Æ R1 at least one value that is not zero.
ͳ Ͳ ʹ ʹܽ ൅ ܾ Result 1.1.8 In a homogeneous system if number of
൭Ͳ ͳ Ͳอ ܽ ൅ ܾ ൱ variables is greater than number of equations, then
Ͳ Ͳ Ͳ ʹܽ ൅ ܿ the system has infinitely many solutions.
We stop here and read the above matrix. Hence, we Result 1.1.9 Given the following ͵ ൈ ͸ matrix:
obtain: Ͳ ͳ Ͳ Ͳ ͳ ͳ
൥ͳ Ͳ ͵ʹ ͷ ͸൩ This matrix is called Reduced
x1 + 2x3 = 2a + b
Ͳ Ͳ Ͳ Ͳ Ͳ ͳ
x2 = a + b Matrix because all the numbers exactly below each
0 = 2a + c leader number are zeros.
Therefore, 2a + c = 0 which means that c = -2a Result 1.1.10 Given the following ͵ ൈ ͸ matrix:
Hence, the solution is ܽǡ ܾ ‫ א‬Թ (free variables), and Ͳ Ͳ ͳʹ Ͳ
c = -2a. ൥ͳ ͷ Ͳʹ Ͳ൩ This matrix is called Completely-
Let’s now discuss some new definitions and results of Ͳ Ͳ ͲͲ ͳ
linear algebra. Reduced Matrix because all the numbers exactly below
Definition 1.1.3 Suppose we have the following ͵ ൈ ͵ and above each leader number are zeros.
Result 1.1.11 The matrix is called Echelon Matrix if it
system of linear equations:
is a combination of Reduced Matrix and Leader
ʹ‫ݔ‬ଵ െ ͵‫ݔ‬ଶ ൅ ‫ݔ‬ଷ ൌ Ͳ Numbers that are arranged so that the leader number
൝ ‫ݔ‬ଵ ൅ ‫ݔ‬ଶ െ ͵‫ݔ‬ଷ ൌ Ͳ in ith row is to the right of the leader number in the
െ‫ݔ‬ଵ ൅ ͵‫ݔ‬ଶ െ ‫ݔ‬ଷ ൌ Ͳ
previous row, and the rows that are entirely zeros
This system is called homogeneous system of linear come at the end of the matrix.
equations because all constants on the right-side are Ͳ ͳ ʹ͵ ͹
zeros. ‫ͳ Ͳ Ͳۍ‬ͷ ͸‫ې‬
‫ێ‬ ‫ۑ‬
Result 1.1.5 Every ݊ ൈ ݉homogeneous system is ‫ ۑ͵ ͳ Ͳ Ͳ Ͳێ‬This is an Echelon Matrix.
consistent. ‫ۑͳ Ͳ Ͳ Ͳ Ͳ ێ‬
‫ۑͲ ͲͲ Ͳ Ͳێ‬
Result 1.1.6 Since every ݊ ൈ ݉homogeneous system is ‫ےͲ Ͳ Ͳ Ͳ Ͳ ۏ‬
consistent, then x1=0, x2=0, x3=0, …, xm=0 is a solution. Ͳ ͳ Ͳ Ͳ ͳ ͳ
This solution is called Trivial Solution. ൥ͳ Ͳ ͵ʹ ͷ ͸൩ This is NOT Echelon Matrix.
15 Ͳ Ͳ Ͳ Ͳ Ͳ ͳ
Example 1.1.9 Given the following reduced matrix: 18 M. Kaabar
Ͳ ͳ Ͳ Ͳ ͳ ͳ
൥ͳ Ͳ ͵ʹ ͷ ͸൩ ʹ ͳ ͵ ͵‫͵כ͵ ͳכ͵ ʹכ‬ ͸ ͵ ͻ
Ͳ Ͳ Ͳ Ͳ Ͳ ͳ 3A = ͵ ‫ כ‬ቂ ቃൌቂ ቃൌቂ ቃ
Ͳ ͳ ͷ ͵‫כ͵ ͳכ͵ Ͳכ‬ͷ Ͳ ͵ ͳͷ
Covert this reduced matrix to echelon matrix. Part b: In matrix addition we can only add matrices of
Solution: We just need to interchange two rows as the same order only.
follows: The order of matrix A isʹ ൈ ͵.
R1 Å--Æ R2 The order of matrix B is ͵ ൈ ʹ.
ͳ Ͳ ͵ ʹ ͷ ͸ Since matrix A and matrix B have different orders,
൥Ͳ ͳ ͲͲ ͳ ͳ൩ then we cannot find A+B.
Ͳ Ͳ Ͳ Ͳ Ͳ ͳ Hence, there is no answer for part b.
Now, the matrix is an echelon matrix.
Part c: In matrix subtraction we can only subtract
Result 1.1.12 To convert Reduced Matrix to Echelon
matrices of the same order only.
Matrix, we just need to use interchange rows.
The order of matrix A isʹ ൈ ͵.
Result 1.1.13 To convert Completely-Reduced Matrix
The order of matrix B is ͵ ൈ ʹ.
to Reduced-Echelon Matrix, we just need to use
Since matrix A and matrix B have different orders,
interchange rows.
then we cannot find A-B.
1.2 Basic Algebra of Matrix Hence, there is also no answer for part c.
Part d: First, we need to multiply 2 by matrix C as
In this section, we discuss an example of basic algebra follows:
െ͵ ʹ ͷቃ ቂʹ ‫ כ‬െ͵ ʹ ‫ כ ʹ ʹ כ‬ͷቃ
of matrix including matrix addition, subtraction and 2C = ʹ ‫ כ‬ቂ ൌ ൌ
ͳ ͹ ͵ ʹ‫כʹ ͳכ‬͹ ʹ‫͵כ‬
multiplication. െ͸ Ͷ ͳͲ
ቂ ቃ
ʹ ͳͶ ͸
Example 1.2.1 Given the following three matrices: Since we are already have 3A from part a, we just need
ͳ ʹ
ʹ ͳ ͵ െ͵ ʹ ͷቃ to find 3A-2C. As we know, in matrix subtraction we
A= ቂ ቃ , B = ͸ ͳ൩ , C = ቂ

Ͳ ͳ ͷ ͳ ͹ ͵ can only subtract matrices of the same order only.
͵ ͳ
a) Find 3A. The order of matrix 3A isʹ ൈ ͵.
b) Find A+B. The order of matrix 2C is ʹ ൈ ͵.
c) Find A-B. Both of them have the same order. Hence, we can find
d) Find 3A-2C. 3A-2C as follows:
Solution: Part a: We just need to multiply 3 by matrix ͸ ͵ ͻ െ͸ Ͷ ͳͲ ͳʹ െͳ െͳ
3A-2C = ቂ ቃെቂ ቃൌቂ ቃ
Ͳ ͵ ͳͷ ʹ ͳͶ ͸ െʹ െͳͳ ͻ
A as follows:
17
1.3 Linear Combinations 20 M. Kaabar

Solution: Part a: To represent the columns of matrix A


In this section, we discuss the concept of linear
as a linear combination, we need to do the same steps
combinations of either columns or rows of a certain as we did in example 1.3.1.
matrix. Step 1: Put each column of matrix A individually:
Example 1.3.1 Suppose we have 5 apples, 7 oranges
ͳ ʹ ͵
and 12 bananas. Represent them as a linear ൥Ͳ ൩ ൥ͷ ൩ ൥ͳ ൩
combination of fruits. ʹ ͳ ʹ
Solution: To represent them as a linear combination,
we need to do the following steps: Step 2: Separate each one of them by addition sign.
Step 1: Put each fruit individually:
Apples Oranges Bananas ͳ ʹ ͵
Step 2: Separate each one of them by addition sign. ൥Ͳ ൩ + ൥ͷ ൩ + ൥ͳ ൩
Apples + Oranges + Bananas ʹ ͳ ʹ
Step 3: Put 1 box in front of each one..
Step 3: Put 1 box in front of each one.
Apples + Oranges + Bananas
Step 4: Write the number of each fruit in each box..
ͳ ʹ ͵
5 Apples + 7 Oranges + 12 Bananas ൥Ͳ ൩ + ൥ͷ ൩ + ൥ͳ ൩
ʹ ͳ ʹ
This representation is called a Linear Combination of
Fruits. Step 4: Write random number for each column in each
Now, we can apply what we have learned in example box because it is not mentioned any number for any
1.3.1on rows and columns of a certain matrix. column in the question.
Example 1.3.2 Given the following matrix A:
ͳ ʹ ͵ ͳ ʹ ͵
A = ൥Ͳ ͷ ͳ ൩ 5 ൥Ͳ ൩ + -3 ൥ͷ ൩ + 2 ൥ͳ ൩
ʹ ͳ ʹ ʹ ͳ ʹ
a) Find a linear combination of the columns of
matrix A. This representation is called a Linear Combination of
b) Find a linear combination of the rows of matrix the Columns of Matrix A.
A.
19
Part b: To represent the rows of matrix A as a linear 22 M. Kaabar
combination, we need to do the same steps as we did in
part a. Example 1.3.3 Given the following matrix A and
Step 1: Put each row of matrix A individually: matrix B:
ͳ ʹ
ͳ ͳ ͳ
ሾͳ ʹ ͵ሿ ሾͲ ͷ ͳሿ ሾʹ ͳ ʹሿ A = ൥͵ ͸ ൩ , B= ቂ ቃ
Ͳ ʹ ͳ
ͳ ͳ
a) Find AB such that each column of AB is a linear
Step 2: Separate each one of them by addition sign.
combination of the columns of A.
b) Find AB such that each row of AB is a linear
ሾͳ ʹ ͵ሿ + ሾͲ ͷ ͳሿ + ሾ ʹ ͳ ʹሿ
combination of the rows of B.
c) Find AB using the usual matrix multiplication.
Step 3: Put 1 box in front of each one.
d) If AB = C, then find c23.
e) Find BA such that each column of BA is a linear
ሾͳ ʹ ͵ሿ + ሾͲ ͷ ͳሿ + ሾʹ ͳ ʹሿ
combination of the columns of B.
f) Find BA such that each row of BA is a linear
Step 4: Write random number for each row in each box
combination of the rows of A.
because it is not mentioned any number for any row in
the question.
Solution: Part a: Since A has an orderሺ•‹œ‡ሻ͵ ൈ ʹ, and
4 ሾ ͳ ʹ ͵ሿ + -8 ሾͲ ͷ ͳሿ + 11 ሾʹ ͳ ʹሿ B has an orderሺ•‹œ‡ሻʹ ൈ ͵, then AB will have an
orderሺ•‹œ‡ሻ͵ ൈ ͵ according to Definition 1.3.1.
This representation is called a Linear Combination of
Step 1: 1st column of AB:
the Rows of Matrix A.
ͳ ʹ ͳ
Definition 1.3.1 Suppose matrix A has an order (size) 1 *൥͵൩ + 0 * ൥͸൩ = ൥͵൩
݊ ൈ ݉ , and matrix B has an order (size) ݉ ൈ ݇, then ͳ ͳ ͳ
the number of columns of matrix A equals to the
number of rows of matrix B. If we assume that the Step 2: 2nd column of AB:
usual multiplication of matrix A by matrix B equals to
a new matrix C such that A ή B = C, then the matrix C ͳ ʹ ͷ
1 *൥͵൩ + 2 * ൥͸൩ = ൥ͳͷ൩
has an order (size) ݊ ൈ ݇. (i.e. If A has an
ͳ ͳ ͵
orderሺ•‹œ‡ሻ͵ ൈ ͹, and B has an orderሺ•‹œ‡ሻ͹ ൈ ͳͲ, then
C has an orderሺ•‹œ‡ሻ͵ ൈ ͳͲ).
21
Step 3: 3rd column of AB: 24 M. Kaabar

ͳ ʹ ͵ ͳ‫ͳכ‬൅ʹ‫ͳכͳ Ͳכ‬൅ʹ‫ͳכͳ ʹכ‬൅ʹ‫ͳכ‬


1 *൥͵൩ + 1 * ൥͸൩ = ൥ͻ൩ AB ൌ ͵ ‫ ͳ כ‬൅ ͸ ‫ ͳ כ ͵ Ͳ כ‬൅ ͸ ‫ ͳ כ ͵ ʹ כ‬൅ ͸ ‫ͳ כ‬൩

ͳ ͳ ʹ ͳ‫ͳכ‬൅ͳ‫ͳכͳ Ͳכ‬൅ͳ‫ͳכͳ ʹכ‬൅ͳ‫ͳכ‬

ͳ ͷ ͵ ͳ ͷ ͵
Hence, AB = ൥͵ ͳͷ ͻ൩ Hence, AB = ൥͵ ͳͷ ͻ൩
ͳ ͵ ʹ ͳ ͵ ʹ

Part b: As we did in part a but the difference here is Part d: Since AB = C, then c23 means that we need to
rows instead of columns. find the number that is located in 2nd row and 3rd
Step 1: 1st row of AB: column.
ܿଵଵ ܿଵଶ ܿଵଷ ͳ ͷ ͵
ܿ ܿ ܿ
C = ଶଵ ଶଶ ଶଷ ൌ ͵ ͳͷ ͻ൩
൥ ൩ ൥
1 * ሾͳ ͳ ͳሿ + 2 * ሾ Ͳ ʹ ͳሿ = ሾͳ ͷ ͵ሿ
ܿଷଵ ܿଷଶ ܿଷଷ ͳ ͵ ʹ
Now, let’s explain each element of matrix C.
Step 2: 2nd row of AB:
c11 means that the number that is located in 1st row
and 1st column. c11 = 1.
3 * ሾͳ ͳ ͳሿ + 6 * ሾ Ͳ ʹ ͳሿ = ሾ͵ ͳͷ ͻሿ
c12 means that the number that is located in 1 st row
and 2nd column. c12 = 5.
Step 3: 3rd row of AB:
c13 means that the number that is located in 1 st row
and 3rd column. c13 = 3.
1 * ሾͳ ͳ ͳሿ + 1 * ሾ Ͳ ʹ ͳሿ = ሾͳ ͵ ʹሿ
c21 means that the number that is located in 2 nd row
and 1st column. c21 = 3.
ͳ ͷ ͵
Hence, AB = ൥͵ ͳͷ ͻ൩ c22 means that the number that is located in 2 nd row
ͳ ͵ ʹ and 2nd column. c22 = 15.
c23 means that the number that is located in 2 nd row
Part c: Here we use the usual matrix multiplication. and 3rd column. c23 = 9.
c31 means that the number that is located in 3 rd row
ͳ ʹ and 1st column. c31 = 1.
ͳ ͳ ͳ
AB =൥͵ ͸൩ ‫ כ‬ቂ ቃ c32 means that the number that is located in 3 rd row
Ͳ ʹ ͳ
ͳ ͳ
and 2nd column. c32 = 3.
c33 means that the number that is located in 3 rd row
23
and 3rd column. c33 = 2.
Hence, c23 = 9. 26 M. Kaabar

Part e: As we did in part a. Since B has an Example 1.3.4 Given the following matrix A and
orderሺ•‹œ‡ሻʹ ൈ ͵ǡ and A has an orderሺ•‹œ‡ሻ͵ ൈ ʹ, then matrix B:
BA will have an orderሺ•‹œ‡ሻʹ ൈ ʹ according to ͳ Ͳ Ͳ Ͳ
A=ቂ ቃ , B= ቂ ቃ
Definition 1.3.1. Ͳ Ͳ ͳ Ͳ
a) Find AB.
Step 1: 1st column of BA:
b) Find BA.
ͳ ͳ ͳ ͷ
1 *ቂ ቃ + 3 * ቂ ቃ + 1 * ቂ ቃ = ቂ ቃ Solution: Part a: Using the usual matrix
Ͳ ʹ ͳ ͹
multiplication, we obtain:
Step 2: 2nd column of BA: ͳ Ͳ Ͳ Ͳ Ͳ Ͳ
AB = ቂ ቃቂ ቃ=ቂ ቃ
Ͳ Ͳ ͳ Ͳ Ͳ Ͳ
ͳ ͳ ͳ ͻ
2 *ቂ ቃ + 6 * ቂ ቃ + 1 * ቂ ቃ = ቂ ቃ Part b: Using the usual matrix multiplication, we
Ͳ ʹ ͳ ͳ͵
obtain:
ͷ ͻቃ Ͳ Ͳ ͳ Ͳ Ͳ Ͳ
Hence, BA = ቂ BA = ቂ ቃቂ ቃ= ቂ ቃ
͹ ͳ͵ ͳ Ͳ Ͳ Ͳ ͳ Ͳ

Part f: As we did in part b. Result 1.3.2 It is possible that the product of two non-
Step 1: 1st row of AB: zero matrices is a zero matrix. However, this is not
true for product of real numbers.
1 * ሾͳ ʹሿ + 1 * ሾ ͵ ͸ሿ + 1 * ሾͳ ͳሿ = ሾͷ ͻሿ
Example 1.3.5 Given the following matrix A, matrix B
Step 2: 2nd row of AB: and matrix D:
ͳ Ͳ Ͳ Ͳ Ͳ Ͳ
A=ቂ ቃ , B= ቂ ቃ , D= ቂ ቃ
0 * ሾͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ
ʹሿ + 2 * ሾ ͵ ͸ሿ + 1 * ሾͳ ͳሿ = ሾ͹ ͳ͵ሿ
a) Find AB.
b) Find AD.
ͷ ͻቃ
Hence, BA = ቂ
͹ ͳ͵
Solution: Part a: Using the usual matrix
Result 1.3.1 In general, matrix multiplication is not multiplication, we obtain:
commutative (i.e. AB is not necessarily equal to BA). ͳ Ͳ Ͳ Ͳ Ͳ Ͳ
AB = ቂ ቃቂ ቃ=ቂ ቃ zero matrix.
Ͳ Ͳ ͳ Ͳ Ͳ Ͳ
25
Part b: Using the usual matrix multiplication, we 28 M. Kaabar
obtain:
ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Now, let ‫ݔ‬ଵ ൌ ͳǡ ‫ݔ‬ଶ ൌ ͳǡ ƒ†‫ݔ‬ଷ ൌ ͳǤ
AD = ቂ ቃቂ ቃ=ቂ ቃ zero matrix.
Ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ ͳ ͳ ͳ ͵
൥െͳ െͳ ʹ൩ ൥ͳ൩ ൌ ൥Ͳ൩
Result 1.3.3 In general, If AB = AD and A is not a zero Ͳ ͳ Ͷ ͳ ͷ
matrix, then it is possible that B ് D.
Definition 1.3.2 Suppose matrix A has an order (size) Result 1.3.4 Let CX = A be a system of linear equations
‫ݔ‬ଵ ܽଵ
݊ ൈ ݉ , A is a zero matrix if each number of A is a zero ‫ ݔ ۍ‬ଶ ‫ܽ ۍ ې‬ଶ ‫ې‬
number. C ‫ ێ‬Ǥ ‫ ێ= ۑ‬Ǥ ‫ۑ‬
‫ ێ‬Ǥ ‫ ێ ۑ‬Ǥ ‫ۑ‬
‫ ݔ ۏ‬௡ ‫ܽ ۏ ے‬௡ ‫ے‬
Example 1.3.5 Given the following system of linear
Then, šଵ ൌ „ଵ ǡ šଶ ൌ „ଶ ǡ ǥ Ǥ ǡ š୬ ൌ „୬ is a solution to the
equations:
‫ݔ‬ଵ ൅ ‫ݔ‬ଶ ൅ ‫ݔ‬ଷ ൌ ͵ system if and only if „ଵ ଵ ൅ „ଶ ଶ ൅ ‫ ڮ‬൅ „୬ ୫ ൌ ǡC1 is
൝െ‫ݔ‬ଵ െ ‫ݔ‬ଶ ൅ ʹ‫ݔ‬ଷ ൌ Ͳ the 1st column of C, C2 is the 2nd column of C, …. , Cm is
‫ݔ‬ଶ ൅ Ͷ‫ݔ‬ଷ ൌ ͷ nth column of C.
Write the above system in matrix-form.
Solution: We write the above system in matrix-form as Example 1.3.6 Given the following system of linear
follows: equations:
(Coefficient Matrix) ή (Variable Column) = (Constant ‫ ݔ‬൅ ‫ݔ‬ଶ ൌ ʹ
൜ ଵ
Column) ʹ‫ݔ‬ଵ ൅ ʹ‫ݔ‬ଶ ൌ Ͷ
CX = A where C is a Coefficient Matrix, X is a Variable Write the above system in matrix-form.
Column, and A is a Constant Column. Solution: We write the above system in matrix-form
ͳ ͳ ͳ ‫ݔ‬ଵ ͵ CX = A as follows:
൥െͳ െͳ ʹ൩ ൥‫ݔ‬ଶ ൩ ൌ ൥Ͳ൩ ͳ ͳ ‫ͳݔ‬ ʹ
C=ቂ ቃ , X=ቂ ቃ , A=ቂ ቃ
Ͳ ͳ Ͷ ‫ݔ‬ଷ ͷ ʹ ʹ ‫ʹݔ‬ Ͷ
The above matrix-form means the following: Hence, CX = A:
ͳ ͳ ͳ ͳ ͳ ‫ݔ‬ଵ ʹ
ቂ ቃቂ ቃ = ቂ ቃ
‫ݔ‬ଵ ‫ כ‬൥െͳ൩ ൅ ‫ݔ‬ଶ ‫ כ‬൥െͳ൩ ൅ ‫ݔ‬ଷ ‫ כ‬൥ʹ൩ ʹ ʹ ‫ݔ‬ଶ Ͷ
Ͳ ͳ Ͷ ͳ ͳ ʹ
‫ݔ‬ଵ ‫ כ‬ቂ ቃ ൅ ‫ݔ‬ଶ ‫ כ‬ቂ ቃ ൌ  ቂ ቃ
Hence, we obtain: ʹ ʹ Ͷ
‫ݔ‬ଵ ൅ ‫ݔ‬ଶ ൅ ‫ݔ‬ଷ ൌ ͵ Now, we need to choose values for x1 and x2 such that
͵
൥െ‫ݔ‬ଵ െ ‫ݔ‬ଶ ൅ ʹ‫ݔ‬ଷ ൌ Ͳ൩ ൌ ൥Ͳ൩ these values satisfy that above matrix-form.
‫ݔ‬ଶ ൅ Ͷ‫ݔ‬ଷ ൌ ͷ ͷ First, let’s try x1 = 3 and x2 = 4.
ͳ ͳ ͹ ʹ
27 ͵ ‫ כ‬ቂ ቃ ൅ Ͷ ‫ כ‬ቂ ቃ ൌ ቂ ቃ ്  ቂ ቃ
ʹ ʹ ͳͶ Ͷ
Therefore, x1 = 3, x2 = 4 is not solution, and our 30 M. Kaabar
assumption is wrong.
Now, let’s try x1 = 0 and x2 = 2. Fact 1.4.5 Թ૛ൈ૛ ൌ ଶ ሺԹሻ = set of all ʹ ൈ ʹ matrices.
ͳ ͳ ʹ ʹ
Ͳ ‫ כ‬ቂ ቃ ൅ ʹ ‫ כ‬ቂ ቃ ൌ ቂ ቃ ൌ ቂ ቃ
ʹ ʹ Ͷ Ͷ Now, we give some helpful notations:
Therefore, x1 = 0, x2 = 2 is solution.
Ժ: Set of all integers.
Է: Set of all rational numbers.
Result 1.3.5 Let CX = A be a system of linear
Թ : Set of all real number.
equations. The constant column A can be written
Գ : Set of all natural numbers.
uniquely as a linear combination of the columns of C.

The following figure 1.4.1 represents three main sets:


1.4 Square Matrix Ժǡ Էƒ†ԹǤ

In this section, we discuss some examples and facts of


Թ
square matrix and identity matrix.
Definition 1.4.1 In general, let A is a matrix, and n be
Է
a positive integer such that An is defined only if
number of rows of A = number of columns of A. (i.e. Let
A is a matrix with a size (order) ൈ ݉ , A2 = AήA is
Ժ
defined only if n = m). This matrix is called a Square-
Matrix.
The following is some facts regarding square-matrix
and set of real numbers ԹǤ

Fact 1.4.1 Let A is a matrix, ି୬ is not equal to .
୅౤
Fact 1.4.2 Let A is a matrix, ି୬ has a specific meaning
if A is a square-matrix and A is invertible Figure 1.4.1: Representation of Three Sets of Numbers
(nonsingular). We will talk about it in section 1.5.
Fact 1.4.3 A set of real numbers Թ has a multiplicative The above figure shows that the largest set is Թ , and
identity equals to 1. (i.e. ܽ ή ͳ ൌ ͳ ή ܽ ൌ ܽ). the smallest set is Ժ. In addition, we know that a
Fact 1.4.4 A set of real numbers Թ has an additive rational number is an integer divided by a non-zero
identity equals to 0. (i.e. ܽ ൅ Ͳ ൌ Ͳ ൅ ܽ ൌ ܽ). integer.
29
Fact 1.4.6 ͵ ‫ א‬Թ means that 3 is an element of the 32 M. Kaabar
setԹǤ (“‫ ”א‬is a mathematical symbol that means
“belong to”). Hence, the multiplicative identity for all ʹ ൈ ʹ matrices
Fact 1.4.7 ξʹ ‫ א‬Թ means that ξʹ is an element of the
ͳ Ͳ
setԹǤ (“‫ ”א‬is a mathematical symbol that means is ଶ ൌ  ቂ ቃǤ
Ͳ ͳ
“belong to”).
ଵ ଵ
Example 1.4.2 Givenଷ ሺԹሻ ൌ Թ૜ൈ૜ . What is ଷ ?
Fact 1.4.8 ‫ ב‬Ժ means that is not an element of the
ଶ ଶ
Solution: We need to find the multiplicative identity
setԺǤ (“‫ ”ב‬is a mathematical symbol that means “does
not belong to”). for all ͵ ൈ ͵ matrices. This means that  ଷ ൌ ଷ  ൌ  for
Fact 1.4.9  ‫ א‬ଶ ሺԹሻ means that A is a ʹ ൈ ʹ matrix. every ‫ א‬ଷ ሺԹሻ.
Fact 1.4.10 Թ࢔ൈ࢔ ൌ ௡ ሺԹሻ = set of all ݊ ൈ ݊ matrices. ͳ Ͳ Ͳ ܽଵଵ ܽଵଶ ܽଵଷ
(i.e. Թ૜ൈ૜ ൌ ଷ ሺԹሻ = set of all ͵ ൈ ͵ matrices). ଷ ൌ  ൥Ͳ ͳ Ͳ൩ and  ൌ ൥ܽଶଵ ܽଶଶ ܽଶଷ ൩
Ͳ Ͳ ͳ ܽଷଵ ܽଷଶ ܽଷଷ
After learning all above facts, let’s test our knowledge Now, let’s check if the multiplicative identity is right:
by asking the following question. ܽଵଵ ܽଵଶ ܽଵଷ ͳ Ͳ Ͳ ܽଵଵ ܽଵଶ ܽଵଷ
Question 1.4.1 Does ௡ ሺԹሻ has a multiplicative  ଷ ൌ ൥ܽଶଵ ܽଶଶ ܽଶଷ ൩ ൥Ͳ ͳ Ͳ൩ ൌ ൥ܽଶଵ ܽଶଶ ܽଶଷ ൩ ൌ 
identity? ܽଷଵ ܽଷଶ ܽଷଷ Ͳ Ͳ ͳ ܽଷଵ ܽଷଶ ܽଷଷ
Solution: Yes; there is a matrix called ௡ where ͳ Ͳ Ͳ ܽଵଵ ܽଵଶ ܽଵଷ ܽଵଵ ܽଵଶ ܽଵଷ
ଷ  ൌ ൥Ͳ ͳ Ͳ൩ ൥ܽଶଵ ܽଶଶ ܽଶଷ ൩ ൌ ൥ܽଶଵ ܽଶଶ ܽଶଷ ൩ ൌ 
௡ ‫ א‬௡ ሺԹሻ such that  ௡ ൌ ௡  ൌ  for every ‫ א‬௡ ሺԹሻ.
Ͳ Ͳ ͳ ܽଷଵ ܽଷଶ ܽଷଷ ܽଷଵ ܽଷଶ ܽଷଷ
Example 1.4.1 Givenଶ ሺԹሻ ൌ Թ૛ൈ૛ . What is ଶ ? Hence, the multiplicative identity for all ͵ ൈ ͵ matrices
Solution: We need to find the multiplicative identity ͳ Ͳ Ͳ
is ଷ ൌ  ൥Ͳ ͳ Ͳ൩Ǥ
for all ʹ ൈ ʹ matrices. This means that  ଶ ൌ ଶ  ൌ  for
Ͳ Ͳ ͳ
every ‫ א‬ଶ ሺԹሻ. Result 1.4.1 Assume we have a square-matrix with
ͳ Ͳ ܽଵଵ ܽଵଶ ݊ ൈ ݊ size and the main-diagonal is ܽଵଵ ǡ ܽଶଶ ǡ ܽଷଷ ǡ ǥ ǡ ܽ௡௡
ଶ ൌ  ቂ ቃ and  ൌ ቂܽ ܽଶଶ ቃ
Ͳ ͳ ଶଵ

Now, let’s check if the multiplicative identity is right:


ܽͳͳ ‫ڮ‬ ܽͳ݊
ܽʹʹ ǥ
 ଶ ൌ ቂܽ
ܽଵଵ ܽଵଶ ͳ Ͳ
ቃ ቂ ቃ ቂ
ܽଵଵ ܽଵଶ ൥‫ڭ‬ ‫ڭ‬൩
ܽଶଶ Ͳ ͳ ൌ ܽଶଵ ܽଶଶ ቃ ൌ  ǥ ǥ
ଶଵ
ܽ݊ͳ ‫ڮ‬ ܽ݊݊
ͳ Ͳ ܽଵଵ ܽଵଶ ܽଵଵ ܽଵଶ
ଶ  ൌ ቂ ቃ ቂܽ ܽ ቃ ൌ ቂܽ ܽଶଶ ቃ ൌ  Then, the multiplicative identity for the above ݊ ൈ ݊
Ͳ ͳ ଶଵ ଶଶ ଶଵ
matrix is as follows:
31
ͳ ‫ڮ‬ Ͳ 34 M. Kaabar
ͳ Ͳ
௡ ൌ ቎ ‫ڭ‬ ‫ڭ‬቏
Ͳ ͳ
Ͳ ‫ͳ ڮ‬ d) Find Elementary Matrices, L1, L2,…, Ln such
All ones on the main-diagonal and zeros elsewhere. that L1 L2 …. Ln B = Z.
e) Find a matrix S such that SC=B.
1.5 Inverse Square Matrix f) Find a matrix X such that XB=C.
In this section, we discuss the concept of inverse
square matrix, and we give some examples of Solution: Part a: Since Z is ͵ ൈ Ͷ matrix and we
multiply from left by -2, then D must be a square
elementary matrix.
matrix. Hence, D must be͵ ൈ ͵.
Example 1.5.1Given the following matrices with the We use the multiplicative identity, and we multiply it
following row-operations steps: by -2 from left as follows:
ͳ ʹ ͵ Ͷ െʹ െͶ െ͸ െͺ ͳ Ͳ Ͳ െʹ Ͳ Ͳ
 ൌ ൥Ͳ ͳെͳ ʹ൩ ,  ൌ ൥ Ͳ ͳ െͳ ʹ ൩ ଷ ൌ  ൥Ͳ ͳ Ͳ൩-2R1Æ ൥ Ͳ ͳ Ͳ൩ ൌ 
Ͳ ͳ ͳ ͵ Ͳ ͳ ͳ ͵ Ͳ Ͳ ͳ Ͳ Ͳ ͳ
ͳ ʹ ͵ Ͷ D is called Elementary Matrix of Type I.
൥Ͳ ͳെͳ ʹ൩
െʹ Ͳ Ͳ
Ͳ ͳ ͳ ͵ Thus, Ͳ ͳ Ͳ൩  ൌ 

-2R1Æ Ͳ Ͳ ͳ
െʹ െͶ െ͸ െͺ
൥Ͳ ͳ െͳ ʹ൩ Part b: Since W is ͵ ൈ Ͷ matrix and we multiply from
ଵ ଵ
Ͳ ͳ ͳ ͵ right by െ because the inverse of -2 is െ , then K
ଶ ଶ
2R1+R3--Æ R3 must be a square matrix. Hence, K must be͵ ൈ ͵.
-3R1+R2--Æ R2 We use the multiplicative identity, and we multiply it

Matrix B R3ÅÆ R1 Matrix C by െ ଶ from left as follows:

a) Find a matrix D such that DZ=W. ͳ Ͳ Ͳ െ Ͳ Ͳ
ଵ ଶ
ଷ ൌ  ൥Ͳ ͳ Ͳ ൩  െ R1Æ ቎ Ͳ ͳ Ͳ቏ ൌ 

b) Find a matrix K such that KW=Z. Ͳ Ͳ ͳ Ͳ Ͳ ͳ
c) Find Elementary Matrices, F1, F2, F3, …, Fm K is also called Elementary Matrix of Type I.

such that F1 F2 …. Fm Z = C. െଶ Ͳ Ͳ
Thus, ቎ Ͳ ͳ Ͳ቏  ൌ 
33
Ͳ Ͳ ͳ
Result 1.5.1 Each row-operation corresponds to one 36 M. Kaabar
and only one elementary matrix.

ͳ Ͳ Ͳ Ͳ Ͳ ͳ
Part c: According to the given four row-operations ଷ ൌ  ൥Ͳ ͳ Ͳ൩ R3ÅÆ R1൥Ͳ ͳ Ͳ൩ (Step Four)
steps, we need to find four elementary matrices from Z Ͳ Ͳ ͳ ͳ Ͳ Ͳ
to C which means m=4 because we want to go 4 steps
forward from Z to C. From part a, we already have the Hence, we obtain the following four elementary
following: matrices:
ͳ Ͳ Ͳ െʹ Ͳ Ͳ
ଷ ൌ  ൥Ͳ ͳ Ͳ൩-2R1Æ ൥ Ͳ ͳ Ͳ൩ (Step One)
Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ െʹ Ͳ Ͳ
൥Ͳ ͳ Ͳ൩൥െ͵ ͳ Ͳ൩൥Ͳ ͳ Ͳ൩ ൥ Ͳ ͳ Ͳ൩  ൌ 
ͳ Ͳ Ͳ Ͳ Ͳ ͳ ʹ Ͳ ͳ Ͳ Ͳ ͳ
െʹ Ͳ Ͳ
Hence, ൥ Ͳ ͳ Ͳ൩  ൌ 
Ͳ Ͳ ͳ Part d: According to the given three row-operations
steps from B to Z, we need to find three elementary
ͳ Ͳ Ͳ ͳ Ͳ Ͳ matrices which means n=3 because we want to go 3
ଷ ൌ  ൥Ͳ ͳ Ͳ൩2R1+R3--Æ R3൥Ͳ ͳ Ͳ൩ (Step Two) steps backward from B to Z. Backward steps mean
Ͳ Ͳ ͳ ʹ Ͳ ͳ that we need to do inverse steps (i.e. The inverse of

-2R1 is െ R1 because it is row-multiplication step). We
ͳ Ͳ Ͳ െʹ Ͳ Ͳ ଶ
Hence, ൥Ͳ ͳ Ͳ൩ ൥ Ͳ ͳ Ͳ൩  ൌ  start from the third step as follows:
ʹ Ͳ ͳ Ͳ Ͳ ͳ The inverse of -3R1+R2--Æ R2 is 3R1+R2--Æ R2 because
it is row-addition step.
ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ
ଷ ൌ  ൥Ͳ ͳ Ͳ൩-3R1+R2--Æ R2൥െ͵ ͳ Ͳ൩ (Step Three) ଷ ൌ  ൥Ͳ ͳ Ͳ൩3R1+R2--Æ R2 ൥͵ ͳ Ͳ൩ (Step Three)
Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ

ͳ Ͳ Ͳ
ͳ Ͳ Ͳ ͳ Ͳ Ͳ െʹ Ͳ Ͳ Hence, ൥͵ ͳ Ͳ൩  ൌ 
Hence, ൥െ͵ ͳ Ͳ൩൥Ͳ ͳ Ͳ൩ ൥ Ͳ ͳ Ͳ൩  ൌ  Ͳ Ͳ ͳ
Ͳ Ͳ ͳ ʹ Ͳ ͳ Ͳ Ͳ ͳ The inverse of 2R1+R3--Æ R3 is -2R1+R3--Æ R3 because
it is row-addition step.

ͳ Ͳ Ͳ ͳ Ͳ Ͳ
35 ଷ ൌ  ൥Ͳ ͳ Ͳ൩-2R1+R3--Æ R3 ൥ Ͳ ͳ Ͳ൩ (Step Two)
Ͳ Ͳ ͳ െʹ Ͳ ͳ
38 M. Kaabar
ͳ Ͳ Ͳ ͳ Ͳ Ͳ
Hence,൥ Ͳ ͳ Ͳ൩ ൥͵ ͳ Ͳ൩  ൌ  Part f: According to the given last row-operations step
െʹ Ͳ ͳ Ͳ Ͳ ͳ
ଵ from B to C, we need to find one elementary matrix
The inverse of -2R1 is െ R1 because it is row-
ଶ because we want to go 1 step forward from B to C.
multiplication step ͳ Ͳ Ͳ Ͳ Ͳ ͳ
ଷ ൌ  ൥Ͳ ͳ Ͳ൩ R3ÅÆ R1൥Ͳ ͳ Ͳ൩
ͳ Ͳ Ͳ െ

Ͳ Ͳ Ͳ Ͳ ͳ ͳ Ͳ Ͳ
ଵ ଶ
ଷ ൌ  ൥Ͳ ͳ Ͳ൩  െ ଶR1Æ ቎ Ͳ ͳ Ͳ቏ (Step One)
Ͳ Ͳ ͳ Ͳ Ͳ ͳ
Ͳ Ͳ ͳ Hence, we obtain: ൥Ͳ ͳ Ͳ൩  ൌ . This means that
ͳ Ͳ Ͳ
Hence, we obtain the following three elementary Ͳ Ͳ ͳ
matrices: X = ൥Ͳ ͳ Ͳ൩.
ͳ Ͳ Ͳ
ͳ
െ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ Ͳ Example 1.5.2 Given the following matrix:
൦ ʹ ൪൥ Ͳ ͳ Ͳ ൩ ൥͵ ͳ Ͳ൩  ൌ  ʹ ͳ
Ͳ ͳ Ͳ ൌቂ ቃ
െʹ Ͳ ͳ Ͳ Ͳ ͳ Ͷ Ͳ
Ͳ Ͳ ͳ
Find A-1 if possible.
Part e: According to the given last row-operations step
from C to B, we need to find one elementary matrix Solution: Finding A-1 if possible means that we need to
because we want to go 1 step backward from C to B. find a possible inverse matrix called A-1 such that
The inverse of R3ÅÆ R1 is R1ÅÆ R3 which is the same AA-1 = ଶ = A-1A. To find this possible matrix A-1, we
as R3ÅÆ R1. need to do the following steps:
ͳ Ͳ Ͳ Ͳ Ͳ ͳ Step 1: Write A and ଶ in the following form:
ଷ ൌ  ൥Ͳ ͳ Ͳ൩ R3ÅÆ R1൥Ͳ ͳ Ͳ൩
Ͳ Ͳ ͳ ͳ Ͳ Ͳ
ሺƒ–”‹šȁ †‡–‹–›ƒ–”‹š ʹ ሻ

Ͳ Ͳ ͳ ʹ ͳ ͳ Ͳ
Hence, we obtain: ൥Ͳ ͳ Ͳ൩  ൌ . This means that ቀ ቚ ቁ
Ͷ Ͳ Ͳ ͳ
ͳ Ͳ Ͳ
Ͳ Ͳ ͳ Step 2: Do some Row-Operations until you get the
S = ൥Ͳ ͳ Ͳ൩. following:
ͳ Ͳ Ͳ ሺ‘›‘—•‡‡ ʹ ǫ ˆ›‡•ǡ –Š‡ ൌ െͳ ȁሻ
37 ሺ‘›‘—•‡‡ ʹ ǫ ˆ‘ǡ –Š‡Šƒ•‘‹˜‡”•‡ƒ–”‹šȁሻ
Now, let’s do the above step until we get the 40 M. Kaabar
Completely-Reduced-Echelon matrix.
ʹ ͳ ͳ Ͳ
ͳ ͳ Solution: Finding A-1 if possible means that we need to
ͳ Ͳ
ቀ ቚ ቁ െ ଶR1Æ ൬


ʹ ʹ ൰ find a possible inverse matrix called A-1 such that
Ͷ Ͳ Ͳ ͳ Ͷ Ͳ Ͳ ͳ
AA-1 = ଷ = A-1A. To find this possible matrix A-1, we
ͳ ͳ
ଵ ଵ
need to do the following steps:
ͳ Ͳ
൬ ฬ
ʹ ʹ ൰-4R1+R2--Æ R2ቆͳ ቤ ଶ

Ͳ
ቇ Step 1: Write A and ଷ in the following form:
Ͳ െʹ െʹ ͳ
Ͷ Ͳ Ͳ ͳ ሺƒ–”‹šȁ †‡–‹–›ƒ–”‹š ͵ ሻ

ଵ ଵ ଵ

Ͳ ͳ Ͳ ʹ ͳ Ͳ Ͳ
ͳ Ͳ ଵ ͳ ଶ
ቆ ଶ ቤ ଶ ቇ െ R2Æ ቌ ଶቮ ଵቍ
Ͳ െʹ െʹ ͳ ଶ
Ͳ ͳͳ െ ቆͲ ͳ ͲቤͲ ͳ Ͳቇ

Ͳ െͳ ͳ Ͳ Ͳ ͳ
ଵ ଵ
Step 2: Do some Row-Operations until you get the
ଵ Ͳ ͳ ͲͲ following:
ቌͳ ଶ ଵ ସ
ଶቮ ଵቍ െ ଶR2+R1--Æ R1ቌͲ ቮ
ͳͳ ଵቍ
Ͳ ͳͳ െ


ଶ ሺ‘›‘—•‡‡ ͵ ǫ ˆ›‡•ǡ –Š‡ ൌ െͳ ȁሻ
ሺ‘›‘—•‡‡ ͵ ǫ ˆ‘ǡ –Š‡Šƒ•‘‹˜‡”•‡ƒ–”‹šȁሻ
Since we got the Completely-Reduced-Echelon matrix
Now, let’s do the above step until we get the
which is the identity matrix ଶ , then A has an inverse
Completely-Reduced-Echelon matrix.
matrix which is A-1.
ͳ Ͳ ʹ ͳ Ͳ Ͳ

Ͳ
Hence, A-1 = ቎ ସ
ଵ቏
ቆͲ ͳ ͲቤͲ ͳ ͲቇR2+R3--Æ R3
ͳ െ
ଶ Ͳ െͳ ͳ Ͳ Ͳ ͳ

Example 1.5.3 Given the following matrix: ͳ Ͳ ʹ ͳ Ͳ Ͳ


ͳ Ͳ ʹ
 ൌ ൥Ͳ ͳ Ͳ ൩
ቆͲ ͳ ͲቤͲ ͳ Ͳቇ -2R3+R1--Æ R1
Ͳ െͳ ͳ Ͳ Ͳ ͳ Ͳ ͳ ͳ
Find A if possible.
-1

ͳ Ͳ Ͳ ͳ െʹ െʹ
ቆͲ ͳ ͲቤͲ ͳ Ͳ ቇ
39
Ͳ Ͳ ͳ Ͳ ͳ ͳ
42 M. Kaabar

Since we got the Completely-Reduced-Echelon matrix


which is the identity matrix ଷ , then A has an inverse 1.6 Transpose Matrix
matrix which is A-1.
ͳ െʹ െʹ In this section, we introduce the concept of transpose
Hence, A = ൥Ͳ ͳ
-1 Ͳ൩ matrix, and we discuss some examples of symmetric
Ͳ ͳ ͳ
Result 1.5.2 Given ݊ ൈ ݉ matrix A and identity matrix and skew-symmetric matrices.

௡ , and with some row-operations, we got Ԣ and ௡ Ԣ as Definition 1.6.1 Given ݊ ൈ ݉ matrix A, ୘ is called A
transpose and it is ݉ ൈ ݊ matrix. We obtain ୘ from A
follows: ሺȁ ݊ ሻ Row-Operations-Æ ሺԢȁ ݊ Ԣሻ
by making columns of A rows or by making rows of A
Then, ௡ᇱ  ൌ Ԣ.
columns.
Result 1.5.3 Given ݊ ൈ ݊ matrix A and identity matrix
Example 1.6.1 Given the following matrix:
ିଵ
௡ , and with some row-operations, we got  and ௡ as ʹ ͵ ͳ Ͳ
 ൌ ൥ͷ ͳʹ ͵൩
follows: ሺȁ ݊ ሻ Row-Operations-Æ ሺ ݊ ȁ ሻ
െͳ
ͳ ͳ Ͳ ͷ

Find  .
Then,AA-1 = ௡ = A-1A.
Fact 1.5.1 Given ݊ ൈ ݉ matrix A and identity matrix ௡ , Solution: According to definition 1.6.1, A is͵ ൈ Ͷ
௡ is called a left identity for all ݊ ൈ ݉ matrices such matrix. Thus, ୘ should beͶ ൈ ͵. By making columns
of A rows, we obtain the following:
that ௡  = A. (i.e. Given͵ ൈ ͷ matrix A, then ଷ  = A). ʹ ͷ ͳ
Fact 1.5.2 Given ݊ ൈ ݊ matrix A and identity matrix ௠ , ୘ ൌ ቎͵ ͳ ͳ቏
ͳ ʹ Ͳ
௠ is called a right identity for all ݊ ൈ ݊ matrices such Ͳ ͵ ͷ
that  ௠ = A. (i.e. Given͵ ൈ ͷ matrix A, then  ହ = A). Definition 1.6.2 Given ݊ ൈ ݉ matrix A and݉ ൈ ݊
Result 1.5.4 Given ݊ ൈ ݊ matrix A, A-4 has a meaning if ƒ–”‹š୘ , then ୘ is always defined, and it is ݊ ൈ ݊
and only if A-1 exists. matrix. (i.e. Let୘ ൌ , then ௡ൈ௠ ୘௠ൈ௡ ൌ ௡ൈ௡ ).
Result 1.5.5 Given ݊ ൈ ݊ matrix A, If A-1 exists, then Definition 1.6.3 Given ݊ ൈ ݉ matrix A and݉ ൈ ݊
A-4 = A-1 ൈ A-1ൈ A-1ൈ A-1. ƒ–”‹š୘ , then ୘  is always defined, and it is ݉ ൈ ݉
41 matrix. (i.e. Let୘  ൌ , then ୘௠ൈ௡ ௡ൈ௠ ൌ ௠ൈ௠ ).
Definition 1.6.4 Given ݊ ൈ ݊ matrix A and݊ ൈ ݊ 44 M. Kaabar
ƒ–”‹š୘ , then  is symmetric if ୘ ൌ Ǥ
Ͳ െʹ െͷ
Definition 1.6.5 Given ݊ ൈ ݊ matrix A and݊ ൈ ݊ ୘
Since  ൌ െ ൌ ൥ʹ Ͳ െ͵൩, then A is skew-
ƒ–”‹š୘ , then  is skew-symmetric if ୘ ൌ െǤ ͷ ͵ Ͳ
Example 1.6.2 Given the following matrix: symmetric.
ͳ ͷ ͳͲ Fact 1.6.1 Given ݊ ൈ ݊ matrix A, if A is skew-
ൌ൥ͷ ͵ ͹൩
ͳͲ ͹ ͳͲ symmetric, then all numbers on the main diagonal of A
Show that A is symmetric. are always zeros.
Fact 1.6.2 Given ݊ ൈ ݉ matrix A, then ሺ୘ ሻ୘ ൌ Ǥ
Solution: According to definition 1.6.4, A is͵ ൈ ͵
matrix. Thus, ୘ should be͵ ൈ ͵. By making columns Fact 1.6.3 Given ݊ ൈ ݉ matrix A and ݉ ൈ ݇ matrix B,
of A rows, we obtain the following: then ሺ ୘ ሻ୘ ൌ  ୘ ୘ Ǥ (Warning:ሺ ୘ ሻ୘ ് ୘  ୘ ).
ͳ ͷ ͳͲ
୘ ൌ ൥ ͷ ͵ ͹ ൩ Fact 1.6.4 Given ݊ ൈ ݉ matrices A and B, then
ͳͲ ͹ ͳͲ ሺ േ ሻ୘ ൌ ୘ േ ୘ Ǥ
ͳ ͷ ͳͲ
Since ୘ ൌ  ൌ ൥ ͷ ͵ ͹ ൩, then A is symmetric. Result 1.6.1 Given matrix A and constantߙ, if A is
ͳͲ ͹ ͳͲ symmetric, then ߙ is symmetric such that
ሺߙሻ୘ ൌ ߙ୘ Ǥ
Example 1.6.3 Given the following matrix:
Ͳ ʹ ͷ Result 1.6.2 Given matrix A and constantߙ, if A is
 ൌ ൥െʹ Ͳ ͵൩ skew-symmetric, then ߙ is skew-symmetric such that
െͷ െ͵ Ͳ
Show that A is skew-symmetric. ሺߙሻ୘ ൌ ߙ୘ Ǥ
Result 1.6.3 Let A be ݊ ൈ ݊ matrix, there exists a
Solution: According to definition 1.6.5, A is͵ ൈ ͵ symmetric matrix B and a skew-symmetric matrix C
matrix. Thus, ୘ should be͵ ൈ ͵. By making columns
such that A is a linear combination of B and C. This
of A rows, we obtain the following:
Ͳ െʹ െͷ means that those should be numbers ߙଵ ƒ†ߙଶ such
୘ ൌ ൥ʹ Ͳ െ͵൩
that  ൌ ߙଵ  ൅ ߙଶ .
ͷ ͵ Ͳ
Proof of Result 1.6.3 We will show that A is a linear
43 combination of B and C.
We assume that B is symmetric such that ൌ  ൅ ୘ . 46 M. Kaabar
 ୘ ൌ ሺ ൅ ୘ ሻ୘ ൌ ୘ ൅ ሺ୘ ሻ୘ ൌ ୘ ൅  ൌ Ǥ
Now, we assume that C is skew-symmetric such that 1.7 Determinants
 ൌ  െ ୘ . In this section, we introduce step by step for finding
 ୘ ൌ ሺ െ ୘ ሻ୘ ൌ ୘ െ ሺ୘ ሻ୘ ൌ ୘ െ  ൌ െሺ െ ୘ ሻ ൌ െ determinant of a certain matrix. In addition, we
By using algebra, we do the following: discuss some important properties such as invertible
ͳ ͳ ͳ ͳ and non-invertible. In addition, we talk about the
 ൅  ൌ ሺ ൅ ୘ ሻ ൅ ሺ െ ୘ ሻ
ʹ ʹ ʹ ʹ
effect of row-operations on determinants.
ͳ ͳ ͳ ͳ
ൌ  ൅ ୘ ൅  െ ୘ ൌ Ǥ Definition 1.7.1 Determinant is a square matrix. Given
ʹ ʹ ʹ ʹ
Thus, A is a linear combination of B and C. ଶ ሺԹሻ ൌ Թ૛ൈ૛ ൌ Թ૛ൈ૛ , let  ‫ א‬ଶ ሺԹሻ where A is ʹ ൈ ʹ
Example 1.6.4 Given the following matrix: ܽଵଵ ܽଵଶ
matrix,  ൌ ቂܽ ܽଶଶ ቃǤ The determinant of A is
ʹ ͳ Ͷ ଶଵ
 ൌ ൥͵ Ͳ ͳ ൩ represented by †‡–ሺሻ‘”ȁȁ.
ͷ ͸ ͹
Find symmetric matrix B and skew-symmetric matrix Hence, †‡–ሺሻ ൌ ȁȁ ൌ ܽଵଵ ܽଶଶ െ ܽଵଶ ܽଶଵ ‫ א‬Թ. (Warning:
C such that  ൌ ߙଵ  ൅ ߙଶ  for some numbersߙଵ ƒ†ߙଶ . this definition works only for ʹ ൈ ʹ matrices).
Example 1.7.1 Given the following matrix:
Solution: As we did in the above proof, we do the ͵ ʹ
following: ൌቂ ቃ
ͷ ͹
ʹ ͳ Ͷ ʹ ͵ ͷ Ͷ Ͷ ͻ Find the determinant of A.
 ൌ  ൅ ୘ ൌ ൥͵ Ͳ ͳ൩ ൅ ൥ͳ Ͳ ͸൩ ൌ ൥Ͷ Ͳ ͹ ൩
ͷ ͸ ͹ Ͷ ͳ ͹ ͻ ͹ ͳͶ
ʹ ͳ Ͷ ʹ ͵ ͷ Ͳ െʹ െͳ Solution: Using definition 1.7.1, we do the following:
 ൌ  െ ୘ ൌ ൥͵ Ͳ ͳ൩ െ ൥ͳ Ͳ ͸൩ ൌ ൥ʹ Ͳ െͷ൩ †‡–ሺሻ ൌ ȁȁ ൌ ሺ͵ሻሺ͹ሻ െ ሺʹሻሺͷሻ ൌ ʹͳ െ ͳͲ ൌ ͳͳǤ
ͷ ͸ ͹ Ͷ ͳ ͹ ͳ ͷ Ͳ Thus, the determinant of A is 11.

Let ߙଵ ൌ ߙଶ ൌ

Example 1.7.2 Given the following matrix:
Ͷ Ͷ ͻ Ͳ െʹ െͳ
ଵ ଵ ଵ ଵ ͳ Ͳ ʹ
Thus,  ൌ  ൅  ൌ ൥Ͷ Ͳ ͹ ൩ ൅ ଶ ൥ʹ Ͳ െͷ൩
ଶ ଶ ଶ  ൌ ൥͵ ͳ െͳ൩
ͻ ͹ ͳͶ ͳ ͷ Ͳ
ͳ ʹ Ͷ
Find the determinant of A.
45
Solution: Since A is ͵ ൈ ͵ matrix such that 48 M. Kaabar
 ‫ א‬ଷ ሺԹሻ ൌ Թ૜ൈ૜ , then we cannot use definition 1.7.1
because it is valid only for ʹ ൈ ʹ matrices. Thus, we ͳ Ͳ ʹ
need to use the following method to find the  ൌ ൥͵ ͳ െͳ൩
ͳ ʹ Ͷ
determinant of A. ͳ ʹ
Step 1: Choose any row or any column. It is ሺെͳሻଷାଶ ܽଷଶ †‡– ቂ ቃ
͵ െͳ
recommended to choose the one that has more zeros. Step 3: Add all of them together as follows:
In this example, we prefer to choose the second column
͵ െͳ ͳ ʹ
or the first row. Let’s choose the second column as †‡–ሺሻ ൌ ሺെͳሻଵାଶ ܽଵଶ †‡– ቂ ቃ ൅ ሺെͳሻଶାଶ ܽଶଶ †‡– ቂ ቃ
ͳ Ͷ ͳ Ͷ
follows: ͳ ʹ
ͳ Ͳ ʹ ൅ ሺെͳሻଷାଶ ܽଷଶ †‡– ቂ ቃ
͵ െͳ
 ൌ ൥͵ ͳ െͳ൩
͵ െͳ ͳ ʹ
ͳ ʹ Ͷ †‡–ሺሻ ൌ ሺെͳሻଷ ሺͲሻ†‡– ቂ ቃ ൅ ሺെͳሻସ ሺͳሻ†‡– ቂ ቃ
ܽଵଶ ൌ Ͳǡ ܽଶଶ ൌ ͳƒ†ܽଷଶ ൌ ʹǤ ͳ Ͷ ͳ Ͷ
Step 2: To find the determinant of A, we do the ͳ ʹ
൅ ሺെͳሻହ ሺʹሻ†‡– ቂ ቃ
͵ െͳ
following: Forܽଵଶ , since ܽଵଶ is in the first row and
͵ െͳ ͳ ʹ
second column, then we virtually remove the first row †‡–ሺሻ ൌ ሺെͳሻሺͲሻ†‡– ቂ ቃ ൅ ሺͳሻሺͳሻ†‡– ቂ ቃ
ͳ Ͷ ͳ Ͷ
and second column.
ͳ ʹ
ͳ Ͳ ʹ ൅ ሺെͳሻሺʹሻ†‡– ቂ ቃ
͵ െͳ
 ൌ ൥͵ ͳ െͳ൩
ͳ ʹ Ͷ †‡–ሺሻ ൌ ሺെͳሻሺͲሻሺͳʹ െ െͳሻ ൅ ሺͳሻሺͳሻሺͶ െ ʹሻ ൅ ሺെͳሻሺʹሻሺെͳ
͵ െͳ െ ͸ሻ
ሺെͳሻଵାଶ ܽଵଶ †‡– ቂ ቃ
ͳ Ͷ
†‡–ሺሻ ൌ Ͳ ൅ ʹ ൅ ͳͶ ൌ ͳ͸Ǥ
Forܽଶଶ , since ܽଶଶ is in the second row and second
column, then we virtually remove the second row and Thus, the determinant of A is 16.
second column. Result 1.7.1 Let  ‫ א‬௡ ሺԹሻ. Then, A is invertible
ͳ Ͳ ʹ
 ൌ ൥͵ ͳ െͳ൩ (non-singular) if and only if †‡–ሺሻ ് ͲǤ
ͳ ʹ Ͷ The above result means that if †‡–ሺሻ ് Ͳ, then A is
ͳ ʹ
ሺെͳሻଶାଶ ܽଶଶ †‡– ቂ ቃ invertible (non-singular), and if A is invertible (non-
ͳ Ͷ
singular), then †‡–ሺሻ ് Ͳ.
Forܽଷଶ , since ܽଷଶ is in the third row and second
column, then we virtually remove the third row and Example 1.7.3 Given the following matrix:
second column. ʹ ͵
ൌቂ ቃ
47 Ͷ ͸
Is A invertible (non-singular)? 50 M. Kaabar

Solution: Using result 1.7.1, we do the following: Result 1.7.2 Let  ‫ א‬௡ ሺԹሻ be a triangular matrix.
†‡–ሺሻ ൌ ȁȁ ൌ ሺʹሻሺ͸ሻ െ ሺ͵ሻሺͶሻ ൌ ͳʹ െ ͳʹ ൌ ͲǤ
Then, †‡–ሺሻ = multiplication of the numbers on the
Since the determinant of A is 0, then A is non-
invertible (singular). main diagonal of A.
Thus, the answer is No because A is non-invertible There are three types of triangular matrix:
(singular).
ܽଵଵ ܽଵଶ a) Upper Triangular Matrix: it has all zeros on the
Definition 1.7.2 Given  ൌ ቂܽ ܽଶଶ ቃ. Assume that
ଶଵ left side of the diagonal of ݊ ൈ ݊ matrix.
†‡–ሺሻ ് Ͳ such that †‡–ሺሻ ൌ ܽଵଵ ܽଶଶ െ ܽଵଶ ܽଶଵ . To find
ͳ ͹ ͵
ିଵ (the inverse of A), we use the following format that (i.e.  ൌ ൥Ͳ ʹ ͷ൩ is an Upper Triangular Matrix).
applies only for ʹ ൈ ʹ matrices: Ͳ Ͳ Ͷ
ͳ ܽଶଶ െܽଵଶ b) Diagonal Matrix: it has all zeros on both left and
ିଵ ൌ ቂെܽ ܽଵଵ ቃ
†‡–ሺ‫ܣ‬ሻ ଶଵ right sides of the diagonal of ݊ ൈ ݊ matrix.
ͳ ܽଶଶ െܽଵଶ ͳ Ͳ Ͳ
ିଵ ൌ ቂെܽ ܽଵଵ ቃ (i.e.  ൌ ൥Ͳ ʹ Ͳ൩ is a Diagonal Matrix).
ܽଵଵ ܽଶଶ െ ܽଵଶ ܽଶଵ ଶଵ
Ͳ Ͳ Ͷ
Example 1.7.4 Given the following matrix: c) Lower Triangular Matrix: it has all zeros on the
͵ ʹ
ൌቂ ቃ right side of the diagonal of ݊ ൈ ݊ matrix.
െͶ ͷ
Is A invertible (non-singular)? If Yes, Find ିଵ . ͳ Ͳ Ͳ
(i.e.  ൌ ൥ͷ ʹ Ͳ൩ is a Diagonal Matrix).
Solution: Using result 1.7.1, we do the following: ͳ ͻ Ͷ
†‡–ሺሻ ൌ ȁȁ ൌ ሺ͵ሻሺͷሻ െ ሺʹሻሺെͶሻ ൌ ͳͷ ൅ ͺ ൌ ʹ͵ ് ͲǤ Fact 1.7.1 Let  ‫ א‬௡ ሺԹሻ. Then, †‡–ሺሻ ൌ †‡–ሺ୘ ሻ.
Since the determinant of A is not 0, then A is invertible Fact 1.7.2 Let  ‫ א‬௡ ሺԹሻ. If A is an invertible (non-
(non-singular).
singular) matrix, then ୘ is also an invertible (non-
Thus, the answer is Yes, there exists ିଵ according to
definition 1.7.2 as follows: singular) matrix. (i.e. ሺ୘ ሻିଵ ൌ ሺିଵ ሻ୘ ).
ͷ ʹ Proof of Fact 1.7.2 We will show that ሺ୘ ሻିଵ ൌ ሺିଵ ሻ୘ .
ͳ ͳ െ
ͷ െʹ ͷ െʹ
ିଵ ൌ ቂ ቃൌ ቂ ቃ ൌ ൦ʹ͵ ʹ͵൪
We know from previous results that ିଵ ൌ ௡ .
ሺ ሻ
†‡– ‫ ܣ‬Ͷ ͵ ʹ͵ Ͷ ͵ Ͷ ͵
ʹ͵ ʹ͵ By taking the transpose of both sides, we obtain:
49
ሺିଵ ሻ୘ ൌ ሺ ௡ ሻ୘ 52 M. Kaabar
Then, ሺିଵ ሻ୘ ୘ ൌ ሺ ௡ ሻ୘
Since ሺ ௡ ሻ୘ ൌ ௡ , then ሺିଵ ሻ୘ ୘ ൌ ௡ . * Ri՞Rk (Interchange two rows). It has no effect on

Similarly, ሺ୘ ሻିଵ ୘ ൌ ሺ ௡ ሻ୘ ൌ ௡ . the determinants.

Thus, ሺ୘ ሻିଵ ൌ ሺିଵ ሻ୘ . In general, the effect of Column-Operations on

The effect of Row-Operations on determinants: determinants is the same as for Row-Operations.


Example 1.7.5 Given the following Ͷ ൈ Ͷ matrix A with
Suppose ‫ ן‬is a non-zero constant, and ݅ܽ݊݀݇ are row
some Row-Operations:
numbers in the augmented matrix.  2R1 --Æ A1 3R3 --Æ A2 -2R4 --Æ A4
* ‫ן‬Ri , ‫( Ͳ ്ן‬Multiply a row with a non-zero If †‡–ሺሻ ൌ Ͷ, then find †‡–ሺଷ ሻ

constant‫)ן‬.
Solution: Using what we have learned from the effect
ͳ ʹ ͵ ͳ ʹ ͵ of determinants on Row-Operations:
i.e.  ൌ ൥Ͳ Ͷ ͳ൩ 3R2 --Æ ൥Ͳ ͳʹ ͵൩ ൌ 
†‡–ሺଵ ሻ ൌ ʹ ‫–‡† כ‬ሺሻ ൌ ʹ ‫ כ‬Ͷ ൌ ͺ because ଵ has the first
ʹ Ͳ ͳ ʹ Ͳ ͳ
row of A multiplied by 2.
Assume that †‡–ሺሻ ൌ ɀ where ɀ is known, then
†‡–ሺଶ ሻ ൌ ͵ ‫–‡† כ‬ሺଵ ሻ ൌ ͵ ‫ כ‬ͺ ൌ ʹͶ because ଶ has the
†‡–ሺሻ ൌ ͵ɀ. third row of ଵ multiplied by 3.
Similarly, if †‡–ሺሻ ൌ Ⱦ ™Š‡”‡Ⱦ‹•‘™ǡ then Similarly, †‡–ሺଷ ሻ ൌ െʹ ‫–‡† כ‬ሺଶ ሻ ൌ െʹ ‫ʹ כ‬Ͷ ൌ െͶͺ
ଵ because ଷ has the fourth row of ଶ multiplied by -2.
†‡–ሺሻ ൌ ଷ Ⱦ.

* ‫ן‬Ri +Rk --Æ Rk (Multiply a row with a non-zero Result 1.7.3 Assume ‹•݊ ൈ ݊ƒ–”‹šwith a given
constant‫ן‬ǡ ƒ†ƒ††‹––‘ƒ‘–Š‡””‘™). †‡–ሺሻ ൌ ߛ . Let ߙ be a number. Then, †‡–ሺߙሻ ൌ ߙ ௡ ‫ߛ כ‬.
ͳ ʹ ͵ Result 1.7.4 Assume ƒ†ƒ”‡݊ ൈ ݊ƒ–”‹…‡•Ǥ
i.e.  ൌ ൥Ͳ Ͷ ͳ൩  ‫ן‬Ri +Rk --Æ Rk
ʹ Ͳ ͳ Then: a) †‡–ሺሻ ൌ †‡–ሺሻ ‫–‡† כ‬ሺሻǤ
ͳ ʹ ͵ b) Assume ିଵ exists and  ିଵ exists.
൥Ͳ ͳʹ ͵൩ ൌ 
ʹ Ͳ ͳ Then, ሺሻିଵ ൌ  ିଵ ିଵ Ǥ
Then, †‡–ሺሻ ൌ †‡–ሺሻ. c)†‡–ሺሻ ൌ †‡–ሺሻǤ
d)†‡–ሺሻ ൌ †‡–ሺ୘ ሻǤ
51

e) If ିଵ exists, then †‡–ሺିଵ ሻ ൌ Ǥ 54 M. Kaabar
ୢୣ୲ሺ୅ሻ

Proof of Result 1.7.4 (b) We will show that ͳ ͵ Ͷ


Let C = ൥ͳ ʹ ͳ൩ Then, the solutions for the system of
ሺሻିଵ ൌ  ିଵ ିଵ .
͹ Ͷ ͵
If we multiply ( ିଵ ିଵ ) by (AB), we obtain:
linear equations are:
ିଵ ሺ ିଵ ିଵ ሺ ିଵ
  ሻ ൌ  ௡ ሻB =   ൌ ௡ Ǥ ܽଵ ͵ Ͷ
Thus, ሺሻ ିଵ
ൌ ିଵ ିଵ
 Ǥ †‡– ൥ ‫ͳ ʹ ڭ‬൩
ܽ௡ Ͷ ͵
Proof of Result 1.7.4 (e) We will show that ‫ݔ‬ଵ ൌ
†‡–ሺሻ

†‡–ሺିଵ ሻ ൌ . ͳ ܽଵ Ͷ
ୢୣ୲ሺ୅ሻ
†‡– ൥ͳ ‫ͳ ڭ‬൩
Since ିଵ ൌ ௡ , then †‡–ሺିଵ ሻ ൌ †‡–ሺ‫ܫ‬௡ ሻ ൌ ͳǤ ͹ ܽ௡ ͵
‫ݔ‬ଶ ൌ
†‡–ሺିଵ ሻ ൌ †‡–ሺሻ ‫–‡† כ‬ሺିଵ ሻ ൌ ͳǤ †‡–ሺሻ
ଵ ͳ ͵ ܽଵ
Thus, †‡–ሺିଵ ሻ ൌ Ǥ
ୢୣ୲ሺ୅ሻ †‡– ͳ ʹ ‫ ڭ‬൩

͹ Ͷ ܽ௡
‫ݔ‬ଷ ൌ
†‡–ሺሻ
1.8 Cramer’s Rule
In this section, we discuss how to use Cramer’s Rule to Example 1.8.1 Use Cramer’s Rule to solve the following
system of linear equations:
solve systems of linear equations.
ʹ‫ݔ‬ଵ ൅ ͹‫ݔ‬ଶ ൌ ͳ͵

Definition 1.8.1 Given ݊ ൈ ݊ system of linear equations. െͳͲ‫ݔ‬ଵ ൅ ͵‫ݔ‬ଶ ൌ െͶ
Let  ൌ  be the matrix form of the given system:
Solution: First of all, we write ʹ ൈ ʹ system in the form
‫ݔ‬ଵ ܽଵ
‫ ݔ ۍ‬ଶ ‫ܽ ۍ ې‬ଶ ‫ې‬  ൌ  according to definition 1.8.1.
‫ۑ ێ ۑ ێ‬
 ‫ ݔ ێ‬ଷ ‫ ۑ‬ൌ ‫ܽ ێ‬ଷ ‫ۑ‬ ʹ ͹ ‫ݔ‬ଵ ͳ͵
ቂ ቃቂ ቃ ൌ ቂ ቃ
െͳͲ ͵ ‫ݔ‬ଶ െͶ
‫ۑڭێ ۑڭێ‬ ʹ ͹
‫ ݔ ۏ‬௡ ‫ܽ ۏ ے‬௡ ‫ے‬ Since C in this form is ቂ ቃ, then
െͳͲ ͵
The system has a unique solution if and only if
†‡–ሺሻ ൌ ሺʹ ‫͵ כ‬ሻ െ ൫͹ ‫ כ‬ሺെͳͲሻ൯ ൌ ͸ െ ሺെ͹Ͳሻ ൌ ͹͸ ് ͲǤ
†‡–ሺሻ ് Ͳ. Cramer’s Rule tells us how to
The solutions for this system of linear equations are:
find‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ǥ ǡ ‫ݔ‬௡ as follows:
53
ͳ͵ ͹ ͳ͵ ͹ 56 M. Kaabar
†‡– ቂ ቃ †‡– ቂ ቃ
‫ݔ‬ଵ ൌ െͶ ͵ ൌ െͶ ͵ ൌ ͸͹
†‡–ሺሻ ͹͸ ͹͸
ܿଵଵ ܿଵଶ ܿଵଷ
ʹ ͳ͵ ʹ ͳ͵ ܿ
 ൌ ൥ ଶଵ ܿଶଶ ܿଶଷ ൩
†‡– ቂ ቃ †‡– ቂ ቃ ͳʹʹ
‫ݔ‬ଶ ൌ െͳͲ െͶ ൌ െͳͲ െͶ ൌ ܿଷଵ ܿଷଶ ܿଷଷ
†‡–ሺሻ ͹͸ ͹͸
଺଻ ଵଶଶ
Thus, the solutions are ‫ݔ‬ଵ ൌ ƒ†‫ݔ‬ଶ ൌ Ǥ In general, to find every element of coefficient matrix
଻଺ ଻଺

A, we need to use the following form:


1.9 Adjoint Method ܿ௜௞ ൌ ሺെͳሻ௜ା௞ †‡–ሾܴ݁݉‫ ݄݅݁ݐ݁ݒ݋‬௧௛ ‫ ݇݀݊ܽݓ݋ݎ‬௧௛ ܿ‫ܣ݂݋݊݉ݑ݈݋‬ሿ
Now, using the above form, we can find all coefficients
In this section, we introduce a new mathematical
of matrix A:
method to find the inverse matrix. We also give some
ͳ െʹ
examples of using Adjoint Method to find inverse ܿଵଵ ൌ ሺെͳሻଵାଵ †‡– ቂ ቃൌʹ
Ͳ ʹ
matrix and its entry elements. Ͳ െʹ
ܿଵଶ ൌ ሺെͳሻଵାଶ †‡– ቂ ቃൌͲ
Ͳ ʹ
Example 1.9.1 Given the following matrix:
Ͳ ͳ
ͳ Ͳ ʹ ܿଵଷ ൌ ሺെͳሻଵାଷ †‡– ቂ ቃൌͲ
Ͳ Ͳ
 ൌ ൥ʹ ͳ െʹ൩
Ͳ Ͳ ʹ Ͳ ʹ
ܿଶଵ ൌ ሺെͳሻଶାଵ †‡– ቂ ቃൌͲ
Find ିଵ using Adjoint Method. Ͳ ʹ
ͳ ʹ
Solution: To find the inverse matrix of A, we need to do ܿଶଶ ൌ ሺെͳሻଶାଶ †‡– ቂ ቃൌʹ
Ͳ ʹ
the following steps: ͳ Ͳ
Step 1: Find the determinant of matrix A, and check ܿଶଷ ൌ ሺെͳሻଶାଷ †‡– ቂ ቃൌͲ
Ͳ Ͳ
whether ିଵ exists or not. Ͳ ʹ
ܿଷଵ ൌ ሺെͳሻଷାଵ †‡– ቂ ቃ ൌ െʹ
Using the previous discussed method for finding the ͳ െʹ
͵ ൈ ͵inverse matrix, we get:†‡–ሺሻ ൌ ʹ ് ͲǤ ͳ ʹ
ܿଷଶ ൌ ሺെͳሻଷାଶ †‡– ቂ ቃൌʹ
Therefore, ିଵ exists. Ͳ െʹ
ͳ Ͳ
Step 2: Calculate the coefficient matrix of A such that ܿଷଷ ൌ ሺെͳሻଷାଷ †‡– ቂ ቃൌͳ
Ͳ ͳ
A = ଷൈଷ because  is ͵ ൈ ͵ matrix.
ʹ Ͳ Ͳ
Hence,  ൌ ൥ Ͳ ʹ Ͳ൩
55 െʹ ʹ ͳ
Step 3: Find the adjoint of matrix A as follows: 58 M. Kaabar
In general, adjoint of A = ƒ†Œሺሻ ൌ  ୘.
ʹ Ͳ െʹ
ሺ ሻ ୘
Thus, ƒ†Œ  ൌ  ൌ Ͳ ʹ ʹ ൩
൥ ʹ Ͳ െʹ ͳ
Ͳ Ͳ ͳ  ଶ ൅  ଷ ՜   ଷ ቎Ͳ ͳ Ͳ ͷ቏
Ͳ Ͳ െͳ ͹
This formula is always true such that Ͳ Ͳ Ͳ Ͷ
 ‫Œ†ƒ כ‬ሺሻ ൌ †‡–ሺሻ ௡ .

Now, using result 1.7.2 for finding the determinant of
If †‡–ሺሻ ് Ͳǡ –Š‡ ቂ ‫Œ†ƒ כ‬ሺሻቃ ൌ ௡ Ǥ the upper triangular matrix:
ୢୣ୲ሺ୅ሻ
ଵ ଵ Since  ‫ א‬௡ ሺԹሻ is a triangular matrix. Then,
Hence, ିଵ ൌ ୢୣ୲ሺ୅ሻ ‫Œ†ƒ כ‬ሺሻ ൌ ୢୣ୲ሺ୅ሻ ‫  כ‬୘ .
†‡–ሺሻ = multiplication of the numbers on the main
ͳ Ͳ െͳ diagonal of A.
ͳ ͳ ʹ Ͳ െʹ
Ͳ ͳ ͳ ʹ Ͳ െʹ ͳ
ିଵ ൌ ‫  כ‬୘ ൌ ‫ כ‬൥Ͳ ʹ ʹ ൩ൌ൦ ͳ൪
†‡–ሺሻ ʹ ቎Ͳ ͳ Ͳ ͷ቏
Ͳ Ͳ ͳ Ͳ Ͳ
ʹ Ͳ Ͳ െͳ ͹
Ͳ Ͳ Ͳ Ͷ
Example 1.9.2 Given the following matrix:
ʹ Ͳ െʹ ͳ †‡–ሺሻ  ൌ ʹ ‫ כ ͳ כ‬െͳ ‫ כ‬Ͷ ൌ െͺ ് ͲǤ
 ൌ ቎െʹ ͳ ʹ Ͷ቏
Therefore, ିଵ exists.
െͶ െͳ ͵ Ͳ
Ͳ Ͳ Ͳ Ͷ Step 2: Use the following general form for (i,k) entry of
Find (2,4) entry of ିଵ . ିଵ :
Solution: To find the (2,4) entry of ିଵ , we need to do ሺ݅ǡ ݇ሻ െ ݁݊‫݂݋ݕݎݐ‬െͳ ൌ
the following steps: ܿ௞௜ ሺെͳሻ௜ା௞ †‡–ሾܴ݁݉‫ ݄݇݁ݐ݁ݒ݋‬௧௛ ‫ ݅݀݊ܽݓ݋ݎ‬௧௛ ܿ‫ܣ݂݋݊݉ݑ݈݋‬ሿ
Step 1: Find the determinant of matrix A, and check ൌ
†‡–ሺሻ †‡–ሺሻ
whether ିଵ exists or not.
Now, using the above form, we can find (2,4)-entry of
Use the Row-Operation Method, we obtain the
following: matrix A:
ʹ Ͳ െʹ ͳ ʹ Ͳ െʹ ͳ ʹ െʹ ͳ ʹ െʹ ͳ
െʹ ͳ ʹ Ͷ ଵ ൅  ଶ ՜   ଶ Ͳ ͳ Ͳ ͷ቏ ሺെͳሻଶାସ †‡– ൥െʹ ʹ Ͷ൩ ሺെͳሻଶାସ †‡– ൥െʹ ʹ Ͷ൩
቎ ቏ ቎ ܿସଶ െͶ ͵ Ͳ ൌ െͶ ͵ Ͳ
െͶ െͳ ͵ Ͳ ʹ ଶ ൅  ଷ ՜   ଷ  Ͳ െͳ െͳ ʹ ൌ
Ͳ Ͳ Ͳ Ͷ Ͳ Ͳ Ͳ Ͷ †‡–ሺሻ †‡–ሺሻ െͺ
ʹ െʹ ͳ
Step 3: Find the determinant of ൥െʹ ʹ Ͷ൩.
െͶ ͵ Ͳ
57
Let’s call the above matrix F such that 60 M. Kaabar
ʹ െʹ ͳ
ൌ ൥െʹ ʹ Ͷ൩
െͶ ͵ Ͳ
1.10 Exercises
To find the determinant of F, we need to use the Row-
Operation Method to reduce the matrix F. 1. Solve the following system of linear equations:
ʹ െʹ ͳ ʹ െʹ ͳ
ଵ ൅  ଶ ՜   ଶ ʹ‫ݔ‬ଷ െ ‫ݔ‬ସ ൅ ‫ ଺ݔ‬ൌ ͳͲ
ൌ ൥െʹ ʹ Ͷ൩ ൥Ͳ Ͳ ͷ൩
ʹଵ ൅  ଷ ՜   ଷ  ൝ െ‫ݔ‬ଵ ൅ ͵‫ݔ‬ଶ ൅ ‫ݔ‬ହ ൌ Ͳ
െͶ ͵ Ͳ Ͳ െͳ ʹ
‫ݔ‬ଵ ൅ ʹ‫ݔ‬ଶ ൅ ʹ‫ݔ‬ଷ െ ‫ݔ‬ସ ൅ ʹ‫ݔ‬ହ ൅ ‫ ଺ݔ‬ൌ ͳʹ
ʹ െʹ ͳ 2. Given an augmented matrix of a
 ଶ ՞   ଷ ൥Ͳ െͳ ʹ൩
ͳ െͳ ܿ ʹ
Ͳ Ͳ ͷ
system:൭െͳ ͳ Ͷอ ͵ ൱
ʹ െʹ ͳ Ͷ െ͵ ܾ ͳͲ
Let’s call the above matrix D such that  ൌ ൥Ͳ െͳ ʹ൩
Ͳ Ͳ ͷ a. For what values of c and b will the system be
Now, using result 1.7.2 for finding the determinant of consistent?
this upper triangular matrix: b. If the system is consistent, when do you have a
ʹ െʹ ͳ
 ൌ ൥Ͳ െͳ ʹ൩ unique solution?.
Ͳ Ͳ ͷ ʹ ͳ ͳ
†‡–ሺሻ  ൌ ʹ ‫ כ‬െͳ ‫ כ‬ͷ ൌ െͳͲ ് ͲǤ െͳ ʹ െͳ ͵
3. Let ൌ ൥Ͳ െͶ ͳ Ͳ቏
ͳ ͳ ͳ ൩ and ൌ ቎ ͳ
Therefore, †‡–ሺ ሻ  ൌ  െ†‡–ሺሻ ൌ ͳͲ ് ͲǤ ͳ ͳ
െʹ Ͳ െͳ ͳ
Thus, The (2,4)-entry of matrix A is: െͳ Ͳ ͵
ʹ െʹ ͳ a. Find the third row of HF.
ሺെͳ ሻଶାସ †‡– ൥െʹ ʹ Ͷ൩
ܿସଶ b. Find the third column of FH.
ൌ െͶ ͵ Ͳ ൌ ሺെͳሻ଺ ‫ Ͳͳ כ‬ൌ ͳͲ ൌ െ ͷ
†‡–ሺሻ െͺ െͺ െͺ Ͷ c. Let HF = A. Find ܽସଶ Ǥ
ൌ െͳǤʹͷ ͳ ʹ Ͷ
4. Let  ൌ ൥െͳ െʹ െ͵൩. If possible find ିଵ Ǥ
െʹ െ͵ െͺ
‫(׵‬2,4)-entry of matrix  ൌ െͳǤʹͷ 5. Given A is ʹ ൈ Ͷ matrix such that
‫ۯ‬ଵ  ՜ ‫ۯ‬૚ ʹଵ ൅  ଶ ՜   ଶ ‫ۯ‬૛ .

59
a. Find two elementary matrices say ‫ܧ‬ଵ ǡ ‫ܧ‬ଶ such that 62 M. Kaabar
‫ܧ‬ଵ ‫ܧ‬ଶ ‫ ܣ‬ൌ ‫ܣ‬ଶ Ǥ
b. Find two elementary matrices say ‫ܨ‬ଵ ǡ ‫ܨ‬ଶ such that Chapter 2
‫ܨ‬ଵ ‫ܨ‬ଶ ‫ܣ‬ଶ ൌ ‫ܣ‬Ǥ
ͳ ʹ Ͷ Vector Spaces
6. Let  ൌ ൥െͳ െʹ ͵ ൩. Find †‡–ሺሻ. Is A invertible?
െʹ െ͵ െ͹ We start this chapter reviewing some concepts of set
Explain.
theory, and we discuss some important concepts of
Ͷ െʹ
7. Let  ൌ ቂ ቃ. Is A invertible? If yes, find ‫ିܣ‬ଵ . vector spaces including span and dimension. In the
െ͵ ʹ
8. Use Cramer’s Rule to find the solution to ‫ݔ‬ଶ in the remaining sections we introduce the concept of linear
system: independence. At the end of this chapter we discuss
ʹ‫ݔ‬ଵ ൅ ‫ݔ‬ଶ െ ‫ݔ‬ଷ ൌ ʹ other concepts such as subspace and basis.
൝ െʹ‫ݔ‬ଵ ൅ Ͷ‫ݔ‬ଶ ൅ ʹ‫ݔ‬ଷ ൌ ͺ
െʹ‫ݔ‬ଵ െ ‫ݔ‬ଶ ൅ ͺ‫ݔ‬ଷ ൌ െʹ
ʹ െͶ ʹ ͳ
2.1 Span and Vector Spaces
9. Let  ൌ ቎ െʹ Ͳ ʹ െͳ቏. Find (2,4) entry of ିଵ .
In this section, we review some concepts of set theory,
ͳ െʹ ͳʹ Ͷ
െʹ Ͷ െʹ ͳʹ and we give an introduction to span and vector spaces
including some examples related to these concepts.
10. Find a ʹ ൈ ʹ matrix ‫ ܣ‬such that
Before reviewing the concepts of set theory, it is
െʹ ʹ
ቂͷ ͷቃ
‫ ܣ‬൅ ͵‫ܫ‬ଶ ൌ ʹ‫ ܣ‬൅ ቂ ቃ.
ͳ Ͷ ͵ ʹ recommended to revisit section 1.4, and read the
ʹ െʹ െʹ ͳ െʹ െʹ notations of numbers and the representation of the
11. Given ‫ିܣ‬ଵ ൌ ൥െʹ ͵ Ͳ ൩and ‫ܤ‬ ൌ ൥ െͶ ʹ ʹ൩
െʹ ʹ ͵ Ͳ ͳ െʹ three sets of numbers in figure 1.4.1.
Ͳ Let’s explain some symbols and notations of set theory:
Solve the system‫ ܺܣ‬ൌ ൥െͳ൩
ͳ ͵ ‫ א‬Ժ means that 3 is an element of ԺǤ
ଵ ଵ
‫ ב‬Ժ means that is not an element of ԺǤ
ଶ ଶ
61 { } means that it is a set.
{5} means that 5 is a subset of Ժ, and the set consists of 64 M. Kaabar
exactly one element which is 5.
Hence, Span{0} = 0.
Definition 2.1.1 The span of a certain set is the set of
Example 2.1.4 Find Span{c} where c is a non-zero
all possible linear combinations of the subset of that
integer.
set.
Solution: Using definition 2.1.1, the span of the set {c}
Example 2.1.1 Find Span{1}.
is the set of all possible linear combinations of the
Solution: According to definition 2.1.1, then the span of
subset of {c} which is ܿ ് Ͳ.
the set {1} is the set of all possible linear combinations
Thus, Span{c} = Թ.
of the subset of {1} which is 1.
Definition 2.1.2 Թ௡ ൌ ሼሺܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ǥ ǡ ܽ௡ ሻȁܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ǥ ǡ ܽ௡ ‫ א‬Թሽ
Hence, Span{1} = Թ.
is a set of all points where each point has exactly ݊
Example 2.1.2 Find Span{(1,2),(2,3)}.
coordinates.
Solution: According to definition 2.1.1, then the span of
Definition 2.1.3 ሺܸǡ ൅ǡήሻ is a vector space if satisfies the
the set {(1,2),(2,3)} is the set of all possible linear
following:
combinations of the subsets of {(1,2),(2,3)} which are
a. For every ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ‫ܸ א‬, ‫ݒ‬ଵ ൅ ‫ݒ‬ଶ ‫ܸ א‬Ǥ
(1,2) and (2,3). Thus, the following is some possible
b. For every ߙ ‫ א‬Թƒ†‫ܸ א ݒ‬, ߙ‫ܸ א ݒ‬Ǥ
linear combinations:
ሺͳǡʹሻ ൌ ͳ ‫ כ‬ሺͳǡʹሻ ൅ Ͳ ‫ כ‬ሺʹǡ͵ሻ
(i.e. Given ܵ‫݊ܽ݌‬ሼ‫ݔ‬ǡ ‫ݕ‬ሽƒ†‫ݐ݁ݏ‬ሼ‫ݔ‬ǡ ‫ݕ‬ሽǡ then
ሺʹǡ͵ሻ ൌ Ͳ ‫ כ‬ሺͳǡʹሻ ൅ ͳ ‫ כ‬ሺʹǡ͵ሻ
ሺͷǡͺሻ ൌ ͳ ‫ כ‬ሺͳǡʹሻ ൅ ʹ ‫ כ‬ሺʹǡ͵ሻ ξͳͲ‫ ݔ‬൅ ʹ‫݊ܽ݌ܵ א ݕ‬ሼ‫ݔ‬ǡ ‫ݕ‬ሽ. Let’s assume that
‫݊ܽ݌ܵ א ݒ‬ሼ‫ݔ‬ǡ ‫ݕ‬ሽ, then ‫ ݒ‬ൌ ܿଵ ‫ ݔ‬൅ ܿଶ ‫ ݕ‬for some numbers
Hence,ሼሺͳǡʹሻǡ ሺʹǡ͵ሻǡ ሺͷǡͺሻሽ ‫ƒ’ א‬ሼሺͳǡʹሻǡ ሺʹǡ͵ሻሽ.
ܿଵ ƒ†ܿଶ ).
Example 2.1.3 Find Span{0}.
Solution: According to definition 2.1.1, then the span of
the set {0} is the set of all possible linear combinations
of the subset of {0} which is 0.
63
2.2 The Dimension of Vector 66 M. Kaabar

Claim: ‫ ܨ‬ൌ ܵ‫݊ܽ݌‬ሼሺ͸ǡͷሻሽ ് Թଶ where ሺ͸ǡͷሻ ‫ א‬Թଶ Ǥ


Space
We cannot find a number ߙ such that ሺ͸ǡͷሻ ൌ ߙሺ͵ǡͶሻ
In this section, we discuss how to find the dimension of
We prove the above claim, and hence ܵ‫݊ܽ݌‬ሼሺ͵ǡͶሻሽ ് Թଶ .
vector space, and how it is related to what we have
learned in section 2.1. Fact 2.2.2 ܵ‫݊ܽ݌‬ሼሺͳǡͲሻǡ ሺͲǡͳሻሽ ൌ Թଶ .

Definition 2.2.1 Given a vector spaceܸ, the dimension Fact 2.2.3 ܵ‫݊ܽ݌‬ሼሺʹǡͳሻǡ ሺͳǡͲǤͷሻሽ ് Թଶ .
of ܸ is the number of minimum elements needed in ܸ
so that their ܵ‫ ݊ܽ݌‬is equal toܸ, and it is denoted by
2.3 Linear Independence
†‹ሺܸሻ. (i.e. †‹ሺԹሻ ൌ ͳƒ† †‹ሺԹଶ ሻ ൌ ʹ). In this section, we learn how to determine whether
Result 2.2.1 †‹ሺԹ௡ ሻ ൌ ݊. vector spaces are linearly independent or not.

Proof of Result 2.2.1 We will show that †‹ሺԹ௡ ሻ ൌ ݊Ǥ Definition 2.3.1 Given a vector spaceሺܸǡ ൅ǡήሻ, we say

Claim: ‫ ܦ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡͲሻǡ ሺͲǤͳሻሽ ൌ Թଶ ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ǥ ǡ ‫ݒ‬௡ ‫ ܸ א‬are linearly independent if none of
them is a linear combination of the remaining ‫ݒ‬௜ Ԣ‫ݏ‬.
ߙଵ ሺͳǡͲሻ ൅ ߙଶ ሺͲǡͳሻ ൌ ሺߙଵ ǡ ߙଶ ሻ ‫ א‬Թଶ
(i.e. ሺ͵ǡͶሻǡ ሺʹǡͲሻ ‫ א‬Թ are linearly independent because
Thus, ‫ ܦ‬is a subset of Թଶ (‫  ك ܦ‬Թଶ ). we cannot write them as a linear combination of each
For every ‫ݔ‬ଵ ǡ ‫ݕ‬ଵ ‫ א‬Թ, ሺ‫ݔ‬ଵ ǡ ‫ݕ‬ଵ ሻ ‫ א‬Թଶ Ǥ other, in other words, we cannot find a number ߙଵ ǡ ߙଶ
such that ሺ͵ǡͶሻ ൌ ߙଵ ሺʹǡͲሻ and ሺʹǡͲሻ ൌ ߙଶ ሺ͵ǡͶሻ).
Therefore, ሺ‫ݔ‬ଵ ǡ ‫ݕ‬ଵ ሻ ൌ ‫ݔ‬ଵ ሺͳǡͲሻ ൅ ‫ݕ‬ଵ ሺͲǡͳሻ ‫ܦ א‬Ǥ
Definition 2.3.2 Given a vector spaceሺܸǡ ൅ǡήሻ, we say
We prove the above claim, and hence ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ǥ ǡ ‫ݒ‬௡ ‫ ܸ א‬are linearly dependent if at least one of
†‹ሺԹ௡ ሻ ൌ ݊. ‫ݒ‬௜ Ԣ‫ ݏ‬is a linear combination of the others.

Fact 2.2.1 ܵ‫݊ܽ݌‬ሼሺ͵ǡͶሻሽ ് Թଶ . Example 2.3.1 Assume ‫ݒ‬ଵ ƒ†‫ݒ‬ଶ are linearly
independent. Show that ‫ݒ‬ଵ and ͵‫ݒ‬ଵ ൅ ‫ݒ‬ଶ are linearly
Proof of Fact 2.2.1 We will show that ܵ‫݊ܽ݌‬ሼሺ͵ǡͶሻሽ ് Թଶ Ǥ
independent.

65
Solution: We will show that ‫ݒ‬ଵ and ͵‫ݒ‬ଵ ൅ ‫ݒ‬ଶ are 68 M. Kaabar
linearly independent. Using proof by contradiction, we
assume that ‫ݒ‬ଵ and ͵‫ݒ‬ଵ ൅ ‫ݒ‬ଶ are linearly dependent. ͳ Ͳ െʹ
൥െʹ ʹ ͳ ൩ Each point is a row-operation. We need to
For some non-zero number ܿଵ , ‫ݒ‬ଵ ൌ ܿଵ ሺ͵‫ݒ‬ଵ ൅ ‫ݒ‬ଶ ሻ. െͳ Ͳ ͷ
Using the distribution property and algebra, we obtain: reduce this matrix to Semi-Reduced Matrix.

‫ݒ‬ଵ ൌ ͵‫ݒ‬ଵ ܿଵ ൅ ‫ݒ‬ଶ ܿଵ Definition 2.3.3 Semi-Reduced Matrix is a reduced-


‫ݒ‬ଵ െ ͵‫ݒ‬ଵ ܿଵ ൌ ‫ݒ‬ଶ ܿଵ matrix but the leader numbers can be any non-zero
number.
‫ݒ‬ଵ ሺͳ െ ͵ܿଵ ሻ ൌ ‫ݒ‬ଶ ܿଵ
ሺͳ െ ͵ܿଵ ሻ Now, we apply the Row-Reduction Method to get the
‫ݒ‬ଵ ൌ ‫ݒ‬ଶ
ܿଵ Semi-Reduced Matrix as follows:
Thus, none of ‫ݒ‬ଵ and ͵‫ݒ‬ଵ ൅ ‫ݒ‬ଶ is a linear combination of ͳ Ͳ െʹ ʹܴ ൅ ܴ ՜ ܴ ͳ Ͳ െʹ
൥െʹ ʹ ͳ ൩ ଵ ଶ ଶ
൥ െ͵൩This is a Semi-
the others which means that ‫ݒ‬ଵ and ͵‫ݒ‬ଵ ൅ ‫ݒ‬ଶ are ܴଵ ൅ ܴଷ ՜ ܴଷ Ͳ ʹ
െͳ Ͳ ͷ Ͳ Ͳ ͵
linearly independent. This is a contradiction. Reduced Matrix.
Therefore, our assumption that ‫ݒ‬ଵ and ͵‫ݒ‬ଵ ൅ ‫ݒ‬ଶ were Since none of the rows in the Semi-Reduced Matrix
linearly dependent is false. Hence, ‫ݒ‬ଵ and ͵‫ݒ‬ଵ ൅ ‫ݒ‬ଶ are become zero-row, then the elements are independent
linearly independent. because we cannot write at least one of them as a
linear combination of the others.
Example 2.3.2 Given the following vectors:
‫ݒ‬ଵ ൌ ሺͳǡͲǡ െʹሻ Example 2.3.3 Given the following vectors:

‫ݒ‬ଶ ൌ ሺെʹǡʹǡͳሻ ‫ݒ‬ଵ ൌ ሺͳǡ െʹǡͶǡ͸ሻ

‫ݒ‬ଷ ൌ ሺെͳǡͲǡͷሻ ‫ݒ‬ଶ ൌ ሺെͳǡʹǡͲǡʹሻ

Are these vectors independent elements? ‫ݒ‬ଷ ൌ ሺͳǡ െʹǡͺǡͳͶሻ

Solution: First of all, to determine whether these Are these vectors independent elements?
vectors are independent elements or not, we need to Solution: First of all, to determine whether these
write these vectors as a matrix. vectors are independent elements or not, we need to
write these vectors as a matrix.

67
ͳ െʹ Ͷ ͸ 70 M. Kaabar
൥െͳ ʹ Ͳ ʹ ൩ Each point is a row-operation. We
ͳ െʹ ͺ ͳͶ
need to reduce this matrix to Semi-Reduced Matrix.

Now, we apply the Row-Reduction Method to get the


Semi-Reduced Matrix as follows:

ͳ െʹ Ͷ ͸ ܴ ൅ܴ ՜ܴ ͳ െʹ Ͷ ͸ V D
ଵ ଶ ଶ
൥െͳ ʹ Ͳ ʹ ൩ െܴ ൅ ܴ ՜ ܴ ൥Ͳ Ͳ Ͷ ͺ൩
ଵ ଷ ଷ
ͳ െʹ ͺ ͳͶ Ͳ Ͳ Ͷ ͺ
ͳ െʹ Ͷ ͸
െܴଶ ൅ ܴଷ ՜ ܴଷ ൥Ͳ Ͳ Ͷ ͺ൩ This is a Semi-Reduced
Ͳ Ͳ Ͳ Ͳ
Matrix.
Figure 2.4.1: Subspace of ܸ
Since there is a zero-row in the Semi-Reduced Matrix,
then the elements are dependent because we can write
at least one of them as a linear combination of the Fact 2.4.1 Every vector space is a subspace of itself.
others.
Example 2.4.1 Given a vector space ‫ ܮ‬ൌ ሼሺܿǡ ͵ܿሻȁܿ ‫ א‬Թሽ.

2.4 Subspace and Basis a. Does ‫ ܮ‬live in Թଶ ?


b. Does ‫ ܮ‬equal to Թଶ ?
In this section, we discuss one of the most important
c. Is ‫ ܮ‬a subspace of Թଶ ?
concepts in linear algebra that is known as subspace.
d. Does ‫ ܮ‬equal to ܵ‫݊ܽ݌‬ሼሺͲǡ͵ሻሽǫ
In addition, we give some examples explaining how to
e. Does ‫ ܮ‬equal to ܵ‫݊ܽ݌‬ሼሺͳǡ͵ሻǡ ሺʹǡ͸ሻሽǫ
find the basis for subspace.
Solution: To answer all these questions, we need first
Definition 2.4.1 Subspace is a vector space but we call to draw an equation from this vector space, say ‫ ݕ‬ൌ ͵‫ݔ‬.
it a subspace because it lives inside a bigger vector The following figure represents the graph of the above
space. (i.e. Given vector spaces ܸ and ‫ܦ‬, then according equation, and it passes through a point ሺͳǡ͵ሻ.
to the figure 2.4.1, ‫ ܦ‬is called a subspace of ܸ).

69
72 M. Kaabar

Part d: No; according to the graph in figure 2.4.2, ሺͲǡ͵ሻ


does not belong to ‫ܮ‬.

Part e: Yes; because we can write ሺͳǡ͵ሻ and ሺʹǡ͸ሻ as a


linear combination of each other.

ߙଵ ሺͳǡ͵ሻ ൅ ߙଶ ሺʹǡ͸ሻ ൌ ሼሺߙଵ ൅ ʹߙଶ ሻǡ ሺ͵ߙଵ ൅ ͸ߙଶ ሻሽ

ߙଵ ሺͳǡ͵ሻ ൅ ߙଶ ሺʹǡ͸ሻ ൌ ሼሺߙଵ ൅ ʹߙଶ ሻǡ ͵ሺߙଵ ൅ ʹߙଶ ሻሽ

Assumeܿ ൌ ሺߙଵ ൅ ʹߙଶ ሻ, then we obtain:


Figure 2.4.2: Graph of ‫ ݕ‬ൌ ͵‫ݔ‬
ߙଵ ሺͳǡ͵ሻ ൅ ߙଶ ሺʹǡ͸ሻ ൌ ሼሺܿǡ ͵ܿ ሻȁܿ ‫ א‬Թሽ ൌ ‫ܮ‬.

Now, we can answer the given questions as follows: Thus, ‫ ܮ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡ͵ሻǡ ሺʹǡ͸ሻሽ.

Part a: Yes; ‫ ܮ‬lives in Թଶ . Result 2.4.1 ‫ ܮ‬is a subspace of Թଶ if satisfies the


following:
Part b: No; ‫ ܮ‬does not equal to Թଶ . To show that we
prove the following claim: a. ‫ ܮ‬lives inside Թଶ .
b. ‫ ܮ‬has only lines through the origin ሺͲǡͲሻ.
Claim: ‫ ܮ‬ൌ ܵ‫݊ܽ݌‬ሼሺͷǡͳͷሻሽ ് Թଶ where ሺͷǡͳͷሻ ‫ א‬Թଶ Ǥ
Example 2.4.2 Given a vector space
It is impossible to find a number ߙ ൌ ͵ such that
‫ ܦ‬ൌ ሼሺܽǡ ܾǡ ͳሻȁܽǡ ܾ ‫ א‬Թሽ.
ሺʹͲǡ͸Ͳሻ ൌ ߙሺͷǡͳͷሻ
a. Does ‫ ܦ‬live in Թଷ ?
because in this case ߙ ൌ Ͷ where ሺʹͲǡ͸Ͳሻ ൌ Ͷሺͷǡͳͷሻ. b. Is ‫ ܦ‬a subspace of Թଷ ǫ

We prove the above claim, and ܵ‫݊ܽ݌‬ሼሺͷǡͳͷሻሽ ് Թଶ . Solution: Since the equation of the above vector space
is a three-dimensional equation, there is no need to
Thus, ‫ ܮ‬does not equal to Թଶ
draw it because it is difficult to draw it exactly. Thus,
Part c: Yes; ‫ ܮ‬is a subspace of Թଶ because ‫ ܮ‬lives inside
we can answer the above questions immediately.
a bigger vector space which is Թଶ .
Part a: Yes; ‫ ܦ‬lives inside Թଷ .
71
74 M. Kaabar

Part b: No; since ሺͲǡͲǡͲሻ ‫ܦ ב‬, then ‫ ܦ‬is not a subspace


Example 2.4.3 Let‫ ܦ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡ െͳǡͲሻǡ ሺʹǡʹǡͳሻǡ ሺͲǡͶǡͳሻሽ.
of Թଷ .
a. Find †‹ሺ‫ܦ‬ሻ.

Fact 2.4.2 Assume ‫ ܦ‬lives insideԹ . If we can write ‫ܦ‬
b. Find a basis for ‫ܦ‬.
as a ܵ‫݊ܽ݌‬, then it is a subspace of Թ௡ .
Solution: First of all, we have infinite set of points, and

Fact 2.4.3 Assume ‫ ܦ‬lives insideԹ . If we cannot write ‫ ܦ‬lives inside Թଷ . Let’s assume the following:
‫ ܦ‬as a ܵ‫݊ܽ݌‬, then it is not a subspace of Թ௡ .
‫ݒ‬ଵ ൌ ሺͳǡ െͳǡͲሻ

Fact 2.4.4 Assume ‫ ܦ‬lives insideԹ . If ሺͲǡͲǡͲǡ ǥ ǡͲሻ is in ‫ݒ‬ଶ ൌ ሺʹǡʹǡͳሻ
‫ܦ‬, then ‫ ܦ‬is a subspace of Թ௡ .
‫ݒ‬ଷ ൌ ሺͲǡͶǡͳሻ

Fact 2.4.5 Assume ‫ ܦ‬lives insideԹ . If ሺͲǡͲǡͲǡ ǥ ǡͲሻ is Part a: To find †‹ሺ‫ܦ‬ሻ, we check whether ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ and ‫ݒ‬ଷ
not in ‫ܦ‬, then ‫ ܦ‬is not a subspace of Թ௡ . are dependent elements or not. Using what we have
Now, we list the main results on Թ௡ : learned so far from section 2.3: We need to write these
vectors as a matrix.
Result 2.4.2 Maximum number of independent points
ͳ െͳ Ͳ
is ݊. ൥ʹ ʹ ͳ൩ Each point is a row-operation. We need to
Result 2.4.3 Choosing any ݊ independent points in Թ௡ , Ͳ Ͷ ͳ
reduce this matrix to Semi-Reduced Matrix.
say ܳଵ ǡ ܳଶ ǡ ǥ ǡ ܳ௡ , then Թ௡ ൌ ܵ‫݊ܽ݌‬ሼܳଵ ǡ ܳଶ ǡ ǥ ǡ ܳ௡ ሽ.
Now, we apply the Row-Reduction Method to get the
Result 2.4.4 †‹ሺԹ௡ ሻ ൌ ݊.
Semi-Reduced Matrix as follows:
Results 2.4.3 and 2.4.4 tell us the following: In order to
ͳ െͳ Ͳ ͳ െͳ Ͳ
get all Թ௡ , we need exactly ݊ independent points. ൥ʹ ʹ ͳ൩െʹܴଵ ൅ ܴଶ ՜ ܴଶ ൥Ͳ Ͷ ͳ൩ െܴଶ ൅ ܴଷ ՜ ܴଷ
Ͳ Ͷ ͳ Ͳ Ͷ ͳ
Result 2.4.5 Assume Թ௡ ൌ ܵ‫݊ܽ݌‬ሼܳଵ ǡ ܳଶ ǡ ǥ ǡ ܳ௞ ሽ, then
݇ ൒ ݊ (݊ points of the ܳ௞ Ԣ‫ ݏ‬are independents). ͳ െͳ Ͳ
൥Ͳ Ͷ ͳ൩ This is a Semi-Reduced Matrix.
Definition 2.4.2 Basis is the set of points that is needed Ͳ Ͳ Ͳ
to ܵ‫ ݊ܽ݌‬the vector space. Since there is a zero-row in the Semi-Reduced Matrix,
then these elements are dependent because we can
73 write at least one of them as a linear combination of
the others. Only two points survived in the Semi- 76 M. Kaabar
Reduced Matrix. Thus, †‹ሺ‫ܦ‬ሻ ൌ ʹ.

Part b: ‫ ܦ‬is a plane that passes through the origin Now, we apply the Row-Reduction Method to get the
ሺͲǡͲǡͲሻ. Since †‹ሺ‫ܦ‬ሻ ൌ ʹ, then any two independent Semi-Reduced Matrix as follows:
points in ‫ ܦ‬will form a basis for ‫ܦ‬. Hence, the following
െͳ ʹ Ͳ Ͳ ܴ ൅ܴ ՜ܴ െͳ ʹ Ͳ Ͳ
are some possible bases for ‫ܦ‬: ൥ͳ െʹ ͵
ଵ ଶ ଶ
Ͳ൩ െʹܴ ൅ ܴ ՜ ܴ ൥ Ͳ Ͳ ͵ Ͳ൩
ଵ ଷ ଷ
െʹ Ͳ ͵ Ͳ Ͳ െͶ ͵ Ͳ
Basis for ‫ ܦ‬is ሼሺͳǡ െͳǡͲሻǡ ሺʹǡʹǡͳሻሽ.
െͳ ʹ Ͳ Ͳ
Another basis for ‫ ܦ‬is ሼሺͳǡ െͳǡͲሻǡ ሺͲǡͶǡͳሻሽ. െܴଶ ൅ ܴଷ ՜ ܴଷ ൥ Ͳ Ͳ ͵ Ͳ൩ This is a Semi-Reduced
Ͳ െͶ Ͳ Ͳ
Result 2.4.6 It is always true that ȁ‫ݏ݅ݏܽܤ‬ȁ ൌ ݀݅݉ሺ‫ܦ‬ሻ. Matrix.
Example 2.4.4 Given the following: Since there is no zero-row in the Semi-Reduced Matrix,
‫ ܯ‬ൌ ܵ‫݊ܽ݌‬ሼሺെͳǡʹǡͲǡͲሻǡ ሺͳǡ െʹǡ͵ǡͲሻǡ ሺെʹǡͲǡ͵ǡͲሻሽ. then these elements are independent. All the three
points survived in the Semi-Reduced Matrix. Thus,
Find a basis for ‫ܯ‬.
†‹ሺ‫ܯ‬ሻ ൌ ͵. Since †‹ሺ‫ܯ‬ሻ ൌ ͵, then any three
Solution: We have infinite set of points, and ‫ ܯ‬lives independent points in ‫ ܯ‬from the above matrices will
inside Թସ . Let’s assume the following: form a basis for ‫ܯ‬. Hence, the following are some
‫ݒ‬ଵ ൌ ሺെͳǡʹǡͲǡͲሻ possible bases for ‫ܯ‬:

‫ݒ‬ଶ ൌ ሺͳǡ െʹǡ͵ǡͲሻ Basis for ‫ ܯ‬is ሼሺെͳǡʹǡͲǡͲሻǡ ሺͲǡͲǡ͵ǡͲሻǡ ሺͲǡ െͶǡͲǡͲሻሽ.
‫ݒ‬ଷ ൌ ሺെʹǡͲǡ͵ǡͲሻ
Another basis for ‫ ܯ‬isሼሺെͳǡʹǡͲǡͲሻǡ ሺͲǡͲǡ͵ǡͲሻǡ ሺͲǡ െͶǡ͵ǡͲሻሽ.
We check if ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ and ‫ݒ‬ଷ are dependent elements.
Using what we have learned so far from section 2.3 Another basis for ‫ ܯ‬isሼሺെͳǡʹǡͲǡͲሻǡ ሺͳǡ െʹǡ͵ǡͲሻǡ ሺെʹǡͲǡ͵ǡͲሻሽ.
and example 2.4.3: We need to write these vectors as a Example 2.4.5 Given the following:
matrix.
ܹ ൌ ܵ‫݊ܽ݌‬ሼሺܽǡ െʹܽ ൅ ܾǡ െܽሻȁܽǡ ܾ ‫ א‬Թሽ.
െͳ ʹ Ͳ Ͳ
a. Show that ܹ is a subspace of Թଷ .
൥ ͳ െʹ ͵ Ͳ൩ Each point is a row-operation. We
െʹ Ͳ ͵ Ͳ b. Find a basis for ܹ.
need to reduce this matrix to Semi-Reduced Matrix.
c. Rewrite ܹ as a ܵ‫݊ܽ݌‬.
Solution: We have infinite set of points, and ܹ lives
75 inside Թଷ .
78 M. Kaabar

Part a: We write each coordinate of ܹ as a linear


combination of the free variables ܽ and ܾ. Example 2.4.6 Given the following:
‫ ܪ‬ൌ ܵ‫݊ܽ݌‬ሼሺܽଶ ǡ ͵ܾ ൅ ܽǡ െʹܿǡ ܽ ൅ ܾ ൅ ܿሻȁܽǡ ܾǡ ܿ ‫ א‬Թሽ.
ܽ ൌͳήܽ൅Ͳήܾ
Is ‫ ܪ‬a subspace of Թସ ?
െʹܽ ൅ ܾ ൌ െʹ ή ܽ ൅ ͳ ή ܾ
Solution: We have infinite set of points, and ‫ ܪ‬lives
െܽ ൌ െͳ ή ܽ ൅ Ͳ ή ܾ inside Թସ . We try write each coordinate of ‫ ܪ‬as a linear
combination of the free variables ܽǡ ܾ and ܿ.
Since it is possible to write each coordinate of ܹ as a
linear combination of the free variables ܽ and ܾ, then ܽଶ ൌ ‫ ݎܾ݁݉ݑܰ݀݁ݔ݅ܨ‬ή ܽ ൅ ‫ ݎܾ݁݉ݑܰ݀݁ݔ݅ܨ‬ή ܾ ൅ ‫ ݎܾ݁݉ݑܰ݀݁ݔ݅ܨ‬ή ܿ
we conclude that ܹ is a subspace of Թଷ .
ܽଶ is not a linear combination of ܽǡ ܾ and ܿ.
Part b: To find a basis for ܹ, we first need to find
We assume that ‫ ݓ‬ൌ ሺͳǡͳǡͲǡͳሻ ‫ܪ א‬, and ܽ ൌ ͳǡ ܾ ൌ ܿ ൌ Ͳ.
†‹ሺܹሻ. To find †‹ሺܹሻ, let’s play a game called (ON-
OFF GAME) with the free variables ܽƒ†ܾǤ If ߙ ൌ െʹ, then െʹ ή ‫ ݓ‬ൌ െʹ ή ሺͳǡͳǡͲǡͳሻ ൌ ሺെʹǡ െʹǡͲǡ െʹሻ ‫ܪ ב‬.

ܽ ܾ ܲ‫ݐ݊݅݋‬ Since it is impossible to write each coordinate of ‫ ܪ‬as a


ͳ Ͳ ሺͳǡ െʹǡ െͳሻ linear combination of the free variables ܽǡ ܾ and ܿ, then
Ͳ ͳ ሺͲǡͳǡͲሻ we conclude that ‫ ܪ‬is not a subspace of Թସ .
Now, we check for independency: We already have the Example 2.4.7 Form a basis for Թସ .
ͳ െʹ െͳ
Semi-Reduced Matrix: ቂ ቃǤThus, †‹ሺܹ ሻ ൌ ʹ. Solution: We just need to select any random four
Ͳ ͳ Ͳ
independent points, and then we form a Ͷ ൈ Ͷ matrix
Hence, the basis for ܹ is ሼሺͳǡ െʹǡ െͳሻǡ ሺͲǡͳǡͲሻሽ. with four independent rows as follows:

Part b: Since we found the basis for ܹ, then it is easy ʹ ͵ Ͳ Ͷ


to rewrite ܹ as a ܵ‫ ݊ܽ݌‬as follows: ቎Ͳ ͷ ͳ ͳ ቏ Note: ߨ ௘ is a number.
Ͳ Ͳ ʹ ͵
ܹ ൌ ܵ‫݊ܽ݌‬ሼሺͳǡ െʹǡ െͳሻǡ ሺͲǡͳǡͲሻሽǤ Ͳ Ͳ Ͳ ߨ௘

Let’s assume the following:


Fact 2.4.6 †‹ሺܹ ሻ ൑ ܰ‫ ݁݁ݎܨ݂݋ݎܾ݁݉ݑ‬െ ܸܽ‫ݏ݈ܾ݁ܽ݅ݎ‬Ǥ
‫ݒ‬ଵ ൌ ሺʹǡ͵ǡͲǡͶሻ

77 ‫ݒ‬ଶ ൌ ሺͲǡͷǡͳǡͳሻ
80 M. Kaabar
‫ݒ‬ଷ ൌ ሺͲǡͲǡʹǡ͵ሻ
‫ݒ‬ସ ൌ ሺͲǡͲǡͲǡ ߨ ௘ ሻ Ͳ ʹ ͳ Ͷ Ͳ ʹ ͳ Ͷ
቎Ͳ െʹ ͵ െͳͲ ቏ܴ ൅ ܴ ՜ ܴ ቎͵
ଵ ଶ ଶ Ͳ
Ͳ ͷ ͵Ͳ቏
Ͳ Ͳ Ͷ െ͸ Ͳ Ͷ െ͸
Thus, the basis for Թସ ൌ ሼ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ‫ݒ‬ଷ ǡ ‫ݒ‬ସ ሽ, and Ͳ Ͳ Ͳ ͳͲͲͲ Ͳ Ͳ Ͳ ͳͲͲͲ

This is a Semi-Reduced Matrix.


ܵ‫݊ܽ݌‬ሼ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ‫ݒ‬ଷ ǡ ‫ݒ‬ସ ሽ ൌ Թସ .
Thus, the basis for Թସ is
Example 2.4.8 Form a basis for Թସ that contains the
following two independent points: ሼሺͲǡʹǡͳǡͶሻǡ ሺͲǡ െʹǡ͵ǡ െͳͲሻǡ ሺ͵ǡͲǡͷǡ͵Ͳሻǡ ሺͲǡͲǡͲǡͳͲͲͲሻሽ.
ሺͲǡʹǡͳǡͶሻƒ†ሺͲǡ െʹǡ͵ǡ െͳͲሻ. Example 2.4.9 Given the following:
Solution: We need to add two more points to the given ‫ ܦ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡͳǡͳǡͳሻǡ ሺെͳǡ െͳǡͲǡͲሻǡ ሺͲǡͲǡͳǡͳሻሽ
one so that all four points are independent. Let’s Is ሺͳǡͳǡʹǡʹሻ ‫?ܦ א‬
assume the following:  Solution: We have infinite set of points, and ‫ ܦ‬lives
‫ݒ‬ଵ ൌ ሺͲǡʹǡͳǡͶሻ inside Թସ . There are two different to solve this
example:
‫ݒ‬ଶ ൌ ሺͲǡ െʹǡ͵ǡ െͳͲሻ
‫ݒ‬ଷ ൌ ሺͲǡͲǡͶǡ െ͸ሻ This is a random point. The First Way: Let’s assume the following:
‫ݒ‬ସ ൌ ሺͲǡͲǡͲǡͳͲͲͲሻ This is a random point. ‫ݒ‬ଵ ൌ ሺͳǡͳǡͳǡͳሻ
Then, we need to write these vectors as a matrix. ‫ݒ‬ଶ ൌ ሺെͳǡ െͳǡͲǡͲሻ
Ͳ ʹ ͳ Ͷ ‫ݒ‬ଷ ൌ ሺͲǡͲǡͳǡͳሻ
቎Ͳ െʹ ͵ െͳͲ ቏ Each point is a row-operation. We We start asking ourselves the following question:
Ͳ Ͳ Ͷ െ͸
Ͳ Ͳ Ͳ ͳͲͲͲ
Question: Can we find ߙଵ ǡ ߙଶ and ߙଷ such that
need to reduce this matrix to Semi-Reduced Matrix.
ሺͳǡͳǡʹǡʹሻ ൌ ߙଵ ή ‫ݒ‬ଵ ൅ ߙଶ ή ‫ݒ‬ଶ ൅ ߙଷ ή ‫ݒ‬ଷ ?
Now, we apply the Row-Reduction Method to get the
Answer: Yes but we need to solve the following system
Semi-Reduced Matrix as follows:
of linear equations:

ͳ ൌ ߙଵ െ ߙଶ ൅ Ͳ ή ߙଷ
79
ͳ ൌ ߙଵ െ ߙଶ ൅ Ͳ ή ߙଷ
82 M. Kaabar

ʹ ൌ ߙଵ ൅ ߙଷ
‫ ܦ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡͳǡͳǡͳሻǡ ሺͲǡͲǡͳǡͳሻሽ.
ʹ ൌ ߙଵ ൅ ߙଷ
Now, we ask ourselves the following question:
Using what we have learned from chapter 1 to solve
the above system of linear equations, we obtain: Question: Can we find ߙଵ ǡ ߙଶ and ߙଷ such that
ሺͳǡͳǡʹǡʹሻ ൌ ߙଵ ή ሺͳǡͳǡͳǡͳሻ ൅ ߙଶ ή ሺͲǡͲǡͳǡͳሻ?
ߙଵ ൌ ߙଶ ൌ ߙଷ ൌ ͳ
Answer: Yes:
Hence, Yes: ሺͳǡͳǡʹǡʹሻ ‫ܦ א‬Ǥ
ͳ ൌ ߙଵ
The Second Way (Recommended): We first need to find
݀݅݉ሺ‫ܦ‬ሻ, and then a basis for ‫ܦ‬. We have to write ͳ ൌ ߙଵ
‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ƒ† ‫ݒ‬ଷ as a matrix.
ʹ ൌ ߙଵ ൅ ߙଶ
ͳ ͳ ͳ ͳ
൥െͳ െͳ Ͳ Ͳ൩ Each point is a row-operation. We ʹ ൌ ߙଵ ൅ ߙଶ
Ͳ Ͳ ͳ ͳ
need to reduce this matrix to Semi-Reduced Matrix. Thus, ߙଵ ൌ ߙଶ ൌ ߙଷ ൌ ͳ.

Now, we apply the Row-Reduction Method to get the Hence, Yes: ሺͳǡͳǡʹǡʹሻ ‫ܦ א‬Ǥ
Semi-Reduced Matrix as follows:

ͳ ͳ ͳ ͳ ͳ ͳ ͳ ͳ
2.5 Exercises
൥െͳ െͳ Ͳ ൩ܴ
Ͳ ଵ ൅ ܴଶ ՜ ܴ ൥
ଶ Ͳ Ͳ ͳ ͳ൩
Ͳ Ͳ ͳ ͳ Ͳ Ͳ ͳ ͳ 1. Let ‫ ܯ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡ െͳǡͳሻǡ ሺെͳǡͲǡͳሻǡ ሺͲǡ െͳǡʹሻሽ

ͳ ͳ ͳ ͳ a. Find a basis for ‫ܯ‬.


െܴଶ ൅ ܴଷ ՜ ܴଷ ൥Ͳ Ͳ ͳ ͳ൩ This is a Semi-Reduced b. Is ሺͳǡ െ͵ǡͷሻ ‫ ?ܯ א‬Why?
Ͳ Ͳ Ͳ Ͳ
Matrix. 2. Let ‫ ܦ‬ൌ ሼሺ‫ݔ‬ǡ ‫ݕ‬ǡ ‫ݖ‬ǡ ‫ݐ‬ሻ ‫ א‬Թସ ȁ‫ݔ‬ǡ ‫ݕ‬ǡ ‫ݖ‬ǡ ‫ א ݐ‬Թǡ
‫ ݔ‬൅ ʹ‫ ݖ‬൅ ͵‫ ݐ‬ൌ Ͳǡ ܽ݊݀‫ ݕ‬െ ‫ ݖ‬൅ ‫ ݐ‬ൌ Ͳሽ
Since there is a zero-row in the Semi-Reduced Matrix,
then these elements are dependent. Thus, †‹ሺ‫ܦ‬ሻ ൌ ʹ. a. Show that ‫ ܦ‬is a subspace of Թସ .
b. Find a basis for ‫ܦ‬.
Thus, Basis for ‫ ܦ‬is ሼሺͳǡͳǡͳǡͳሻǡ ሺͲǡͲǡͳǡͳሻሽ, and
c. Write ‫ ܦ‬as a ܵ‫݊ܽ݌‬.
81
84 M. Kaabar
݉
3. Let ‫ ܧ‬ൌ ሼቈ݉ ൅ ‫ ݓ‬቉ ȁ݉ǡ ‫ א ݓ‬Թሽ
‫ݓ‬ Chapter 3
a. Show that ‫ ܧ‬is a subspace of Թଷ .
b. Find a basis for ‫ܧ‬. Homogeneous Systems
c. Write ‫ ܧ‬as a ܵ‫݊ܽ݌‬.
In this chapter, we introduce the homogeneous
4. Find a basis for the subspace ‫ ܨ‬where
systems, and we discuss how they are related to what
‫ ݓ‬െ ‫ ݏ‬൅ ͵‫ݑ‬ Ͷ‫ ݓ‬൅ ͵‫ ݏ‬െ ͻ‫ݑ‬ ʹ‫ݓ‬
‫ ܨ‬ൌ ሼቂ ቃ ȁ‫ݓ‬ǡ ‫ݏ‬ǡ ‫ א ݑ‬Թሽ we have learned in chapter 2. We start with an
ͺ‫ ݓ‬൅ ʹ‫ ݏ‬െ ͸‫ݑ‬ ͷ‫ݓ‬ Ͳ
5. Determine the value(s) of ‫ ݔ‬such that the points introduction to null space and rank. Then, we study
ሺͳǡͲǡͷሻǡ ሺͳǡʹǡͶሻǡ ሺͳǡͶǡ ‫ݔ‬ሻ are dependent. one of the most important topics in linear algebra
ʹ െͳ ͳ െͳ ͷ െ͵ which is linear transformation. At the end of this
6. Let ‫ ܦ‬ൌ ܵ‫݊ܽ݌‬ሼቂ ቃǡቂ ቃǡቂ ቃሽ
Ͳ ͳ Ͳ ͲǤͷ Ͳ ʹǤͷ
chapter we discuss how to find range and kernel, and
a. Find ݀݅݉ሺ‫ܦ‬ሻ.
their relation to sections 3.1 and 3.2.
b. Find a basis ‫ܣ‬for ‫ܦ‬.

c. Is ‫ ܥ‬ൌ ቂ
െʹ Ͳ
Ͳ െͳ
ቃ ‫ ?ܦ א‬Why? 3.1 Null Space and Rank
d. If the answer to part c is yes, then write ‫ ܥ‬as a In this section, we first give an introduction to
linear combination of the elements in ‫ܣ‬. Otherwise, homogeneous systems, and we discuss how to find the
write the basis ‫ ܣ‬as a ܵ‫݊ܽ݌‬. null space and rank of homogeneous systems. In
7. Let ‫ ܭ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡ െͳǡͲሻǡ ሺʹǡ െͳǡͲሻǡ ሺͳǡͲǡͲሻሽ. addition, we explain how to find row space and column
Find݀݅݉ሺ‫ܭ‬ሻ. space.

8. Find a basis for the subspace of Թ spanned by Definition 3.1.1 Homogeneous System is a ݉ ൈ ݊
ሼሺʹǡͻǡ െʹǡͷ͵ሻǡ ሺെ͵ǡʹǡ͵ǡ െʹሻǡ ሺͺǡ െ͵ǡ െͺǡͳ͹ሻǡ ሺͲǡ െ͵ǡͲǡͳͷሻሽ. system of linear equations that has all zero constants.
9. Does the ܵ‫݊ܽ݌‬ሼሺെʹǡͳǡʹሻǡ ሺʹǡͳǡ െͳሻǡ ሺʹǡ͵ǡͲሻሽ equal to (i.e. the following is an example of homogeneous

Թ ? ʹ‫ݔ‬ଵ ൅ ‫ݔ‬ଶ െ ‫ݔ‬ଷ ൅ ‫ݔ‬ସ ൌ Ͳ
system): ൝͵‫ݔ‬ଵ ൅ ͷ‫ݔ‬ଶ ൅ ͵‫ݔ‬ଷ ൅ Ͷ‫ݔ‬ସ ൌ Ͳ
83 െ‫ݔ‬ଶ ൅ ‫ݔ‬ଷ െ ‫ݔ‬ସ ൌ Ͳ
86 M. Kaabar
Imagine we have the following solution to the
݉ଵ ‫ݓ‬ଵ Ͳ
homogeneous system: ‫ݔ‬ଵ ൌ ‫ݔ‬ଶ ൌ ‫ݔ‬ଷ ൌ ‫ݔ‬ସ ൌ Ͳ. ‫݉ ۍ‬ଶ ‫ې‬ ‫ݓۍ‬ଶ ‫ېͲۍ ې‬
‫ۑ ێ‬ ‫ۑ ێ ۑ ێ‬
Then, this solution can be viewed as a point of Թ௡ (here Now, using algebra:‫ ܯ‬൅ ܹ ൌ ‫݉ ێ ܥ‬ଷ ‫ ۑ‬൅ ‫ݓێ ܥ‬ଷ ‫ ۑ‬ൌ ‫ۑͲێ‬
‫ۑ ڭ ێ‬ ‫ۑڭێ ۑ ڭ ێ‬
is Թସ ) : ሺͲǡͲǡͲǡͲሻ ‫݉ۏ‬௡ ‫ے‬ ‫ݓۏ‬௡ ‫ےͲۏ ے‬
Result 3.1.1 The solution of a homogeneous system By taking ‫ ܥ‬as a common factor, we obtain:
݉ ൈ ݊ can be written as ݉ଵ ‫ݓ‬ଵ Ͳ
‫݉ ۍ‬ଶ ‫ݓۍ ې‬ଶ ‫ې‬ ‫ېͲۍ‬
ሼሺܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ܽସ ǡ ǥ ǡ ܽ௡ ȁܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ܽସ ǡ ǥ ǡ ܽ௡ ‫ א‬Թሽ. ‫ۑ ێ ۊ ۑ ێ ۑ ێۇ‬
‫݉ ێۈ ܥ‬ଷ ‫ ۑ‬൅ ‫ݓێ‬ଷ ‫ ۋۑ‬ൌ ‫ۑͲێ‬
Result 3.1.2 All solutions of a homogeneous system ‫ۑ ڭ ێ ۑ ڭ ێ‬ ‫ۑڭێ‬
‫ۏ‬
‫ ۉ‬௡݉ ‫ے‬ ‫ۏ‬ ‫ݓ‬ ‫ے‬
௡ ‫ی‬ ‫ےͲۏ‬
݉ ൈ ݊ form a subset of Թ௡ , and it is equal to the
݉ଵ ൅ ‫ݓ‬ଵ Ͳ
number of variables. ‫ ݉ۍ‬൅ ‫ې ۍ ې ݓ‬
‫ ݉ ێ‬൅ ‫ۑͲێ ۑ ݓ‬
ଶ ଶ
‫ ێܥ‬ଷ ଷ ‫ ۑ‬ൌ ‫ۑͲێ‬
Result 3.1.3 Given a homogeneous system ݉ ൈ ݊. We
‫ێ‬ ‫ڭ‬ ‫ۑڭێ ۑ‬
‫ݔ‬ଵ Ͳ ‫݉ۏ‬௡ ൅ ‫ݓ‬௡ ‫ےͲۏ ے‬
‫ݔ ۍ‬ଶ ‫ېͲۍ ې‬
‫ۑ ێ ۑ ێ‬ Thus, ‫ ܯ‬൅ ܹ is a solution.
write it in the matrix-form: ‫ݔ ێ ܥ‬ଷ ‫ ۑ‬ൌ ‫ ۑͲێ‬where ‫ ܥ‬is a
‫ۑڭێ ۑ ڭ ێ‬ Fact 3.1.1 If ‫ܯ‬ଵ ൌ ሺ݉ଵ ǡ ݉ଶ ǡ ǥ ǡ ݉௡ ሻ is a solution, and
‫ݔۏ‬௡ ‫ےͲۏ ے‬
ߙ ‫ א‬Թ, then ߙ‫ ܯ‬ൌ ሺߙ݉ଵ ǡ ߙ݉ଶ ǡ ǥ ǡ ߙ݉௡ ሻ is a solution.
coefficient. Then, the set of all solutions in this system
Fact 3.1.2 The only system where the solutions form a
is a subspace of Թ௡ .
vector space is the homogeneous system.
Proof of Result 3.1.3 We assume that
Definition 3.1.2 Null Space of a matrix, say ‫ ܣ‬is a set of
‫ܯ‬ଵ ൌ ሺ݉ଵ ǡ ݉ଶ ǡ ǥ ǡ ݉௡ ሻ and ܹଵ ൌ ሺ‫ݓ‬ǡ ‫ݓ‬ଶ ǡ ǥ ǡ ‫ݓ‬௡ ሻ are two
all solutions to the homogeneous system, and it is
solutions to the above system. We will show that
denoted by ܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻ or ܰሺ‫ܣ‬ሻ.
‫ ܯ‬൅ ܹ is a solution. We write them in the matrix-form:
݉ଵ Ͳ ‫ݓ‬ଵ Ͳ Definition 3.1.3 Rank of a matrix, say ‫ ܣ‬is the number
‫݉ ۍ‬ଶ ‫ې Ͳ ۍ ې‬ ‫ݓۍ‬ଶ ‫ېͲۍ ې‬
‫ۑ ێ ۑ ێ‬ ‫ۑ ێ ۑ ێ‬ of independent rows or columns of ‫ܣ‬, and it is denoted
‫݉ێ ܥ‬ଷ ‫ ۑ‬ൌ ‫ ۑͲێ‬and ‫ݓێ‬ଷ ‫ ۑ‬ൌ ‫ۑͲێ‬
‫ۑڭێ ۑ ڭ ێ‬ ‫ۑڭێ ۑ ڭ ێ‬ by ܴܽ݊݇ሺ‫ܣ‬ሻ.
‫݉ ۏ‬௡ ‫ے Ͳ ۏ ے‬ ‫ݓۏ‬௡ ‫ےͲۏ ے‬
85
88 M. Kaabar
Definition 3.1.4 Row Space of a matrix, say ‫ ܣ‬is the
ܵ‫݊ܽ݌‬of independent rows of ‫ܣ‬, and it is denoted by ͳ െͳ ʹ Ͳ െͳ Ͳ
൭Ͳ ͳ ʹ Ͳ ʹ อͲ൱ܴଶ ൅ ܴଵ ՜ ܴଵ
ܴ‫ݓ݋‬ሺ‫ܣ‬ሻ. Ͳ Ͳ Ͳ ͳ Ͳ Ͳ
Definition 3.1.5 Column Space of a matrix, say ‫ ܣ‬is the ͳ Ͳ Ͷ Ͳ ͳͲ
ܵ‫݊ܽ݌‬of independent columns of ‫ܣ‬, and it is denoted by ൭Ͳ ͳ ʹ Ͳ ʹอͲ൱ This is a Completely-Reduced
Ͳ Ͳ Ͳ ͳ ͲͲ
‫݊݉ݑ݈݋ܥ‬ሺ‫ܣ‬ሻ. Matrix.
Example 3.1.1 Given the following ͵ ൈ ͷ matrix:
Step 3: Read the solution for the above system of linear
ͳ െͳ ʹ Ͳ െͳ equations after using Row-Operation.
‫ ܣ‬ൌ ൥Ͳ ͳ ʹ Ͳ ʹ ൩.
Ͳ Ͳ Ͳ ͳ Ͳ ‫ݔ‬ଵ ൅ Ͷ‫ݔ‬ଷ ൅ ‫ݔ‬ହ ൌ Ͳ
a. Find ܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻ. ‫ݔ‬ଶ ൅ ʹ‫ݔ‬ଷ ൅ ʹ‫ݔ‬ହ ൌ Ͳ
‫ݔ‬ସ ൌ Ͳ
b. Find ݀݅݉ሺܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻሻ.
c. Rewrite ܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻ as ܵ‫݊ܽ݌‬. Free variables are ‫ݔ‬ଷ and ‫ݔ‬ହ .

d. Find ܴܽ݊݇ሺ‫ܣ‬ሻ. Assuming that ‫ݔ‬ଷ ,‫ݔ‬ହ ‫ א‬Թ. Then, the solution of the
e. Find ܴ‫ݓ݋‬ሺ‫ܣ‬ሻ. above homogeneous system is as follows:

Solution: Part a: To find the null space of ‫ܣ‬, we need to ‫ݔ‬ଵ ൌ െͶ‫ݔ‬ଷ െ ‫ݔ‬ହ
find the solution of‫ ܣ‬as follows: ‫ݔ‬ଶ ൌ െʹ‫ݔ‬ଷ െ ʹ‫ݔ‬ହ
‫ݔ‬ସ ൌ Ͳ
Step 1: Write the above matrix as an Augmented-
Thus, according to definition 3.1.2,
Matrix, and make all constants’ terms zeros.

ͳ െͳ ʹ Ͳ െͳ Ͳ ܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻ ൌ ሼሺെͶ‫ݔ‬ଷ െ ‫ݔ‬ହ ǡ െʹ‫ݔ‬ଷ െ ʹ‫ݔ‬ହ ǡ ‫ݔ‬ଷ ǡ Ͳǡ ‫ݔ‬ହ ሻȁ‫ݔ‬ଷ ,‫ݔ‬ହ ‫ א‬Թሽ.
൭Ͳ ͳ ʹ Ͳ ʹ อͲ൱
Ͳ Part b: It is always true that
Ͳ Ͳ ͳ Ͳ Ͳ

Step 2: Apply what we have learned from chapter 1 to ݀݅݉൫ܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻ൯ ൌ ݀݅݉൫ܰሺ‫ܣ‬ሻ൯ ൌ ݄ܶ݁ܰ‫ݏ݈ܾ݁ܽ݅ݎܸܽ݁݁ݎܨ݂݋ݎܾ݁݉ݑ‬
solve systems of linear equations use Row-Operation Here, ݀݅݉൫ܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻ൯ ൌ ʹ.
Method.

87
90 M. Kaabar

Definition 3.1.6 The nullity of a matrix, say ‫ ܣ‬is the


Result 3.1.6 Let ‫ ܣ‬be ݉ ൈ ݊ matrix. The geometric
dimension of the null space of ‫ܣ‬, and it is denoted by
meaning of ‫݊݉ݑ݈݋ܥ‬ሺ‫ܣ‬ሻ ൌ ܵ‫݊ܽ݌‬ሼ‫ݏ݊݉ݑ݈݋ܥݐ݊݁݀݊݁݌݁݀݊ܫ‬ሽ
݀݅݉ሺܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻሻ or݀݅݉ሺܰሺ‫ܣ‬ሻሻ.
“lives” inside Թ௠ .
Part c: We first need to find a basis for ܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻ as
Result 3.1.7 Let ‫ ܣ‬be ݉ ൈ ݊ matrix. Then,
follows: To find a basis for ܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻ, we play a game
called (ON-OFF GAME) with the free variables ‫ݔ‬ଷ and ܴܽ݊݇ሺ‫ܣ‬ሻ ൌ ݀݅݉൫ܴ‫ݓ݋‬ሺ‫ܣ‬ሻ൯ ൌ ݀݅݉ሺ‫݊݉ݑ݈݋ܥ‬ሺ‫ܣ‬ሻሻ.
‫ݔ‬ହ Ǥ Example 3.1.2 Given the following ͵ ൈ ͷ matrix:
‫ݔ‬ଷ ‫ݔ‬ହ ܲ‫ݐ݊݅݋‬ ͳ ͳ ͳ ͳ ͳ
ͳ Ͳ ሺെͶǡ െʹǡͳǡͲǡͲሻ ‫ ܤ‬ൌ ൥െͳ െͳ െͳͲ ʹ൩.
Ͳ ͳ ሺെͳǡ െʹǡͲǡͲǡͳሻ Ͳ Ͳ Ͳ Ͳ Ͳ
a. Find ܴ‫ݓ݋‬ሺ‫ܤ‬ሻ.
The basis for ܰ‫݈݈ݑ‬ሺ‫ܣ‬ሻ ൌ ሼሺെͶǡ െʹǡͳǡͲǡͲሻǡ ሺെͳǡ െʹǡͲǡͲǡͳሻሽ.
b. Find ‫݊݉ݑ݈݋ܥ‬ሺ‫ܤ‬ሻ.
Thus, ܰ‫ ݈݈ݑ‬ሺ‫ܣ‬ሻ ൌ ܵ‫݊ܽ݌‬ሼሺെͶǡ െʹǡͳǡͲǡͲሻǡ ሺെͳǡ െʹǡͲǡͲǡͳሻሽ. c. Find ܴܽ݊݇ሺ‫ܤ‬ሻ.

Part d: To find the rank of matrix ‫ܣ‬, we just need to Solution: Part a: To find the row space of ‫ܤ‬, we need to
change matrix ‫ ܤ‬to the Semi-Reduced Matrix as
change matrix ‫ ܣ‬to the Semi-Reduced Matrix. We
follows:
already did that in part a. Thus, ܴܽ݊݇ሺ‫ܣ‬ሻ ൌ ͵Ǥ
ͳ ͳ ͳ ͳ ͳ ܴ ൅ܴ ՜ܴ ͳ ͳ ͳ ͳ ͳ
Part e: To find the row space of matrix ‫ܣ‬, we just need ൥െͳ െͳ െͳͲ
ଵ ଶ ଶ
ʹ ൩ ܴ ൅ ܴ ՜ ܴ ൥Ͳ Ͳ Ͳͳ ͵൩
ଵ ଷ ଷ
to write the ܵ‫ ݊ܽ݌‬of independent rows. Thus, Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ

ܴ‫ݓ݋‬ሺ‫ܣ‬ሻ ൌ ܵ‫݊ܽ݌‬ሼሺͳǡ െͳǡʹǡͲǡ െͳሻǡ ሺͲǡͳǡʹǡͲǡʹሻǡ ሺͲǡͲǡͲǡͳǡͲሻሽǤ This is a Semi-Reduced Matrix. To find the row space
It is also a subspace of Թହ . of matrix ‫ܤ‬, we just need to write the ܵ‫ ݊ܽ݌‬of
Result 3.1.4 Let ‫ ܣ‬be ݉ ൈ ݊ matrix. Then, independent rows. Thus,
ܴܽ݊݇ሺ‫ܣ‬ሻ ൅ ݀݅݉൫ܰ ሺ‫ܣ‬ሻ൯ ൌ ݊ ൌ ܰ‫ܣ݂݋ݏ݊݉ݑ݈݋ܥ݂݋ݎܾ݁݉ݑ‬. ܴ‫ݓ݋‬ሺ‫ܤ‬ሻ ൌ ܵ‫݊ܽ݌‬ሼሺͳǡͳǡͳǡͳǡͳሻǡ ሺͲǡͲǡͲǡͳǡ͵ሻሽǤ

Result 3.1.5 Let ‫ ܣ‬be ݉ ൈ ݊ matrix. The geometric Part b: To find the column space of ‫ܤ‬, we need to
change matrix ‫ ܤ‬to the Semi-Reduced Matrix. We
meaning of ܴ‫ݓ݋‬ሺ‫ܣ‬ሻ ൌ ܵ‫݊ܽ݌‬ሼ‫ݏݓ݋ܴݐ݊݁݀݊݁݌݁݀݊ܫ‬ሽ “lives”
already did that in part a. Now, we need to locate the
inside Թ௡ . columns in the Semi-Reduced Matrix of ‫ ܤ‬that contain
89
92 M. Kaabar

the leaders, and then we should locate them to the


original matrix ‫ܤ‬. Before discussing polynomials, we need to know the
following mathematical facts:

Fact 3.2.1 Թ௡ൈ௠ ൌ Թ௡ൈ௠ ൌ ‫ܯ‬௡ൈ௠ ሺԹሻ is a vector space.


ͳ ͳ ͳ ͳ ͳ
൥Ͳ Ͳ Ͳͳ ͵൩ Semi-Reduced Matrix Fact 3.2.2 Թଶൈଷ is equivalent to Թ଺ as a vector space.
Ͳ Ͳ Ͳ Ͳ Ͳ
ͳ ʹ ͵
(i.e. ቂ ቃ ‹•‡“—‹˜ƒŽ‡––‘ሺͳǡʹǡ͵ǡͲǡͳǡͳሻ).
Ͳ ͳ ͳ

ͳ ͳ ͳ ͳ ͳ Fact 3.2.3 Թଷൈଶ is equivalent to Թ଺ as a vector space.


൥െͳ െͳ െͳͲ ʹ൩ Matrix ‫ܤ‬
Ͳ Ͳ Ͳ Ͳ Ͳ ͳ ʹ
(i.e. ͵
൥ Ͳ൩ ‹•‡“—‹˜ƒŽ‡––‘ሺͳǡʹǡ͵ǡͲǡͳǡͳሻ).
Each remaining columns is a linear combination of the ͳ ͳ
first and fourth columns.
After knowing the above facts, we introduce
Thus, ‫݊݉ݑ݈݋ܥ‬ሺ‫ܤ‬ሻ ൌ ܵ‫݊ܽ݌‬ሼሺͳǡ െͳǡͲሻǡ ሺͳǡͲǡͲሻሽ. polynomials as follows:

Part c: To find the rank of matrix ‫ܤ‬, we just need to ܲ௡ ൌ ܵ݁‫ ݁݁ݎ݂݃݁݀݋ݏ݈ܽ݅݉݋݊ݕ݈݋݌݈݈݂ܽ݋ݐ‬൏ ݊Ǥ

change matrix ‫ ܣ‬to the Semi-Reduced Matrix. We The algebraic expression of polynomials is in the
already did that in part a. Thus, following from: ܽ௡ ‫ ݔ‬௡ ൅ ܽ௡ିଵ ‫ ݔ‬௡ିଵ ൅ ‫ ڮ‬൅ ܽଵ ‫ ݔ‬ଵ ൅ ܽ଴

ܴܽ݊݇ሺ‫ܣ‬ሻ ൌ ݀݅݉൫ܴ‫ݓ݋‬ሺ‫ܤ‬ሻ൯ ൌ ݀݅݉ሺ‫݊݉ݑ݈݋ܥ‬ሺ‫ܤ‬ሻሻ ൌ ʹǤ ܽ௡ ǡ ܽ௡ିଵ ƒ†ܽଵ are coefficients.

3.2 Linear Transformation ݊ƒ†݊ െ ͳ are exponents that must be positive


integers whole numbers.
We start this section with an introduction to
ܽ଴ is a constant term.
polynomials, and we explain how they are similar to
The degree of polynomial is determined by the highest
Թ௡ as vector spaces. At the end of this section we
power (exponent).
discuss a new concept called linear transformation.
We list the following examples of polynomials:

91
94 M. Kaabar

x ܲଶ ൌ ܵ݁‫ ݁݁ݎ݂݃݁݀݋ݏ݈ܽ݅݉݋݊ݕ݈݋݌݈݈݂ܽ݋ݐ‬൏ ʹ (i.e. ͵‫ ݔ‬ଶ െ ʹ ൌ െʹ ൅ Ͳ‫ ݔ‬൅ ͵‫ ݔ‬ଶ ՞ ሺെʹǡͲǡ͵ሻ


͵‫ ݔ‬൅ ʹ ‫ܲ א‬ଶ , Ͳ ‫ܲ א‬ଶ , ͳͲ ‫ܲ א‬ଶ , ξ͵ ‫ܲ א‬ଶ but
ξ͵ξ‫ܲ ב ݔ‬ଶ ). െͷ‫ ݔ‬ൌ Ͳ െ ͷ‫ ݔ‬൅ Ͳ‫ ݔ‬ଶ ՞ ሺͲǡ െͷǡͲሻ
x ܲସ ൌ ܵ݁‫ ݁݁ݎ݂݃݁݀݋ݏ݈ܽ݅݉݋݊ݕ݈݋݌݈݈݂ܽ݋ݐ‬൏ Ͷ (i.e. ͸‫ ݔ‬ଶ െ ͳͲ‫ ݔ‬െ Ͷ ൌ െͶ െ ͳͲ‫ ݔ‬൅ ͸‫ ݔ‬ଶ ՞ ሺെͶǡ െͳͲǡ͸ሻ
͵ͳ‫ ݔ‬ଶ ൅ Ͷ ‫ܲ א‬ସ ).
x If ܲሺ‫ ݔ‬ሻ ൌ ͵, then ݀݁݃൫ܲ ሺ‫ ݔ‬ሻ൯ ൌ Ͳ. Now, we need to write these vectors as a matrix.
x ξ‫ ݔ‬൅ ͵ is not a polynomial. െʹ Ͳ ͵
൥Ͳ െͷ Ͳ൩ Each point is a row-operation. We need
Result 3.2.1 ܲ௡ is a vector space. െͶ െͳͲ ͸
to reduce this matrix to Semi-Reduced Matrix.
Fact 3.2.4 Թଶൈଷ ൌ ‫ܯ‬ଶൈଷ ሺԹሻ as a vector space same as
Թ଺ . Then, we apply the Row-Reduction Method to get the
Semi-Reduced Matrix as follows:
Result 3.2.2 ܲ௡ is a vector space, and it is the same as
Թ௡ . (i.e. ܽ଴ ൅ ܽଵ ‫ ݔ‬ଵ ൅ ‫ ڮ‬൅ ܽ௡ିଵ ‫ ݔ‬௡ିଵ ՞ ሺܽ଴ ǡ ܽଵ ǡ ǥ ǡ ܽ௡ିଵ ሻ. െʹ Ͳ ͵ െʹ Ͳ ͵
Note: The above form is in an ascending order. ൥Ͳ െͷ Ͳ൩െʹܴଵ ൅ ܴଷ ՜ ܴଷ ൥ Ͳ െͷ Ͳ൩
െͶ െͳͲ ͸ Ͳ െͳͲ Ͳ
Result 3.2.3 ݀݅݉ሺܲ௡ ሻ ൌ ݊Ǥ
െʹ Ͳ ͵
Fact 3.2.5 ܲଷ ൌ ܵ‫݊ܽ݌‬ሼ͵‫ݏ݈ܽ݅݉݋݊ݕ݈݋ܲݐ݊݁݀݊݁݌݁݀݊ܫ‬ǡ ܽ݊݀ െʹܴଶ ൅ ܴଷ ՜ ܴଷ ൥ Ͳ െͷ Ͳ൩ This is a Semi-Reduced
Ͳ Ͳ Ͳ
‫ ݁݁ݎ݃݁ܦ݂݋݄ܿܽܧ‬൏ ͵ሽ. (i.e. ܲଷ ൌ ܵ‫݊ܽ݌‬ሼͳǡ ‫ݔ‬ǡ ‫ ʹݔ‬ሽ).
Matrix.
Example 3.2.1 Given the following polynomials:
Since there is a zero-row in the Semi-Reduced Matrix,
͵‫ ݔ‬ଶ െ ʹǡ െͷ‫ݔ‬ǡ ͸‫ ݔ‬ଶ െ ͳͲ‫ ݔ‬െ Ͷ. then these elements are dependent. Thus, the answer
a. Are these polynomials independent? to this question is NO.

b. Let ‫ ܦ‬ൌ ܵ‫݊ܽ݌‬ሼ͵‫ ݔ‬ଶ െ ʹǡ െͷ‫ݔ‬ǡ ͸‫ ݔ‬ଶ െ ͳͲ‫ ݔ‬െ Ͷሽ. Find a Part b: Since there are only 2 vectors survived after
basis for ‫ܦ‬. checking for dependency in part a, then the basis for ‫ܦ‬
ሺͲǡ െͷǡͲሻ ՞ െͷ‫ݔ‬.
Solution: Part a: We know that these polynomial live
in ܲଷ , and as a vector space ܲଷ is the same as Թଷ . According Result 3.2.4 Given ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ǥ ǡ ‫ݒ‬௞ points in Թ௡ where
to result 3.2.2, we need to make each polynomial ݇ ൏ ݊. Choose one particular point, say ܳ, such that
equivalent to Թ௡ as follows: ܳ ൌ ܿଵ ‫ݒ‬ଵ ൅ ܿଶ ‫ݒ‬ଶ ൅ ‫ ڮ‬൅ ܿ௞ ‫ݒ‬௞ where ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ௞ are
93
96 M. Kaabar

constants. If ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ௞ are unique, then ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ǥ ǡ ‫ݒ‬௞


Example 3.2.2 Givenܶǣ Թଶ ՜ Թଷ where Թଶ is a domain
are independent.
and Թଷ is a co-domain. ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺ͵ܽଵ ൅ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ.
Note: The word “unique” in result 3.2.4 means that
a. Find ܶሺሺͳǡͳሻሻ.
there is only one value for each of ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ௞ .
b. Find ܶሺሺͳǡͲሻሻ.
Proof of Result 3.2.4 By using proof by contradiction, c. Show that ܶ is a linear transformation.
we assume that ‫ݒ‬ଵ ൌ ߙଶ ‫ݒ‬ଶ ൅ ߙଷ ‫ݒ‬ଷ ൅ ‫ ڮ‬൅ ߙ௞ ‫ݒ‬௞ where Solution: Part a: Since ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺ͵ܽଵ ൅ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ,
ߙଶ ǡ ߙଷ ǡ ǥ ǡ ߙ௞ are constants. Our assumption means that then ܽଵ ൌ ܽଶ ൌ ͳ. Thus, ܶ൫ሺͳǡͳሻ൯ ൌ ሺ͵ሺͳሻ ൅ ͳǡͳǡ െͳሻ ൌ
it is dependent. Using algebra, we obtain: ሺͶǡͳǡ െͳሻ.

ܳ ൌ ܿଵ ߙଶ ‫ݒ‬ଶ ൅ ܿଵ ߙଷ ‫ݒ‬ଷ ൅ ‫ ڮ‬൅ ܿଵ ߙ௞ ‫ݒ‬௞ ൅ ܿଶ ‫ݒ‬ଶ ൅ ‫ ڮ‬൅ ܿ௞ ‫ݒ‬௞ . Part b: Since ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺ͵ܽଵ ൅ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ, then
ܳ ൌ ሺܿଵ ߙଶ ൅ܿଶ ሻ‫ݒ‬ଶ ൅ ሺܿଵ ߙଷ ൅ ܿଷ ሻ‫ݒ‬ଷ ൅ ‫ ڮ‬൅ ሺܿଵ ߙ௞ ൅ ܿ௞ ሻ‫ݒ‬௞ ൅ ܽଵ ൌ ͳƒ†ܽଶ ൌ Ͳ. Thus, ܶ൫ሺͳǡͲሻ൯ ൌ ሺ͵ሺͳሻ ൅ ͲǡͲǡ െͳሻ ൌ
Ͳ‫ݒ‬ଵ Ǥ Thus, none of them is a linear combination of the ሺ͵ǡͲǡ െͳሻ.

others which means that they are linearly Part c: Proof: We assume that ‫ݒ‬ଵ ൌ ሺܽଵ ǡ ܽଶ ሻ,
independent. This is a contradiction. Therefore, our ‫ݒ‬ଶ ൌ ሺܾଵ ǡ ܾଶ ሻ, and ߙ ‫ א‬Թ. We will show that ܶ is a linear
assumption that ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ǥ ǡ ƒ†‫ݒ‬௞ were linearly transformation. Using algebra, we start from the Left-
dependent is false. Hence, ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ǥ ǡ ƒ†‫ݒ‬௞ are linearly Hand-Side (LHS):
independent. ߙ‫ݒ‬ଵ ൅ ‫ݒ‬ଶ ൌ ሺߙܽଵ ൅ ܾଵ ǡ ߙܽଶ ൅ ܾଶ ሻ
Result 3.2.5 Assume ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ǥ ǡ ‫ݒ‬௞ are independent and ܶሺߙ‫ݒ‬ଵ ൅ ‫ݒ‬ଶ ሻ ൌ ܶሺሺߙܽଵ ൅ ܾଵ ǡ ߙܽଶ ൅ ܾଶ ሻሻ
ܳ ‫݊ܽ݌ܵ א‬ሼ‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ǡ ǥ ǡ ‫ݒ‬௞ ሽ. Then, there exists unique ܶሺߙ‫ݒ‬ଵ ൅ ‫ݒ‬ଶ ሻ ൌ ሺ͵ߙܽଵ ൅ ͵ܾଵ ൅ ߙܽଶ ൅ ܾଶ ǡ ߙܽଶ ൅ ܾଶ ǡ െߙܽଵ െ ܾଵ ሻ
number ܿଵ ǡ ܿଶ ǡ ǥ ǡ ܿ௞ such that ܳ ൌ ܿଵ ‫ݒ‬ଵ ൅ ܿଶ ‫ݒ‬ଶ ൅ ‫ ڮ‬൅
Now, we start from the Right-Hand-Side (RHS):
ܿ௞ ‫ݒ‬௞ .
ߙܶሺ‫ݒ‬ଵ ሻ ൅ ܶሺ‫ݒ‬ଶ ሻ ൌ ߙܶሺܽଵ ǡ ܽଶ ሻ ൅ ܶሺܾଵ ǡ ܾଶ ሻ
Linear Transformation: ߙܶሺ‫ݒ‬ଵ ሻ ൅ ܶሺ‫ݒ‬ଶ ሻ ൌ ߙሺ͵ܽଵ ൅ ܽଶ ǡ ܽଶ ǡ െܽଵ ሻ ൅ ሺ͵ܾଵ ൅ ܾଶ ǡ ܾଶ ǡ െܾଵ ሻ
Definition 3.2.1 ܶǣ ܸ ՜ ܹ where ܸ is a domain and ܹ is ൌ ሺ͵ߙܽଵ ൅ ߙܽଶ ǡ ߙܽଶ ǡ െߙܽଵ ሻ ൅ ሺ͵ܾଵ ൅ ܾଶ ǡ ܾଶ ǡ െܾଵ ሻ
a co-domain. ܶ is a linear transformation if for every ൌ ሺ͵ߙܽଵ ൅ ߙܽଶ ൅ ͵ܾଵ ൅ ܾଶ ǡ ߙܽଶ ൅ ܾଶ ǡ െߙܽଵ െ ܾଵ ሻ
‫ݒ‬ଵ ǡ ‫ݒ‬ଶ ‫ ܸ א‬and ߙ ‫ א‬Թ, we have the following:
ܶሺߙ‫ݒ‬ଵ ൅ ‫ݒ‬ଶ ሻ ൌ ߙܶሺ‫ݒ‬ଵ ሻ ൅ ܶሺ‫ݒ‬ଶ ሻ. Thus, ܶ is a linear transformation.
95
98 M. Kaabar
௡ ௠
Result 3.2.6 Given ܶǣ Թ ՜ Թ . Then,
ܶሺሺܽଵ ǡ ܽଶ ǡ ܽଷ ǡ ǥ ǡ ܽ௡ ሻሻ ൌ Each coordinate is a linear ͳ Ͳ
Since ‫ܫ‬ଶ ൌ ቂ ቃ, then the standard basis for Թଶ is
combination of the ܽ௜ Ԣ‫ݏ‬. Ͳ ͳ
ሼሺͳǡͲሻǡ ሺͲǡͳሻሽ.
Example 3.2.3 Givenܶǣ Թଷ ՜ Թସ where Թଷ is a domain
Example 3.2.7 Find the standard basis for Թଷ .
and Թସ is a co-domain.
Solution: The standard basis for Թଷ is the rows of ‫ܫ‬ଷ .
a. If ܶ൫ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ‫ݔ‬ଷ ሻ൯ ൌ ሺെ͵‫ݔ‬ଷ ൅ ͸‫ݔ‬ଵ ǡ െͳͲ‫ݔ‬ଶ ǡ ͳ͵ǡ െ‫ݔ‬ଷ ሻ, is
ܶ a linear transformation? ͳ Ͳ Ͳ
b. If ܶ൫ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ‫ݔ‬ଷ ሻ൯ ൌ ሺെ͵‫ݔ‬ଷ ൅ ͸‫ݔ‬ଵ ǡ െͳͲ‫ݔ‬ଶ ǡ Ͳǡ െ‫ݔ‬ଷ ሻ, is ܶ Since ‫ܫ‬ଷ ൌ ൥Ͳ ͳ Ͳ൩, then the standard basis for Թଷ is
Ͳ Ͳ ͳ
a linear transformation? ሼሺͳǡͲǡͲሻǡ ሺͲǡͳǡͲሻǡ ሺͲǡͲǡͳሻሽ.
Solution: Part a: Since 13 is not a linear combination of
Example 3.2.8 Find the standard basis for ܲଷ .
‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ƒ†‫ݔ‬ଷ . Thus, ܶ is not a linear transformation.
Solution: The standard basis for ܲଷ is ሼͳǡ ‫ݔ‬ǡ ‫ ݔ‬ଶ ሽ.
Part b: Since 0 is a linear combination of ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ƒ†‫ݔ‬ଷ .
Thus, ܶ is a linear transformation. Example 3.2.9 Find the standard basis for ܲସ .

Example 3.2.4 Givenܶǣ Թଶ ՜ Թଷ where Թଶ is a domain Solution: The standard basis for ܲସ is ሼͳǡ ‫ݔ‬ǡ ‫ ݔ‬ଶ ǡ ‫ ݔ‬ଷ ሽ.
and Թଷ is a co-domain. If ܶ൫ሺܽଵ ǡ ܽଶ ሻ൯ ൌ ሺܽଵ ଶ ൅ ܽଶ ǡ െܽଶ ሻ,
Example 3.2.10 Find the standard basis for Թଶൈଶ ൌ
is ܶ a linear transformation?
‫ܯ‬ଶൈଶ ሺԹሻ.
Solution: Since ܽଵ ଶ ൅ ܽଶ is not a linear combination of
Solution: The standard basis for Թଶൈଶ ൌ ‫ܯ‬ଶൈଶ ሺԹሻ is
ܽଵ ƒ†ܽଶ . Hence, ܶ is not a linear transformation. ͳ Ͳ Ͳ ͳ Ͳ Ͳ Ͳ Ͳ
ሼቂ ቃǡቂ ቃǡቂ ቃǡቂ ቃሽ because Թଶൈଶ ൌ
Ͳ Ͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ
Example 3.2.5 Givenܶǣ Թ ՜ Թ. If ܶሺ‫ ݔ‬ሻ ൌ ͳͲ‫ݔ‬, is ܶ a
‫ܯ‬ଶൈଶ ሺԹሻ ൌ Թସ as a vector space where standard basis
linear transformation? ͳ Ͳ Ͳ Ͳ
Solution: Since it is a linear combination of ܽଵ such for Թଶൈଶ ൌ ‫ܯ‬ଶൈଶ ሺԹሻ is the rows of ‫ܫ‬ସ ൌ ቎Ͳ ͳ Ͳ Ͳ቏that
Ͳ Ͳ ͳ Ͳ
thatߙܽଵ ൌ ͳͲ‫ݔ‬. Hence, ܶ is a linear transformation. Ͳ Ͳ Ͳ ͳ
are represented by ʹ ൈ ʹ matrices.
Example 3.2.6 Find the standard basis for Թଶ .
Example 3.2.11 Let ܶǣ Թଶ ՜ Թଷ be a linear

Solution: The standard basis for Թ is the rows of ‫ܫ‬ଶ .
transformation such that
97 ܶሺʹǡͲሻ ൌ ሺͲǡͳǡͶሻ
100 M. Kaabar

ܶሺെͳǡͳሻ ൌ ሺʹǡͳǡͷሻ

Find ܶሺ͵ǡͷሻ.
3.3 Kernel and Range
In this section, we discuss how to find the standard
Solution: The given points are ሺʹǡͲሻ and ሺെͳǡͳሻ. These
two points are independent because of the following: matrix representation, and we give examples of how to
find kernel and range.
ʹ Ͳ ͳ ʹ Ͳ
ቂ ቃ ܴଵ ൅ ܴଶ ՜ ܴଶ ቂ ቃ
െͳ ͳ ʹ Ͳ ͳ Definition 3.3.1 Given ܶǣ Թ௡ ՜ Թ௠ where Թ௡ is a
Every point in Թଶ is a linear combination of ሺʹǡͲሻ and domain and Թ௠ is a co-domain. Then, Standard Matrix
ሺെͳǡͳሻ. There exists unique numbers ܿଵ and ܿଶ such Representation is a ݉ ൈ ݊matrix. This means that it is
that ሺ͵ǡͷሻ ൌ ܿଵ ሺʹǡͲሻ ൅ ܿଶ ሺെͳǡͳሻ.
݀݅݉ሺ‫ ݋ܥ‬െ ‫ ݊݅ܽ݉݋ܦ‬ሻ ൈ ݀݅݉ሺ‫݊݅ܽ݉݋ܦ‬ሻ matrix.
͵ ൌ ʹܿଵ െ ܿଶ
ͷ ൌ ܿଶ Definition 3.3.2 Given ܶǣ Թ௡ ՜ Թ௠ where Թ௡ is a
Now, we substitute ܿଶ ൌ ͷ in ͵ ൌ ʹܿଵ െ ܿଶ , we obtain: domain and Թ௠ is a co-domain. Kernel is a set of all
͵ ൌ ʹܿଵ െ ͷ
points in the domain that have image which equals to
ܿଵ ൌ Ͷ
Hence, ሺ͵ǡͷሻ ൌ ͶሺʹǡͲሻ ൅ ͷሺെͳǡͳሻ. the origin point, and it is denoted by ‫ݎ݁ܭ‬ሺܶሻ. This
ܶሺ͵ǡͷሻ ൌ ܶሺͶሺʹǡͲሻ ൅ ͷሺെͳǡͳሻሻ means that ‫ݎ݁ܭ‬ሺܶሻ ൌ ܰ‫݂ܶ݋݁ܿܽ݌݈݈ܵݑ‬.
ܶሺ͵ǡͷሻ ൌ ͶܶሺʹǡͲሻ ൅ ͷܶሺെͳǡͳሻ
Definition 3.3.3 Range is the column space of standard
ܶሺ͵ǡͷሻ ൌ ͶሺͲǡͳǡͶሻ ൅ ͷሺʹǡͳǡͷሻ ൌ ሺͳͲǡͻǡͶͳሻ
matrix representation, and it is denoted by ܴܽ݊݃݁ሺܶሻ.
Thus, ܶሺ͵ǡͷሻ ൌ ሺͳͲǡͻǡͶͳሻ.
Example 3.3.1 Given ܶǣ Թଷ ՜ Թସ where Թଷ is a domain
Example 3.2.12 Let ܶǣ Թ ՜ Թ be a linear
and Թସ is a co-domain.
transformation such that ܶሺͳሻ ൌ ͵. Find ܶሺͷሻ.
ܶ൫ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ‫ݔ‬ଷ ሻ൯ ൌ ሺെͷ‫ݔ‬ଵ ǡ ʹ‫ݔ‬ଶ ൅ ‫ݔ‬ଷ ǡ െ‫ݔ‬ଵ ǡ Ͳሻ
Solution: Since it is a linear transformation, then a. Find the Standard Matrix Representation.
ܶሺͷሻ ൌ ܶሺͷ ή ͳሻ ൌ ͷܶሺͳሻ ൌ ͷሺ͵ሻ ൌ ͳͷ. If it is not a linear b. Find ܶሺሺ͵ǡʹǡͳሻሻ.
transformation, then it is impossible to find ܶሺͷሻ. c. Find ‫ݎ݁ܭ‬ሺܶሻ.
d. Find ܴܽ݊݃݁ሺܶሻ.

Solution: Part a: According to definition 3.3.1, the
99 Standard Matrix Representation, let’s call it ‫ܯ‬, here is
102 M. Kaabar
Ͷ ൈ ͵. We know from section 3.2 that the standard
basis for domain (here is Թଷ ) is ሼሺͳǡͲǡͲሻǡ ሺͲǡͳǡͲሻǡ ሺͲǡͲǡͳሻሽ. െͷ Ͳ Ͳ െͳͷ
We assume the following: ܶ൫ሺ͵ǡʹǡͳሻ൯ ൌ ͵ ή ቎ Ͳ ቏ ൅ ʹ ή ቎ʹ቏ ൅ ͳ ή ቎ͳ቏ ൌ ቎ ͷ ቏
െͳ Ͳ Ͳ െ͵
‫ݒ‬ଵ ൌ ሺͳǡͲǡͲሻ Ͳ Ͳ Ͳ Ͳ
‫ݒ‬ଶ ൌ ሺͲǡͳǡͲሻ െͳͷ

‫ݒ‬ଷ ൌ ሺͲǡͲǡͳሻ ቎ ͷ ቏ is equivalent to ሺെͳͷǡͷǡ െ͵ǡͲሻ. This lives in the


െ͵
Now, we substitute each point of the standard basis for Ͳ
co-domain. Thus, ܶ൫ሺ͵ǡʹǡͳሻ൯ ൌ ሺെͳͷǡͷǡ െ͵ǡͲሻ.
domain in ܶ൫ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ‫ݔ‬ଷ ሻ൯ ൌ ሺെͷ‫ݔ‬ଵ ǡ ʹ‫ݔ‬ଶ ൅ ‫ݔ‬ଷ ǡ െ‫ݔ‬ଵ ǡ Ͳሻ as
follows: Part c: According to definition 3.3.2, ‫ݎ݁ܭ‬ሺܶሻ is a set of
ܶ൫ሺͳǡͲǡͲሻ൯ ൌ ሺെͷǡͲǡ െͳǡͲሻ all points in the domain that have imageൌ ሺͲǡͲǡͲǡͲሻ.
ܶ൫ሺͲǡͳǡͲሻ൯ ൌ ሺͲǡʹǡͲǡͲሻ Hence, ܶ൫ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ‫ݔ‬ଷ ሻ൯ ൌ ሺͲǡͲǡͲǡͲሻ. This means the
ܶ൫ሺͲǡͲǡͳሻ൯ ൌ ሺͲǡͳǡͲǡͲሻ ‫ݔ‬ଵ Ͳ
following: ‫ ܯ‬൥‫ݔ‬ଶ ൩ ൌ ቎Ͳ቏
‫ݔ‬ଵ Ͳ
‫ݔ‬ଷ
‫ݔ‬
Our goal is to find ‫ ܯ‬so that ܶ൫ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ‫ݔ‬ଷ ሻ൯ ൌ ‫ ܯ‬൥ ଶ ൩. Ͳ
‫ݔ‬ଷ െͷ Ͳ Ͳ ‫ݔ‬ Ͳ

቎ Ͳ ʹ ͳ ቏ ൥‫ݔ‬ଶ ൩ ൌ ቎Ͳ቏
െͷ Ͳ Ͳ െͳ Ͳ Ͳ ‫ݔ‬ Ͳ

‫ ܯ‬ൌ ቎ Ͳ ʹ ͳ቏ This is the Standard Matrix Ͳ Ͳ Ͳ Ͳ
െͳ Ͳ Ͳ
Ͳ Ͳ Ͳ Since ‫ݎ݁ܭ‬ሺܶሻ ൌ ܰ‫݈݈ݑ‬ሺ‫ܯ‬ሻ, then we need to find ܰሺ‫ܯ‬ሻ as
Representation. The first, second and third columns follows:
represent ܶሺ‫ݒ‬ଵ ሻǡ ܶሺ‫ݒ‬ଶ ሻƒ†ܶሺ‫ݒ‬ଷ ሻ. െͷ Ͳ Ͳ Ͳ ͳ Ͳ ͲͲ
ͳ
‫ݔ‬ଵ ቌ Ͳ ʹ ͳቮͲቍ െ ܴଵ ቌ Ͳ ʹ ͳቮͲቍ ܴଵ ൅ ܴଷ ՜ ܴଷ
െͳ Ͳ Ͳ Ͳ ͷ െͳ Ͳ Ͳ Ͳ
Part b: Since ൫ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ‫ݔ‬ଷ ሻ൯ ൌ ‫ ܯ‬൥‫ݔ‬ଶ ൩ , then Ͳ Ͳ Ͳ Ͳ Ͳ Ͳ ͲͲ
‫ݔ‬ଷ
െͷ Ͳ Ͳ ͵ ͳ Ͳ ͲͲ ͳ Ͳ Ͳ Ͳ
ቌͲ ʹ ͳቮͲቍ ܴଶ ቌͲ ͳ ͲǤͷቮͲቍ This is a Completely-

ܶ൫ሺ͵ǡʹǡͳሻ൯ ൌ ቎ Ͳ ʹ ͳ቏ ൥ʹ൩ Ͳ Ͳ ͲͲ ଶ Ͳ Ͳ Ͳ Ͳ
െͳ Ͳ Ͳ
ͳ Ͳ Ͳ ͲͲ Ͳ Ͳ Ͳ Ͳ
Ͳ Ͳ Ͳ
Reduced Matrix. Now, we need to read the above
matrix as follows:
‫ݔ‬ଵ ൌ Ͳ
101
104 M. Kaabar
ͳ
‫ݔ‬ଶ ൅ ‫ݔ‬ଷ ൌ Ͳ
ʹ Thus, ܴܽ݊݃݁ሺܶሻ ൌ ܵ‫݊ܽ݌‬ሼሺെͷǡͲǡ െͳǡͲሻǡ ሺͲǡʹǡͲǡͲሻሽ.
ͲൌͲ
ͲൌͲ Result 3.3.1 Given ܶǣ Թ௡ ՜ Թ௠ where Թ௡ is a domain
To write the solution, we need to assume that and Թ௠ is a co-domain. Let ‫ ܯ‬be a standard matrix
‫ݔ‬ଷ ‫ א‬Թሺ‫݈ܾ݁ܽ݅ݎܸܽ݁݁ݎܨ‬ሻ. representation. Then,

Hence,‫ݔ‬ଵ ൌ Ͳ and ‫ݔ‬ଶ ൌ െ ‫ݔ‬ଷ . ܴܽ݊݃݁ሺܶሻ ൌ ܵ‫݊ܽ݌‬ሼ‫ܯ݂݋ݏ݊݉ݑ݈݋ܥݐ݊݁݀݊݁݌݁݀݊ܫ‬ሽ.


ܰሺ‫ܯ‬ሻ ൌ ሼሺͲǡ െ ‫ݔ‬ଷ ǡ ‫ݔ‬ଷ ሻȁ‫ݔ‬ଷ ‫ א‬Թሽ. Result 3.3.2 Given ܶǣ Թ௡ ՜ Թ௠ where Թ௡ is a domain

By letting ‫ݔ‬ଷ ൌ ͳ, we obtain: and Թ௠ is a co-domain. Let ‫ ܯ‬be a standard matrix
ܰ‫ݕݐ݈݈݅ݑ‬ሺ‫ܯ‬ሻ ൌ ܰ‫ ݏ݈ܾ݁ܽ݅ݎܸܽ݁݁ݎܨ݂݋ݎܾ݁݉ݑ‬ൌ ͳ, and representation. Then, ݀݅݉൫ܴܽ݊݃݁ሺܶሻ൯ ൌ ܴܽ݊݇ሺ‫ܯ‬ሻ =

‫ ݏ݅ݏܽܤ‬ൌ ሼሺͲǡ െ ǡ ͳሻሽ Number of Independent Columns.


Thus, ‫ݎ݁ܭ‬ሺܶሻ ൌ ܰሺ‫ ܯ‬ሻ ൌ ܵ‫݊ܽ݌‬ሼሺͲǡ െ ǡ ͳሻሽ. Result 3.3.3 Given ܶǣ Թ௡ ՜ Թ௠ where Թ௡ is a domain

and Թ௠ is a co-domain. Let ‫ ܯ‬be a standard matrix
Part d: According to definition 3.3.3, ܴܽ݊݃݁ሺܶሻ is the representation. Then, ݀݅݉൫‫ݎ݁ܭ‬ሺܶሻ൯ ൌ ܰ‫ݕݐ݈݈݅ݑ‬ሺ‫ܯ‬ሻ.
column space of ‫ܯ‬. Now, we need to locate the columns
in the Completely-Reduced Matrix in part c that Result 3.3.4 Given ܶǣ Թ௡ ՜ Թ௠ where Թ௡ is a domain
contain the leaders, and then we should locate them to and Թ௠ is a co-domain. Let ‫ ܯ‬be a standard matrix
the original matrix as follows: representation. Then, ݀݅݉൫ܴܽ݊݃݁ሺܶሻ൯ ൅ ݀݅݉ሺ‫ݎ݁ܭ‬ሺܶሻሻ ൌ
݀݅݉ሺ‫݊݅ܽ݉݋ܦ‬ሻ.

ͳ Ͳ Ͳ Ͳ Example 3.3.2 Given ܶǣ ܲଶ ՜ Թ. ܶሺ݂ ሺ‫ ݔ‬ሻሻ ൌ ‫׬‬଴ ݂ሺ‫ ݔ‬ሻ݀‫ ݔ‬is
ቌͲ ͳ ͲǤͷቮͲቍ Completely-Reduced Matrix a linear transformation.
Ͳ Ͳ Ͳ Ͳ a. Find ܶሺʹ‫ ݔ‬െ ͳሻ.
Ͳ Ͳ Ͳ Ͳ
b. Find ‫ݎ݁ܭ‬ሺܶሻ.
c. Find ܴܽ݊݃݁ሺܶሻ.

െͷ Ͳ ͲͲ
ͳቮͲቍOrignial Matrix Solution:
ቌͲ ʹ
‫ݔ‬ൌͳ
െͳ Ͳ ͲͲ ଵ
Part a: ܶሺʹ‫ ݔ‬െ ͳሻ ൌ ‫׬‬଴ ሺʹ‫ ݔ‬െ ͳሻ݀‫ ݔ‬ൌ ‫ ݔ‬ଶ െ ‫ ݔ‬ቚ ൌ ͲǤ
Ͳ Ͳ ͲͲ ‫ݔ‬ൌͲ
Part b: To find ‫ݎ݁ܭ‬ሺܶሻ, we set equation of ܶ ൌ Ͳ, and
݂ሺ‫ ݔ‬ሻ ൌ ܽ଴ ൅ ܽଵ ‫ܲ א ݔ‬ଶ .
103
106 M. Kaabar
ଵ ௔భ ‫ݔ‬ൌͳ
Thus, ܶሺ݂ሺ‫ݔ‬ሻሻ ൌ ‫׬‬଴ ሺܽ଴ ൅ ܽଵ ‫ ݔ‬ሻ݀‫ ݔ‬ൌ ܽ଴ ‫ ݔ‬൅ ‫ݔ‬ଶ ቚ ൌͲ
ଶ ‫ݔ‬ൌͲ 2. Let ܶǣ Թଶ ՜ Թ be a linear transformation such that
ܽଵ
ܽ଴ ൅ െ Ͳ ൌ Ͳ
ʹ ܶሺͳǡͲሻ ൌ Ͷ, ܶሺʹǡ െʹሻ ൌ ʹ. Find the standard matrix

ܽ଴ ൌ െ ଶభ representation of ܶ.
௔భ
Hence, ‫ݎ݁ܭ‬ሺܶሻ ൌ ሼെ ൅ ܽଵ ‫ݔ‬ȁܽଵ ‫ א‬Թሽ. We also know that 3. Given ܶǣ ܲଷ ՜ Թ such that ܶሺܽ ൅ ܾ‫ ݔ‬൅ ܿ‫ ݔ‬ଶ ሻ ൌ


݀݅݉ሺ‫ݎ݁ܭ‬ሺܶሻ ൌ ͳ because there is one free variable. In ‫׬‬଴ ሺܽ ൅ ܾ‫ ݔ‬൅ ܿ‫ ݔ‬ଶ ሻ݀‫ݔ‬. Find ‫ݎ݁ܭ‬ሺܶሻ.
addition, we can also find basis by letting ܽଵ be any
4. Given ܶǣ Թସ ՜ Թଷ is a linear transformation such
real number not equal to zero, say ܽଵ ൌ ͳ, as follows:
ͳ that ܶሺ‫ݔ‬ǡ ‫ݕ‬ǡ ‫ݖ‬ǡ ‫ݓ‬ሻ ൌ ሺ‫ ݔ‬൅ ‫ ݕ‬൅ ‫ ݖ‬െ ʹ‫ݓ‬ǡ െʹ‫ݓ‬ǡ ‫ݓ‬ሻ.
‫ ݏ݅ݏܽܤ‬ൌ ሼെ ൅ ‫ݔ‬ሽ
ʹ a. Find the standard matrix representation of ܶ.

Thus, ‫ܶ ݎ݁ܭ‬ሻ ൌ ܵ‫݊ܽ݌‬ሼെ ൅ ‫ݔ‬ሽ.

ଶ b. Find ݀݅݉ሺ‫ݎ݁ܭ‬ሺܶሻሻ.
Part c: It is very easy to find range here. ܴܽ݊݃݁ሺܶሻ ൌ Թ c. Find ܴܽ݊݃݁ሺܶሻ.
because we linearly transform from a second degree
polynomial to a real number. For example, if we 5. Given ܶǣ ܲସ ՜  Թଶൈଶ such that
linearly transform from a third degree polynomial to a ݃ሺെͳሻ ݃ሺͲሻ
second degree polynomial, then the range will be ܲଶ . ܶሺ݃ሺ‫ݔ‬ሻሻ ൌ ൤ ൨ is a linear transformation.
݃ሺെͳሻ ݃ሺͲሻ
a. Find the standard matrix representation of ܶ.
3.4 Exercises b. Find ‫ݎ݁ܭ‬ሺܶሻ and write ‫ݎ݁ܭ‬ሺܶሻ as a ܵ‫݊ܽ݌‬.
1. Given ܶǣ Թଷ ՜  Թଶൈଶ such that c. Find a basis for ܴܽ݊݃݁ሺܶሻ and write ܴܽ݊݃݁ሺܶሻ as
‫ݔ‬ଵ ‫ݔ‬ଵ aܵ‫݊ܽ݌‬.
ܶ൫ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ‫ݔ‬ଷ ሻ൯ ൌ ቂ‫ ݔ ݔ‬ቃ is a linear transformation.
ଷ ଶ

a. Find the standard matrix representation of ܶ. 6. Given ܶǣ ܲଷ ՜ Թ is a linear transformation such that
b. Find ‫ݎ݁ܭ‬ሺܶሻ. ܶሺͳሻ ൌ ͸ǡ ܶሺ‫ ݔ‬ଶ ൅ ‫ ݔ‬ሻ ൌ െͷ, and ܶሺ‫ ݔ‬ଶ ൅ ʹ‫ ݔ‬൅ ͳሻ ൌ Ͷ.
c. Find a basis for ܴܽ݊݃݁ሺܶሻ and write ܴܽ݊݃݁ሺܶሻ as ƒǤ Find ܶሺ‫ ݔ‬ሻǡ ܶሺ‫ ݔ‬ଶ ሻƒ†ܶሺͷ‫ ݔ‬ଶ ൅ ͵‫ ݔ‬൅ ͺሻ.
aܵ‫݊ܽ݌‬. b. Find the standard matrix representation of ܶ.
c. Find ‫ݎ݁ܭ‬ሺܶሻ and write ‫ݎ݁ܭ‬ሺܶሻ as a ܵ‫݊ܽ݌‬.

105
Chapter 4 108 M. Kaabar

Solution: If such ߙ and such ܳ exist, we say ߙ is an


Characteristic Equation of eigenvalue of ‫ܣ‬, and we say ܳ is an eigenvector of ‫ܣ‬
corresponds to the eigenvalue ߙ.
ܽଵ ܽଵ
Matrix ‫ܽ ۍ‬ଶ ‫ې‬ ‫ܽ ۍ‬ଶ ‫ې‬
‫ܽ ێ ܣ‬ଷ ‫ ۑ‬ൌ ߙ ‫ܽ ێ‬ଷ ‫ۑ‬
‫ۑڭێ‬ ‫ۑڭێ‬
In this chapter, we discuss how to find eigenvalues and ‫ܽ ۏ‬௡ ‫ے‬ ‫ܽۏ‬௡ ‫ے‬
eigenvectors of a matrix. Then, we introduce the ܽଵ ܽଵ Ͳ
‫ܽ ۍ‬ଶ ‫ې‬ ‫ܽ ۍ‬ଶ ‫ې Ͳ ۍ ې‬
diagonalizable matrix, and we explain how to ‫ܽ ێ ܣ‬ଷ ‫ ۑ‬െ ߙ ‫ܽ ێ‬ଷ ‫ ۑ‬ൌ ‫ۑۑͲێێ‬
‫ۑڭێ‬ ‫ۑڭێ ۑ ڭ ێ‬
determine whether a matrix is diagonalizable or not. ‫ܽ ۏ‬௡ ‫ے‬ ‫ܽ ۏ‬௡ ‫ے Ͳ ۏ ے‬
At the end of this chapter we discuss how to find ܽଵ Ͳ
‫ܽ ۍ‬ଶ ‫ېͲۍ ې‬
‫ۑ ێ‬
diagonal matrix and invertible matrix. ሾ‫ ܣ‬െ ߙ‫ܫ‬௡ ሿ ‫ܽ ێ‬ଷ ‫ ۑ‬ൌ ‫ۑͲێ‬
‫ۑڭێ ۑ ڭ ێ‬
‫ܽۏ‬௡ ‫ےͲۏ ے‬
4.1 Eigenvalues and We conclude that such ߙ and ሺܳ ് ‘”‹‰‹ሻexist if and
only if ݀݁‫ݐ‬ሺ‫ ܣ‬െ ߙ‫ܫ‬௡ ሻ ൌ Ͳ.
Eigenvectors
Note: ݀݁‫ݐ‬ሺ‫ ܣ‬െ ߙ‫ܫ‬௡ ሻ is called characteristic polynomial of
In this section, we give an example explaining the ‫ܣ‬.
steps for finding eigenvalues and eigenvectors. Example 4.1.2 Given the following ͵ ൈ ͵ matrix:
Example 4.1.1 Given ݊ ൈ ݊ matrix, say ‫ܣ‬, Let ߙ ‫ א‬Թ. ʹ Ͳ ͳ
Can we find a point ܳ in Թ௡ , ‫ ܣ‬ൌ ൥Ͳ ͳ െʹ൩
Ͳ Ͳ െͳ
ܳ ൌ ሺܽଵ ǡ ܽଶ ǡ ǥ ǡ ܽ௡ ሻ, such that ܳ ് ሺͲǡͲǡͲǡͲǡ ǥ ǡͲሻ, and
ܽଵ ܽଵ a. Find all eigenvalues of ‫ܣ‬.
‫ܽ ۍ‬ଶ ‫ې‬ ‫ܽ ۍ‬ଶ ‫ې‬ b. For each eigenvalue, find the corresponding
‫ܣ‬௡ൈ௡ ܳ ൌ ‫ܽ ێ ܣ‬ଷ ‫ ۑ‬ൌ ߙ ‫ܽ ێ‬ଷ ‫?ۑ‬ eigenspace.
‫ۑڭێ‬ ‫ۑڭێ‬
‫ܽ ۏ‬௡ ‫ے‬ ‫ܽۏ‬௡ ‫ے‬ Solution: Part a: To find all eigenvalues, we set
characteristic polynomial of ‫ ܣ‬ൌ Ͳ. This means that
݀݁‫ݐ‬ሺ‫ ܣ‬െ ߙ‫ܫ‬௡ ሻ ൌ Ͳ. Then, we do the following:
107
110 M. Kaabar
݀݁‫ݐ‬ሺ‫ ܣ‬െ ߙ‫ܫ‬௡ ሻ ൌ Ͳ
ʹ Ͳ ͳ ͳ Ͳ Ͳ Thus, we form the augmented-matrix for the
݀݁‫ ݐ‬൭൥Ͳ ͳ െʹ൩ െ ߙ ൥Ͳ ͳ Ͳ൩൱ ൌ Ͳ homogeneous system as follows:
Ͳ Ͳ െͳ Ͳ Ͳ ͳ Ͳ Ͳ ͳ Ͳ
ʹ Ͳ ͳ ߙ Ͳ Ͳ
൭Ͳ െͳ െʹอͲ൱
݀݁‫ ݐ‬൭൥Ͳ ͳ െʹ൩ െ ൥ Ͳ ߙ Ͳ ൩൱ ൌ Ͳ
Ͳ Ͳ െ͵ Ͳ
Ͳ Ͳ െͳ Ͳ Ͳ ߙ ܽଷ ൌ Ͳ
ʹെߙ Ͳ ͳ
݀݁‫ ݐ‬൭൥ Ͳ ͳെߙ െʹ ൩൱ ൌ Ͳ It is very easy to solve it: ܽଶ ൌ െʹܽଷ ൌ Ͳ
Ͳ Ͳ െͳ െ ߙ ܽଷ ൌ Ͳ
ʹെߙ Ͳ ͳ
݀݁‫ ݐ‬൭൥ Ͳ ͳെߙ െʹ ൩൱ ൌ ሺʹ െ ߙሻ ή ሺͳ െ ߙሻ ή ሺെͳ െ ߙሻ ൌ Ͳ ܽଵ ‫ א‬Թ (Free-Variable), and ܽଶ ൌ ܽଷ ൌ Ͳ.
Ͳ Ͳ െͳ െ ߙ ‫ܧ‬ଶ ൌ ሼሺܽଵ ǡ ͲǡͲሻȁܽଵ ‫ א‬Թሽ. Now, we can select any value for
ߙൌʹ ܽଵ ് Ͳ. Let’s select ܽଵ ൌ ͳ.
Thus, the eigenvalues of ‫ ܣ‬are: ߙ ൌ ͳ We can also find ݀݅݉ሺ‫ܧ‬ଶ ሻ ൌ ͳ.
ߙ ൌ െͳ Hence, ‫ܧ‬ଶ ൌ ܵ‫݊ܽ݌‬ሼሺͳǡͲǡͲሻሽ. Similarly, we can find the
Part b: Since the first eigenvalue is 2, then the eigenspaces that correspond to the other eigenvalues:
eigenspace of ‫ ܣ‬that corresponds to the eigenvalue 2 ߙ ൌ ͳ and ߙ ൌ െͳ. To find eigenspace that corresponds
equals to the set of all points in Թଷ such that to ߙ ൌ ͳ, say ‫ܧ‬ଵ , we need to find ‫ܧ‬ଵ ൌ ܰሺ‫ ܣ‬െ ͳ‫ܫ‬ଷ ሻ. Note:
ሺ‫ܣݐ݊݅݋݌‬ሻ ൌ ߙ ή ‫ ݐ݊݅݋݌‬ൌ set of all eigenvectors of ‫ ܣ‬that ܰ here representsܰ‫݈݈ݑ‬. Now, we do the following:
correspond to ߙ ൌ ሺʹ ൅ ‫ݐ݊݅݋݌݊݅݃݅ݎ݋‬ሻ. To find eigenspace ʹ Ͳ ͳ ͳ Ͳ Ͳ
that corresponds to ߙ ൌ ʹ, say ‫ܧ‬ଶ , we need to find ‫ܧ‬ଵ ൌ ܰሺ‫ ܣ‬െ ͳ‫ܫ‬ଷ ሻ ൌ ܰ ൭൥Ͳ ͳ െʹ൩ െ ͳ ൥Ͳ ͳ Ͳ൩൱
‫ܧ‬ଶ ൌ ܰሺ‫ ܣ‬െ ʹ‫ܫ‬ଷ ሻ. Note: ܰ here representsܰ‫݈݈ݑ‬. Now, we Ͳ Ͳ െͳ Ͳ Ͳ ͳ
do the following: ʹെͳ Ͳ ͳ
‫ܧ‬ଵ ൌ ܰሺ‫ ܣ‬െ ͳ‫ܫ‬ଷ ሻ ൌ ܰ ൭൥ Ͳ ͳെͳ െʹ ൩൱
ʹ Ͳ ͳ ͳ Ͳ Ͳ
‫ܧ‬ଶ ൌ ܰሺ‫ ܣ‬െ ʹ‫ܫ‬ଷ ሻ ൌ ܰ ൭൥Ͳ ͳ െʹ൩ െ ʹ ൥Ͳ ͳ Ͳ൩൱ Ͳ Ͳ െͳ െ ͳ
ͳ Ͳ ͳ
Ͳ Ͳ െͳ Ͳ Ͳ ͳ ‫ܧ‬ଵ ൌ ܰሺ‫ ܣ‬െ ͳ‫ܫ‬ଷ ሻ ൌ ܰ ൥Ͳ Ͳ െʹ൩
ʹെʹ Ͳ ͳ
‫ܧ‬ଶ ൌ ܰሺ‫ ܣ‬െ ʹ‫ܫ‬ଷ ሻ ൌ ܰ ൭൥ Ͳ Ͳ Ͳ െʹ
ͳെʹ െʹ ൩൱
Thus, as we did previously, we form the augmented-
Ͳ Ͳ െͳ െ ʹ
Ͳ Ͳ ͳ matrix for the homogeneous system as follows:
‫ܧ‬ଶ ൌ ܰሺ‫ ܣ‬െ ʹ‫ܫ‬ଷ ሻ ൌ ܰ ൥Ͳ െͳ െʹ൩ ͳ Ͳ ͳ Ͳ
Ͳ Ͳ െ͵ ൭Ͳ Ͳ െʹอͲ൱
Ͳ Ͳ െʹ Ͳ
ܽଵ ൌ െܽଷ
It is very easy to solve it: ܽ ൌ Ͳ
109 ଷ
112 M. Kaabar
ܽଶ ‫ א‬Թ (Free-Variable), and ܽଵ ൌ ܽଷ ൌ Ͳ.
‫ܧ‬ଵ ൌ ሼሺͲǡ ܽଶ ǡ Ͳሻȁܽଶ ‫ א‬Թሽ. Now, we can select any value for
ܽଶ ് Ͳ. Let’s select ܽଶ ൌ ͳ.
4.2 Diagonalizable Matrix
We can also find ݀݅݉ሺ‫ܧ‬ଵ ሻ ൌ ͳ.
In this section, we explain the concept of
Hence, ‫ܧ‬ଵ ൌ ܵ‫݊ܽ݌‬ሼሺͲǡͳǡͲሻሽ. Finally, to find eigenspace
that corresponds to ߙ ൌ െͳ, say ‫ିܧ‬ଵ , we need to find diagonalization, and we give some examples explaining
‫ିܧ‬ଵ ൌ ܰሺ‫ ܣ‬൅ ͳ‫ܫ‬ଷ ሻ. Note: ܰ here representsܰ‫݈݈ݑ‬. Now, it, and how to find the diagonal and invertible
we do the following: matrices.
ʹ Ͳ ͳ ͳ Ͳ Ͳ Definition 4.2.1 ‫ ܣ‬is diagonalizable if there exists an
‫ିܧ‬ଵ ൌ ܰሺ‫ ܣ‬൅ ͳ‫ܫ‬ଷ ሻ ൌ ܰ ൭൥Ͳ ͳ െʹ൩ ൅ ͳ ൥Ͳ ͳ Ͳ൩൱
Ͳ Ͳ െͳ Ͳ Ͳ ͳ invertible matrix ‫ܮ‬, and a diagonal matrix ‫ ܦ‬such that
ʹ൅ͳ Ͳ ͳ ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ .
‫ିܧ‬ଵ ൌ ܰሺ‫ ܣ‬൅ ͳ‫ܫ‬ଷ ሻ ൌ ܰ ൭൥ Ͳ ͳ൅ͳ െʹ ൩൱
Ͳ Ͳ െͳ ൅ ͳ Result 4.2.1 ‫ ܣ‬is ݊ ൈ ݊diagonalizable matrix if and only
͵ Ͳ ͳ
‫ିܧ‬ଵ ൌ ܰሺ‫ ܣ‬൅ ͳ‫ܫ‬ଷ ሻ ൌ ܰ ൥Ͳ ʹ െʹ൩ if ݀݁‫ݐ‬ሺ‫ ܣ‬െ ߙ‫ܫ‬௡ ሻ is written as multiplication of linear
Ͳ Ͳ Ͳ equation, say ݀݁‫ݐ‬ሺ‫ ܣ‬െ ߙ‫ܫ‬௡ ሻ = some constants ሺܿଵ െ ߙ ሻ ή
Thus, as we did previously, we form the augmented-
ሺܿଶ െ ߙ ሻ ή ǥ ή ሺܿ௞ െ ߙሻ, and ݀݅݉൫‫ܧ‬௖೔ ൯ ൌ ݊௜ (Multiplicity of
matrix for the homogeneous system as follows:
͵ Ͳ ͳ Ͳ the Eigenvalues) for ݅ ൑ ݅ ൑ ݇.
൭Ͳ ʹ െʹอͲ൱
Ͳ Ͳ Ͳ Ͳ Example 4.2.1 Assume ‫ ܣ‬is ͷ ൈ ͷ matrix, and

ܽଵ ൌ െ ܽଷ ݀݁‫ݐ‬ሺ‫ ܣ‬െ ߙ‫ܫ‬ହ ሻ ൌ ሺ͵ െ ߙ ሻଶ ή ሺെʹ െ ߙ ሻ ή ሺͶ െ ߙ ሻ, and the

It is very easy to solve it: ܽଶ ൌ ܽଷ
ߙൌ͵
ͲൌͲ eigenvalues of ‫ ܣ‬are ߙ ൌ െʹ
ܽଷ ‫ א‬Թ (Free-Variable). ߙൌͶ

‫ିܧ‬ଵ ൌ ሼሺെ ܽଷ ǡ ܽଷ ǡ ܽଷ ሻȁܽଷ ‫ א‬Թሽ. Now, we can select any Given ݀݅݉ሺ‫ܧ‬ଷ ሻ ൌ ͳǡ ݀݅݉ሺ‫ିܧ‬ଶ ሻ ൌ ͳƒ† ݀݅݉ሺ‫ܧ‬ସ ሻ ൌ ʹ.

value for ܽଷ ് Ͳ. Let’s select ܽଷ ൌ ͳ. Is ‫ ܣ‬diagonalizable matrix?
We can also find ݀݅݉ሺ‫ିܧ‬ଵ ሻ ൌ ͳ.

Hence, ‫ିܧ‬ଵ ൌ ܵ‫݊ܽ݌‬ሼሺെ ǡ ͳǡͳሻሽ. Solution: It is not a diagonalizable matrix because

݀݅݉ሺ‫ܧ‬ଷ ሻ ൌ ͳ, and it must be equal to 2 instead of 1.

111
114 M. Kaabar
Example 4.2.2 Assume ‫ ܣ‬is Ͷ ൈ Ͷ matrix, and
repetition if there is a repetition, and all other
݀݁‫ݐ‬ሺ‫ ܣ‬െ ߙ‫ܫ‬ସ ሻ ൌ ሺʹ െ ߙ ሻଷ ή ሺ͵ െ ߙ ሻ, and given
elements are zeros. Hence, the diagonal matrix ‫ ܦ‬is as
݀݅݉ሺ‫ܧ‬ଶ ሻ ൌ ͵ƒ† ݀݅݉ሺ‫ܧ‬ଷ ሻ ൌ ͳ. follows:
ʹ Ͳ Ͳ
Is ‫ ܣ‬diagonalizable matrix? ‫ ܦ‬ൌ ൥Ͳ െͳ Ͳ൩
Ͳ Ͳ ͳ
Solution: According result 4.2.1, it is a diagonalizable
To find an invertible matrix ‫ܮ‬, we create a ͵ ൈ ͵ matrix
matrix.
like the one above but each eigenvalue above is
Example 4.2.3 Given the following ͵ ൈ ͵ matrix: represented by a column of the eigenspace that
ʹ Ͳ ͳ corresponds to that eigenvalue as follows:
‫ ܣ‬ൌ Ͳ ͳ െʹ൩. Use example 4.1.2 from section 4.1 to
൥ ͳ
Ͳ Ͳ െͳ ͳ െ Ͳ
‫ܮ‬ൌ൦ ͵ ൪
answer the following questions: Ͳ ͳ ͳ
Ͳ ͳ Ͳ
a. Is ‫ ܣ‬diagonalizable matrix? If yes, find a
diagonal matrix ‫ܦ‬, and invertible matrix ‫ ܮ‬such ʹ Ͳ ͳ
Thus,‫ ܣ‬ൌ ൥Ͳ ͳ െʹ൩ ൌ ‫ିܮܦܮ‬ଵ
that ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ . Ͳ Ͳ െͳ
b. Find ‫ ଺ܣ‬such that ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ .
ଵ ଵ ିଵ
c. Find ‫ܣ‬ଷ଴଴ such that ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ . ͳ െ Ͳ ʹ Ͳ Ͳ ͳ െ Ͳ
ଷ ଷ
ൌ ቎Ͳ ͳ ͳ቏ ൥Ͳ െͳ Ͳ൩ ቎ Ͳ ͳ ͳ቏
Solution: Part a: From example 4.1.2, we found the
Ͳ ͳ Ͳ Ͳ Ͳ ͳ Ͳ ͳ Ͳ
following:
‫ܧ‬ଶ ൌ ܵ‫݊ܽ݌‬ሼሺͳǡͲǡͲሻሽ Part b: To find ‫ ଺ܣ‬such that ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ , we do the
‫ܧ‬ଵ ൌ ܵ‫݊ܽ݌‬ሼሺͲǡͳǡͲሻሽ following steps:
ͳ
‫ିܧ‬ଵ ൌ ܵ‫݊ܽ݌‬ሼሺെ ǡ ͳǡͳሻሽ ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ
͵
According to result 4.2.1, ‫ ܣ‬is a diagonalizable matrix. ‫ ଺ܣ‬ൌ ሺ‫ିܮܦܮ‬ଵ ሻ ή ሺ‫ିܮܦܮ‬ଵ ሻ ή ǥ ή ሺ‫ିܮܦܮ‬ଵ ሻ
Now, we need to find a diagonal matrix ‫ܦ‬, and ‫ ଺ܣ‬ൌ ሺ‫ܦܮ‬ଶ ‫ିܮ‬ଵ ሻ ή ǥ ή ሺ‫ିܮܦܮ‬ଵ ሻ
invertible matrix ‫ ܮ‬such that ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ . To find a ‫ ଺ܣ‬ൌ ‫ିܮ ଺ܦܮ‬ଵ
diagonal matrix ‫ܦ‬, we create a ͵ ൈ ͵ matrix, and we
put eigenvalues on the main diagonal with

113
116 M. Kaabar

Thus,
݀݅݉ሺ‫ܧ‬ଷ ሻ ൌ ͵
ʹ Ͳ Ͳ ଺ ‫ܧ‬ଶ ൌ ܵ‫݊ܽ݌‬ሼሺͲǡͲǡͲǡͳǡͳሻǡ ሺͲǡͲǡͲǡͲǡͳͲሻሽ
‫ ଺ܣ‬ൌ ‫ିܮ ଺ܦܮ‬ଵ ൌ ‫ ܮ‬൥Ͳ െͳ Ͳ൩ ‫ିܮ‬ଵ ൌ
Ͳ Ͳ ͳ a. Find the characteristic polynomial of ‫ܣ‬.

ʹ Ͳ Ͳ ͸Ͷ Ͳ Ͳ
b. Find a diagonal matrix ‫ ܦ‬such that ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ .
‫ ܮ‬൥ Ͳ ሺെͳሻ଺ Ͳ ൩ ‫ିܮ‬ଵ ൌ ‫ ܮ‬൥ Ͳ ͳ Ͳ൩ ‫ିܮ‬ଵ
Ͳ Ͳ ͳ c. Find an invertible matrix ‫ ܮ‬such that ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ .
Ͳ Ͳ ͳ଺

Part c: To find ‫ܣ‬ଷ଴଴ such that ‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ , we do the 3. Assume that ܹ is ͵ ൈ ͵ matrix, and
following steps: ܹ െ ߙ‫ܫ‬ଷ ൌ ሺͳ െ ߙሻሺʹ െ ߙሻଶ . Given ܰሺܹ െ ‫ ܫ‬ሻ ൌ
ܵ‫݊ܽ݌‬ሼሺͳǡʹǡͲሻሽ, and ܰሺܹ െ ʹ‫ ܫ‬ሻ ൌ ܵ‫݊ܽ݌‬ሼሺʹǡͲǡ͵ሻሽ. Is ܹ
‫ ܣ‬ൌ ‫ିܮܦܮ‬ଵ diagonalizable matrix? Explain.
‫ܣ‬ଷ଴଴ ൌ ሺ‫ିܮܦܮ‬ଵ ሻ ή ሺ‫ିܮܦܮ‬ଵ ሻ ή ǥ ή ሺ‫ିܮܦܮ‬ଵ ሻ
‫ܣ‬ଷ଴଴ ൌ ሺ‫ ܦܮ‬ଶ ‫ିܮ‬ଵ ሻ ή ǥ ή ሺ‫ିܮܦܮ‬ଵ ሻ
‫ܣ‬ଷ଴଴ ൌ ‫ܦܮ‬ଷ଴଴ ‫ିܮ‬ଵ
Chapter 5
ʹ Ͳ Ͳ ଷ଴଴
Thus, ‫ܣ‬ଷ଴଴ ൌ ‫ܦܮ‬ଷ଴଴ ‫ିܮ‬ଵ ൌ ‫ ܮ‬൥Ͳ െͳ Ͳ൩ ‫ିܮ‬ଵ ൌ
Ͳ Ͳ ͳ
Matrix Dot Product
ʹଷ଴଴ Ͳ Ͳ ʹଷ଴଴ Ͳ Ͳ
‫ܮ‬൥ Ͳ ሺെͳሻ ଷ଴଴ ିଵ
Ͳ ൩‫ ܮ‬ൌ ‫ܮ‬൥ Ͳ ͳ Ͳ൩ ‫ିܮ‬ଵ In this chapter, we discuss the dot product only in Թ௡ .
Ͳ Ͳ ͳଷ଴଴ Ͳ Ͳ ͳ In addition, we give some results about the dot product
in Թ௡ Ǥ At the end of this chapter, we get introduced to a
4.3 Exercises concept called “Gram-Schmidt Orthonormalization”.
1. Given the following Ͷ ൈ Ͷ matrix:
ͳ Ͳ Ͳ Ͳ 5.1 The Dot Product in Թ௡
‫ ܥ‬ൌ ቎Ͳ ͳ ͳ ͳ ቏. Is ‫ ܥ‬diagonalizable matrix?
Ͳ Ͳ െͳ ͳ
Ͳ Ͳ Ͳ െͳ In this section, we first give an example of the dot
product in Թ௡ , and then we give three important
2. Assume that ‫ ܣ‬is ͷ ൈ ͷ diagonalizable matrix, and
given the following: results related to this concept.
‫ܧ‬ଷ ൌ ܵ‫݊ܽ݌‬ሼሺʹǡͳǡͲǡͲǡͳሻǡ ሺͲǡͳǡͲǡͳǡͳሻǡ ሺͲǡͲǡʹǡʹǡͲሻሽ Example 5.1.1 Assume that ‫ ܣ‬ൌ ሺʹǡͶǡͳǡ͵ሻ and
‫ ܤ‬ൌ ሺͲǡͳǡʹǡͷሻ where ‫ܣ‬ǡ ‫ א ܤ‬Թସ . Find ‫ ܣ‬ή ‫ܤ‬.
115
118 M. Kaabar
Solution: To find ‫ ܣ‬ή ‫ܤ‬, we need to do a simple vector
Result 5.1.4 Assume that ܹ ൌ ሺ‫ݔ‬ଵ ǡ ‫ݔ‬ଶ ǡ ‫ݔ‬ଷ ǡ ǥ ǡ ‫ݔ‬௡ ሻ, then
dot product as follows:
the squared-norm of ܹ is written as follows:
‫ ܣ‬ή ‫ ܤ‬ൌ ሺʹǡͶǡͳǡ͵ሻ ή ሺͲǡͳǡʹǡͷሻ ȁȁܹ ȁȁଶ ൌ ‫ݔ‬ଵ ଶ ൅ ‫ݔ‬ଶ ଶ ൅ ‫ݔ‬ଷ ଶ ൅ ‫ ڮ‬൅ ‫ݔ‬௡ ଶ ൌ ܹ ή ܹ.
‫ܣ‬ή‫ ܤ‬ൌ ʹήͲ൅Ͷήͳ൅ͳήʹ൅͵ήͷ
‫ ܣ‬ή ‫ ܤ‬ൌ Ͳ ൅ Ͷ ൅ ʹ ൅ ͳͷ 5.2 Gram-Schmidt
Thus, ‫ ܣ‬ή ‫ ܤ‬ൌ ʹͳ
Result 5.1.1 If ܹଵ ƒ†ܹଶ in Թ௡ and ܹଵ ് ሺͲǡͲǡͲǡ ǥ ǡͲሻ
Orthonormalization
and ܹଶ ് ሺͲǡͲǡͲǡ ǥ ǡͲሻ, and ܹଵ ή ܹଶ ൌ Ͳ, then ܹଵ and ܹଶ In this section, we give one example that explains the
are independent. concept of Gram-Schmidt Orthonormalization, and
Result 5.1.2 If ܹଵ ƒ†ܹଶ in Թ௡ are independent, then how it is related to what we have learned in chapters 2
may/maybe not ܹଵ ή ܹଶ ൌ Ͳ. (i.e. Assume that and 3.
ܹଵ ൌ ሺͳǡͳǡͳǡͳሻ and ܹଶ ൌ ሺͲǡͳǡͳǡͳሻ, then ܹଵ ή ܹଶ ൌ ͵) Example 5.2.1 Given the following:
Result 5.1.3 If ܹଵ ǡ ܹଶ ǡ ܹଷ ƒ†ܹସ in Թ௡ and none of ‫ ܣ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡͲǡͳǡͳሻǡ ሺͲǡͳǡͲǡͳሻǡ ሺͲǡͳǡͳǡͳሻሽ. Find the
orthogonal basis for ‫ܣ‬.
them is ሺͲǡͲǡͲǡ ǥ ǡͲሻ, then we say that ܹଵ ǡ ܹଶ ǡ ܹଷ ƒ†ܹସ
Hint: The orthogonal basis means Gram-Schmidt
are orthogonal if they satisfy the following conditions: Orthonormalization.
ܹଵ ή ܹଶ ൌ Ͳ
Solution: To find the orthogonal basis for ‫ܣ‬, we need to
ܹଵ ή ܹଷ ൌ Ͳ do the following steps:
Step 1: Find a basis for ‫ܣ‬.
ܹଵ ή ܹସ ൌ Ͳ
ͳ Ͳ ͳ ͳ ͳ Ͳ ͳ ͳ
ܹଶ ή ܹଷ ൌ Ͳ ൥Ͳ ͳ Ͳ ͳ൩ െܴଶ ൅ ܴଷ ՜ ܴଷ ൥Ͳ ͳ Ͳ ͳ൩ This is the
Ͳ ͳ ͳ ͳ Ͳ Ͳ ͳ Ͳ
ܹଶ ή ܹସ ൌ Ͳ Semi-Reduced Matrix.
ܹଷ ή ܹସ ൌ Ͳ Since we do now have a zero-row, then ݀݅݉ሺ‫ܣ‬ሻ ൌ ͵.
To write a basis for ‫ܣ‬, it is recommended to choose
rows in the above Semi-Reduced Matrix. Thus, a basis
117 for ‫ ܣ‬is ሼሺͳǡͲǡͳǡͳሻǡ ሺͲǡͳǡͲǡͳሻǡ ሺͲǡͲǡͳǡͲሻሽ.
120 M. Kaabar

Step 2: Vector spaces assumptions. ௩ ήௐ


ܹଷ ൌ ‫ݒ‬ଷ െ ߙଶ ή ܹଶ െ ߙଵ ή ܹଵ where ߙଵ ൌ ቀ య ȁȁభమ ቁ and
ȁȁௐ భ
Since the basis for ‫ ܣ‬is ሼሺͳǡͲǡͳǡͳሻǡ ሺͲǡͳǡͲǡͳሻǡ ሺͲǡͲǡͳǡͲሻሽ, we
௩ ήௐ
య మ
assume the following: ߙଶ ൌ ቀȁȁௐ ȁȁమ
ቁ.

௩యήௐమ ௩ ήௐ
‫ݒ‬ଵ ൌ ሺͳǡͲǡͳǡͳሻ Thus, ܹଷ ൌ ‫ݒ‬ଷ െ ൬ మ
య భ
൰ ή ܹଶ െ ቀȁȁௐ ቁ ή ܹଵ
หȁௐమ ȁห ȁȁమభ
‫ݒ‬ଶ ൌ ሺͲǡͳǡͲǡͳሻ
ͳ ͳ ʹ
‫ݒ‬ଷ ൌ ሺͲǡͲǡͳǡͲሻ ‫ۇ‬ሺͲǡͲǡͳǡͲሻ ή ቀെ ͵ ǡ ͳǡ െ ͵ ǡ ͵ቁ‫ۊ‬ ͳ ͳ ʹ
ܹଷ ൌ ሺͲǡͲǡͳǡͲሻ െ ‫ۈ‬ ଶ ‫ ۋ‬ή ൬െ ͵ ǡ ͳǡ െ ͵ ǡ ͵൰
Step 3: Use Gram-Schmidt Orthonormalization ͳ ͳ ʹ
ቤቚቀെ ǡ ͳǡ െ ǡ ቁቚቤ
‫ۉ‬ ͵ ͵ ͵ ‫ی‬
method.
ሺͲǡͲǡͳǡͲሻ ή ሺͳǡͲǡͳǡͳሻ
Let’s assume that the orthogonal basis for ‫ ܣ‬is െ൭ ଶ ൱ ή ሺͳǡͲǡͳǡͳሻ
หȁሺͳǡͲǡͳǡͳሻȁห
‫୓ܤ‬୰୲୦୭୥୭୬ୟ୪ ൌ ሼܹଵ ǡ ܹଶ ǡ ܹଷ ሽ. Now, we need to find ସ ଵ ଵଵ ଻
Thus, ܹଷ ൌ ቀെ ǡെ ǡ ǡ െ ቁ.
ଵହ ହ ଵହ ଵହ
ܹଵ ǡ ܹଶ ƒ†ܹଷ . To find them, we need to do the
following: Hence, the orthogonal basis for ‫( ܣ‬Gram-Schmidt
Orthonormalization) is
ܹଵ ൌ ‫ݒ‬ଵ ൌ ሺͳǡͲǡͳǡͳሻ ଵ ଵ ଶ ସ ଵ ଵଵ ଻
ሼሺͳǡͲǡͳǡͳሻǡ ቀെ ǡ ͳǡ െ ǡ ቁ ǡ ቀെ ǡെ ǡ ǡ െ ቁሽ.
௩ ήௐ ଷ ଷ ଷ ଵହ ହ ଵହ ଵହ
ܹଶ ൌ ‫ݒ‬ଶ െ ߙଵ ή ሺ‫ܹݏݑ݋݅ݒ݁ݎ݌‬௜ ᇲ ௦ ሻ where ߙଵ ൌ ቀ మ ȁȁభమ ቁ
ȁȁௐ భ

Thus, ܹଶ ൌ ‫ݒ‬ଶ െ ൬
௩మήௐభ
మ ൰ ή ܹଵ 5.3 Exercises
หȁௐభ ȁห

1. Let ‫ ܦ‬ൌ ܵ‫݊ܽ݌‬ሼሺͲǡͲǡͳǡͳሻǡ ሺͳǡͲǡͳǡͳሻǡ ሺͳǡ െͳǡͳǡͲሻሽ. Find the


‫ݒ‬ଶ ή ܹଵ
ܹଶ ൌ ‫ݒ‬ଶ െ ൭ ଶ൱ ή ܹଵ orthogonal basis for ‫ܦ‬.
หȁܹଵ ȁห
2. Let ‫ ܧ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡͳǡ െͳǡͲሻǡ ሺͲǡͳǡͳǡͳሻǡ ሺ͵ǡͷǡ െͳǡʹሻሽ. Find
ሺͲǡͳǡͲǡͳሻ ή ሺͳǡͲǡͳǡͳሻ
ൌ ሺͲǡͳǡͲǡͳሻ െ ൭ ଶ ൱ ή ሺͳǡͲǡͳǡͳሻ the orthogonal basis for ‫ܧ‬.
หȁሺͳǡͲǡͳǡͳሻȁห
ͳ
ൌ ሺͲǡͳǡͲǡͳሻ െ ή ሺͳǡͲǡͳǡͳሻ
͵
ͳ ͳ ͳ ͳ ͳ ʹ
ൌ ሺͲǡͳǡͲǡͳሻ െ ൬ ǡ Ͳǡ ǡ ൰ ൌ ൬െ ǡ ͳǡ െ ǡ ൰
͵ ͵ ͵ ͵ ͵ ͵
119
Answers to Odd-Numbered 122 M. Kaabar

Exercises 2.5 Exercises


1.10 Exercises 1. a. Basis for ‫ ܯ‬is ሼሺͳǡ െͳǡͳሻǡ ሺെͳǡͲǡͳሻሽ.
b. ሺͳǡ െ͵ǡͷሻ ‫ ܯ א‬because there is a zero row that
1. ‫ݔ‬ସ ǡ ‫ݔ‬ହ ǡ ‫ א ଺ݔ‬Թ corresponds to ሺͳǡ െ͵ǡͷሻ which means that ሺͳǡ െ͵ǡͷሻ is
ଷ ଷ
‫ݔ‬ଵ ൌ ‫ݔ‬ସ ൅ ‫ݔ‬ହ െ ‫ ଺ݔ‬൅ ͳͷ dependent, and ሺͳǡ െ͵ǡͷሻcan be written as a linear
ଶ ଶ
ଷ ଷ ଷ ଵଷ
‫ݔ‬ଶ ൌ െ ‫ݔ‬ସ ൅ ‫ݔ‬ହ ൅ ‫ ଺ݔ‬െ combination.
ସ ଶ ସ ଶ
ଵ ଵ
‫ݔ‬ଷ ൌ ‫ݔ‬ସ െ ‫ ଺ݔ‬൅ ͷ ‫ݔ‬ ܽ
ଶ ଶ
3. a. Let ‫ݒ‬ଵ ൌ ൥ ‫ݔ‬ ൅ ‫ݕ‬ ൩ ‫ܧ א‬ǡ and ‫ݒ‬ଶ ൌ ቈܽ ൅ ‫ݖ‬቉, then
3. a. ሾെ͵ ͵ െͳ ͷሿ ‫ݕ‬ ‫ݖ‬
͹ ‫ݔ‬൅ܽ
b. ൥Ͷ൩ ‫ݒ‬ଵ ൅ ‫ݒ‬ଶ ൌ ൥ሺ‫ ݔ‬൅ ܽሻ ൅ ሺ‫ ݕ‬൅ ‫ݖ‬ሻ൩ ‫ܧ א‬.
Ͳ ‫ݕ‬൅‫ݖ‬
‫ݔ‬
c. െʹ
For ݉ ‫ א‬Թ and ‫ݒ‬ଵ ൌ ൥‫ ݔ‬൅ ‫ݕ‬൩ ‫ܧ א‬, then
ͳ Ͳ ʹ Ͳ ‫ݕ‬
5. a. ቂ ቃቂ ቃ ‫ ܣ‬ൌ ‫ܣ‬ଶ ߙ‫ݔ‬ ߙ‫ݔ‬
ʹ ͳ Ͳ ͳ
ଵ ߙ‫ݒ‬ଵ ൌ ൥ߙሺ‫ ݔ‬൅ ‫ݕ‬ሻ൩ ൌ ൥ߙ‫ ݔ‬൅ ߙ‫ݕ‬൩ ‫ܧ א‬. Thus, ‫ ܧ‬is a subspace
Ͳ቉ ቂ ͳ Ͳቃ
b. ቈ ଶ ‫ܣ‬ଶ ൌ ‫ܣ‬ ߙ‫ݕ‬ ߙ‫ݕ‬
Ͳ ͳ െʹ ͳ of Թଷ .
ͳ ͳ b. Basis for ‫ ܧ‬is ሼሺͳǡͳǡͲሻǡ ሺͲǡͳǡͳሻሽ.
7. ‫ ܣ‬is invertible (non-singular),, ‫ିܣ‬ଵ ൌ ቈ ଷ ʹ቉
ଶ c. ‫ ܧ‬ൌ ܵ‫݊ܽ݌‬ሼሺͳǡͳǡͲሻǡ ሺͲǡͳǡͳሻሽ.
ଶ ଶ ଵ 5. ‫ ݔ‬ൌ ͵
ሺିଵሻమశర ୢୣ୲൥ିଶ ଶ ିଵ൩
௖రమ ଵ ଵଶ ସ ଶ଼ ଻
9. ൌ ൌ െ ଵଵସସ ൌ െ ଶ଼଺
ୢୣ୲ሺ୅ሻ ିଵଵସସ 7. ݀݅݉ሺ‫ ܭ‬ሻ ൌ ʹ

‫ݔ‬ଵ ൌ Ͳ െʹ ʹ ʹ
11. ‫ݔ‬ଶ ൌ െ͵ 9. No, since the determinant of ൥ ͳ െͳ ͵൩ equals
‫ݔ‬ଷ ൌ ͳ ʹ െͳ Ͳ
zero, then the elements ሼሺെʹǡͳǡʹሻǡ ሺʹǡͳǡ െͳሻǡ ሺʹǡ͵ǡͲሻሽ do
not Թଷ .

121
124 M. Kaabar

3.4 Exercises 5.3 Exercises


1. a. Standard Matrix Representation of ܶ is
1. The orthogonal basis for ‫( ܣ‬Gram-Schmidt
ͳ Ͳ Ͳ
Orthonormalization) is
቎ͳ Ͳ Ͳ቏ ଵ ଵ
Ͳ Ͳ ͳ ሼሺͲǡͲǡͳǡͳሻǡ ሺͳǡͲǡͲǡͲሻǡ ቀͲǡ െͳǡ ǡ െ ቁሽ.
ଶ ଶ
Ͳ ͳ Ͳ
b. Basis for ‫ݎ݁ܭ‬ሺܶሻ is ሼሺͳǡͳǡͲǡͲሻǡ ሺͲǡͲǡͲǡͳሻǡ ሺͲǡͲǡͳǡͲሻሽ.
Thus, ‫ݎ݁ܭ‬ሺܶሻ is ܵ‫݊ܽ݌‬ሼሺͳǡͳǡͲǡͲሻǡ ሺͲǡͲǡͲǡͳሻǡ ሺͲǡͲǡͳǡͲሻሽ.
ͳ ͳ Ͳ Ͳ Ͳ Ͳ
c. Basis for ܴܽ݊݃݁ሺܶሻ ൌ ቄቂ ቃǡቂ ቃǡቂ ቃቅ.
Ͳ Ͳ Ͳ ͳ ͳ Ͳ
ͳ ͳ Ͳ Ͳ Ͳ Ͳ
Thus, ‫ݎ݁ܭ‬ሺܶሻ is ܵ‫ ݊ܽ݌‬ቄቂ ቃǡቂ ቃǡቂ ቃቅ.
Ͳ Ͳ Ͳ ͳ ͳ Ͳ

3. ‫ݎ݁ܭ‬ሺܶሻ ൌ ܵ‫݊ܽ݌‬ሼെͲǤͷ ൅ ‫ݔ‬ǡ െ ൅ ‫ ݔ‬ଶ ሽ.

5. a. Standard Matrix Representation of ܶ is


ͳ െͳ ͳ െͳ
቎ͳ Ͳ Ͳ Ͳ ቏
ͳ െͳ ͳ െͳ
ͳ Ͳ Ͳ Ͳ
b. Basis for ‫ݎ݁ܭ‬ሺܶሻ is ሼ‫ ݔ‬൅ ‫ ݔ‬ଶ ǡ െ‫ ݔ‬൅ ‫ ݔ‬ଷ ሽ.
Thus, ‫ݎ݁ܭ‬ሺܶሻ is ܵ‫݊ܽ݌‬ሼ‫ ݔ‬൅ ‫ ݔ‬ଶ ǡ െ‫ ݔ‬൅ ‫ ݔ‬ଷ ሽ.
ͳ ͳ െͳ Ͳ
c. Basis for ܴܽ݊݃݁ሺܶሻ ൌ ቄቂ ቃǡቂ ቃቅ.
ͳ ͳ െͳ Ͳ
ͳ ͳ െͳ Ͳ
Thus, ‫ݎ݁ܭ‬ሺܶሻ is ܵ‫ ݊ܽ݌‬ቄቂ ቃǡቂ ቃቅ.
ͳ ͳ െͳ Ͳ

4.3 Exercises
1. Since ݀݅݉ሺ‫ିܧ‬ଵ ሻ ് ʹ, ‫ ܥ‬is not diagonalizable.

3. ܹ is not diagonalizable because ݀݅݉ሺ‫ܧ‬ଶ ሻ must be 2


not 1.

123
Bibliography
[1] Badawi, A.: MTH 221/Linear Algebra.
http://www.ayman-badawi.com/MTH221.html (2004).
Accessed 18 Aug 2014.

125
Proof Digital Proofer

Printed By Createspace

You might also like