You are on page 1of 5

T10.

If an inconsistent system has r > n, by applying the FVCS Theorem (n-r) number of free variables will be a negative quantity. r is the number of nonzero rows, but since each nonzero row has a leading 1, it is also the number of leading 1's present. For each leading 1, we have a pivot column, so r is also the number of pivot columns. So the value of (r) can possibly be n+1 for the (n+1)th column if the condition of r > n happens. Thus then by FVCS, the number of free variable will be (n-n-1) = -1. Thus the number of free variables we can possibly discover is -1.

T40. The two identical columns of the consistent system of linear equation after row reducing remains still identical, which will be true for any augmented matrix with two identical columns and with consistent linear system of equation. Now, if the two columns are identical then the row reduced form will have the two pivot columns with the leading ones in the same row. This is a violation of the echelon form. Thus the leading one of that particular row is dependent variable and thus due to the other identical column we will have one free variable generated. And since we have a free variable we will have a infinite solution set. HSE T10. A system of liner equation is homogenous if and if only the system has zero vector as a solution. A system of linear equation LS (A, b) is said to be homogenous if the vector of constant is a zero vector, i.e. b=0. This is true if we substitute the variables with zero. This will give 0=0 for each of the equations. Thus zero vectors are a solution.

Let the system of linear equation LS (A, b) has zero vector as a solution. Thus substituting the value of the variables with zero, we get the vector of constant as zero vector. Thus this can be now written as LS (A, 0).But this is the definition for homogenous system of linear equation. Thus if the system has zero vector as a solution the system of linear equation is homogenous.

T20. Let us suppose we have a system of equation as + + .. + .. + =0 =0

.. +

=0

.. +

=0

Now the system of equation has u = [ + + .. + .. + =0 =0 ]

as one solution to the system, thus we have

=0

.. +

=0

Now substituting v = [ + + ]

in place of u , we get

.. + .. +

= 4( = 4(

+ +

.. + .. +

= 4(0) = 0 ( 0 (

.. +

= 4(

.. +

.. +

= 4(

.. +

So v satisfies the property of the homogenous system of equation as LS (A, 0). Thus v is a solution to the system.

NM M51. Each system has unique solution, infinite solution or no solution. According to the problem the coefficient matrix is singular. Thus the row reduced form of the matrix is not identity. Thus there is no unique solution to the system of equation. Thus there remain only two possibilities i.e. of infinite solution and of no solution. M52. Now according to the problem the coefficient matrix is non-singular. Thus by row reducing we get an identity matrix. Thus we can infer that the solution set to this corresponding system of linear equation is unique. Moreover since the system is not homogenous thus zero vector is not a solution.

T10. A is a singular matrix. B is the row reduced form of the matrix. Since A is the singular matrix, thus B is not Identity matrix. Now Since B is in row reduced echelon form and also not identity matrix, thus the number of pivot columns will surely be less than the number of columns of A matrix. Now we know r is the number of nonzero rows, but since each nonzero row has a leading 1, it is also the number of leading 1's present. For each leading 1, we have a pivot column, so r is also the number of pivot columns. Since the number of pivot columns is less than the number of columns of A matrix, let say n, then the number of non-zero rows are also less than n. Thus we can say that there is at least one zero row in B matrix and which by the definition of RREF must be the last row below any other row that contains a Non-zero entry. T12. A is a square matrix. Let A be n x n matrix. Let also every column of A is pivot column. Now we will use the two properties from the theorem of RREF stated as 1. The leftmost nonzero entry of a row is equal to 1. 2. The leftmost nonzero entry of a row is the only nonzero entry in its column. Now since every column of A is pivot column, thus that particular column has the leading 1 and this 1 is also the left most non zero entry of a particular non-zero row. Also by the property of RREF this particular non-zero column will have only this leading 1 as the sole non-zero entry. Thus if we now consider the total n x n matrix , we can see that since every column of A is pivot column thus the every row containing that leading 1 of the particular pivot column also has this 1 as the only non-zero entry. Thus the positions of these leading 1s are different and unique for different columns and similarly for different rows. Thus this matrix A takes the form if an Identity matrix. Now let A is a n x n identity matrix. Thus a particular column has 1 as the only non-zero entry and that 1 is also the left most non-zero entry of a particular row. Again this is actually the two properties of RREF mentioned above. But this actually means that each of the columns in A has only the leading 1 and this 1 is also the left most non zero entry of a particular non-zero row. These are actually the pivot columns of matrix A. Thus every column of A matrix are pivot columns.

V.VO T5. Proposition 1. For any vectors u, v, w , if u + v = u + w, then v = w. Proof: Let u, v, w , and suppose u + v = u + w. 1. Then -u + (u + v) = -u + (u + w), Additive Property of Equality 2. So (-u + u) + v = (-u + u) + w. Additive associativity 3. Thus, we have 0 + v = 0 + w, Additive Inverse 4. And it follows that v = w. Zero Vector Thus, for any vectors u, v, w , if u + v = u + w, then v = w.

T6. Proposition 2. For any vector u Proof: Let u .

, 0u = 0.

1. Since 0 + 0 = 0, we have 0u = (0 + 0) u. 2. We then have 0u = 0u + 0u. 3. It follows that 0u + [-(0u)] = (0u + 0u) + [-(0u)], 4. So 0u + [-(0u)] = 0u + (0u + [-(0u)]), 5. So that 0 = 0u + 0, 6. And thus 0 = 0u. Thus, for any vector u , 0u = 0.

Substitution Distributive across Scalar Addition Additive Property of Equality Additive associativity Additive Inverse Zero Vector

T7. Proposition 3. For any scalar c, c0 = 0. Proof: Let c be an arbitrary scalar. 1. Then c0 = c (0 + 0), 2. So c 0 = c 0 + c 0. 3. We then have c 0 + (- c 0) = (c 0 + c 0) + (-c 0) 4. So that c 0 + (-c 0) = c 0 + (c 0 + (-c 0)). 5. It follows that 0 = c 0 + 0, 6.and finally we have 0 = c 0. Thus, for any scalar c, c 0 = 0.

Zero Vector Distributive Across Vector Addition Additive Property of Equality Additive associativity Additive Inverse Zero Vector

T13. For 1 i = = = , + + By the definition of CVA By Commutative property in Complex space Again by CVA. i

Since the individual components of u + v and v + u are equal for all for all i , 1 Definition of CVE tells that the vectors are equal. Thus = : u,v

T17. & ,

For 1 (

, = =(

, u

By definition of CVSM By definition of CVSM

Since the individual components of ( and ( are equal for all for all i , 1 Definition of CVE tells that the vectors are equal. Thus ( =( . T18. For 1 ( i , = = = = + + By definition of CVSM Distributivity in C By definition of CVSM By definition of CVA.

Since the individual components of ( and are equal for all for all i , 1 i , definition of CVE tells that the vectors are equal. Thus ( : u,v & .

T31.

Now u and v are

. Let u = [ ]

& v= [ ]

Now u v =

&vu= [ ]

So for 1

but

. Thus commutative

[ ] property does not hold.

Also (u v) - v =

u (v v) = [ ]

. So for 1

, (

. Thus the

[ ] associativity property does not hold.

You might also like