You are on page 1of 4

13

13.1

The rank-nullity (dimension) theorem


Rank and nullity of a matrix

Definition: The rank of the matrix A is the dimension of the row space of A, and is denoted R(A) Examples: The rank of Inn is n; the rank of 0mn is 0. The rank of the 3 5 matrix considered above is 3. Theorem: The rank of a matrix in Gauss-Jordan form is equal to the number of leading variables. Proof: In the G form of a matrix, every non-zero row has a leading 1, which is the only non-zero entry in its column. No elementary row operation can zero out a leading 1, so these non-zero rows are linearly independent. Since all the other rows are zero, the dimension of the row space of the G form is equal to the number of leading 1s, which is the same as the number of leading variables. Definition: The nullity of the matrix A is the dimension of the null space of A, and is denoted by N (A). (This is to be distinguished from Null(A), which is a subspace; the nullity is a number.) Examples: The nullity of I is 0. The nullity of the 3 5 matrix considered above (Chapter 12) is 2. The nullity of 0mn is n. Theorem: The nullity of a matrix in Gauss-Jordan form is equal to the number of free variables. Proof: Suppose A is m n, and that the GJ form has j leading variables and k free variables, where j + k = n. Then, when computing the solution to the homogeneous equation, we solve for the rst j (leading) variables in terms of the remaining k free variables which well call s1 , s2 , . . . , sk . Then the general solution to the homogeneous equation, as we know, consists of all linear combinations of the form s1 v1 + s2 v2 + sk vk , where . . . . . . v1 = 1 , . . . , vk = 0 , . 0 . . . . 0 . 0 1 where, in v1 , the 1 appears in position j + 1, and so on. The vectors {v1 , . . . , vk } are linearly 1

independent and form a basis for the null space of A. And there are k of them, the same as the number of free variables. Exercise: What are the rank and nullity of the following matrices? 1 0 0 1 1 0 3 7 A= 3 4 , B = 0 1 4 9 7 9 We now have to address the question: how are the rank and nullity of the matrix A related to those of its Gauss-Jordan form? Definition: The matrix B is said to be row equivalent to A if B can be obtained from A by a nite sequence of elementary row operations. If B is row equivalent to A, we write B A. In pure matrix terms, B A there exist elementary matrices E1 , . . . , Ek such that B = Ek Ek1 E2 E1 A. If we write C = Ek Ek1 E2 E1 , then C is invertible. Conversely, if C is invertible, then C can be expressed as a product of elementary matrices, so a much simpler denition can be given: Definition: B is row equivalent to A if B = CA, where C is invertible. We can now establish two important results: Theorem: If B A, then Null(B ) =Null(A). Proof: Suppose x Null(A). Then Ax = 0. Since B A, then for some invertible matrix C B = CA, and it follows that B x = CAx = C 0 = 0, so x Null(B ). Therefore Null(A) Null(B ). Conversely, if x Null(B ), then B x = 0. But B = CA, where C is invertible, being the product of elementary matrices. Thus B x = CAx = 0. Multiplying by C 1 gives Ax = C 1 0 = 0, so x Null(A), and Null(B ) Null(A). So the two sets are equal, as advertised. Theorem: If B A, then the row space of B is identical to that of A Proof: Weve already done this (see section 11.1). Were just restating the result in a slightly dierent context. Summarizing these results: Row operations change neither the row space nor the null space of A. Corollary 1: If R is the Gauss-Jordan form of A, then R has the same null space and row space as A. 2

Corollary 2: If B A, then R(B ) = R(A), and N (B ) = N (A). Proof: If B A, then both A and B have the same GJ form, and hence the same rank (equal to the number of leading ones) and nullity (equal to the number of free variables). The following result may be somewhat surprising: Theorem: The number of linearly independent rows of the matrix A is equal to the number of linearly independent columns of A. In particular, the rank of A is also equal to the number of linearly independent columns, and hence to the dimension of the column space of A Proof (sketch): As an example, consider the matrix 3 1 1 0 A= 4 2 2 3 4 Observe that columns 1, 2, and 3 are linearly dependent, with col1 (A) = 2col2 (A) col3 (A). You should be able to convince yourself that doing any row operation on the matrix A doesnt aect this equation. Even though the row operation changes the entries of the various columns, it changes them all in the same way, and this equation continues to hold. The span of the columns can, and generally will change under row operations (why?), but this doesnt aect the result. For this example, the column space of the original matrix has dimension 2 and this is preserved under any row operation. The actual proof would consist of the following steps: (1) identify a maximal linearly independent set of columns of A, (2) argue that this set remains linearly independent if row operations are done on A. (3) Then it follows that the number of linearly independent columns in the GJ form of A is the same as the number of linearly independent columns in A. The number of linearly independent columns of A is then just the number of leading entries in the GJ form of A which is, as we know, the same as the rank of A.

13.2

The rank-nullity theorem

This is also known as the dimension theorem, and version 1 (well see another later in the course) goes as follows: Theorem: Let A be m n. Then n = N (A) + R(A), where n is the number of columns of A. 3

Lets assume, for the moment, that this is true. What good is it? Answer: You can read o both the rank and the nullity from the echelon form of the matrix A. Suppose A can be row-reduced to 1 0 0 1 . 0 0 0 1 Then its clear (why?) that the dimension of the row space is 3, or equivalently, that the dimension of the column space is 3. Since there are 5 columns altogether, the dimension theorem says that n = 5 = 3 + N (A), so N (A) = 2. We can therefore expect to nd two linearly independent solutions to the homogeneous equation Ax = 0. Alternatively, inspection of the echelon form of A reveals that there are precisely 2 free variables, x2 and x5 . So we know that N (A) = 2 (why?), and therefore, rank(A) = 5 2 = 3. Proof of the theorem: This is, at this point, almost trivial. We have shown above that the rank of A is the same as the rank of the Gauss-Jordan form of A which is equal to the number of leading entries in the Gauss-Jordan form. We also know that the dimension of the null space is equal to the number of free variables in the reduced echelon (GJ) form of A. And we know further that the number of free variables plus the number of leading entries is exactly the number of columns. So n = N (A) + R(A), as claimed. Exercises: Find the rank and nullity of the following - do the absolute minimum (zero!) amount of computation possible: 3 1 6 2 , 2 5 3 1 4 2

(T/F) For any matrix A, R(A) = R(At ). Give a proof or counterexample. (T/F) For any matrix A, N (A) = N (At ). Give a proof or counterexample.

You might also like