You are on page 1of 20

li

Span In this section we will cover a topic that well see off and on over the course of this chapter. Lets start off by going back to part (b) of Example 4 from the previous section. In that example we saw that the null space of the given matrix consisted of all the vectors of the form

We would like a more compact way of stating this result and by the end of this section well have that.

Lets first revisit an idea that we saw quite some time ago. In the section on Matrix Arithmetic we looked at linear combinations of matrices and columns of matrices. We can also talk about linear combinations of vectors.

Definition 1 We say the vector w from the vector space V is a linear combination of the vectors ,all from V, if there are scalars so that w can be written

So, we can see that the null space we were looking at above is in fact all the linear combinations of the vector (7,1). It may seem strange to talk about linear combinations of a single vector since that is really scalar multiplication, but we can think of it as that if we need to.

The null space above was not the first time that weve seen linear combinations of vectors however. When we were looking at Euclidean n-space we introduced these things called the standard basis vectors. The standard basis vectors for were defined as,

We saw that we could take any vector from and write it as,

Or, in other words, we could write u as a linear combination of the standard basis vectors, . We will be revisiting this idea again in a couple of sections, but the point here is simply that weve seen linear combinations of vectors prior to us actually discussing them here.

Lets take a look at an example or two.

Example 1 Determine if the vector is a linear combination of the two given vectors. (a) Is and ? [Solution] (b) Is and
[Solution]

a linear combination of

a linear combination of ?

(c) Is and
[Solution]

a linear combination of ?

Solution

(a) Is and

a linear combination of ?

In each of these cases well need to set up and solve the following equation,

Then set coefficients equal to arrive at the following system of equations,

If the system is consistent (i.e. has at least one solution then w is a linear combination of the two vectors. If there is no solution then w is not a linear combination of the two vectors.

Well leave it to you to verify that the solution to this system is . Therefore, w is a linear combination of can write . and

and and we

[Return to Problems]

(b) Is and

a linear combination of ?

For this part well need to the same kind of thing so here is the system.

The solution to this system is,

This means w is linear combination of and . However, unlike the previous part there are literally an infinite number of ways in which we can write the linear combination. So, any of the following combinations would work for instance.

There are of course many more. There are just a few of the possibilities.
[Return to Problems]

(c) Is and

a linear combination of ?

Here is the system well need to solve for this part.

This system does not have a solution and so w is not a linear combination of .

and

[Return to Problems]

So, this example was kept fairly simple, but if we add in more components and/or more vectors to the set the problem will work in essentially the same manner.

Now that weve seen how linear combinations work and how to tell if a vector is a linear combination of a set of other vectors we need to move into the real topic of this section. In the opening of this section we recalled a null space that wed looked at in the previous section. We can now see that the null space from that example is nothing more than all the linear combinations of the vector (7,1) (and again, it is kind of strange to be talking about linear combinations of a single vector).

As pointed out at the time were after a more compact notation for denoting this. It is now time to give that notation.

Definition 2 Let be a set of vectors in a vector space V and let W be the set of all linear combinations of the vectors . The set W is the span of the vectors and is denoted by

We also say that the vectors

span W.

So, with this notation we can now see that the null space that we examined at the start of this section is now nothing more than,

Before we move on to some examples we should get a nice theorem out of the way.

Theorem 1 Let their span be then, (a) W is a subspace of V.

be vectors in a vector space V and let

(b) W is the smallest subspace of V that contains all of the vectors .

Proof : (a) So, we need to show that W is closed under addition and scalar multiplication. Let u and w be any two vectors from W. Now, since W is the set of all linear combinations of that means that both u and w must be a linear combination of these vectors. So, there are scalars and
so that,

Now, lets take a look at the sum.

So the sum,

, is a linear combination of the vectors and hence must be in W and so W is closed under addition.

Now, let k be any scalar and lets take a look at,

As we can see the scalar multiple, ku, is a linear combination of the vectors and hence must be in W and so W is closed under scalar multiplication.

Therefore, W must be a vector space.

(b) In these cases when we say that W is the smallest vector space that contains the set of vectors were really saying that if is also a vector space that contains then it will also contain a complete copy of W as well.

So, lets start this off by noticing that W does in fact contain each of the

s since,

Now, let be a vector space that contains and consider any vector u from W. If we can show that u must also be in then well have shown that contains a copy of W since it will contain all the vectors in W. Now, u is in W and so must be a linear combination of
,

Each of the terms in this sum, , is a scalar multiple of a vector that is in and since is a vector space it must be closed under scalar multiplication and so each is in . But this means that u is the sum of a bunch of vectors that are in which is closed under addition and so that means that u must in fact be in .

Weve now shown that

contains every vector from W and so must contain W itself.

Now, lets take a look at some examples of spans.

Example 2 Describe the span of each of the following sets of vectors. (a) (b) and and .

Solution (a) The span of this set of vectors, , is the set of all linear combinations and we can write down a general linear combination for these two vectors.

So, it looks like that are in the form

will be all of the vectors from for any choices of a and b.

(b) This one is fairly similar to the first one. A general linear combination will look like,

So,

will be all the vectors from for any choices of a and b.

of the form

Example 3 Describe the span of each of the following sets of vectors.

(a)

and

(b) Solution

, and

These work exactly the same as the previous set of examples worked. The only difference is that this time we arent working in for this example.

(a) Here is a general linear combination of these vectors.

Here it looks like .

will be all the diagonal matrices in

(b) A general linear combination in this case is,

In this case from that do not have a quadratic term.

will be all the polynomials

Now, lets see if we can determine a set of vectors that will span some of the common vector spaces that weve seen. What well need in each of these examples is a set of vectors with which we can write a general vector from the space as a linear combination of the vectors in the set.

Example 4 Determine a set of vectors that will exactly span each of the following vector spaces. (a) (b) (c) Solution Okay, before we start this lets think about just what we need to show here. Well need to find a set of vectors so that the span of that set will be exactly the space given. In other words, we need to show that the span of our proposed set of vectors is in fact the same set as the vector space.
[Solution] [Solution] [Solution]

So just what do we need to do to mathematically show that two sets are equal? Lets suppose that we want to show that A and B are equal sets. To so this well need to show that each a in A will be in B and in doing so well have shown that B will at the least contain all of A. Likewise, well need to show that each b in B will be in A and in doing that well have shown that A will contain all of B. However, the only way that A can contain all of B and B can contain all of A is for A and B to be the same set.

So, for our example well need to determine a possible set of spanning vectors show that every

vector from our vector space is in the span of our set of vectors. Next well need to show that each vector in our span will also be in the vector space.

(a)

Weve pretty much done this one already. Earlier in the section we showed that any vector from can be written as a linear combination of the standard basis vectors, and so at the least the span of the standard basis vectors will contain all of . However, since any linear combination of the standard basis vectors is going to be a vector in we can see that must also contain the span of the standard basis vectors.

Therefore, the span of the standard basis vectors must be

.
[Return to Problems]

(b)

We can use result of Example 3(a) above as a guide here. In that example we saw a set of matrices that would span all the diagonal matrices in extension to get a set that will span all of it. and so we can do a natural . It looks like the following set should do

Clearly any linear combination of these four matrices will be a

matrix and hence

in

and so the span of these matrices must be contained in

Likewise, given any matrix from

we can write it as the following linear combination of these vectors.

and so span .

must be contained in the span of these vectors and so these vectors will

[Return to Problems]

(c)

We can use Example 3(b) to help with this one. First recall that is the set of all polynomials of degree n or less. Using Example 3(b) as a guide it looks like the following set of vectors will work for us.

Note that used subscripts that matched the degree of the term and so started at the usual .

instead of

It should be clear (hopefully) that a linear combination of these is a polynomial of degree n or

less and so will be in

. Therefore the span of these vectors will be contained in

Likewise, we can write a general polynomial of degree n or less,

as the following linear combination

Therefore vectors is exactly

is contained in the span of these vectors and this means that the span of these .
[Return to Problems]

There is one last idea about spans that we need to discuss and its best illustrated with an example.

Example 5 Determine if the following sets of vectors will span (a) , and (b) , and , ,

. [Solution]

. [Solution]

Solution

(a)

, and

Okay lets think about how weve got to approach this. Clearly the span of these vectors will be in since they are vectors from . The real question is whether or not will be contained in the span of these vectors, . In the previous example our set of vectors contained vectors that we could easily show this. However, in this case its not so clear. So to answer that question here well do the following.

Choose a general vector from determine if we can find scalars given vectors. Or,

, , , and

, and so that u is a linear combination of the

If we set components equal we arrive at the following system of equations,

In matrix form this is,

What we need to do is to determine if this system will be consistent (i.e. have at least one solution) for every possible choice of . Nicely enough this is very easy to do if you recall Theorem 9 from the section on Determinant Properties. This theorem tells us that this system will be consistent for every choice of provided the coefficient matrix is invertible and we can check that be doing a quick determinant computation. So, if we denote the coefficient matrix as A well leave it to you to verify that .

Therefore the coefficient matrix is invertible and so this system will have a solution for every choice of shown that . This in turn tells us that is contained in and so weve now

[Return to Problems]

(b) and

, .

Well do this one a little quicker. As with the first part, lets choose a general vector form and form up the system that we need to solve. Well leave it to you to verify that the matrix form of this system is,

This system will have a solution for every choice of if the coefficient matrix, A, is invertible. However, in this case we have (you should verify this) and so the coefficient matrix is not invertible.

This in turn tells us that there is at least one choice of for which this system will not have a solution and so cannot be written as a linear combination of these three vectors. Note that there are in fact infinitely many choices of that will not yield solutions!

Now, we know that , but weve just shown that there is at least one vector from not be all of .

is contained in that is not contained in and so the span of these three vectors will

[Return to Problems]

This example has shown us two things. First, it has shown us that we cant just write down any set of three vectors and expect to get those three vectors to span . This is an idea were going to be looking at in much greater detail in the next couple of sections.

Secondly, weve now seen at least two different sets of vectors that will span . There are the three vectors from Example 5(a) as well as the standard basis vectors for

. This tells us that the set of vectors that will span a vector space is not unique. In other words, we can have more than one set of vectors span the same vector space.

You might also like