HW#2, solutions to selected problems

2) This is simple but important. You can find a counterexample by taking almost any three vectors in R^2. For example, the vectors (1,0), (0,1), and (1,1) will do the trick.

3) This is straightforward. Let's do it.

Nonempty: since W_1 and W_2 are subspaces, they both contain the 0 vector, so 0 is in the intersection of W_1 and W_2, so this intersection is nonempty.

Closure under addition: suppose x and y are in the intersection of W_1 and W_2. Then x and y are in both W_1 and W_2. Since W_1 is a subspace, and x and y are in W_1, it follows that x+y is in W_1. Likewise, since W_2 is a subspace, x+y is in W_2. Since x+y is in both W_1 and in W_2, it follows that x+y is in the intersection of W_1 and W_2.

Closure under scalar multiplication: same idea, but simpler.

4) Suppose a_1v_1+...+a_nv_n=b_1v_1+...+b_nv_n. Subtracting, we obtain (a_1-b_1)v_1+...+(a_n-b_n)v_n=0. By hypothesis, a_i-b_i=0, so a_i=b_i, for each i=1,...,n. (Isn't that cute?)

6) This seems almost obvious, but how do we prove it? Some thought is required. First pass: We need to show that W=V. Since W is a subset of V, we just have to check that if x is a vector in V, then x is also in W. How do we do that? Well, let's suppose that x is not in W, and try to deduce a contradiction. We have to use our assumptions somehow. It helps to give things names, so let n denote the dimension of W, and let's choose a basis w_1,...,w_n of W. We observe that the vectors w_1,...,w_n,x have to be linearly independent. (Proof: This is by an argument we gave in class. Suppose c_1w_1+...+c_nw_n+c_{n+1}x=0. We have to show that c_1=...=c_{n+1}=0. We must have c_{n+1}=0, because otherwise we could divide the above equation by -c_{n+1} and move x to the other side to find that x is in the span of w_1,...,w_n, contradicting the fact that x is not in W. But then c_1w_1+...+c_nw_n=0, and since w_1,...,w_n are independent, it follows that c_1=...=c_n=0.) But now we have n+1 linearly independent vectors in V, while dim(V)=n, and we know from a proposition proved in class that this is impossible. This contradiction completes the proof.

Remark: the above argument is a bit convoluted. Sometimes it is possible to logically transform a proof by contradiction into a direct proof and get a simpler argument. In this case we obtain the following.

Polished proof: We need to show that W=V. Since W is a subset of V, we just have to check that if x is a vector in V, then x is also in W. Let w_1,...,w_n be a basis for W. Since dim(V)=n, the n+1 vectors w_1,...,w_n,x in V must be linearly dependent, by a proposition proved in class. So some nontrivial linear combination c_1w_1+...+c_nw_n+c_{n+1}x=0. Now c_{n+1} cannot equal zero, because we know that w_1,...,w_n are independent. Dividing the above equation by -c_{n+1} and moving x to the other side, we find that x is in the span of w_1,...,w_n, so x is in W.

Note: if W is a subspace of V and dim(W)=dim(V)=infinity, it does not necessarily follow that W=V. For example, take W to be the space of differentiable functions from R to R, and V the space of continuous functions from R to R.