Вы находитесь на странице: 1из 5

Suggested Solutions to HW 4

October 14, 2010


1. (8a) Assume that this proposition is correct. We can then get an equality a1 (1, 1, 0) + a2 (1, 0, 1) + a3 (0, 1, 1) = 0, for some a1 , a2 , a3 F which are not all zero. From the equality, we can easily see that a1 must be equal to a2 and a3 , a2 must be equal to a3 . This implies a1 = a2 = a3 = 0. Contradiction. Hence, these three vectors are linearly independent.

2. (9) Suppose u or v is a multiple of the other. It means a F, a = 0 such that: u = av or v = au. In other words u av = 0 or v au = 0. This implies {u, v} is linearly dependent. Conversely, if {u, v} is linearly dependent, then we have a, b F, a = 0, or b = 0, and au + bv = 0. b If b = 0, then we get v = a u. If a = 0, we get u = a v. This proves the result. b

3. (10) {(1, 0, 0), (0, 1, 0), (1, 1, 0)} is an example. We can check that (1, 0, 0) + (0, 1, 0) (1, 1, 0) = 0 (linearly dependent). Also, any two vectors of them can not have the relation which is a multiple of the other.

4. (11) Our conclusion is that there are 2n vectors in span(S). As we all know every element in S can be represented by a1 u1 + a2 u2 + + an un , (ai Z2 , i = 1, 2, , n), since we assume that the eld is Z2 , so (ai , i = 1, 2, , n) can only be 0 or 1. Because of this fact, every coecient has only two choices, so we have 2 2 2 = 2n vectors.

5. (13) (a)Suppose {u, v} is linearly independent. We have au + bv = 0 implies a = b = 0. Now, r(u + v) + s(u v) = 0 implies (r + s)u + (r s)v = 0. It means r + s = 0 and r s = 0. Thus, r=s=0. {u + v, u v} is linearly independent. Now suppose {u + v, u v} is linearly independent. We have a(u + v) + b(u v) = 0 implies a = b = 0. Now, ru + sv = 0 implies ( r+s )u + ( rs )v = 0. It means r+s = rs = 0. Thus, 2 2 2 2 r = s = 0. {u, v} is linearly independent. (b)Suppose {u, v, w} is linearly independent. We have au + bv + cw = 0 implies a = b = c = 0. Now, r(u + v) + s(u + w) + t(v + w) = 0 implies (r + s)u + (r + t)v + (s + t)w = 0. It means r + s = r + t = s + t = 0. By solving this system of linear equations, we get r=s=t=0. {u + v, u + w, v + w} is linearly independent. 1

Now suppose {u+v, u+w, v +w} is linearly independent. We have a(u+v)+b(u+w)+c(v +w) = 0 implies a = b = c = 0. Now, ru+sv+tw = 0 implies a+bc (u+v)+ ab+c (u+w)+ b+ca (v+w) = 0. 2 2 2 Hence, a+bc = ab+c = b+ca = 0. Again, solving this system of linear equations, we get 2 2 2 r=s=t=0. {u, v, w} is linearly independent.

6. (14) Suppose S is linearly dependent. Assume S is not equal to {0}. Then there are m distinct vectors u1 , u2 , , um in S and scalars a1 , a2 , , am (not all zero) such that: a1 u1 + a2 u2 + + am um = 0. Since all these m numbers cannot be all zero, without loss of generality, we assume am = 0. We a2 a1 then have a representation of um as um = am u1 am u2 am1 um1 . Let v = um and am n = m 1. We get the conclusion that v is a linear combination of u1 , ..., un Conversely, if S = {0}, then S is linearly dependent. Now, suppose there exists n distinct vectors v, u1 , u2 , , un in S such that v is a linear combination of u1 , u2 , , un . We have v = a1 u1 + a2 u2 + + an un . This implies a1 u1 + a2 u2 + + an un v = 0. The coecient of v is nonzero. This mean S is linearly dependent.

7. (16) Suppose S is linearly independent. By Theorem 1.6, every nite subset of S is also linearly independent. Now suppose every nite subset of S is linearly independent. Assume S is linearly dependent. Then, there exist a nite number of distinct vectors u1 , u2 , , um in S and scalars a1 , a2 , , am (not all zero), such that a1 u1 + a2 u2 + + am um = 0. This contradicts the condition that every nite subset of S is linearly independent. Hence, S must be linearly independent.

8. (18) We will prove by mathematical induction. Suppose S consists only of one nonzero polynomial. S is clearly linear independent. Now, assume the result is true when S consists of m nonzero polynomials such that no two have the same degree. We need to prove that the result is true when S contains m+1 nonzero polynomials such that no two have the same degree. Consider the set S = p1 , ..., pm , pm+1 . Without loss of generality, we may assume the degree of pm+1 is larger than the degree of pi (1 i m). We now consider the equality am+1 pm+1 + ... + a1 p1 = 0. Expanding am+1 pm+1 + ... + a1 p1 , we can easily check that the coecient of xm+1 is am+1 . Thus, am+1 pm+1 + ... + a1 p1 = 0 implies am+1 = 0. This implies: am pm + ... + a1 p1 = 0. However, by assumption, {p1 , ..., pm } is linearly independent. Hence, we have a1 = a2 = ... = am = am+1 = 0 and {p1 , ..., pm+1 } is linearly independent. This proves the result by mathematical induction.

9. (10)(a) The polynomial must be of degree at least 2. We construct the function g following the process mentioned in textbook. Assume f1 (x) = (x + 1)(x 1) (2 + 1)(2 1) 2

f2 (x) =

(x 1)(x + 2) (1 1)(1 + 2) (x + 2)(x + 1) (1 + 2)(1 + 1)

f3 (x) =

Now, g = a1 f1 + a2 f2 + a3 f3 , where a1 = 6, a2 = 5, a3 = 3. We get g(x) = 4x2 x + 8.

10. (16) Let Eij be the matrix whose only nonzero entry is a 1 in the i-th row and j-th column. It is easy to see that B = {Eij |i j, 1 i, j n} is a basis of W . We can then calculate its dimension by counting the number of elements in B: 1+2++n= Hence, the dimension of W is
n(n+1) . 2

n(n + 1) . 2

11. (19) If a1 At + a2 At + + an At = 0, then we get (a1 A1 + a2 A2 + + an An )t = 0, n 1 2 Hence, a1 A1 + a2 A2 + + an An = 0. Since {A1 , A2 , , An } are linearly independent, we must have: a1 = a2 = = an = 0. So we can get the conclusion that {At , At , , At } is linearly independent. n 1 2

12. (22) The necessary and sucient condition is that W1 W2 : : Notice that W1 W2 W1 . Since the dimension of W1 is nite, the equality dim(W1 W2 ) = dim(W1 ) and Theorem 1.11 implies: W1 W2 = W1 . Hence, W1 W2 . : If we have W1 W2 , then W1 W2 = W1 . Hence, dim(W1 W2 ) = dim(W1 ).

13. (23) (a)The necessary and sucient conditions on v is that vector v is a linear combination of v1 , ..., vk . Suppose this is true, then we have v span(v1 , ..., vk ), which implies span(v, v1 , ..., vk ) span(v1 , ..., vk ). span(v1 , ..., vk ) span(v, v1 , ..., vk ) is trivial. So we get dim(W1 ) = dim(W2 ). Now suppose dim(W1 ) = dim(W2 ). {v1 , ..., vk } is a linear independent subset of W2 . Since dim(W2 ) = k, {v1 , ..., vk } is a basis of W2 . This implies span(v, v1 , ..., vk ) = span(v1 , ..., vk ). Hence, we have v is a linear combination of v1 , ..., vk . (b)If dim(W1 ) = dim(W2 ), v must not be a linear combination of v1 , ..., vk , otherwise, we would have span(v, v1 , ..., vk ) = span(v1 , ..., vk ) which contradicts the assumption. Since v is not a linear combination of v1 , ..., vk , compare the basis of these two spaces, its easy to see that the number of elements in the basis of space W2 is equal to the number of elements in the basis of space W1 plus

one. In other words, we have dim(W1 ) + 1 = dim(W2 ).

14. (25) Let {v1 , v2 , , vm } be a basis of V , and {w1 , w2 , , wn } be a basis of W . It is easy to check that {(v1 , 0), (v2 , 0), , (vm , 0), (0, w1 ), (0, w2 ), , (0, wn )} is a basis for the vector space Z, so dim(Z) = m + n.

15. (29) (a)Let {u1 , u2 , , uk } be a basis of W1 W2 , {u1 , u2 , , uk , v1 , v2 , , vm } be a basis of W1 , {u1 , u2 , , uk , w1 , w2 , , wn } be a basis of W2 . Then, we can check that S = {u1 , u2 , , uk , v1 , v2 , , vm , w1 , w2 , , wn } is a basis of W1 + W2 . S is a generating set, which is trivial. To prove that S is a basis, we need to prove S is linearly independent. Consider a1 u1 + ... + ak uk + b1 v1 + ... + bm vm + c1 w1 + ... + cn wn = 0. Rearrange the equality, we get: c1 w1 + ... + cn wn = 1 (a1 u1 + ... + ak uk + b1 v1 + ... + bm vm ) W1 . Hence, c1 w1 + ... + cn wn W1 W2 . We can then write c1 w1 + ... + cn wn = d1 u1 + ... + dk uk which gives c1 w1 + ... + cn wn d1 u1 ... dk uk = 0. Since {u1 , u2 , , uk , w1 , w2 , , wn } is linearly independent, we have c1 = c2 = ... = cn = 0. Similarly, b1 = b2 = ... = bm = 0. This implies a1 u1 +...+ak uk = 0. By the linear independency of {u1 , u2 , , uk }, we also have a1 = ... = ak = 0. Now, since dim(W1 + W2 ) = k + m + n, which is nite. So W1 + W2 is nite-dimensional. On the other hand, dim W1 + dim W2 dim(W1 W2 ) = (k + m) + (k + n) k = k + m + n = dim(W1 + W2 ). So we get: dim W1 + dim W2 dim(W1 W2 ) = dim(W1 + W2 ). (b)If V = W1 W2 , then W1 + W2 = V and W1 W2 = {0}. dim V = dim(W1 + W2 ) = dim(W1 ) + dim(W2 ) dim(W1 W2 ) = dim(W1 ) + dim(W2 ). Conversely, if dim V = dim(W1 ) + dim(W2 ). Since dim V = dim(W1 + W2 ) = dim(W1 ) + dim(W2 ) dim(W1 W2 ), we have dim(W1 W2 ) = 0, It means that W1 W2 = {0}. We get V = W1 W2 .

16. (30) First, we can easily see that 0 = a2 c2 b2 a2 , then x + y = ka1 kc1 kb1 ka1 a1 + a2 c1 + c2

0 0 a1 W1 . If x y W1 , x = 0 0 c1 b1 + b2 W1 . Also, if x W1 , k F. x = a1 + a2

b1 a1 a1 c1

,y = b1 a1 ,

we have kx =

W1 . So W1 is a subspace of V .

By the same steps, we can prove that W2 is a subspace of V . Since { 1 } is a basis of space W1 , we get dim W1 = 3. 0 1 0 0 Also, observe that { } is a basis of space W2 , we get dim W2 = 2. 0 0 1 0 a 0 1 Finally, we can check that W1 W2 = { , a F }. Hence, { } is a basis of a 0 1 0 4 1 0 0 1 0 1 0 0 0 1 0 0

space of W1 W2 . We can conclude that dim(W1 W2 ) = 1. Therefore, we have dim(W1 + W2 ) = dim W1 + dim W2 dim(W1 W2 ) = 4.

17. (31) (a)Since W2 is nite-dimensional, W1 W2 W2 , we have dim(W1 W2 ) dim W2 = n. (b) dim(W1 + W2 ) = dim W1 + dim W2 dim(W1 W2 ) dim W1 + dim W2 = m + n.

Вам также может понравиться