Вы находитесь на странице: 1из 17

LINEAR ALGEBRA

W W L CHEN
c
W W L Chen, 1994, 2008.

This chapter is available free to all individuals, on the understanding that it is not to be used for nancial gain, and may be downloaded and/or photocopied, with or without permission from the author. However, this document may not be kept on any information storage and retrieval system without permission from the author, unless such system is not accessible to any individuals other than its owners.

Chapter 5
INTRODUCTION TO VECTOR SPACES

5.1. Real Vector Spaces Before we give any formal denition of a vector space, we shall consider a few concrete examples of such an abstract object. We rst study two examples from the theory of vectors which we rst discussed in Chapter 4. Example 5.1.1. Consider the set R2 of all vectors of the form u = (u1 , u2 ), where u1 , u2 R. Consider vector addition and also multiplication of vectors by real numbers. It is easy to check that we have the following properties: (1.1) For every u, v R2 , we have u + v R2 . (1.2) For every u, v, w R2 , we have u + (v + w) = (u + v) + w. (1.3) For every u R2 , we have u + 0 = 0 + u = u. (1.4) For every u R2 , we have u + (u) = 0. (1.5) For every u, v R2 , we have u + v = v + u. (2.1) For every c R and u R2 , we have cu R2 . (2.2) For every c R and u, v R2 , we have c(u + v) = cu + cv. (2.3) For every a, b R and u R2 , we have (a + b)u = au + bu. (2.4) For every a, b R and u R2 , we have (ab)u = a(bu). (2.5) For every u R2 , we have 1u = u. Example 5.1.2. Consider the set R3 of all vectors of the form u = (u1 , u2 , u3 ), where u1 , u2 , u3 R. Consider vector addition and also multiplication of vectors by real numbers. It is easy to check that we have properties analogous to (1.1)(1.5) and (2.1)(2.5) in the previous example, with reference to R2 being replaced by R3 . We next turn to an example from the theory of matrices which we rst discussed in Chapter 2.
Chapter 5 : Introduction to Vector Spaces page 1 of 17

Linear Algebra

W W L Chen, 1994, 2008

Example 5.1.3. Consider the set M2,2 (R) of all 2 2 matrices with entries in R. Consider matrix addition and also multiplication of matrices by real numbers. Denote by O the 2 2 null matrix. It is easy to check that we have the following properties: (1.1) For every P, Q M2,2 (R), we have P + Q M2,2 (R). (1.2) For every P, Q, R M2,2 (R), we have P + (Q + R) = (P + Q) + R. (1.3) For every P M2,2 (R), we have P + O = O + P = P . (1.4) For every P M2,2 (R), we have P + (P ) = O. (1.5) For every P, Q M2,2 (R), we have P + Q = Q + P . (2.1) For every c R and P M2,2 (R), we have cP M2,2 (R). (2.2) For every c R and P, Q M2,2 (R), we have c(P + Q) = cP + cQ. (2.3) For every a, b R and P M2,2 (R), we have (a + b)P = aP + bP . (2.4) For every a, b R and P M2,2 (R), we have (ab)P = a(bP ). (2.5) For every P M2,2 (R), we have 1P = P . We also turn to an example from the theory of functions. Example 5.1.4. Consider the set A of all functions of the form f : R R. For any two functions f, g A, dene the function f + g : R R by writing (f + g)(x) = f (x) + g(x) for every x R. For every function f A and every number c R, dene the function cf : R R by writing (cf )(x) = cf (x) for every x R. Denote by : R R the function where (x) = 0 for every x R. Then it is easy to check that we have the following properties: (1.1) For every f, g A, we have f + g A. (1.2) For every f, g, h A, we have f + (g + h) = (f + g) + h. (1.3) For every f A, we have f + = + f = f . (1.4) For every f A, we have f + (f ) = . (1.5) For every f, g A, we have f + g = g + f . (2.1) For every c R and f A, we have cf A. (2.2) For every c R and f, g A, we have c(f + g) = cf + cg. (2.3) For every a, b R and f A, we have (a + b)f = af + bf . (2.4) For every a, b R and f A, we have (ab)f = a(bf ). (2.5) For every f A, we have 1f = f . There are many more examples of sets where properties analogous to (1.1)(1.5) and (2.1)(2.5) in the four examples above hold. This apparent similarity leads us to consider an abstract object which will incorporate all these individual cases as examples. We say that these examples are all vector spaces over R. Definition. A vector space V over R, or a real vector space V , is a set of objects, known as vectors, together with vector addition + and multiplication of vectors by element of R, and satisfying the following properties: (VA1) For every u, v V , we have u + v V . (VA2) For every u, v, w V , we have u + (v + w) = (u + v) + w. (VA3) There exists an element 0 V such that for every u V , we have u + 0 = 0 + u = u. (VA4) For every u V , there exists u V such that u + (u) = 0. (VA5) For every u, v V , we have u + v = v + u. (SM1) For every c R and u V , we have cu V . (SM2) For every c R and u, v V , we have c(u + v) = cu + cv. (SM3) For every a, b R and u V , we have (a + b)u = au + bu. (SM4) For every a, b R and u V , we have (ab)u = a(bu). (SM5) For every u V , we have 1u = u. Remark. The elements a, b, c R discussed in (SM1)(SM5) are known as scalars. Multiplication of vectors by elements of R is sometimes known as scalar multiplication.
Chapter 5 : Introduction to Vector Spaces page 2 of 17

Linear Algebra

W W L Chen, 1994, 2008

Example 5.1.5. Let n N. Consider the set Rn of all vectors of the form u = (u1 , . . . , un ), where u1 , . . . , un R. For any two vectors u = (u1 , . . . , un ) and v = (v1 , . . . , vn ) in Rn and any number c R, write u + v = (u1 + v1 , . . . , un + vn ) and cu = (cu1 , . . . , cun ).

To check (VA1), simply note that u1 +v1 , . . . , un +vn R. To check (VA2), note that if w = (w1 , . . . , wn ), then u + (v + w) = (u1 , . . . , un ) + (v1 + w1 , . . . , vn + wn ) = (u1 + (v1 + w1 ), . . . , un + (vn + wn )) = ((u1 + v1 ) + w1 , . . . , (un + vn ) + wn ) = (u1 + v1 , . . . , un + vn ) + (w1 , . . . , wn ) = (u + v) + w. If we take 0 to be the zero vector (0, . . . , 0), then u + 0 = 0 + u = u, giving (VA3). Next, writing u = (u1 , . . . , un ), we have u + (u) = 0, giving (VA4). To check (VA5), note that u + v = (u1 + v1 , . . . , un + vn ) = (v1 + u1 , . . . , vn + un ) = v + u. To check (SM1), simply note that cu1 , . . . , cun R. To check (SM2), note that c(u + v) = c(u1 + v1 , . . . , un + vn ) = (c(u1 + v1 ), . . . , c(un + vn )) = (cu1 + cv1 , . . . , cun + cvn ) = (cu1 , . . . , cun ) + (cv1 , . . . , cvn ) = cu + cv. To check (SM3), note that (a + b)u = ((a + b)u1 , . . . , (a + b)un ) = (au1 + bu1 , . . . , aun + bun ) = (au1 , . . . , aun ) + (bu1 , . . . , bun ) = au + bu. To check (SM4), note that (ab)u = ((ab)u1 , . . . , (ab)un ) = (a(bu1 ), . . . , a(bun )) = a(bu1 , . . . , bun ) = a(bu). Finally, to check (SM5), note that 1u = (1u1 , . . . , 1un ) = (u1 , . . . , un ) = u. It follows that Rn is a vector space over R. This is known as the n-dimensional euclidean space. Example 5.1.6. Let k N. Consider the set Pk of all polynomials of the form p(x) = p0 + p1 x + . . . + pk xk , where p0 , p1 , . . . , pk R.

In other words, Pk is the set of all polynomials of degree at most k and with coecients in R. For any two polynomials p(x) = p0 + p1 x + . . . + pk xk and q(x) = q0 + q1 x + . . . + qk xk in Pk and for any number c R, write p(x) + q(x) = (p0 + q0 ) + (p1 + q1 )x + . . . + (pk + qk )xk and cp(x) = cp0 + cp1 x + . . . + cpk xk .

To check (VA1), simply note that p0 + q0 , . . . , pk + qk R. To check (VA2), note that if we write r(x) = r0 + r1 x + . . . + rk xk , then we have p(x) + (q(x) + r(x)) = (p0 + p1 x + . . . + pk xk ) + ((q0 + r0 ) + (q1 + r1 )x + . . . + (qk + rk )xk ) = (p0 + (q0 + r0 )) + (p1 + (q1 + r1 ))x + . . . + (pk + (qk + rk ))xk = ((p0 + q0 ) + r0 ) + ((p1 + q1 ) + r1 )x + . . . + ((pk + qk ) + rk )xk = ((p0 + q0 ) + (p1 + q1 )x + . . . + (pk + qk )xk ) + (r0 + r1 x + . . . + rk xk ) = (p(x) + q(x)) + r(x).
Chapter 5 : Introduction to Vector Spaces page 3 of 17

Linear Algebra

W W L Chen, 1994, 2008

If we take 0 to be the zero polynomial 0 + 0x + . . . + 0xk , then p(x) + 0 = 0 + p(x) = p(x), giving (VA3). Next, writing p(x) = p0 p1 x . . . pk xk , we have p(x) + (p(x)) = 0, giving (VA4). To check (VA5), note that p(x) + q(x) = (p0 + q0 ) + (p1 + q1 )x + . . . + (pk + qk )xk = (q0 + p0 ) + (q1 + p1 )x + . . . + (qk + pk )xk = q(x) + p(x). To check (SM1), simply note that cp0 , . . . , cpk R. To check (SM2), note that c(p(x) + q(x)) = c((p0 + q0 ) + (p1 + q1 )x + . . . + (pk + qk )xk ) = c(p0 + q0 ) + c(p1 + q1 )x + . . . + c(pk + qk )xk = (cp0 + cq0 ) + (cp1 + cq1 )x + . . . + (cpk + cqk )xk = (cp0 + cp1 x + . . . + cpk xk ) + (cq0 + cq1 x + . . . + cqk xk ) = cp(x) + cq(x). To check (SM3), note that (a + b)p(x) = (a + b)p0 + (a + b)p1 x + . . . + (a + b)pk xk = (ap0 + bp0 ) + (ap1 + bp1 )x + . . . + (apk + bpk )xk = (ap0 + ap1 x + . . . + apk xk ) + (bp0 + bp1 x + . . . + bpk xk ) = ap(x) + bp(x). To check (SM4), note that (ab)p(x) = (ab)p0 + (ab)p1 x + . . . + (ab)pk xk = a(bp0 ) + a(bp1 )x + . . . + a(bpk )xk = a(bp0 + bp1 x + . . . + bpk xk ) = a(bp(x)). Finally, to check (SM5), note that 1p(x) = 1p0 + 1p1 x + . . . + 1pk xk = p0 + p1 x + . . . + pk xk = p(x). It follows that Pk is a vector space over R. Note also that the vectors are the polynomials. There are a few simple properties of vector spaces that we can deduce easily from the denition. PROPOSITION 5A. Suppose that V is a vector space over R, and that u V and c R. (a) We have 0u = 0. (b) We have c0 = 0. (c) We have (1)u = u. (d) If cu = 0, then c = 0 or u = 0. Proof. (a) By (SM1), we have 0u V . Hence 0u + 0u = (0 + 0)u = 0u It follows that 0u = 0u + 0 = 0u + (0u + ((0u))) = (0u + 0u) + ((0u)) = 0u + ((0u)) =0
Chapter 5 : Introduction to Vector Spaces

(by (SM3)), (since 0 R).

(by (VA3)), (by (VA4)), (by (VA2)), (from above), (by (VA4)).
page 4 of 17

Linear Algebra

W W L Chen, 1994, 2008

(b) By (SM1), we have c0 V . Hence c0 + c0 = c(0 + 0) = c0 It follows that c0 = c0 + 0 = c0 + (c0 + ((c0))) = (c0 + c0) + ((c0)) = c0 + ((c0)) =0 (by (VA3)), (by (VA4)), (by (VA2)), (from above), (by (VA4)). (by (SM2)), (by (VA3)).

(c) We have (1)u = (1)u + 0 = (1)u + (u + (u)) = ((1)u + u) + (u) = ((1)u + 1u) + (u) = ((1) + 1)u + (u) = 0u + (u) = 0 + (u) = u (d) Suppose that cu = 0 and c = 0. Then c1 R and u = 1u = (c1 c)u = c1 (cu) =c
1

(by (VA3)), (by (VA4)), (by (VA2)), (by (SM5)), (by (SM3)), (since 1 R), (from (a)), (by (VA3)).

(by (SM5)), (since c R \ {0}), (by (SM4)), (assumption), (from (b)),

=0 as required.

5.2. Subspaces Example 5.2.1. Consider the vector space R2 of all points (x, y), where x, y R. Let L be a line through the origin 0 = (0, 0). Suppose that L is represented by the equation x + y = 0; in other words, L = {(x, y) R2 : x + y = 0}. Note rst of all that 0 = (0, 0) L, so that (VA3) and (VA4) clearly hold in L. Also (VA2) and (VA5) clearly hold in L. To check (VA1), note that if (x, y), (u, v) L, then x + y = 0 and u + v = 0, so that (x + u) + (y + v) = 0, whence (x, y) + (u, v) = (x + u, y + v) L. Next, note that (SM2)(SM5) clearly hold in L. To check (SM1), note that if (x, y) L, then x + y = 0, so that (cx) + (cy) = 0, whence c(x, y) = (cx, cy) L. It follows that L forms a vector space over R. In fact, we have shown that every line in R2 through the origin is a vector space over R.
Chapter 5 : Introduction to Vector Spaces page 5 of 17

Linear Algebra

W W L Chen, 1994, 2008

Definition. Suppose that V is a vector space over R, and that W is a subset of V . Then we say that W is a subspace of V if W forms a vector space over R under the vector addition and scalar multiplication dened in V . Example 5.2.2. We have just shown in Example 5.2.1 that every line in R2 through the origin is a subspace of R2 . On the other hand, if we work through the example again, then it is clear that we have really only checked conditions (VA1) and (SM1) for L, and that 0 = (0, 0) L. PROPOSITION 5B. Suppose that V is a vector space over R, and that W is a non-empty subset of V . Then W is a subspace of V if the following conditions are satised: (SP1) For every u, v W , we have u + v W . (SP2) For every c R and u W , we have cu W . Proof. To show that W is a vector space over R, it remains to check that W satises (VA2)(VA5) and (SM2)(SM5). To check (VA3) and (VA4) for W , it clearly suces to check that 0 W . Since W is non-empty, there exists u W . Then it follows from (SP2) and Proposition 5A(a) that 0 = 0u W . The remaining conditions (VA2), (VA5) and (SM2)(SM5) hold for all vectors in V , and hence also for all vectors in W . Example 5.2.3. Consider the vector space R3 of all points (x, y, z), where x, y, z R. Let P be a plane through the origin 0 = (0, 0, 0). Suppose that P is represented by the equation x + y + z = 0; in other words, P = {(x, y, z) R2 : x + y + z = 0}. To check (SP1), note that if (x, y, z), (u, v, w) P , then x + y + z = 0 and u + v + w = 0, so that (x + u) + (y + v) + (z + w) = 0, whence (x, y, z) + (u, v, w) = (x + u, y + v, z + w) P . To check (SP2), note that if (x, y, z) P , then x + y + z = 0, so that (cx) + (cy) + (cz) = 0, whence c(x, y, z) = (cx, cy, cz) P . It follows that P is a subspace of R3 . Next, let L be a line through the origin 0 = (0, 0, 0). Suppose that (, , ) R3 is a non-zero point on L. Then we can write L = {t(, , ) : t R}. Suppose that u = t(, , ) L and v = s(, , ) L, and that c R. Then u + v = t(, , ) + s(, , ) = (t + s)(, , ) L, giving (SP1). Also, cu = c(t(, , )) = (ct)(, , ) L, giving (SP2). It follows that L is a subspace of R3 . Finally, it is not dicult to see that both {0} and R3 are subspaces of R3 . Example 5.2.4. Note that R2 is not a subspace of R3 . First of all, R2 is not a subset of R3 . Note also that vector addition and scalar multiplication are dierent in R2 and R3 . Example 5.2.5. Suppose that A is an m n matrix and 0 is the m 1 zero column matrix. Consider the system Ax = 0 of m homogeneous linear equations in the n unknowns x1 , . . . , xn , where x1 . . x= . xn is interpreted as an element of the vector space Rn , with usual vector addition and scalar multiplication. Let S denote the set of all solutions of the system. Suppose that x, y S and c R. Then A(x + y) = Ax + Ay = 0 + 0 = 0, giving (SP1). Also, A(cx) = c(Ax) = c0 = 0, giving (SP2). It follows that S is a subspace of Rn . To summarize, the space of solutions of a system of m homogeneous linear equations in n unknowns is a subspace of Rn .
Chapter 5 : Introduction to Vector Spaces page 6 of 17

Linear Algebra

W W L Chen, 1994, 2008

Example 5.2.6. As a special case of Example 5.2.5, note that if we take two non-parallel planes in R3 through the origin 0 = (0, 0, 0), then the intersection of these two planes is clearly a line through the origin. However, each plane is a homogeneous equation in the three unknowns x, y, z R. It follows that the intersection of the two planes is the collection of all solutions (x, y, z) R3 of the system formed by the two homogeneous equations in the three unknowns x, y, z representing these two planes. We have already shown in Example 5.2.3 that the line representing all these solutions is a subspace of R3 . Example 5.2.7. We showed in Example 5.1.3 that the set M2,2 (R) of all 2 2 matrices with entries in R forms a vector space over R. Consider the subset W = of M2,2 (R). Since a11 a21 a12 0 + b11 b21 b12 0 = a11 + b11 a21 + b21 a12 + b12 0 and c a11 a21 a12 0 = ca11 ca21 ca12 0 , a11 a21 a12 0 : a11 , a12 , a21 R

it follows that (SP1) and (SP2) are satised. Hence W is a subspace of M2,2 (R). Example 5.2.8. We showed in Example 5.1.4 that the set A of all functions of the form f : R R forms a vector space over R. Let C0 denote the set of all functions of the form f : R R which are continuous at x = 2, and let C1 denote the set of all functions of the form f : R R which are dierentiable at x = 2. Then it follows from the arithmetic of limits and the arithmetic of derivatives that C0 and C1 are both subspaces of A. Furthermore, C1 is a subspace of C0 (why?). On the other hand, let k N. Recall from Example 5.1.6 the vector space Pk of all polynomials of the form p(x) = p0 + p1 x + . . . + pk xk , where p0 , p1 , . . . , pk R.

In other words, Pk is the set of all polynomials of degree at most k and with coecients in R. Clearly Pk is a subspace of C1 .

5.3. Linear Combination In this section and the next two, we shall study ways of describing the vectors in a vector space V . Our ultimate goal is to be able to determine a subset B of vectors in V and describe every element of V in terms of elements of B in a unique way. The rst step in this direction is summarized below. Definition. Suppose that v1 , . . . , vr are vectors in a vector space V over R. By a linear combination of the vectors v1 , . . . , vr , we mean an expression of the type c1 v1 + . . . + cr vr , where c1 , . . . , cr R. Example 5.3.1. In R2 , every vector (x, y) is a linear combination of the two vectors i = (1, 0) and j = (0, 1), for clearly (x, y) = xi + yj. Example 5.3.2. In R3 , every vector (x, y, z) is a linear combination of the three vectors i = (1, 0, 0), j = (0, 1, 0) and k = (0, 0, 1), for clearly (x, y, z) = xi + yj + zk.
Chapter 5 : Introduction to Vector Spaces page 7 of 17

Linear Algebra

W W L Chen, 1994, 2008

Example 5.3.3. In R4 , the vector (1, 4, 2, 6) is a linear combination of the two vectors (1, 2, 0, 4) and (1, 1, 1, 3), for we have (1, 4, 2, 6) = 3(1, 2, 0, 4) 2(1, 1, 1, 3). On the other hand, the vector (2, 6, 0, 9) is not a linear combination of the two vectors (1, 2, 0, 4) and (1, 1, 1, 3), for (2, 6, 0, 9) = c1 (1, 2, 0, 4) + c2 (1, 1, 1, 3) would lead to the system of four equations c1 + c2 = 2, 2c1 + c2 = 6, c2 = 0, 4c1 + 3c2 = 9. It is easily checked that this system has no solutions. Example 5.3.4. In the vector space A of all functions of the form f : R R described in Example 5.1.4, the function cos 2x is a linear combination of the three functions cos2 x, cosh2 x and sinh2 x. It is not too dicult to check that cos 2x = 2 cos2 x + sinh2 x cosh2 x, noting that cos 2x = 2 cos2 x 1 and cosh2 x sinh2 x = 1. We observe that in Example 5.3.1, every vector in R2 is a linear combination of the two vectors i and j. Similarly, in Example 5.3.2, every vector in R3 is a linear combination of the three vectors i, j and k. On the other hand, we observe that in Example 5.3.3, not every vector in R4 is a linear combination of the two vectors (1, 2, 0, 4) and (1, 1, 1, 3). Let us therefore investigate the collection of all vectors in a vector space that can be represented as linear combinations of a given set of vectors in V . Definition. Suppose that v1 , . . . , vr are vectors in a vector space V over R. The set span{v1 , . . . , vr } = {c1 v1 + . . . + cr vr : c1 , . . . , cr R} is called the span of the vectors v1 , . . . , vr . We also say that the vectors v1 , . . . , vr span V if span{v1 , . . . , vr } = V ; in other words, if every vector in V can be expressed as a linear combination of the vectors v1 , . . . , vr . Example 5.3.5. The two vectors i = (1, 0) and j = (0, 1) span R2 . Example 5.3.6. The three vectors i = (1, 0, 0), j = (0, 1, 0) and k = (0, 0, 1) span R3 . Example 5.3.7. The two vectors (1, 2, 0, 4) and (1, 1, 1, 3) do not span R4 . PROPOSITION 5C. Suppose that v1 , . . . , vr are vectors in a vector space V over R. (a) Then span{v1 , . . . , vr } is a subspace of V . (b) Suppose further that W is a subspace of V and v1 , . . . , vr W . Then span{v1 , . . . , vr } W . Proof. (a) Suppose that u, w span{v1 , . . . , vr } and c R. There exist a1 , . . . , ar , b1 , . . . , br R such that u = a1 v1 + . . . + ar vr
Chapter 5 : Introduction to Vector Spaces

and

w = b1 v1 + . . . + br vr .
page 8 of 17

Linear Algebra

W W L Chen, 1994, 2008

Then u + w = (a1 v1 + . . . + ar vr ) + (b1 v1 + . . . + br vr ) = (a1 + b1 )v1 + . . . + (ar + br )vr span{v1 , . . . , vr } and cu = c(a1 v1 + . . . + ar vr ) = (ca1 )v1 + . . . + (car )vr span{v1 , . . . , vr }. It follows from Proposition 5B that span{v1 , . . . , vr } is a subspace of V . (b) Suppose that c1 , . . . , cr R and u = c1 v1 + . . . + cr vr span{v1 , . . . , vr }. If v1 , . . . , vr W , then it follows from (SM1) for W that c1 v1 , . . . , cr vr W . It then follows from (VA1) for W that u = c1 v 1 + . . . + cr v r W . Example 5.3.8. In R2 , any non-zero vector v spans the subspace {cv : c R}. This is clearly a line through the origin. Also, try to draw a picture to convince yourself that any two non-zero vectors that are not on the same line span R2 . Example 5.3.9. In R3 , try to draw pictures to convince yourself that any non-zero vector spans a subspace which is a line through the origin; any two non-zero vectors that are not on the same line span a subspace which is a plane through the origin; and any three non-zero vectors that do not lie on the same plane span R3 .

5.4. Linear Independence We rst study two simple examples. Example 5.4.1. Consider the three vectors v1 = (1, 2, 3), v2 = (3, 2, 1) and v3 = (3, 3, 3) in R3 . Then span{v1 , v2 , v3 } = {c1 (1, 2, 3) + c2 (3, 2, 1) + c3 (3, 3, 3) : c1 , c2 , c3 R} = {(c1 + 3c2 + 3c3 , 2c1 + 2c2 + 3c3 , 3c1 + c2 + 3c3 ) : c1 , c2 , c3 R}. Write (x, y, z) = (c1 + 3c2 + 3c3 , 2c1 + 2c2 + 3c3 , 3c1 + c2 + 3c3 ). Then it is not dicult to see that x 1 y = 2 z 3 3 2 1 3 c1 3 c2 , 3 c3

and so (do not worry if you cannot understand why we take this next step) x 1)y = (1 z 1 1)2 3 3 2 1 3 c1 3 c2 = ( 0 3 c3 c1 0 ) c2 = ( 0 ) , c3

( 1 2

so that x 2y + z = 0. It follows that span{v1 , v2 , v3 } is a plane through the origin and not R3 . Note, in fact, that 3v1 + 3v2 4v3 = 0. Note also that 1 det 2 3
Chapter 5 : Introduction to Vector Spaces

3 2 1

3 3 = 0. 3
page 9 of 17

Linear Algebra

W W L Chen, 1994, 2008

Example 5.4.2. Consider the three vectors v1 = (1, 1, 0), v2 = (5, 1, 3) and v3 = (2, 7, 4) in R3 . Then span{v1 , v2 , v3 } = {c1 (1, 1, 0) + c2 (5, 1, 3) + c3 (2, 7, 4) : c1 , c2 , c3 R} = {(c1 + 5c2 + 2c3 , c1 + c2 + 7c3 , 3c2 + 4c3 ) : c1 , c2 , c3 R}. Write (x, y, z) = (c1 + 5c2 + 2c3 , c1 + c2 + 7c3 , 3c2 + 4c3 ). Then it is not dicult to see that x 1 y = 1 z 0 so that 25 26 4 4 3 3 33 x 25 26 33 1 5 2 c1 5 y = 4 4 5 1 1 7 c2 0 3 4 c3 4 z 3 3 4 1 0 0 c1 c1 = 0 1 0 c2 = c2 . 0 0 1 c3 c3 5 1 3 2 c1 7 c2 , 4 c3

It follows that for every (x, y, z) R3 , we can nd c1 , c2 , c3 R such that (x, y, z) = c1 v1 + c2 v2 + c3 v3 . Hence span{v1 , v2 , v3 } = R3 . Note that 1 det 1 0 and that the only solution for (0, 0, 0) = c1 v1 + c2 v2 + c3 v3 is c1 = c2 = c3 = 0. Definition. Suppose that v1 , . . . , vr are vectors in a vector space V over R. (LD) We say that v1 , . . . , vr are linearly dependent if there exist c1 , . . . , cr R, not all zero, such that c1 v1 + . . . + cr vr = 0. (LI) We say that v1 , . . . , vr are linearly independent if they are not linearly dependent; in other words, if the only solution of c1 v1 + . . . + cr vr = 0 in c1 , . . . , cr R is given by c1 = . . . = cr = 0. Example 5.4.3. Let us return to Example 5.4.1 and consider again the three vectors v1 = (1, 2, 3), v2 = (3, 2, 1) and v3 = (3, 3, 3) in R3 . Consider the equation c1 v1 + c2 v2 + c3 v3 = 0. This can be rewritten in matrix form as 1 3 3 c1 0 2 2 3 c2 = 0 . 3 1 3 c3 0 Since 1 det 2 3 3 2 1 3 3 = 0, 3 5 1 3 2 7 = 0, 4

the system has non-trivial solutions; for example, (c1 , c2 , c3 ) = (3, 3, 4), so that 3v1 + 3v2 4v3 = 0. Hence v1 , v2 , v3 are linearly dependent.
Chapter 5 : Introduction to Vector Spaces page 10 of 17

Linear Algebra

W W L Chen, 1994, 2008

Example 5.4.4. Let us return to Example 5.4.2 and consider again the three vectors v1 = (1, 1, 0), v2 = (5, 1, 3) and v3 = (2, 7, 4) in R3 . Consider the equation c1 v1 + c2 v2 + c3 v3 = 0. This can be rewritten in matrix form as 1 5 2 c1 0 1 1 7 c2 = 0 . 0 3 4 c3 0 Since 1 det 1 0 5 1 3 2 7 = 0, 4

the only solution is c1 = c2 = c3 = 0. Hence v1 , v2 , v3 are linearly independent. Example 5.4.5. In the vector space A of all functions of the form f : R R described in Example 5.1.4, the functions x, x2 and sin x are linearly independent. To see this, note that for every c1 , c2 , c3 R, the linear combination c1 x + c2 x2 + c3 sin x is never identically zero unless c1 = c2 = c3 = 0. Example 5.4.6. In Rn , the vectors e1 , . . . , en , where ej = (0, . . . , 0, 1, 0, . . . , 0)
j1 nj

for every j = 1, . . . , n,

are linearly independent (why?). We observe in Examples 5.4.35.4.4 that the determination of whether a collection of vectors in R3 are linearly dependent is based on whether a system of homogeneous linear equations has non-trivial solutions. The same idea can be used to prove the following result concerning Rn . PROPOSITION 5D. Suppose that v1 , . . . , vr are vectors in the vector space Rn . If r > n, then v1 , . . . , vr are linearly dependent. Proof. For every j = 1, . . . , r, write vj = (a1j , . . . , anj ). Then the equation c1 v1 + . . . + cr vr = 0 can be rewritten in matrix form as a11 . . . an1 ... ... a1r c1 0 . . . . . = . . . . . anr cr 0

If r > n, then there are more variables than equations. It follows that there must be non-trivial solutions c1 , . . . , cr R. Hence v1 , . . . , vr are linearly dependent. Remarks. (1) Consider two vectors v1 = (a11 , a21 ) and v2 = (a12 , a22 ) in R2 . To study linear independence, we consider the equation c1 v1 + c2 v2 = 0, which can be written in matrix form as a11 a21 a12 a22 c1 c2 = 0 0 .

The vectors v1 and v2 are linearly independent precisely when det


Chapter 5 : Introduction to Vector Spaces

a11 a21

a12 a22

= 0.
page 11 of 17

Linear Algebra

W W L Chen, 1994, 2008

This can be interpreted geometrically in the following way: The area of the parallelogram formed by the two vectors v1 and v2 is in fact equal to the absolute value of the determinant of the matrix formed with v1 and v2 as the columns; in other words, det a11 a21 a12 a22 .

It follows that the two vectors are linearly dependent precisely when the parallelogram has zero area; in other words, when the two vectors lie on the same line. On the other hand, if the parallelogram has positive area, then the two vectors are linearly independent. (2) Consider three vectors v1 = (a11 , a21 , a31 ), v2 = (a12 , a22 , a32 ), and v3 = (a13 , a23 , a33 ) in R3 . To study linear independence, we consider the equation c1 v1 + c2 v2 + c3 v3 = 0, which can be written in matrix form as a11 a12 a13 c1 0 a21 a22 a23 c2 = 0 . a31 a32 a33 c3 0 The vectors v1 , v2 and v3 are linearly independent precisely when a11 a12 a13 det a21 a22 a23 = 0. a31 a32 a33 This can be interpreted geometrically in the following way: The volume of the parallelepiped formed by the three vectors v1 , v2 and v3 is in fact equal to the absolute value of the determinant of the matrix formed with v1 , v2 and v3 as the columns; in other words, a11 a12 a13 det a21 a22 a23 . a31 a32 a33 It follows that the three vectors are linearly dependent precisely when the parallelepiped has zero volume; in other words, when the three vectors lie on the same plane. On the other hand, if the parallelepiped has positive volume, then the three vectors are linearly independent. (3) What is the geometric interpretation of two linearly independent vectors in R3 ? Well, note that if v1 and v2 are non-zero and linearly dependent, then there exist c1 , c2 R, not both zero, such that c1 v1 + c2 v2 = 0. This forces the two vectors to be multiples of each other, so that they lie on the same line, whence the parallelogram they form has zero area. It follows that if two vectors in R3 form a parallelogram with positive area, then they are linearly independent.

5.5. Basis and Dimension In this section, we complete the task of describing uniquely every element of a vector space V in terms of the elements of a suitable subset B. To motivate the ideas, we rst consider an example. Example 5.5.1. Let us consider the three vectors v1 = (1, 1, 0), v2 = (5, 1, 3) and v3 = (2, 7, 4) in R3 , as in Examples 5.4.2 and 5.4.4. We have already shown that span{v1 , v2 , v3 } = R3 , and that the vectors v1 , v2 , v3 are linearly independent. Furthermore, we have shown that for every u = (x, y, z) R3 , we can write u = c1 v1 + c2 v2 + c3 v3 , where c1 , c2 , c3 R are determined uniquely by c1 25 26 33 x c2 = 4 4 5 y . c3 3 3 4 z
Chapter 5 : Introduction to Vector Spaces page 12 of 17

Linear Algebra

W W L Chen, 1994, 2008

Definition. Suppose that v1 , . . . , vr are vectors in a vector space V over R. We say that {v1 , . . . , vr } is a basis for V if the following two conditions are satised: (B1) We have span{v1 , . . . , vr } = V . (B2) The vectors v1 , . . . , vr are linearly independent. Example 5.5.2. Consider two vectors v1 = (a11 , a21 ) and v2 = (a12 , a22 ) in R2 . Suppose that det a11 a21 a12 a22 = 0;

in other words, suppose that the parallelogram formed by the two vectors has non-zero area. Then it follows from Remark (1) in Section 5.4 that v1 and v2 are linearly independent. Furthermore, for every u = (x, y) R2 , there exist c1 , c2 R such that u = c1 v1 + c2 v2 . Indeed, c1 and c2 are determined as the unique solution of the system a11 a21 a12 a22 c1 c2 = x y .

Hence span{v1 , v2 } = R2 . It follows that {v1 , v2 } is a basis for R2 . Example 5.5.3. Consider three vectors of the type v1 = (a11 , a21 , a31 ), v2 = (a12 , a22 , a32 ) and v3 = (a13 , a23 , a33 ) in R3 . Suppose that a11 det a21 a31 a12 a22 a32 a13 a23 = 0; a33

in other words, suppose that the parallelepiped formed by the three vectors has non-zero volume. Then it follows from Remark (2) in Section 5.4 that v1 , v2 and v3 are linearly independent. Furthermore, for every u = (x, y, z) R3 , there exist c1 , c2 , c3 R such that u = c1 v1 + c2 v2 + c3 v3 . Indeed, c1 , c2 and c3 are determined as the unique solution of the system a11 a12 a13 c1 x a21 a22 a23 c2 = y . z a31 a32 a33 c3 Hence span{v1 , v2 , v3 } = R3 . It follows that {v1 , v2 , v3 } is a basis for R3 . Example 5.5.4. In Rn , the vectors e1 , . . . , en , where ej = (0, . . . , 0, 1, 0, . . . , 0)
j1 nj

for every j = 1, . . . , n,

are linearly independent and span Rn . Hence {e1 , . . . , en } is a basis for Rn . This is known as the standard basis for Rn . Example 5.5.5. In the vector space M2,2 (R) of all 2 2 matrices with entries in R as discussed in Example 5.1.3, the set 1 0 is a basis. Example 5.5.6. In the vector space Pk of polynomials of degree at most k and with coecients in R as discussed in Example 5.1.6, the set {1, x, x2 , . . . , xk } is a basis.
Chapter 5 : Introduction to Vector Spaces page 13 of 17

0 0

0 0

1 0

0 1

0 0

0 0

0 1

Linear Algebra

W W L Chen, 1994, 2008

PROPOSITION 5E. Suppose that {v1 , . . . , vr } is a basis for a vector space V over R. Then every element u V can be expressed uniquely in the form u = c1 v1 + . . . + cr vr , where c1 , . . . , cr R.

Proof. Since u V = span{v1 , . . . , vr }, there exist c1 , . . . , cr R such that u = c1 v1 + . . . + cr vr . Suppose now that b1 , . . . , br R such that c1 v1 + . . . + cr vr = b1 v1 + . . . + br vr . Then (c1 b1 )v1 + . . . + (cr br )vr = 0. Since v1 , . . . , vr are linearly independent, it follows that c1 b1 = . . . = cr br = 0. Hence c1 , . . . , cr are uniquely determined. We have shown earlier that a vector space can have many bases. For example, any collection of three vectors not on the same plane is a basis for R3 . In the following discussion, we attempt to nd out some properties of bases. However, we shall restrict our discussion to the following simple case. Definition. A vector space V over R is said to be nite-dimensional if it has a basis containing only nitely many elements. Example 5.5.7. The vector spaces Rn , M2,2 (R) and Pk that we have discussed earlier are all nitedimensional. Recall that in Rn , the standard basis has exactly n elements. On the other hand, it follows from Proposition 5D that any basis for Rn cannot contain more than n elements. However, can a basis for Rn contain fewer than n elements? We shall answer this question by showing that all bases for a given vector space have the same number of elements. As a rst step, we establish the following generalization of Proposition 5D. PROPOSITION 5F. Suppose that {v1 , . . . , vn } is a basis for a vector space V over R. Suppose further that r > n, and that the vectors u1 , . . . , ur V . Then the vectors u1 , . . . , ur are linearly dependent. Proof. Since {v1 , . . . , vn } is a basis for the vector space V , we can write u1 = a11 v1 + . . . + an1 vn , . . . ur = a1r v1 + . . . + anr vn , where aij R for every i = 1, . . . , n and j = 1, . . . , r. Let c1 , . . . , cr R. Since v1 , . . . , vn are linearly independent, it follows that if c1 u1 + . . . + cr ur = c1 (a11 v1 + . . . + an1 vn ) + . . . + cr (a1r v1 + . . . + anr vn ) = (a11 c1 + . . . + a1r cr )v1 + . . . + (an1 c1 + . . . + anr cr )vn = 0, then a11 c1 + . . . + a1r cr = . . . = an1 c1 + . . . + anr cr = 0; in other words, we have the homogeneous system a11 . . . an1
Chapter 5 : Introduction to Vector Spaces

... ...

a1r c1 0 . . . . . = . . . . . anr cr 0
page 14 of 17

Linear Algebra

W W L Chen, 1994, 2008

If r > n, then there are more variables than equations. It follows that there must be non-trivial solutions c1 , . . . , cr R. Hence u1 , . . . , ur are linearly dependent. PROPOSITION 5G. Suppose that V is a nite-dimensional vector space V over R. Then any two bases for V have the same number of elements. Proof. Note simply that by Proposition 5F, the vectors in the basis with more elements must be linearly dependent, and so cannot be a basis. We are now in a position to make the following denition. Definition. Suppose that V is a nite-dimensional vector space over R. Then we say that V is of dimension n if a basis for V contains exactly n elements. Example 5.5.8. The vector space Rn has dimension n. Example 5.5.9. The vector space M2,2 (R) of all 2 2 matrices with entries in R, as discussed in Example 5.1.3, has dimension 4. Example 5.5.10. The vector space Pk of all polynomials of degree at most k and with coecients in R, as discussed in Example 5.1.6, has dimension (k + 1). Example 5.5.11. Recall Example 5.2.5, where we showed that the set of solutions to a system of m homogeneous linear equations in n unknowns is a subspace of Rn . Consider now the homogeneous system 1 3 1 4 1 5 0 3 5 7 9 6 1 3 5 2 x1 5 0 x2 2 0 x = . 9 3 0 x4 1 0 x5

The solutions can be described in the form 1 1 2 3 x = c1 1 + c2 0 , 0 5 0 1 where c1 , c2 R (the reader must check this). It can be checked that (1, 2, 1, 0, 0) and (1, 3, 0, 5, 1) are linearly independent and so form a basis for the space of solutions of the system. It follows that the space of solutions of the system has dimension 2. Suppose that V is an n-dimensional vector space over R. Then any basis for V consists of exactly n linearly independent vectors in V . Suppose now that we have a set of n linearly independent vectors in V . Will this form a basis for V ? We have already answered this question in the armative in the cases when the vector space is R2 or R3 . To seek an answer to the general case, we rst establish the following result. PROPOSITION 5H. Suppose that V is a nite-dimensional vector space over R. Then any nite set of linearly independent vectors in V can be expanded, if necessary, to a basis for V . Proof. Let S = {v1 , . . . , vk } be a nite set of linearly independent vectors in V . If S spans V , then the proof is complete. If S does not span V , then there exists vk+1 V that is not a linear combination
Chapter 5 : Introduction to Vector Spaces page 15 of 17

Linear Algebra

W W L Chen, 1994, 2008

of the elements of S. The set T = {v1 , . . . , vk , vk+1 } is a nite set of linearly independent vectors in V ; for otherwise, there exist c1 , . . . , ck , ck+1 , not all zero, such that c1 v1 + . . . + ck vk + ck+1 vk+1 = 0. If ck+1 = 0, then c1 v1 + . . . + ck vk = 0, contradicting the assumption that S is a nite set of linearly independent vectors in V . If ck+1 = 0, then vk+1 = c1 ck v1 . . . vk , ck+1 ck+1

contradicting the assumption that vk+1 is not a linear combination of the elements of S. We now study the nite set T of linearly independent vectors in V . If T spans V , then the proof is complete. If T does not span V , then we repeat the argument. Note that the number of vectors in a linearly independent expansion of S cannot exceed the dimension of V , in view of Proposition 5F. So eventually some linearly independent expansion of S will span V . PROPOSITION 5J. Suppose that V is an n-dimensional vector space over R. Then any set of n linearly independent vectors in V is a basis for V . Proof. Let S be a set of n linearly independent vectors in V . By Proposition 5H, S can be expanded, if necessary, to a basis for V . By Proposition 5F, any expansion of S will result in a linearly dependent set of vectors in V . It follows that S is already a basis for V . Example 5.5.12. Consider the three vectors v1 = (1, 2, 3), v2 = (3, 2, 1) and v3 = (3, 3, 3) in R3 , as in Examples 5.4.1 and 5.4.3. We showed that these three vectors are linearly dependent, and span the plane x 2y + z = 0. Note that v3 = 3 3 v1 + v2 , 4 4

and that v1 and v2 are linearly independent. Consider now the vector v4 = (0, 0, 1). Note that v4 does not lie on the plane x 2y + z = 0, so that {v1 , v2 , v4 } form a linearly independent set. It follows that {v1 , v2 , v4 } is a basis for R3 .

Chapter 5 : Introduction to Vector Spaces

page 16 of 17

Linear Algebra

W W L Chen, 1994, 2008

Problems for Chapter 5 1. Determine whether each of the following subsets of R3 is a subspace of R3 : a) {(x, y, z) R3 : x = 0} b) {(x, y, z) R3 : x + y = 0} 3 c) {(x, y, z) R : xz = 0} d) {(x, y, z) R3 : y 0} 3 e) {(x, y, z) R : x = y = z} 2. For each of the following collections of vectors, determine whether the rst vector is a linear combination of the remaining ones: a) (1, 2, 3); (1, 0, 1), (2, 1, 0) in R3 b) x3 + 2x2 + 3x + 1; x3 , x2 + 3x, x2 + 1 in P4 c) (1, 3, 5, 7); (1, 0, 1, 0), (0, 1, 0, 1), (0, 0, 1, 1) in R4 3. For each of the following collections of vectors, determine whether the vectors are linearly independent: a) (1, 2, 3), (1, 0, 1), (2, 1, 0) in R3 b) (1, 2), (3, 5), (1, 3) in R2 4 c) (2, 5, 3, 6), (1, 0, 0, 1), (4, 0, 9, 6) in R d) x2 + 1, x + 1, x2 + x in P3 4. Find the volume of the parallelepiped in R3 formed by the vectors (1, 2, 3), (1, 0, 1) and (3, 0, 2). 5. Let S be the set of all functions y that satisfy the dierential equation 2 dy d2 y 3 + y = 0. dx2 dx

Show that S is a subspace of the vector space A described in Example 5.1.4. 6. For each of the sets in Problem 1 which is a subspace of R3 , nd a basis for the subspace, and then extend it to a basis for R3 .

Chapter 5 : Introduction to Vector Spaces

page 17 of 17

Вам также может понравиться