Вы находитесь на странице: 1из 8

Fall 2015 ARE201-Simon

Problem Set #01- Answer key


First Lin Algebra Problem Set

(1) Using the definition of linear independence, i.e., P


a set of vectors {v 1 , ...v k ..., v m } is a linear independent set if for all t ∈ Rm , m k
k=1 tk v = 0
implies t = 0, prove the following properties: (Note: once you have shown a property, you
can use it to show the following ones)
a) A singleton vector is a linear independent set if and only if it is not the zero vector.

Ans: First, let’s prove the sufficiency condition by contrapositive ((notB ⇒ notA) ⇔ (A ⇒
B)). The nul vector is not a linear independent set, because we can find a λ 6= 0 such that
λ.v = 0 (take λ = 1 for example). That proves that a singleton vector is a linear independent
set then it is not equal to zero. For the necessary condition, lets take a non zero vector v,
and let λ be a scalar such that λ.v = 0. Knowing that v 6= 0 there is at least one non zero
component vl of v. For this component, we can write λ = 0/vl so λ = 0 and v is a linear
independent set.

b) Two nonzero vectors are linearly independent if and only if they are not colinear (or
proportional, i.e. for two vectors (u, v),there exists λ ∈ R such that u = λ.v)

Ans: First, let’s prove the sufficiency condition by contradiction. Let (u, v) be a couple of
nonzero linearly independent vector that are colinear. By definition of colinearity, there exist a
scalars α such that v = α.u. Thus v − α.u = 0, and this constitutes a linear combination of
(u, v) equal to zero but with non zero scalars (at least the coefficient 1 in front of v is not equal
to zero). Thus (u, v) is linear dependent which contradicts the assumption.
Second, we prove the necessary part by contrapositive ((notB ⇒ notA) ⇔ (A ⇒ B)). Suppose
that u and v are two linearly dependent non-zero vectors. Then we just need to show that they
are colinear. By definition of linear dependence, there exist two scalars (α, β) 6= (0, 0) such that
α.u + β.v = 0. One of these scalars is necessarily different from zero, suppose it is α. Then,
we can write u = − αβ .v, so these two vectors are colinear. This proves that two vector that are
not colinear are linearly independent .

c) for n > 1, (v 1 , ..., v n ) is a linear dependent set if and only if one of the vector in the
set v i is a linear combination of the other n − 1 vectors.

Ans: The case of two vectors was shown in b). Now lets look at a case where n > 2. Suppose
(v 1 , ..., v n ) are P
linearly dependent then there exist a set of scalars (λ1 , ..., λn ) not all equal to
zero such that nk=1 λk .v k = 0. There is at least one scalar not equal to zero, suppose λi 6= 0
P P
then we have λi .v i = − k6=i λk .v k or v i = − λ1i k6=i λk .v k so v i is a linear combination of
the others n − 1 vectors with coefficients µk = − λλki . This shows the sufficiency part.
Conversely, if there exists i ∈ [|0, n|] such that v i is a linear combination of P the n − 1 other
vectors, there exists a set of scalars (µ1 , ..., µi−1 , µi+1 , ..., µn ) such that v i = k6=i µk .v k . We
will construct a set of scalars in order to prove that these vectors are linearly dependent: set
λk = µk for k 6= i and Pλi = −1. P
Then we have that k6=i µk .v k − v i = 0, i.e. k i
k6=i λk .v + λi v = 0, which simplifies to
Pn k
k=1 λk .v = 0. So we found a linear combination of the vectors equal to zero for which at
least one scalar λi = −1 is not equal zero, so the vectors are linearly dependent.

d)If (v 1 , ..., v n ) is a linear independent set, and y is a different vector, (v 1 , ..., v n , y) is


linearly dependent if and only if y is a linear combination of v 1 , ..., v n
2

Ans: First we look at the sufficiency case. Suppose (v 1 , ..., v n ) is a linear independent set,
and (v 1 , ..., v n , y) is linearly dependent. Because of Pthis linear dependence, there exists a set
of scalars (λ1 , ..., λn+1 ) not all equal to 0 such that nk=1 λk .v k + λn+1 y = 0. There are two
possibilities: λn+1 yP = 0 or λn+1 y 6= 0.
- If λn+1 = 0 then nk=1 λk .v k = 0 and by linear independence all the scalars are equal to zero
which is impossible by definition P of the linear dependence of (v 1 , ..., v n , y). Thus λn+1 6= 0.
n 1 Pn
Knowing that, we can write that k=1 λk .v = −λn+1 y and y = − λn+1 k=1 λk .v k . Thus y
k

λk
is a linear combination of the n vectors with coefficients − λn+1 .
For the necessary part, suppose that y isPa linear combination of (v 1 , ..., v n ), then there exists a
set of scalars (µ1 , ..., µn ) such that y = nk=1 µk .v k . By a similar reasoning as in our proof of c)
P
we can construct a set of scalars λ1 , ..., λn+1 not equal to zero such that nk=1 λk .v k + λn+1 y =
0. We just need to define these scalars: set λk = µk for all k ∈ [|1, n|] and set λn+1 = −1.
Thus we just showed that (v 1 , ..., v n , y) is linearly dependent.

e) If (v 1 , ..., v n ) is a linear independent set, then any subset of this set (such as v 1 , ..., v i ,
with i < n) is also linear independent.

Ans: This case only requires to show one sufficiency condition, I will proceed by contradiction.
Suppose that (v 1 , ..., v n ) is linearly independent but that (v 1 , ..., v i ) is linearly dependent. Then
P
there exists a set of scalars (λ1 , ..., λi ) not all equal to zero such that ik=1 λk .v k = 0. We can
Pi k
add zero to this expression, so for k ∈ {i + 1, ..., n} set λk = 0 then we have: k=1 λk v +
Pn k
P n k
k=i+1 λk .v = 0, i.e., k=1 λk v = 0. But at least one of the scalars {λ1 , ..., λi } is not equal
to zero because Λ 6= 0. This contradicts our assumption of linear independence of (v 1 , ..., v n )
so necessarily (v 1 , ..., v i ) is linearly independent. This is true for any subset of the original set
of linear independent vectors.
3

(2) Show that if v1 and v2 are linearly independent vectors in R2 , then any vector w ∈ R2 can
be written as a linear combination of v1 and v2 . To show this, use only the fact that v1 and
v2 are linear independent iff they are not colinear. (i.e., do not use properties of matrices
or determinants.)
Hint #1: Try to write w as αv1 + βv2 . Under what conditions can you solve for α and β?
Hint #2: one way of utilizing hint #1 is to use the following rather clumsy characterization
of colinearity:
 1 2
 neither v 1 nor v 2 is zero =⇒ v2 = v2 or

 1 1 1
v1 2
v1
1 2 v 1 v 2
v and v are colinear iff neither v21 nor v22 is zero =⇒ 11 = 12 or (1)

 v2 v2

at least one of the vectors is zero
The second and third branches of this characterization say, respectively, that either “rise
over run” or “run over rise” must be the same for the two vectors.

Ans: Following hint #1, we will write


w1 = αv11 + βv12 (2a)
w2 = αv21 + βv22 (2b)

and will attempt to solve for α and β. We’ll start with the “if” part, i.e., assume linear
independence and show that we can solve for α and β. Since v1 and v2 are linearly independent
and hence not colinear, either (a) v11 6= 0 or (b) v21 6= 0 (i.e., v cannot be zero). Without loss
of generality, assume that (a) is true. (If (b) is true, flip the coefficients.) We can then divide
both sides of (2a) by v11 and obtain
w1 v12
α = − β (3)
v11 v11
now, subsituting for α in (2a)
 
w1 v12
w2 = − β 1 v21 + βv22
v11 v1
so that

w2 v11 = w1 v21 + β v22 v11 − v12 v21 (4)

We can now solve for β iff v12 v21 − v22 v1 6= 0, in which case
1

w2 v11 − w1 v21
β =
v22 v11 − v12 v21
and so, substituting this expression for β back into (3)
w1 v2
α = − β 11
v11 v
1 1 
w1 w2 v1 − w1 v21 v12
= −
v11 v22 v11 − v12 v21 v11
4
 
w1 v22 v11 − v12 v21 − v12 w2 v11 − w1 v21
= 
v11 v22 v11 − v12 v21
w1 v22 − v12 w2
= .
v22 v11 − v12 v21

We now need to show that v1 and v2 being linearly independent implies v12 v21 − v22 v11 6= 0.
First note that if v12 = 0, then necessarily v22 6= 0 and hence
 1 v12v21 − v22 v11 = v22 v11 6= 0. On
 v v2
the other hand, if v12 6= 0 then v12 v21 − v22 v11 = v11 v12 v21 − v22 . Now v11 v12 is non-zero by
1 1
assumption, and we obtain that the term in parentheses is non-zero by negating the implication
in the second branch of our characterization of colinearity.
Now for the “only if” part. Assume that v1 and v2 are linearly dependent. We  need to show
that we can’t solve for α and β. From β, this will be the case if v22 v11 − v12 v21 = 0. From (1),
there are three possibilities:

(a) for i = 1, 2, v1i = v2i = 0,


v21 v22
(b) both vl1 and v12 are nonzero and v11
= v12
.
v11 v12
(c) both v21 and v22 are nonzero and v21
= v22
.

All three conditions imply that v22 v11 − v12 v21 = 0.

(3) What is the difference between a minimum spanning set for a vector space and a basis for
a vector space. Provide an example highlighting this difference.

Ans: The minimum spanning set does not have to be part of the vector space V , while a
basis has to be an element of the vector set. For example, consider the vector space
 
1
V = {x|x = α , α ∈ R}
1
which is a straight line through the origin.

A minimal spanning set for V are the two vectors:


   
1 1 2 0
v = , v =
0 1
Take away any of the two vectors and the remaining vector does not span V . However they are
not a basis as neither v1 nor v2 are an element of V .
5

(4) Simon & Blume question 11.14 (page 249). Explain your answer.

Ans: This question asks you to determine whether a set of vectors is a basis or not. From
Simon & Blume Theorem 11.7 (page 248) you know that a basis of Rn contains exactly n
vectors and from Theorem 11.8 you know that they form a basis iff the n vectors are linearly
independent. There are again several ways to do this.

(i) Following problem 2, show that 3 vectors are linearly independent.


(ii) Calculate the determinant of the 3 vectors in part b)-d). If it is nonzero, we know that the
3 vectors are linearly independent and therefore form a basis

Since we did method (i) in problem 3, let’s do method (ii) here.

a) Two vectors cannot span R3 and we therefore know that the vectors can’t be a basis. (In
general, a set of n vectors can span at most a space of dimension n).
1 1 1
b) det = 1 2 0
1 1 1
Develop after the third column:
1 2 1 1
det = 1 +1 = 1 - 2 + 2 -1 = 0
1 1 1 2
Since the detrminant is zero, the three vectors are linearly dependent and thus can’t be a
basis.
6 5 4
c) det = 3 2 1
9 8 7
Develop after the third column:
3 2 6 5 6 5
det = 4 -1 +7 = 4(24-18) - (48-45) + 7(12-15) = 24-3-21 = 0
9 8 9 8 3 2
Since the detrminant is zero, the three vectors are linearly dependent and thus can’t be a
basis.
1 1 1
d) det = 1 2 0
1 1 0
Develop after the third column:
1 2
det = 1 = 1 - 2 = -1
1 1
Since the detrminant is nonzero, the three vectors are linearly independent and span R3
and hence are a basis.
e) 4 vectors can not be a basis for R3 as they have to be linearly dependent. (In general,
if your vector space has dimension n and you have more than n vectors, they have to be
linearly dependent).
6

(5) Consider the following conjecture: Let V be a vector space spanned by the set {v1 , v2 , ...vn }.
The set {v1 , v2 , ...vn } is a minimal spanning set for V iff the vectors v1 , v2 , ...vn are linearly
independent vectors. Is the conjecture true? Prove your answer.

Ans: The conjecture is false because of the “⇐”: Consider:


     
1 0 0
1 2 3
v =  0  ,v =  1  ,v =  0 
0 0 1
 
1
And let V = {α  1  , α ∈ R} (A line in the plane where x3 = 0).
0
Clearly, v , v , and v3 are linearly independent, they span V (As they span R3 and hence any
1 2

subset of it) but they are not a minimal spanning set (as you can drop v3 and still span V ).
7

(6) Let U and W be vector subspaces of the vectorspace V. Define the space
U + W = {x|x = u + w, u ∈ U, w ∈ W } Show that U + W is a vector subspace of V.

Ans: Recall that you have to check three conditions to show that X is a vector subspace of
the vetor space V:

(i) X has to be nonempty


(ii) ∀x1 , x2 ∈ X : x1 + x2 ∈ X
(iii) ∀x1 ∈ X, α ∈ R+ : αx1 ∈ X

Now let’s prove the claim that if U, W are vector subspaces then so is U + W .

(i) Since 0 is an element of any vector subspace it is an element of U and W . But therefore
0 = |{z}
0 + |{z}
0
∈U ∈W
and hence 0 is an element of U + W . Consequently, U + W is nonempty.
(ii) (1) ∀x1 , x2 ∈ U + W : ∃u1 , u2 ∈ U and ∃w1 , w2 ∈ W such that
x1 = u1 + w1 and x2 = u2 + w2 .
(2) Since U is a vector space we know that u1 + u2 ∈ U
(3) Since W is a vector space we know that w1 + w2 ∈ W
(4) From (2) and (3) we therfore know that
x1 + x2 = u1 + w1 + u2 + w2 (By (1)) = |u1 {z
+ u2} + w1
+ w}2
| {z
∈U ∈W
is an element of U + W
(iii) (1) ∀x1 ∈ U + W : ∃u1 ∈ U and ∃w1 ∈ W such that
x1 = u1 + w1
(2) Since U is a vector space we know that ∀α ∈ R+ : αu1 ∈ U
(3) Since W is a vector space we know that ∀α ∈ R+ : αw1 ∈ W
(4) From (2) and (3) we therfore know that
αx1 = α(u1 + w1 ) (By (1))
αu1 + αw
= |{z} |{z}
1

∈U ∈W
is an element of U + W
8

Вам также может понравиться