Вы находитесь на странице: 1из 3

3.

3
9. Consider the vectors cos(x + ) and sin(x) in C[, ]. For what values
of will the two vectors be linearly dependent? Give a graphical interpretation
of your answer.

Solution:
Let us first define a matrix A as:


sin(x) cos(x + )
A=
cos(x) sin(x + )
We know from theorem 3.3.3 that these vectors will be linearly independent
when det(A) 6= 0. Therefore we need to consider the values of that will make
det(A) = 0.
We know:
det(A) = sin(x)sin(x + ) cos(x)cos(x + )
= (cos(x)cos(x + ) + sin(x)sin(x + ))
= cos(x (x + ))
= cos()
Since cos() = 0 when is an odd multiple of /2, we will consider those
values of . Clearly when is an odd multiple of /2, cos(x) is shifted to the
left or right by an odd multiple of /2. When this occurs cos(x + ) becomes
equivalent to sin(x) or sin(x).
sin(x) and sin(x) are clearly linearly dependent becuase the system:
c1 sin(x) + c2 sin(x) = 0
has the nontrivial solution c1 = 1 and c2 = 1.
Similarly, sin(x) and sin(x) are linearly dependent becuase the system
c1 sin(x) c2 sin(x) = 0
has the nontrivial solution c1 = 1 and c2 = 1.
Thus, both vectors are clearly linearly dependent when is an odd multiple
of /2 (Note that the graphical interpretation is above).
13. Prove that any nonempty subset of a linearly independent set of vectors
{v1 , ..., vn } is also linearly independent.
Solution:
I will prove the contrapositive.

If we have a nonempty subset of a set of vectors {v1 , ..., vk }, k n, and this


subset is linearly dependent, then we have that for some c1 , c2 , ..., ck ,
c1 v1 + c2 v2 + ... + ck vk = 0
where c1 , c2 , ..., ck do not all equal 0.
Now, lets look at the set of vectors {v1 , ..., vn } from which we have the aforementioned subset. This set has at least the members contained in our subset.
Therefore, it has values of c1 , c2 , ..., ck , ..., cn , k n, other than c1 = c2 = ... =
ck = ... = cn = 0 that will make the equation
c1 v1 + c2 v2 + ... + ck vk + ... + cn vn = 0
Since we have proven the contrapositive of our original statement, we have that
our original statement is also proven, which is our desired result.

14. Let A be an m n matrix. Show that if A has linearly independent column


vectors, then N (A) = {0}

Solution:
I will call the column vectors of A: a1 , a2 , . . . an . Since the column vectors of A are linearly independent we know:
c1 a1 + c2 a2 + . . . + cn an = 0
only has the solution c1 = c2 = . . . = cn = 0 by definition.
Recall that N (A) = {x  Rn : Ax = 0}. This means that for any x in N (A) we
must have Ax = 0. Let us represent any x  Rn by x = (c1 , c2 , . . . cn )T where
ci is a scalar. We already know that:
c1 a1 + c2 a2 + . . . + cn an = 0 implies
c1 = c2 = . . . = cn = 0.
Then, by the hint given in the book we know that:
Ax = c1 a1 + c2 a2 + . . . + cn an .
Thus, we know that Ax = 0 implies that x = (0, 0, . . . 0)T which clealry means
that N (A) = {0}.
15. Let {x1 , ..., xk } be linearly independent vectors in Rn , and let A be a nonsingular n n matrix. Define yi = Axi for i = 1, ..., k. Show that {y1 , ..., yk }
are linearly independent.
Solution:
We have that {x1 , ..., xk } are linearly independent vectors, so we then know that
2


the nn matrix x1 ... xk is invertible. We also have that A is nonsingular,

so we know that A is invertible. Therefore, the product A x1 ...  xk is
invertible as well. Simplifying the product,
we obtain Ax1 ... Axk which

is equal to the matrix y1 ... yk . Since we have that all yi = Axi for
i = 1, ..., k can be written as an n n invertible matrix, we also have that the
column vectors {y1 , ..., yk } are linearly independent, our desired result.

Вам также может понравиться