Академический Документы
Профессиональный Документы
Культура Документы
Josh Hernandez
October 27, 2009
1.4 - Linear Combinations and Systems of Linear Equations
2. Solve the following systems of linear equations.
b.
2x
1
7x
2
+ 4x
3
= 10
x
1
2x
2
+ x
3
= 3
2x
1
x
2
2x
3
= 6
Solution:
1. Scaling down from rst pivot:
1 ( 2x
1
7x
2
+ 4x
3
= 10)
-2 ( x
1
2x
2
+ x
3
= 3)
-1 ( 2x
1
x
2
2x
3
= 6)
2. Summing down from rst row:
2x
1
7x
2
+ 4x
3
= 10
-3x
2
+ 2x
3
= 4
-6x
2
+ 6x
3
= 4
3. Scaling down from second pivot:
2x
1
7x
2
+ 4x
3
= 10
2( -3x
2
+ 2x
3
= 4)
-1( -6x
2
+ 6x
3
= 4)
4. Summing down from second row:
2x
1
7x
2
+ 4x
3
= 10
-3x
2
+ 2x
3
= 4
-2x
3
= 4
5. Scaling up from second pivot:
-3 ( 2x
1
7x
2
+ 4x
3
= 10)
7 ( -3x
2
+ 2x
3
= 4)
-2x
3
= 4
6. Summing up from second pivot:
-6x
1
+ 2x
3
= -2
-3x
2
+ 2x
3
= 4
-2x
3
= 4
7. Summing up from third pivot:
-6x
1
= 2
-3x
2
= 8
-2x
3
= 4
8. Solution:
x
1
= -1/3
x
2
= -8/3
x
3
= -2
d.
x
1
+ 2x
2
+ 2x
3
= 2
x
1
+ 8x
3
+ 5x
4
= -6
x
1
+ x
2
+ 5x
3
+ 5x
4
= 3
1
Solution:
1. Scaling down from rst pivot:
x
1
+ 2x
2
+ 2x
3
= 2
-1 ( x
1
+ 8x
3
+ 5x
4
= -6)
-1 ( x
1
+ x
2
+ 5x
3
+ 5x
4
= 3)
2. Summing down from rst row:
x
1
+ 2x
2
+ 2x
3
= 2
2x
2
+ -6x
3
5x
4
= 8
x
2
+ -3x
3
5x
4
= -1
3. Scaling down from second pivot:
x
1
+ 2x
2
+ 2x
3
= 2
2x
2
+ -6x
3
5x
4
= 8
-2 ( x
2
+ -3x
3
5x
4
= -1)
4. Summing down from second row:
x
1
+ 2x
2
+ 2x
3
= 2
2x
2
+ -6x
3
5x
4
= 8
5x
4
= 10
5. Scaling up from second pivot:
-1 ( x
1
+ 2x
2
+ 2x
3
= 2)
2x
2
+ -6x
3
5x
4
= 8
5x
4
= 10
6. Summing up from second pivot:
-x
1
+ -8x
3
5x
4
= 6
2x
2
+ -6x
3
5x
4
= 8
5x
4
= 10
7. Summing up from third pivot:
-x
1
+ -8x
3
= 16
2x
2
+ -6x
3
= 18
5x
4
= 10
8. Solution:
x
1
= -16 + -x
3
x
2
= 9 + 3x
3
x
3
= x
3
x
4
= 2
This solution set is the line through the point (-16, 9, 0, 2), in the direction (-1, 3, 1, 0).
3. For each of the following lists of vectors in R
3
, determine whether the rst vector can be expressed as
a linear combination of the other two.
b. (1, 2, -3), (-3, 2, 1), (2, -1, -1).
2
Solution: We can nd such a linear combination i.e. a, b R such that
a(1, 2, -3) + b(-3, 2, 1) = (2, -1, -1)
if we can nd a solution to the linear system
a + -3b = 2
2a + 2b = -1
-3a + b = -1
1. Scaling down from rst pivot:
6 ( a + -3b = 2)
-3 ( 2a + 2b = -1)
2 ( -3a + b = -1)
2. Summing down from rst row:
a + -3b = 2
-24b = 15
-16b = 10
The bottom two rows are equivalent.
3. Scaling up from the second pivot:
8 ( a + -3b = 2)
3 ( 8b = -5)
3. Summing up from the second row:
a = 1
8b = -5
4. Solution:
a = 1
b = -5/8
So these vectors are indeed linearly dependent.
d. (2, -1, 0), (1, 2, -3), (1, -3, 2).
Solution: Suppose there existed such a linear combination,
a(2, -1, 0) + b(1, 2, -3) = (1, -3, 2)
Then there must be a solution to the linear system
2a + b = 1
-a + 2b = -3
-3b = 2
So b = -2/3. Plugging this into the two rst equations,
2a + -2/3 = 1
-a + 2(-2/3) = -3
Solving for a gives conicting results: a = 5/6, and 5/3. This set of equations has no solution, so
the three vectors are linearly independent.
5h. Determine whether the given vector is in the span of S:
v =
_
1 0
0 1
_
, S =
_
v
1
=
_
1 0
-1 0
_
, v
2
=
_
0 1
0 1
_
, v
3
=
_
1 1
0 0
__
3
Solution: Suppose there existed a, b, c R such that av
1
+ bv
2
+ cv
2
= v. A quick check of the
set reveals that only v
1
has a nonzero element in the bottom-left position. Since v has a zero in
that position, then a = 0. Likewise, considering the bottom-right element, we see that b = 1.
This leaves us with the much simpler linear combination,
_
0 1
0 1
_
+ c
_
1 1
0 0
_
=
_
c 1 + c
0 1
_
=
_
1 0
0 1
_
.
Considering the elements, we get the conicting solutions c = 0 and c = -1.
10. Show that if
M
1
=
_
1 0
0 0
_
, M
2
=
_
0 0
0 1
_
, and M
2
=
_
0 1
1 0
_
then the span of {M
1
, M
2
, M
3
} is the set of all symmetric 2 2 matrices.
Solution: The generic symmetric 2 2 matrix satises the equation A = A
t
, or
_
a
11
a
12
a
21
a
22
_
=
_
a
11
a
21
a
12
a
22
_
This gives the solutions
a
11
= a
11
, a
12
= a
12
, a
21
= a
12
, a
22
= a
22
and the solution set
__
a
11
a
12
a
12
a
22
_
: a
11
, a
12
, a
22
R
_
.
This generic symmetric matrix can be generated from the given matrices,
a
11
_
1 0
0 0
_
+ a
12
_
0 1
1 0
_
+ a
22
_
0 1
1 0
_
,
so they do indeed span the symmetric matrices.
14. Show that if S
1
and S
2
are arbitrary subsets of a vector space V, then
span(S
1
S
2
) = span(S
1
) + span(S
2
).
Solution:
span(S
1
S
2
) =
_
n
i=1
c
i
u
i
:
i
S
1
S
2
, c
i
R
_
=
_
n
i=1
c
i
u
i
: u
i
S
1
or u
i
S
2
, c
i
R
_
.
Now, we can divide the u
i
into those vectors from S
1
and those from S
2
. Conversely, we can
combine the v
i
and w
i
into a sum of vectors from S
1
S
2
.
=
_
m
i=1
c
i
v
i
+
m
i=1
d
i
w
i
: v
i
S
1
, w
i
S
2
, c
i
, d
i
R
_
= {v + w : v span(S
1
), w span(S
2
)}
= span(S
1
) + span(S
2
).
4
1.5 - Linear Dependence and Linear Independence
2. Determine whether the following sets are linearly dependent or independent.
d. {x
3
x, 2x
2
+ 4, -2x
3
+ 3x
2
+ 2x + 6} in P
3
(R)
Solution: The linear equation
a(x
3
x) + b(2x
2
+ 4) + c(-2x
3
+ 3x
2
+ 2x + 6) = 0
can be rearranged, combining like terms,
x
3
(a 2c) + x
2
(b + 3c) + x(-a + 2c) + 1(4b + 6c) = 0.
Since dierent powers of x are linearly independent
n-1
i=1
a
ii
_
_
_
_
_
_
_
(1)
Breaking this apart into a basis, we have two dierent sorts of matrices; rst, those with zeros
along the diagonal, whose basis is
_
_
_
_
_
_
_
_
_
0 1 0 0
0 0 0 0
0 0 0 0
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
0 0 0 0
_
_
_
_
_
_
_
,
_
_
_
_
_
_
_
0 0 1 0
0 0 0 0
0 0 0 0
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
0 0 0 0
_
_
_
_
_
_
_
, . . . ,
_
_
_
_
_
_
_
0 0 0 0
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
0 0 0 0
0 0 0 0
0 1 0 0
_
_
_
_
_
_
_
,
_
_
_
_
_
_
_
0 0 0 0
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
0 0 0 0
0 0 0 0
0 0 1 0
_
_
_
_
_
_
_
_
_
There are n
2
n o-diagonal entries, so this basis contains n
2
n matrices. Second, there are
the diagonal matrices. These have the basis
_
_
_
_
_
_
_
_
_
1 0 0 0
0 0 0 0
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
0 0 0 0
0 0 0 -1
_
_
_
_
_
_
_
,
_
_
_
_
_
_
_
0 0 0 0
0 1 0 0
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
0 0 0 0
0 0 0 -1
_
_
_
_
_
_
_
, . . . ,
_
_
_
_
_
_
_
0 0 0 0
0 0 0 0
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
0 0 1 0
0 0 0 -1
_
_
_
_
_
_
_
_
_
.
11
There are n1 of these basis matrices, one for each diagonal entry except the last. Joining these
two bases we have a linearly independent set (for each (i, j) = (n, n), there is exactly one basis
matrix M such that M
ij
= 0) spanning W (any matrix of the form in (1) can be built from a
combination of these vectors, where the coecients a
ij
correspond to the basis vectors in the
obvious way). The dimension of W is exactly (n
2
n) + (n 1) = n
2
1.
20. Let V be a vector space having dimension n, and let S be a generating subset of V.
(a) Prove that there is a subset of S that is a basis of V.
Solution: Let = {v
1
, . . . , v
n
} be a basis of V. We can not assume that S is nite, but we need
only nd a nite subset of S that generates .
For each v
i
, there is some collection {w
i1
, w
i2
, . . . , w
ik
i
} S and scalars d
i1
, . . . , d
ik
i
such that v
i
=
k
i
j=1
d
ij
w
ij
. Dene
T = {w
ij
: 1 i n, 1 j k
i
}
Then T is a nite subset of S. Observe that, for any v V , we can write v =
c
i
v
i
, so
v =
n
i=1
c
i
_
_
k
i
j=1
d
ij
w
ij
_
_
Thus T spans V . By Theorem 1.10, there must be a subset of T S which is a basis of V.
(b) Prove that S contains at least n vectors.
Solution: By the above, T S contains a basis of V, which must have exactly n vectors. Therefore
S must have at least n vectors.
33. Let W
1
and W
2
be subspaces of a vector space V with bases
1
and
2
, respectively.
(a) Assume W
1
W
2
= V. Prove that
1
2
= and
1
2
is a basis for V.
Solution: First, if v lies in
1
2
, then v W
1
W
2
= {0}. However, 0 can not be a member of
any basis, neither
1
or
2
. Therefore
1
2
= .
Denote the elements of
1
and
2
by v
1
, v
2
. . . and w
1
, w
2
, . . ., respectively. We now must
prove that (1)
1
2
is linearly independent, and that (2)
1
2
spans V.
(1) Suppose
c
1
v
1
+ + c
m
v
m
+ c
m+1
w
1
+ . . . + c
m+n
w
n
= 0 (2)
for some c
1
, . . . , c
m+n
. We can rearrange this sum
c
1
v
1
+ + c
m
v
m
= -c
m+1
w
1
+ . . . + -c
m+n
w
n
Now, the LHS lies in W
1
, and the RHS lies in W
2
, so both must lie in W
1
W
2
= {0} Thus
c
1
v
1
+ + c
m
v
m
= 0 and -c
m+1
w
1
+ . . . + -c
m+n
w
n
= 0
but both of these linear combinations must be trivial, since
1
and
2
are both bases. Thus
c
1
= c
2
= = c
m+n
= 0, so (2) must be trivial. Therefore
1
2
must be linearly independent.
12
(2) Now, given u V, we can write u = v + w, with v W
1
and w W
2
. Likewise, we can write
v =
n
1
i=1
c
i
v
i
and w =
n
2
i=1
c
i
w
i
. So
u = v + w =
n
1
i=1
c
i
v
i
+
n
2
i=1
c
i
w
i
,
which is a linear combination in
1
2
. So the union of bases spans V.
(b) Conversely, if
1
2
= and
1
2
is a basis for V, prove that W
1
W
2
= V.
Solution: Denote the elements of
1
and
2
by v
1
, v
2
. . . and w
1
, w
2
, . . ., respectively. We need
to prove that (1) W
1
W
2
= {0}, and (2) W
1
+W
2
= V.
(1) Suppose u W
1
W
2
. Then we can write u in two ways:
c
1
v
1
+ c
2
v
2
+ + c
m
v
m
= v = d
1
w
1
+ d
2
w
2
+ + d
n
w
n
.
This gives the linear combination
c
1
v
1
+ c
2
v
2
+ + c
m
+ -d
1
w
1
+ -d
2
w
2
+ + -d
n
w
n
= 0
Observe that all vectors in the sum lie in
1
2
, which is a basis, so all coecients must be
zero. Therefore u = 0. So W
1
W
2
= {0}.
(2) Now, any u V can be written as a linear combination in
1
2
, that is to say, of
vectors from either
1
or
2
. Rearrange this sum so that it has the form
u =
n
1
i=1
v
i
+
n
2
j=1
w
i
= v + w,
where v is some vector in W
1
and w is some vector in W
2
. Thus V = W
1
+W
2
.
13