Вы находитесь на странице: 1из 5

7 Bilinear forms and inner products

Denition 7.1 A bilinear form on a vector space V over a eld F is a function : V V


F such that
(u +v, w) = (u, w) +(v, w)
(u, v +w) = (u, v) +(u, w)
for all u, v, w V and all , F. We say that is linear in each argument.
A bilinear form on V is called symmetric if
(u, v) = (v, u)
for all u, v V .
Example 7.2 If A M
nn
(F) is any n n matrix with entries in the eld F then there is
a bilinear form
A
on the vector space F
n
over F dened by

A
(u, v) = uAv
T
for all u, v F
n
. Moreover
A
is symmetric if and only if A is a symmetric matrix (that is,
A
T
= A).
For the rst time in this course it is now important that our eld of scalars F should be R,
because we want to know what it means for scalars to be positive.
Denition 7.3 A symmetric bilinear form on a vector space V over R is called positive
denite if
(u, u) 0 for all u V
and
(u, u) = 0 u = 0.
A symmetric matrix A M
nn
(R) is called positive denite if the corresponding symmetric
bilinear form
A
on R
n
dened in Example 7.2 above is positive denite.
Example 7.4 The bilinear form
I
n
on R
n
associated as in Example 7.2 to the identity
matrix I
n
M
nn
(R) is a positive denite symmetric bilinear form.
I
n
is the usual dot
product on R
n
given by

I
n
(u, v) = uv
T
= u.v =
n

i=1
u
i
v
i
for all u = (u
1
, . . . , u
n
) and v = (v
1
, . . . , v
n
) in R
n
.
Remark 7.5 It is traditional to use notation such as u.v or u, v (or even (u, v), though this
is not recommended as it can be confused with the notation for ordered pairs) for positive
denite symmetric bilinear forms (also called inner products) on real vector spaces.
1
Denition 7.6 Let V be a real vector space. An inner product on V is a positive denite
symmetric bilinear form on V ; that is, it is a function , : V V R with the following
properties.
(1) For all v V we have v, v 0 and v, v = 0 v = 0 (positive deniteness).
(2) For all v, w V and , R we have v, w = w, v (symmetry).
(3) For all u, v, w V we have u+v, w = u, w+v, w (linearity in the rst argument).
Clearly linearity in the second argument follows from symmetry and linearity in the rst
argument.
An inner product space is a real vector space V together with an inner product , on
V .
Example 7.7 As noted above, the usual dot product is an inner product on R
n
. So is
A
for any diagonal matrix A M
nn
(R) with strictly positive diagonal entries.
Example 7.8 Let V be the vector space R
n
[X] of polynomials of degree at most n in X with
real coecients. Then there is an inner product , on V given by
f, g =

1
0
f(t)g(t)dt.
An inner product allows us to dene the length of a vector.
Denition 7.9 Let , be an inner product on a real vector space V . If v V then the
length of v is
[[v[[ =

v, v.
Theorem 7.10 Cauchy-Schwartz inequality
Let , be an inner product on a real vector space V . If u, v V then
[u, v[ [[u[[ [[v[[
with equality holding if and only if u and v are linearly dependent (that is, one is a scalar
multiple of the other).
Proof: If u, v V and R it follows from the properties of an inner product that
0 u v, u v = u, u 2u, v +
2
v, v = [[u[[
2
2u, v +
2
[[v[[
2
with equality if and only if u = v. If u = 0 or v = 0 then the result is trivial, so we can
assume that u ,= 0 and v ,= 0. Then we can take
= [[u[[/[[v[[
to get 2[[u[[
2
2[[u[[[[v[[
1
u, v and hence [u, v[ [[u[[ [[v[[. Moreover if equality holds here
then u = v so u and v are linearly dependent, and conversely if one of u and v is a scalar
multiple of the other then it is easy to check that equality holds.
The Cauchy-Schwartz inequality allows us to dene the angle between two non-zero vectors
in an inner product space.
2
Denition 7.11 Let , be an inner product on a real vector space V . The angle between
non-zero vectors u and v in V is the unique element of [0, ] such that
u, v = [[u[[[[v[[ cos .
Vectors u, v V are said to be orthogonal if u, v = 0.
Remark 7.12 We can also dene inner products (often called Hermitian inner products) for
complex vector spaces, but we need to modify the axioms for real inner products, since they
become inconsistent over the complex numbers.
Denition 7.13 Let V be a complex vector space. A Hermitian inner product on V is
a function , : V V C with the following properties.
(1) For all v V we have v, v R and v, v 0, and v, v = 0 v = 0 (positive
deniteness).
(2) For all v, w V and , R we have v, w = w, v (complex-conjugate symmetry).
(3) For all u, v, w V we have u+v, w = u, w+v, w (linearity in the rst argument).
From complex-conjugate symmetry and linearity in the rst argument we get complex-
conjugate linearity in the second argument: u, v+w =

u, v+ u, w for all u, v, w V .
A Hermitian inner product space is a vector space V over C together with a Hermitian
inner product , on V .
Remark 7.14 The length of a vector in a Hermitian inner product space can be dened in
exactly the same way as in a real vector space, and the denition of orthogonality of two
vectors is also the same; however there is no analogous denition of the angle between two
non-zero vectors. The Cauchy-Schwartz inequality for a Hermitian inner product space says
that for all vectors u and v
[u, v[ [[u[[ [[v[[
with equality holding if and only if u and v are linearly dependent (the proof is the same as
in the real case except that has to be chosen more carefully).
Denition 7.15 Let , be an inner product on a real vector space V . A subset B of
V is said to be orthogonal if u, v = 0 for all u, v B such that u ,= v. It is said to be
orthonormal if in addition u, u = 1 for all u B.
An orthonormal basis of V is an orthonormal subset of V which is also a basis of V .
Lemma 7.16 Let , be an inner product on a real vector space V and let B be an
orthogonal subset of V consisting of non-zero vectors. Then B is linearly independent.
Proof: Suppose that v
1
, . . . , v
n
are distinct elements of B and
1
, . . . ,
n
R and that
1
v
1
+
+
n
v
n
= 0. Then if 1 j n we have
0 =
1
v
1
+ +
n
v
n
, v
j
=
1
v
1
, v
j
+ +
n
v
n
, v
j
=
j
v
j
, v
j
.
By the hypothesis on B we have v
j
,= 0 and so v
j
, v
j
, = 0, and therefore
j
= 0. Thus B is
linearly independent.
3
Lemma 7.17 Let , be an inner product on a nite-dimensional real vector space V .
Let B = v
1
, . . . v
n
be a basis for V , and for each v V let [v]
B
be the coordinate vector of
v with respect to the basis B. Then the basis B is orthonormal if and only if
u, v = ([u]
B
)
T
[v]
B
for all u, v V.
Proof to be completed.
Recall that a square real matrix A is orthogonal if and only if A
1
= A
T
.
Denition 7.18 Let ,
V
be an inner product on a real vector space V and let ,
W
be an inner product on a real vector space W. A linear transformation T : V W is called
an orthogonal linear transformation of the inner product spaces if
T(u), T(v)
W
= u, v
V
for all u, v V.
Lemma 7.19 A linear transformation T : V W of real inner product spaces is orthogonal
if and only its matrix A with respect to any orthonormal bases B
V
and B
W
of V and W is
orthogonal.
Proof: If v V let [v]
B
V
be the coordinate vector of v with respect to the basis B
V
, and if
w W let [w]
B
W
be the coordinate vector of w with respect to the basis B
W
. Then if v V
we have
[T(v)]
B
W
= A[v]
B
V
and so by Lemma 7.17 if u, v V then
T(u), T(v)
W
= ([T(u)]
B
W
)
T
[T(v)]
B
W
= (A[u]
B
V
)
T
A[v]
B
V
= ([u]
B
V
)
T
A
T
A[v]
B
V
.
If A is orthogonal then this is equal to
[u]
T
B
V
[v]
B
V
= u, v
and so T is orthogonal. Conversely if T is orthogonal then
([u]
B
V
)
T
A
T
A[v]
B
V
= [u]
T
B
V
[v]
B
V
for all u, v V , and so taking u and v to be i-th and j-th elements of the orthonormal basis
B
V
we nd that the (i, j)-th entry of A
T
A is the (i, j)-th entry of the identity matrix for all
choices of i and j, and thus A is orthogonal.
Remark 7.20 (non-examinable)
The GramSchmidt procedure allows us to construct from any basis v
1
, . . . , v
n
of a nite-
dimensional real inner product space V an orthonormal basis e
1
, . . . , e
n
such that
Sp(e
1
, . . . , e
k
) = Sp(v
1
, . . . , v
k
)
for k = 1, . . . , n, as follows.
Let e
1
= v
1
/[[v
1
[[, which is well dened since v
1
, . . . , v
n
is linearly independent and so v
1
,= 0.
Then [[e
1
[[ = 1.
4
Let e
2
= w
2
/[[w
2
[[ where
w
2
= v
2
v
2
, e
1
e
1
.
This is well dened since v
1
, . . . , v
n
is linearly independent and e
1
is a scalar multiple of v
1
and so w
2
,= 0. Then [[e
2
[[ = 1 and e
1
, e
2
= 0 and Sp(e
1
, e
2
) = Sp(v
1
, v
2
).
Assume inductively that we have dened an orthonormal set e
1
, . . . , e
k
such that
Sp(e
1
, . . . , e
j
) = Sp(v
1
, . . . , v
j
) for j = 1, . . . , k. Let e
k+1
= w
k+1
/[[w
k+1
[[ where
w
k+1
= v
k+1
v
k+1
, e
1
e
1
v
k+1
, e
k
e
k
.
This is well dened since v
1
, . . . , v
n
is linearly independent and Sp(e
1
, . . . , e
k
) = Sp(v
1
, . . . , v
k
)
and so w
k+1
,= 0. Then [[e
k+1
[[ = 1 and e
j
, e
k+1
= 0 if 1 j k and Sp(e
1
, . . . , e
k+1
)
= Sp(v
1
, . . . , v
k+1
) by the Steinitz Exchange Lemma.
We can repeat this procedure until we obtain an orthonormal basis e
1
, . . . , e
n
for V such
that
Sp(e
1
, . . . , e
k
) = Sp(v
1
, . . . , v
k
)
for k = 1, . . . , n.
Note also that if v
1
, . . . , v
k
is already orthonormal for some k then we will get e
j
= v
j
for j =
1, . . . , k. So if S = e
1
, . . . e
k
is any orthonormal subset of V , then S is linearly independent
by Lemma 7.16, so we can extend S to a basis e
1
, . . . , e
k
, v
k+1
, . . . , v
n
for V and then apply
the GramSchmidt procedure to obtain an orthonormal basis e
1
, . . . , e
k
, e
k+1
, . . . , e
n
for V
containing S.
5

Вам также может понравиться