Вы находитесь на странице: 1из 4

11,3,1 We begin with a lemma.

Lemma: Let be a unital ring and let

be left unital

-modules. If

is

-bilinear, then the induced map given by is a well-defined -module homomorphism. Proof: To see well definedness, we need to verify that is a module homomorphism. To that end note that . Similarly, to show that is a module homomorphism, note that . is a unital ring and unital modules, and is bilinear. (So that the

, so that [Note to self: In a similar way, if

is trilinear, then

induced map is a module homomorphism, or unilinear- if you will.) That is to say, in a concrete fashion we can think of multilinear maps as the uncurried versions of higher order functions on modules. (!!!) (I just had a minor epiphany and it made me happy. Okay, so the usual isomorphism is just this lemma applied to the dot are -algebras, product ... that's cool.) Moreover, if and if and then the induced map is an algebra homomorphism if and only if and .] . This map is certainly Define by bilinear, and so by the lemma induces the linear transformation . Since

has finite dimension, and since its dual space

has the same dimension, to see that is an isomorphism of vector spaces it suffices to show that the kernel is trivial. To that end, suppose . Then we have nonzero element for all . In particular, we have for all . If there exists a , then by the Building-up lemma there is a basis of containing . ,

In particular, there is a linear transformation such that . That is, we have so that . Hence is injective, and so an isomorphism of vector spaces. Note that , while

. If has dimension greater than 1, then is not a commutative ring. Thus these expressions need not be equal in general. In fact, if we choose , , and such that , . In particular, then , and , then clearly and is not a ring isomorphism if . On the other hand, if , is a ring isomorphism. and are vector

is commutative, and

On the other hand, these rings are clearly isomorphic since spaces of the same dimension. Note that multiplication by and . Fix a basis of are both

-algebras via the usual scalar with

, and identify the linear transformation

its matrix

with respect to this basis. (Likewise for

.) Now define is well defined,

and moreover is an

by . It is clear that -vector space homomorphism. Note also that , so that . Thus is a ring homomorphism; since

, we have is an for all ,

, and indeed

is an

-algebra homomorphism. It remains to be seen that

isomorphism; it suffices to show injectivity. To that end, suppose . Then for all , and so . Thus is an -algebra isomorphism . Note that and so is not natural. 11,3,5 We claim that by unique that Then . To prove this, for each let be a copy of

depends essentially on our choice of a basis

. Now define

. By the universal property of direct products, there exists a such that for all . Now define . We claim by letting .

-linear transformation and extending linearly; certianly

is an isomorphism. To see surjectivity, let , so that . Since is a basis of . , and thus , we have . Thus

. To see injectivity, suppose for all . Thus for all is an isomorphism, and we have .

By this previous exercise, 11,4,1 Let .

has strictly larger dimension than does

We begin with a definition. If

, where

has dimension .

, then the -minor of is the matrix Recall that the cofactor expansion formula for along the th row is . The analogous expansion along the th column is . which we presently prove to be true. First, note that

; this follows from our

definition of minors and the fact that =

. Now we have the following.

= = = = as desired.
About these ads

11,4,2 Let be the reduced row echelon form of , and let be invertible such that . Suppose the columns of are linearly independent. Now has column rank . In particular, . Now ; so . We prove the converse contrapositively. Suppose the columns of are linearly dependent; then the column rank of is strictly less than , so that has a row of all zeros. Using the cofactor expansion formula, . Thus if 11,4,4 Note that if is not the identity, then there must exist an element such that (Otherwise, we can show by induction that .) Now recalling the naive formula for computing , note that if is an upper- or lower-triangular matrix, then the product of the diagonal entries. . . Since is invertible, its determinant is nonzero; thus are linearly independent. , then the columns of

is merely

In particular, we have , and if . Note also that the row operation of interchanging two rows is equivalent to three operations which add a multiple of one row to another and one scalar row multiplication by -1. Since determinants are multiplicative, the determinant of the elementary matrix achieving this operation is -1. Suppose . In this previous exercise, we saw that the columns of are linearly independent. In particular, all columns in the reduced row echelon form of are pivotal, and so is row equivalent to the identity matrix. Conversely, suppose is row equivalent to the identity matrix. Then the columns of are linearly independent, and so . Suppose now that , where is a product of elementary matrices. Since determinants are multiplicative, it is clear that 11,4,5 Evidently, matrix row operations: 1. 2. 3. 4. 5. 6. has the given form.

is in reduced row echelon form after performing the following sequence of

7. 8. 9. This sequence required one interchange and three scalar multiplications of rows. So Similarly, operations: 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. Thus . . is in reduced row echelon form after performing the following sequence of row

Вам также может понравиться