Вы находитесь на странице: 1из 5

Final Exam Check List 4/14/11 11:55 AM

Math 415. Appled Linear Algebra


Final Exam Checklist
Chapter 1. Basics

This chapter contains basic material, but it is important only as it arises


in different contexts in later chapters. So scan the definitions and
properties and review the techniques.

Chapter 2. Vector Spaces and Bases

2.1. Real Vector Spaces

What are Rn , P(n) , M m x n and F(I) ? In particular, how are addition and
scalar multiplication defined on these spaces?

2.2. Subspaces

Make sure you know your definitions and how to verify that something
is a subspace.

2.3. Span and Linear Independence

Once again definitions are basic, so know what linear independence


and span mean both generally and in specific applications.

2.4. Bases and Dimension

Definitions again, especially of basis and dimension and coordinates of


a vector relative to a basis. It is here that Gaussian elimination will
come into calculations. Be ready for example calculations in Rn and
P(n) and calculation of bases for the subspace of solutions of one or
more linear homogeneous equations.

2.5. The Fundamental Matrix Subspaces

The four fundamental subspaces of a matrix are mainly important for


the results in Section 5.6, but you need to know their definitions and
how to find bases for them since they come up in other places

In addition there are two principal superposition results in this section,


Propositions 2.37 and 2.39, and you need to know what they say and
how to use them and be able to replicate their proofs.

2.6. Graphs and Incident Matrices

http://www.math.uiuc.edu/~muncast/math415su08/Checklists/CheckListFinalExam.html Page 1 of 5
Final Exam Check List 4/14/11 11:55 AM

The principal techniques here are finding a basis of independent


circuits for a graph and then writing other circuits in terms of your
independent ones. What is the kernel of an incidence matrix of a
connected graph and why?

Chapter 3. Inner Products and Norms

3.1. Inner Products

Inner products are important for defining orthogonality of elements in


Rn and P(n) and C0 [a, b] and other vector spaces. You should know the
definition and how to verify orthogonality, in particular, in polynomial
and function spaces.

3.2. Inequalities

Know the statement of the Cauchy-Schwarz inequality be able to


explain how it allows us to define angles between vectors, even if those
vectors are polynomials or functions

3.4. Positive Definite Matrices

This is where our study of quadratic forms begins and it is the


characterization in Theorem 3.21 that connects quadratic forms to
symmetric matrices. You should know how to determine whether a
symmetric matrix is positive definite, semi-definite or indefinite

3.5. Completing the Square

Completing the square is the same as the LDL T factorization of a


symmetric matrix and it is here that the matrix factorizations in Chapter
1 become important. If you can complete the square, then you can more
easily assess definiteness.

Chapter 4. Minimization and Least Squares Approximation

4.1. Minimization Problems

Be able to describe the closest point problem geometrically and write it


down analytically as a minimization problem

4.2. Minimization of Quadratic Functions

The results in this section are fundamental to minimization problems


for quadratic forms. You need to know how to take a qwadratic form
you are given and write it in the form of the general quadratic function
on page 186 (equation (4.10)) since Theorem 4.1 then tells you how to
minimize it. Once again examples 4.2 and 4.3 are good prototype

http://www.math.uiuc.edu/~muncast/math415su08/Checklists/CheckListFinalExam.html Page 2 of 5
Final Exam Check List 4/14/11 11:55 AM

examples of how to do the minimization in practice

4.3. Least Squares and the Closest Point

What is the least squares solution of a linear system and why is it


important? What are the normal equations for finding the least squares
solution and how do they arise?

Chapter 5. Orthogonality

5.1. Orthogonal Bases

Here is where we begin to see the great power in working with a basis
in which the vectors are orthogonal or orthonormal. Know what it
means for a collection of vectors to be mutually orthogonal and how
this relates to linear independence. Be able to prove linear
independence for an orthogonal collection of vectors. Be able to find
the coordinates of a vector relative to an orthogonal/orthonormal basis

5.2. The Gram-Schmidt Process

Not much to say here. This is an important technique.

5.3. Orthogonal Matrices

Orthogonal matrices come from putting using the vectors of an


orthonormal basis as columns of a matrix. Know the definition and
basic properties of orthogonal matrices (products, determinant,
connection to orthonormal bases). Also be familiar with the form of
rotation matrices about standard axes in 2 and 3 dimensions since these
are important in Chapter 7 on linear transformations

5.5. Orthogonal Projections and Least Squares

Here is where we learn the connection between the closest point


problem and orthogonal projections. Moreover, orthogonal projections
are easy if the basis of your subspace is orthogonal. Review these ideas
geometrically and in some examples

5.6. Orthogonal Subspaces

In many ways, the interpretations in this section are both the most
theoretical and yet the most important results in the subject. Know the
definition of orthogonality of two subspaces and orthogonal
complements and be able to find orthogonal complements (that is,
bases of them) in simple cases.

Theorem 5.55 is vital since it tells you how to deduce whether the

http://www.math.uiuc.edu/~muncast/math415su08/Checklists/CheckListFinalExam.html Page 3 of 5
Final Exam Check List 4/14/11 11:55 AM

vector b in Ax = b admits solutions, so be able to find bases of


cokernels!! Moreover, Theorem 5.59 and Equation (5.82) tell us how to
avoid problems with multiple solutions by looking for the unique
solution of minimum norm. Then we can get the full set of solutions by
adding a general vector from the kernel.

Chapter 6. Equilibrium

6.2. Electrical Networks

If you get an electrical network problem, you will need to know Section 2.6
and should be able to solve problems either with an external current imposed
and a grounded node, or with a battery vector in the network and a node
grounded.

Chapter 7. Linearity

7.1. Linear Functions

Theorem 7.5 is a vital result since it characterizes all linear functions


from Rn to Rm. as matrices so know its proof. Moreover, know how to
find a matrix representation for a linear function by finding the action
of the function on basis vectors.

Be familiar with the following two correspondences between linear


functions and matrices: a) the matrix corresponding to a composition
of functions is the product of the separate matrices, b) the matrix
corresponding to the inverse of a linear function is the inverse of its
own matrix representation

7.2. Linear Transformations

Know the change of basis results characterized in Eqns (7.27) and


(7.28) and how to use them in practice. See Handout 3 for an example

Chapter 8. Eigenvalues

8.2. Eigenvalues and Eigenvectors

Know your definitions: eigenvalue, eigenvector, eigenspace,


characteristic polynomial, multiplicity, trace, etc., and how they are
interconnected. And most important, know how to solve eigenvalue
problems.

8.3. Eigenvector Bases and Diagonalization

Diagonalization is one of the most important products of linear


algebra. It is used everywhere in linear theory to "decouple" systems of

http://www.math.uiuc.edu/~muncast/math415su08/Checklists/CheckListFinalExam.html Page 4 of 5
Final Exam Check List 4/14/11 11:55 AM

equations. Know the leading results: a) eigenvalues corresponding to


distinct eigenvalues are lin indep, b) if A is complete, then there is an
invertible S and a diagonal D such that S-1 AS = D. How do you find S
and D? This is what we refer to as "diagonalizing A".

8.4. Eigenvalues of Symmetric Matrices

Symmetric matrices are the "best of all worlds" and we learn that here:
a) eigenvalues are real, b) eigenvectors for distinct eigenvalues are
orthogonal, and c) there is an orthonormal basis of Rn consisting of
eigenvectors of A. Be prepared to give a proof of c), given a) and b).
Consequence: how do you tell whether a symmetric matrix is positive
definite (semi-definite)? Moreover, understand how this translates into
the special diagonalization QT AQ = D (how do you find Q and D?). In
general, recognize when your basis vectors are orthogonal and the
advantages that that gives you.

Chapter 9. Eigenvalues

8.1 and 9.1. Basic Solution Techniques

Of prime importance here was the result that eigenvalues k and


eigenvectors v of A give use eigensolutions u(t) = e kt v to the linear
system and if we can find n of these (i.e. the matrix A is complete),
then the general solution is a general linear combo of these eigen
solutions. Know how to implement this idea and, in particular, to solve
initial value problems via eigenvalue methods

9.2. Stability of Linear Systems

Be familiar with the different definitions of equilibrium, stability and


instability and how these can be assessed in linear systems using
information about the eigenvalues.

http://www.math.uiuc.edu/~muncast/math415su08/Checklists/CheckListFinalExam.html Page 5 of 5

Вам также может понравиться