Вы находитесь на странице: 1из 35

Eigenvalues, Eigenvectors and Eigenspaces Diagonalizable Matrices

Eigenvalues, Eigenvectors and Eigenspaces Diagonalizable Matrices

Eigenvalues, Eigenvectors and Eigenspaces of the Linear Operators

Eigenvalue:

Let T be a linear operator on a vector space V over the field F . An eigenvalue (or characteristic value) of T is a scalar λ in F such that there is a non-zero vector v in V such that T v = λv .

Eigenvector:

If λ is an eigenvalue of T then any vector v satisfying T v = λv is called an eigenvector of T associated with the eigenvalue λ .

Eigenspace:

If λ is an eigenvalue of T then the collection of all vectors v in V such that T v = λv is called the eigenspace associated with the eigenvalue λ .

of all vectors v in V such that T v = λv is called the eigenspace

Other Terminologies for Eigenvalues:

Eigenvalue

Characteristic Value

Characteristic root

Latent root

Proper value

Spectral value

Note: The zero vector obviously satisfies T 0 = 0 = λ · 0. Find an eigenvector v associated with the eigenvalue λ means, you should find a non-zero vector v such that T v = λv .

v associated with the eigenvalue λ means, you should find a non-zero vector v such that

Example

Let T : R 2 R 2 be the linear operator defined by

T x

y

=

23

40

12

21

x

y

.

Observe that T (3, 5) = 3(3, 5) and T (1, 2) = (1)(1, 2). Therefore, 3 and 1 are eigenvalues of T .

Corresponding to the eigenvalue 3 , (3, 5) is an eigenvector and {(3k, 5k) : k R} is the eigenspace.

Corresponding to the eigenvalue 1 , (1, 2) is an eigenvector and {(k, 2k) : k R} is the eigenspace.

to the eigenvalue − 1 , (1 , 2) is an eigenvector and { ( k,

Example of a linear operator on infinite dimensional space having no eigenvalues

Let C[R] be the vector space (over the field R ) that consists of continuous, real valued functions on R . Let T be the linear operator on C[R] defined by

(Tf)(x) = x f (t) dt .

0

Then, T has no eigenvalues.

Note:

If T is a linear operator on a finite dimensional vector space V over the complex field C then T has an eigenvalue.

is a linear operator on a finite dimensional vector space V over the complex field C

Theorem (for a linear operator on finite dimensional vector space)

Theorem:

Let T be a linear operator on a finite dimensional vector space V . Then, the following are equivalent.

1 λ is an eigenvalue of T .

2 The operator (T λI) is singular (not invertible).

3 Determinant of (T λI) = 0 .

How to find eigenvalues?

If B is an ordered basis for V and A = [T ] B , then

(T λI) is singular if and only if (A λI) is singular

basis for V and A = [ T ] B , then ( T − λI

Eigenvalues, Eigenvectors, Eigenspaces of the square matrices

Eigenvalue: Let A be an n × n matrix over the field F . An
Eigenvalue: Let A be an n × n matrix over the field F . An
Eigenvalue: Let A be an n × n matrix over the field F . An

Eigenvalue: Let A be an n × n matrix over the field F . An eigenvalue (or characteristic value) of A is a scalar λ in F such that the matrix (A λI) is singular.

Eigenvector: If λ is an eigenvalue of A then any vector v satisfying Av = λv is called an eigenvector of A associated with the eigenvalue λ .

Eigenspace: If λ is an eigenvalue of A then the collection of all vectors v in V such that Av = λv is called the eigenspace associated with the eigenvalue λ .

Denote the eigenspace of the matrix A associated with the eigenvalue λ by E λ (A).

the eigenvalue λ . Denote the eigenspace of the matrix A associated with the eigenvalue λ

Results:

λ is an eigenvalue of A if and only if (A λI) is singular if and only if the determinant of (A λI) = 0 if and only if (λI A) is singular if and only if the determinant of (λI A) = 0 .

Eigenspace of A associated with the eigenvalue λ is

= {v V

:

Av = λv}

= {v

V

:

(A λI)v = 0}

= Null space of (A λI)

= Set of all solutions of linear system (A λI)v = 0

= Null space of (λI A)

= Set of all solutions of linear system (λI A)v = 0

( A − λI ) v = 0 = Null space of ( λI − A

Eigenspaces are invariant subspaces

Definition: Let T be a linear operator on a vector space V . A subspace
Definition: Let T be a linear operator on a vector space V . A subspace
Definition: Let T be a linear operator on a vector space V . A subspace

Definition: Let T be a linear operator on a vector space V . A subspace U of V is said to be invariant (or T -invariant) under the linear map T if T u U for each u U .

Result: Let T be a linear operator on a vector space V . If λ is an eigenvalue of T then the eigenspace E λ (T ) of T associated with the eigenvalue λ is an invariant subspace of V .

of T then the eigenspace E λ ( T ) of T associated with the eigenvalue
of T then the eigenspace E λ ( T ) of T associated with the eigenvalue

Example

1

Find the eigenvalues of the matrix A = 1

4

2

0

4

1

1

5

  . Find the

eigenspace and the basis & dimension of the eigenspace

associated with each eigenvalue of A . Step 1: Finding eigenvalues Solving the equation det (λI A) = 0 .

0

0

(λ 1)

1

4

= λ 3 6λ 2 + 11λ 6 = (λ 1)(λ 2)(λ 3) .

= det (λI A) =

2

(λ 0)

4

1

1

(λ 5)

Eigenvalues of A are 1 , 2 and 3 .

det ( λI − A ) =  2 ( λ − 0) − 4 −

Example (continuation)

Step 2(a): Corresponding to the eigenvalue 1, finding eigenvector and eigenspace

When λ = 1, to find an eigenvector, we need to solve the

equation (λI A)v = (1 × I A)v =

Equivalently, solving the homogeneous system Bv = 0 where B = (I A).

(I A)v = 0.

B = 1

4

0

2

1

4

1

1

4

1

R = 0

0

1

1 .

1

2

2

0

, Eigenspace of A associated to the eigenvalue 1 is

So, the solution set is k 1

0 0

1

2 , 1

: k R .

2

E 1 (A) = k 1 2

,

1

2 , 1

:

k R .

The vector 1

the dimension of E 1 (A) is 1 .

2

,

2 1 , 1 is a basis for the eigenspace E 1 (A) and

− 1 the dimension of E 1 ( A ) is 1 . 2 , 2

Example (continuation)

Step 2(b): Corresponding to the eigenvalue 2, finding eigenvector and eigenspace

When λ = 2, to find an eigenvector, we need to solve the

equation (λI

Equivalently, solving the homogeneous system Bv = 0 where B = (2I A).

A)v = (2 × I A)v = (2I A)v = 0.

B = 1

4

So, the solution set is k 1

Eigenspace of A associated to the eigenvalue 1 is

1

2

2

4

1

1

3

1

R = 0

0

0

1

0

1 .

1

2

4

0

2

,

1

4 , 1

:

k R .

E 2 (A) = k 1 2

,

1

4 , 1

:

k R .

The vector 1

the dimension of E 2 (A) is 1 .

2

,

4 1 , 1 is a basis for the eigenspace E 2 (A) and

− 1 the dimension of E 2 ( A ) is 1 . 2 , 4

Example (continuation)

Step 2(c): Corresponding to the eigenvalue 3, finding eigenvector and eigenspace

When λ = 3, to find an eigenvector, we need to solve the

equation (λI

Equivalently, solving the homogeneous system Bv = 0 where B = (3I A).

A)v = (3 × I A)v = (3I A)v = 0.

B = 1

4

So, the solution set is k 1

Eigenspace of A associated to the eigenvalue 1 is

2

2

3

4

1

1

2

1

R = 0

0

0

1

0

1 .

1

4

4

0

4

,

1

4 , 1

:

k R .

E 3 (A) = k 1 4

,

1

4 , 1

:

k R .

The vector 1

the dimension of E 3 (A) is 1 .

4

,

4 1 , 1 is a basis for the eigenspace E 3 (A) and

− 1 the dimension of E 3 ( A ) is 1 . 4 , 4

Eigenvalues relation to the determinant and the trace of the matrix

Theorem: If A is an n × n matrix with the eigenvalues λ 1 , λ 2 , ··· , λ n (repeated according to multiplicity) then

Det (A) = λ 1 λ 2 ··· λ n .

Tr (A)

=

λ 1 + λ 2 + ··· + λ n .

If

3

A = 2

2

1

2

2

1

1

0

then the eigenvalues of A are the roots the equation

λ 3 5λ 2 + 8λ 4 = (λ 1)(λ 2) 2 = 0

.

The determinant of A is 1 × 2 × 2 = 4 and the trace of A is 1 + 2 + 2 = 5.

− 1)( λ − 2) 2 = 0 . The determinant of A is 1 ×

Some Results

Theorem: Let λ 1 , λ 2 , ··· , λ k be distinct eigenvalues of a matrix

A

and v 1 , v 2 , ··· , v k be corresponding (non-zero) eigenvectors.

Then v 1 , v 2 , ··· , v k are linearly independent.

Theorem: Let λ 1 , λ 2 , ··· , λ k be distinct eigenvalues of a matrix

A

and E λ 1 (A), E λ 2 (A), ··· , E λ k (A) be the corresponding

eigenspaces. Then, E λ 1 (A) + E λ 2 (A) + ··· + E λ k (A) is direct.

Theorem: If A and B are two n × n square matrices then AB and BA have the same eigenvalues.

A ) is direct. Theorem: If A and B are two n × n square matrices
Theorem: If λ is an eigenvalue of a square matrix A then λ is an
Theorem: If λ is an eigenvalue of a square matrix A then λ is
an eigenvalue for the transpose matrix A t of the matrix A .
Theorem: If λ is an eigenvalue of a square matrix A which is
invertible then λ is an eigenvalue for the inverse matrix A −1 of
the matrix A .
1
Theorem: Let A be an n × n matrix. Then 0 is an eigenvalue of
A
iff A is singular.

Theorem: Let A be an n × n matrix. If λ is an eigenvalue of A

then λ n is an eigenvalue of A n for each n N . If v

eigenvector associated with the eigenvalue λ , then v is an

eigenvector of A n associated with the eigenvalue λ n for each

n N .

= 0 is an

, then v is an eigenvector of A n associated with the eigenvalue λ n for

Characteristic Polynomial

Definition: If A is an n × n matrix then the polynomial M (x) = Det (xI A) is called a characteristic polynomial of A .

Note: The polynomial P (x) = Det (A xI) is also a characteristic polynomial of A.

Clearly, the roots of the characteristic polynomial are the eigenvalues of A . Therefore, the eigenvalues are also called as the characteristic roots or characteristic values of A .

A

polynomial is said to be a monic polynomial if the coefficient

of

the highest degree is 1 .

Note: The characteristic polynomial M (x) = Det (xI A) is a monic polynomial.

the highest degree is 1 . Note: The characteristic polynomial M ( x ) = Det

Result on Eigenvalues of Similar Matrices

Theorem: Similar matrices have the same characteristic polynomial and hence they have same eigenvalues. But the eigenvectors need not be the same.

Proof: If B = P 1 AP then

det (xI B)

=

det (xI P 1 AP )

=

det (xP 1 P P 1 AP )

=

det (xP 1 IP P 1 AP )

=

det (P 1 xIP P 1 AP )

=

det (P 1 (xI A)P )

=

det (P 1 ) det (xI A) det (P )

=

det (xI A) det (P 1 ) det (P )

=

det (xI A) det (P 1 P ) = det (xI A) det (I)

=

det (xI A)

= det ( xI − A )

Some Results

Theorem: Let V be a finite dimensional vector space over the field F and let
Theorem: Let V be a finite dimensional vector space over the field F and let
Theorem: Let V be a finite dimensional vector space over the field F and let

Theorem: Let V be a finite dimensional vector space over the field F and let T be a linear operator on V . If B is an ordered basis for V and A = [T ] B , then the eigenvalues of T are the roots of the characteristic polynomial of A which lie in the field F .

Theorem: A square matrix A and its transpose A t have the same characteristic polynomial.

lie in the field F . Theorem: A square matrix A and its transpose A t
in the field F . Theorem: A square matrix A and its transpose A t have

Algebraic multiplicity and Geometric multiplicity of Eigenvalues

Definition (Algebraic multiplicity):

An eigenvalue λ of the matrix A has algebraic multiplicity k if (x λ) k is the highest power of (x λ) that divides the characteristic polynomial of A .

That is, the power of the term (x λ) in the characteristic polynomial is the algebraic multiplicity of λ .

Definition (Geometric multiplicity):

The geometric multiplicity of an eigenvalue λ of the matrix A is the dimension of the eigenspace associated with the eigenvalue λ of A .

of an eigenvalue λ of the matrix A is the dimension of the eigenspace associated with

Result and Definition

Result: The geometric multiplicity of an eigenvalue λ does not exceed its algebraic multiplicity. Definition:
Result: The geometric multiplicity of an eigenvalue λ does not exceed its algebraic multiplicity. Definition:
Result: The geometric multiplicity of an eigenvalue λ does not exceed its algebraic multiplicity. Definition:

Result: The geometric multiplicity of an eigenvalue λ does not exceed its algebraic multiplicity.

Definition: An eigenvlaue λ of A is said to be regular if the geometric multiplicity of λ is equal to its algebraic multiplicity.

λ of A is said to be regular if the geometric multiplicity of λ is equal
λ of A is said to be regular if the geometric multiplicity of λ is equal

Example (from Section 6.2 of Hoffman & Kunze)

3

A = 2

2

1

2

2

1

1

0

The characteristic polynomial of A is given by

x 3 5x 2 + 8x 4

= (x 1)(x 2) 2 .

The algebraic multiplicity of the eigenvalue 2 is 2 . The geometric multiplicity of the eigenvalue 2 is 1. Reason: Basis for E 2 (A) is (1, 1, 2).

The algebraic multiplicity of the eigenvalue 1 is 1 . The geometric multiplicity of the eigenvalue 1 is 1. Reason: Basis for E 1 (A) is (1, 0, 2).

1 is 1 . The geometric multiplicity of the eigenvalue 1 is 1 . Reason: Basis

Minimal Polynomial

Definition: If A is an n × n matrix then the minimal polynomial of A
Definition: If A is an n × n matrix then the minimal polynomial of A

Definition: If A is an n × n matrix then the minimal polynomial of A is the monic polynomial m(x) of smallest degree such that m(A) = 0 .

Definition: If T is a linear operator on a finite dimensional vector space V then the minimal polynomial of T is the minimal polynomial of a matrix A = [T ] B for any ordered basis B of V .

Let A = 1

1

2

0

4

1

4

5

1 . Observe that

(A I)(A 2I)(A 3I) = 0 . That is, m(A) = (A 1 × I)(A 2 × I)(A 3 × I) = 0 . Therefore, the minimal polynomial of A is m(x) = (x 1)(x 2)(x 3).

)( A − 3 × I ) = 0 . Therefore, the minimal polynomial of A

Examples

1

Let A = 0

0

Note that A(A I) = 0 and A 2 (A I) = 0 . The characteristic polynomial of A is x 2 (x 1) = 0 . The minimal polynomial of A is x(x 1) = 0 .

0

0

0

0

0

0

.

1

Let B = 0

0

Note that B(B I) = 0 and B 2 (B I) = 0 . The characteristic polynomial of B is x 2 (x 1) = 0 . The minimal polynomial of B is x 2 (x 1) = 0 .

0

0

0

0

0

.

1

is x 2 ( x − 1) = 0 . The minimal polynomial of B is

Some Results

Result: Similar matrices have the same minimal polynomial. Result: A square matrix A and its
Result: Similar matrices have the same minimal polynomial. Result: A square matrix A and its

Result: Similar matrices have the same minimal polynomial.

Result: A square matrix A and its transpose A t have the same minimal polynomial.

Result: A square matrix A is non-singular iff the constant term in the minimal polynomial of A is non-zero.

Result: A square matrix A is non-singular iff the constant term in the minimal polynomial of

Some Examples

1

The matrices A = 0

0

same characteristic polynomial but different minimal polynomials. The characteristic polynomial of A and B are x 2 (x 1) = 0 . The minimal polynomial of A is x(x 1) = 0 and the minimal polynomial of B is x 2 (x 1) = 0 .

0

0

0

0

0 and B = 0

1

0

0

0

0

0

1 have the

0

0

1

The matrices A = 0

0

same minimal polynomial but different characteristic

polynomials (and so are not similar). The minimal polynomial of

A and B are (x 1)(x 2) = 0 . The characteristic polynomial

of A is (x 1) 2 (x 2) = 0 and the characteristic polynomial of

B is (x 1)(x 2) 2 = 0.

1

2

0

2

0 and B = 0

0

1

0

0

2

0

0

2 have the

1

( x − 1)( x − 2) 2 = 0 . 1 2 0 2 0

Main Theorem

Cayley-Hamilton Theorem: Let A be an n × n matrix and let M (x) be the characteristic polynomial of A . Then, M (A) = 0 . In other words, the minimal polynomial divides the characteristic polynomial of A.

Note: The roots of the minimal and characteristic polynomials are same, though their multiplicities may differ.

Cayley-Hamilton Theorem: Let T be a linear operator on a finite dimensional vector space V and let M (x) be the characteristic polynomial of T . Then, M (T ) = 0 . In other words, the minimal polynomial divides the characteristic polynomial of T .

of T . Then, M ( T ) = 0 . In other words, the minimal

Diagonalizable

Definition: Let T be a linear operator on a finite dimensional vector space V . We say that T is diagonalizable if there is a basis B for V such that each vector of B is an eigenvector of T .

If B is an ordered basis for V consisting of only eigenvectors of

T , then the matrix representation of T relative to the basis B is

a diagonal matrix .

of only eigenvectors of T , then the matrix representation of T relative to the basis

Example

Let T : R 2 R 2 be the linear operator defined by

T x

y

=

1

3

2 x y

2

.

Observe that T (2, 3) = 4(2, 3) and T (1, 1) = (1)(1, 1). The set B = {v 1 = (2, 3), v 2 = (1, 1)} consisting of eigenvectors is an ordered basis for R 2 . Then,

[T] B = 4 0

1

0

which is a diagonal matrix.

consisting of eigenvectors is an ordered basis for R 2 . Then, [ T ] B

Generalization of previous slide example

Let T be an linear operator on an n -dimensional vector space V . Let B = {v 1 , v 2 , ··· , v n } be an ordered basis for V in which each v i is an eigenvector of T , then the matrix of T in the ordered basis B is diagonal. If T v i = λ i v i then

[T] B =

λ 1

0

.

.

.

0

0

λ 2

.

0

···

···

···

0

0

.

λ n

 

.

Note: We don’t require that the scalars λ 1 , ··· , λ n be distinct. We need only n -distinct eigenvectors that form a basis for V .

scalars λ 1 , ··· , λ n be distinct. We need only n -distinct eigenvectors

Result: Let V be a finite dimensional vector space over the field F . Let T be a linear operator on V and let A be its associated matrix relative to some basis B of V . Then, T (or A) is diagonalizable in F if and only if there exists an invertible matrix P in F such that P 1 AP is a diagonal matrix, that is, A is similar to a diagonal matrix.

matrix P in F such that P − 1 AP is a diagonal matrix, that is,

Example

Let T : R 2 R 2 be the linear operator defined by T (X) = AX for X = (x, y) R 2 where

A

=

1

3

2

2

.

Observe that T (2, 3) = 4(2, 3) and T (1, 1) = (1)(1, 1). Construct a matrix P whose column vectors are eigenvectors.

Set P = 2 3

1 1 . Then, P 1 =

1

5

3

5

1

5

2

5

. Then,

B = 4

0

1 0 = P 1 AP

which is a diagonal matrix. Therefore T (or A ) is diagonalizable.

B = 4 0 − 1 0 = P − 1 AP which is a diagonal

When T (or A ) is diagonalizable?

Theorem: Let V be a finite dimensional vector space V with dim V = n . Let T be a linear operator on V and let A be its associated matrix relative to some basis B of V . Let λ 1 , λ 2 , ··· , λ k be the distinct eigenvalues of T . Let W i = E λ i (A) denote the eigenspace associated with the eigenvalue λ i for i = 1, · · · , k . Then, the following are equivalent.

1 T (or A ) is diagonalizable.

2 The characteristic polynomial for T is

f(x) = (x λ 1 ) d 1 (x λ 2 ) d 2 ···(x λ k ) d k

and dim W i = d i . That is, the geometric multiplicity of each eigenvalue is equal to its algebraic multiplicity (i.e., regular).

3 dim W 1 + dim W 2 + · · · + dim W k = n = dim V .

algebraic multiplicity (i.e., regular). 3 dim W 1 + dim W 2 + · · ·

Example of Diagonalizable Matrix (Sec. 6.2 of H-K)

A = 1

5

3

6

4

6

6

2

4

The characteristic polynomial of A is given by

x 3 5x 2 + 8x 4

= (x 1)(x 2) 2 .

The algebraic multiplicity of the eigenvalue 1 is 1 . The geometric multiplicity of the eigenvalue 1 is 1 (Reason: Basis for E 1 (A) is (3, 1, 3)). The algebraic multiplicity of the eigenvalue 2 is 2 . The

geometric multiplicity of the eigenvalue 2 is 2 (Reason: Basis for E 2 (A) is (2, 1, 0) and (2, 0, 1)).

2

Set P = 1 1

3 0

3

2

0 . Then P 1 AP = D = 0

0

1

1

0

2

0

0

0 .

2

Set P =  − 1 1 3 0  3 2 0  . Then

Example of Non-Diagonalizable Matrix

A =

1

0

1

1

The characteristic polynomial of A is given by

(x 1) 2 .

The eigenspace E 1 (A) has a basis (1, 0) and the dimension of E 1 (A) = 1 .

But dim V = diagonalizable.

dim R 2 = 2 > 1 . Therefore, A is not

the dimension of E 1 ( A ) = 1 . But dim V = diagonalizable.