Вы находитесь на странице: 1из 10

Annexure-I

TERM PAPER

ENGINEERING MATHAMETICS-I
MTH101

Topic: What Are The Applications Of Eigen Values?

DOA:

DOR:

DOS:

Submitted to: Submitted by

Acknowledgement-I would like to express my appreciation to the persons and sources


that help me in the term paper. I am very thankful to my mam She helps me very much. . I
am highly thankful to Miss Chadni for her active support, valuable time and advice, whole-
hearted guidance, sincere cooperation and pains-taking involvement during the preparation of
the term paper. By her great help I could complete this term paper. I could understand
applications of eigen values by the help of some books also. I will also want to thank my
colleagues who always encouraged me to complete this term paper. At last I would like to
thank once more those all persons who help me, encouraged me very much.

Abstract Of Work Undertaken-The roots which satisfies characteristic


equation is known as Eigen Values.

Table Of Contents-
*Introduction
* Application of Eigen values
* Schrodinger equation
* Molecular orbitals

* Factor analysis

* Vibration analysis
* Eigenfaces

* Tensor of inertia

* Stress tensor

*Eigenvalues of a graph

* In matrix factorization

*Quantum mechanics
*Geology and glaciology

Introduction-When we talk about EigenValues,the most important term is come that is


matrix.So A matrix is a rectangular array of m.n number(or functions) arranged in m rows
(horizontal)and n columns l (vertical lines).These numbers known as elements or entries are
enclosed in brackets[ ]or ( ) or ║║.The order of such matrix is m×n and is said to be a
rectangular matrix.

Example-

In mathematics, a matrix (plural matrices, or less commonly matrixes) is a rectangular


array of numbers, such as

An item in a matrix is called an entry or an element. The example has entries 1, 9, 13,
20, 55, and 4. Entries are often denoted by a variable with two subscripts, as shown on
the right. Matrices of the same size can be added and subtracted entry wise and matrices
of compatible sizes can be multiplied. These operations have many of the properties of
ordinary arithmetic, except that matrix multiplication is not commutative, that
is, AB and BA are not equal in general. Matrices consisting of only one column or row
define the components of vectors, while higher-dimensional (e.g., three-dimensional)
arrays of numbers define the components of a generalization of a vector called a tensor.
Matrices with entries in other fields or rings are also studied.

Matrices are a key tool in linear algebra. One use of matrices is to represent linear
transformations, which are higher-dimensional analogs of linear functions of the
form f(x) = cx, where c is a constant; matrix multiplication corresponds
to composition of linear transformations. Matrices can also keep track of
the coefficients in a system of linear equations. For a square matrix,
the determinant and inverse matrix (when it exists) govern the behavior of solutions to
the corresponding system of linear equations, and eigenvalues and eigenvectors provide
insight into the geometry of the associated linear transformation.

Matrices find many applications. Physics makes use of matrices in various domains, for
example in geometrical optics and matrix mechanics; the latter led to studying in more
detail matrices with an infinite number of rows and columns. Graph theory uses matrices
to keep track of distances between pairs of vertices in a graph. Computer graphics uses
matrices to project 3-dimensional space onto a 2-dimensional screen. Matrix
calculus generalizes classical analytical notions such as derivatives of functions
or exponentials to matrices. The latter is a recurring need in solving ordinary differential
equations. Serialism and dodecaphonism are musical movements of the 20th century
that use a square mathematical matrix to determine the pattern of music intervals.

Now we come to the eigen values. In mathematics, eigenvalue, eigenvector,


and eigenspace are related concepts in the field of linear algebra. Eigenvalues, eigenvectors
and eigenspaces are properties of a matrix.

Have you ever heard the words eigenvalue and eigenvector? They are
derived from the German word "eigen" which means "proper" or "characteristic." An
eigenvalue of a square matrix is a scalar that is usually represented by the Greek letter
(pronounced lambda). As you might suspect, an eigenvector is a vector. Moreover, we
require that an eigenvector be a non-zero vector, in other words, an eigenvector can not be the
zero vector.

Let A be a square matrix of order n×n.Let X be a column matrix of order


n×1,

Then exist such that

AX= X

Where is called eigen value of A

X is called eigen vector corresponding to eigen value .


Characteristic equation [A- I] =0

Order of characteristic equation =order of matrix

An example of finding eigen values is given below-

Let's find both of the eigen values of the matrix

So the eigen values are -6=0

=6

And -1=0

=1

So there are two eigen values of given matrix are 1,6.

Sum of the eigen values =sum of the principle diagonal elements of matrix

Example-

on solving we get eigen values are 6,1

Sum of eigen values =sum of diagonal elements

6+1=3+4

7=7

Product of eigen values = determinant of given matrix

Example-

on solving we get eigen values are 4,-4.

Product of eigen values = determinant of given matrix


4×(-4)= -4-12

-16= -16

There are some properties of eigen values which is given below-

Property 1- If there is a matrix A. Then the eigen value of A and AT are same.

Property 2-If eigen values of A are 1 , 2 then eigen values of A-1 are 1/ 1,1/ 2.

Property 3-If eigen values of A are 1, 2 then eigen values of kA will be k 1 ,k 2 and eigen
vectors will be same.

Property 4-If eigen values of A are 1 , 2 .Then eigen values of An will be n


1, 2
n

Property 5-Two eigen vectors X and Y are said to be orthogonal vectors if

XTY=0 or YTX=0

Property 6-The eigen value of diagonal matrices, upper triangular matrices and lower
triangular matrices are given by diagonal elements.

Property 7- Algebric multiplicity of an eigen value is the order of the eigen value as a root
of the characteristic polynomial.

Property 8- Geometric multiplicity of is the number of linearly independent eigen vectors


corresponding to .

Application of Eigen values-

Schrodinger equation- An example of an eigenvalue equation where the


transformation T is represented in terms of a differential operator is the time-independent
Schrodinger equation in quantum mechanics

where H, the Hamiltonian, is a second-order differential operator and ψE, the wave function,
is one of its eigenfunctions corresponding to the eigenvalue E, interpreted as its energy.

However, in the case where one is interested only in the bound state solutions of the
Schrödinger equation, one looks for ψE within the space of square intergrable functions.
Since this space is a Hilber Space with a well-defined scalar product, one can introduce
a basis set in which ψE and H can be represented as a one-dimensional array and a matrix
respectively. This allows one to represent the Schrödinger equation in a matrix form.
Braket notation is often used in this context. A vector, which represents a state of the
system, in the Hilbert space of square integrable functions is represented by . In
this notation, the Schrödinger equation is:

where is an eigenstate of H. It is a self adjoint operator, the infinite


dimensional analog of Hermitian matrices . As in the matrix case, in the equation
above is understood to be the vector obtained by application of the
transformation H to .
Molecular orbitals- In quantum mechanics, and in particular
in atomic and molecular physics, within the Hatree-Fock theory, the
atomic and molecular orbitals can be defined by the eigenvectors of the Fock
operator. The corresponding eigenvalues are interpreted as ionization potentials
via Koopman’s Theorem In this case, the term eigenvector is used in a somewhat
more general meaning, since the Fock operator is explicitly dependent on the
orbitals and their eigenvalues. If one wants to underline this aspect one speaks of
nonlinear eigenvalue problem. Such equations are usually solved by an iteration
procedure, called in this case self consistent field method. In quantum chemistry
one often represents the Hartree–Fock equation in a non-orthogonal basis set This
particular representation is a generalized eigenvalue problem called Roothan
Equations.
Factor analysis- In factor analysis, the eigenvectors of a covariance
matrix or correlation matrix correspond to factors, and eigenvalues to the variance explained
by these factors. Factor analysis is a statistical technique used in the social sciences and
in marketing, product management, operations research, and other applied sciences that deal
with large quantities of data. The objective is to explain most of the covariability among a
number of observable random variables in terms of a smaller number of unobservable latent
variables called factors. The observable random variables are modeled as linear
combinations of the factors, plus unique variance terms. Eigenvalues are used in analysis
used by Q-methodology software; factors with eigenvalues greater than 1.00 are considered
significant, explaining an important amount of the variability in the data, while eigenvalues
less than 1.00 are considered too weak, not explaining a significant portion of the data
variability.
Vibration analysis- Eigenvalue problems occur naturally in the vibration analysis of
mechanical structures with many degree of freedom. The eigenvalues are used to determine
the natural frequencies of vibration, and the eigenvectors determine the shapes of these
vibrational modes. The orthogonality properties of the eigenvectors allows decoupling of the
differential equations so that the system can be represented as linear summation of the
eigenvectors. The eigenvalue problem of complex structures is often solved using finite
element analysis.
Eigenfaces- In image processing, processed images of faces can be seen as vectors whose
components are the brightnesses of each pixel The dimension of this vector space is the
number of pixels. The eigenvectors of the covariance matrix associated with a large set of
normalized pictures of faces are called eigenfaces this is an example of principal components
analysis. They are very useful for expressing any face image as a linear combination of some
of them. In the facial recognition branch of biometrics, eigenfaces provide a means of
applying data compression to faces for identification purposes. Research related to eigen
vision systems determining hand gestures has also been made.

Similar to this concept, eigenvoices represent the general direction of variability in human
pronunciations of a particular utterance, such as a word in a language. Based on a linear
combination of such eigenvoices, a new voice pronunciation of the word can be constructed.
These concepts have been found useful in automatic speech recognition systems, for speaker
adaptation.
Tensor of inertia- In mechanics, the eigenvectors of the inertia tensor define the principal
axes of a rigid body. The tensor of inertia is a key quantity required in order to determine the
rotation of a rigid body around its center of mass.
Stress tensor- In solid mechanics, the stress tensor is symmetric and so can be
decomposed into a diagonal tensor with the eigenvalues on the diagonal and eigenvectors as a
basis. Because it is diagonal, in this orientation, the stress tensor has no shear components;
the components it does have are the principal components.
Eigenvalues of a graph- In spectral graph theory, an eigenvalue of a graph is defined as
an eigenvalue of the graph's adjacency matrix A, or (increasingly) of the
graph's Laplacian matrix, which is either T−A (sometimes called the Combinatorial
Laplacian) or I−T -1/2AT −1/2 (sometimes called the Normalized Laplacian), where T is a
diagonal matrix with Tv, v equal to the degree of vertex v, and in T −1/2, the v, vth entry is
deg(v)-1/2. The kth principal eigenvector of a graph is defined as either the eigenvector
corresponding to the kth largest or kth smallest eigenvalue of the Laplacian. The first
principal eigenvector of the graph is also referred to merely as the principal eigenvector.

The principal eigenvector is used to measure the centrality of its vertices. An example
is Google's PageRank algorithm. The principal eigenvector of a modified adjacency matrix of
the World Wide Web graph gives the page ranks as its components. This vector corresponds
to the stationary distribution of the Markov chain represented by the row-normalized
adjacency matrix; however, the adjacency matrix must first be modified to ensure a stationary
distribution exists. The second smallest eigenvector can be used to partition the graph into
clusters, via spectral clustering. Other methods are also available for clustering.
In matrix factorization- Eigen Values can be used in matrix factorization. They have
applications in areas of applied mathematics as diverse as economics and quantum
mechanics.

Mathematical economics refers to the application of mathematical methods to


represent economic theories and analyze problems posed in economics. It allows formulation
and derivation of key relationships in a theory with clarity, generality, rigor, and simplicity
Much of classical economics can be presented in simple geometric terms or
elementary mathematical notation. Mathematical economics, however, conventionally makes
use of calculus and matrix algebra in economic analysis in order to make powerful claims
that would be more difficult without such mathematical tools. These tools are prerequisites
for formal study, not only in mathematical economics but in contemporary economic theory
in general. Economic problems often involve so many variables that mathematics is the only
practical way of attacking and solving them. Alfred Marshall argued that every economic
problem which can be quantified, analytically expressed and solved, should be treated by
means of mathematical work.

Quantum mechanics- Quantum mechanics had enormous success in explaining many of


the features of our world. The individual behaviour of the subatomic particles that make up
all forms of matter—electrons, protons,neutrons, photons and others—can often only be
satisfactorily described using quantum mechanics. Quantum mechanics has strongly
influenced string theory, a candidate for a theory of everything and the multiverse hypothesis.

Quantum mechanics is important for understanding how individual atoms combine


covalently to form chemicals or molecules. The application of quantum mechanics
to chemistry is known as quantum chemistry. (Relativistic) quantum mechanics can in
principle mathematically describe most of chemistry. Quantum mechanics can provide
quantitative insight into ionic and covalent bonding processes by explicitly showing which
molecules are energetically favorable to which others, and by approximately how much. Most
of the calculations performed in computational chemistry rely on quantum mechanics.
Geology and glaciology-In geology, especially in the study of glacial till eigenvectors
and eigenvalues are used as a method by which a mass of information of a clast fabric's
constituents' orientation and dip can be summarized in a 3-D space by six numbers. In the
field, a geologist may collect such data for hundreds or thousands of clasts in a soil sample,
which can only be compared graphically such as in a Tri-Plot (Sneed and Folk) diagram , or
as a Stereonet on a Wulff Net . The output for the orientation tensor is in the three orthogonal
(perpendicular) axes of space. Eigenvectors output from programs such as Stereo32 are in
the order E1 ≥ E2 ≥ E3, with E1 being the primary orientation of clast orientation/dip, E2 being
the secondary and E3 being the tertiary, in terms of strength. The clast orientation is defined
as the eigenvector, on a compass rose of 360°. Dip is measured as the eigenvalue, the
modulus of the tensor: this is valued from 0° (no dip) to 90° (vertical). The relative values
of E1, E2, and E3 are dictated by the nature of the sediment's fabric. If E1 = E2 = E3, the fabric
is said to be isotropic
Core Chapters-
Quantum Mechanics

Quantum Mechanics- Much of modern technology operates at a scale where quantum effects
are significant. Examples include the laser, the transistor (and thus the microchip),
the electron microscope, and magnetic resonance imaging. The study of semiconductors led
to the invention of the diode and the transistor, which are indispensable for
modern electronics.

Researchers are currently seeking robust methods of directly manipulating quantum states.
Efforts are being made to develop quantum cryptography, which will allow guaranteed secure
transmission of information. A more distant goal is the development of quantum computers
which are expected to perform certain computational tasks exponentially faster than
classical computers. Another active research topic is quantum teleportation, which deals with
techniques to transmit quantum information over arbitrary distances.

Quantum tunnelling is vital in many devices, even in the simple light switch, as
otherwise the electrons in the electric current could not penetrate the potential barrier made
up of a layer of oxide. Flash memory chips found in USB drives use quantum tunnelling to
erase their memory cells.
Reference Cited- Higher Engineering Mathematics By B.V Ramana and Internet.

Вам также может понравиться