Академический Документы
Профессиональный Документы
Культура Документы
49
What you will find in this Chapter?
Pre requisites:
3.1 Introduction:
th
R.C matrix is a square matrix in which i row, for all integer value of i, is orthogonal
to i th column. Though the set of all R.C matrices is a sub-class of square matrices but refrains
to obey some of the basic tenets of matrix algebra. Member matrices of the set of R.C
matrices, except the null matrix, are always non-singular and they disguise many invincible
characteristics seemingly common in nature. It has wide open many avenues for further
research work.
We introduce R.C matrix and try to unveil some of its excellent salient features. These
are the features which have become known to us on looking at its basic structure from
different angles.
We give an illustration;
1 2 −3
A= 3 is an R.C matrix.
1 2 −6
We write its general form. A R.C matrix A of order × from the set JJn is as follows.
1 1
50
…
… ∈
… … … …
A= … …
…( ) + ∑ = 0 "#$ % ≠ ’, ) = 1 *#
We have the R.C property; ..(1)
. (,-) . + , , + , , + , , + … … … . = 1
-. .- -/ /- -0 0-
[e.g ]
We have R.C matrices of higher order and even can be constructed by extending R.C
property from a given R.C matrix of order 3 × 3.
We have developed an elegant method of extending a R.C matrix of order nxn to the
next higher order matrix of order ( + 1) × ( + 1). We shall discuss the same in the section
to follow. We, just for citation purpose, write R.C matrices of order 3 × 3 and 4 × 4.
−1
1
1/2 −1/2 7= 9 1 −1 2 3 :
8
A= and 3
3/2 3 3 −3
1 1
1 1 −1
1 1
51
[Extension of a R.C matrix from 2x2 to any R.C matrix of subsequent higher order has been
shown in the annexure. Readers are requested to please go through the technical proceedings.]
…
… ∈
… … … … …
A=
…
…
( ) + ∑ = 0 "#$ % ≠ ’, ) = 1 *#
; <
A= $ ..(2)
= > ?
Matrix as a standard matrix and treat this as a R.C matrix with all real entries.
@ A
[If all real entries are zero, then it is a null matrix and hence defining property of an R.C
matrix permits a Null Matrix in the category of a R.C matrix.
0 0 0
i.e 0 -- A null matrix is, by definition, a R.C matrix ..(3)
1 = 0 0 0
We+have,
;= +by definition
<@ = 0 of R.C matrix for matrix A as in (2), the following conditions. ..
0 0
(4)
;= + > + A? = 0 ..(5)
<@ + A? + $
= 0 ..(6)
We shall write
B = ;=, 7 = <@, 4 C = A? ..(7)
+ B + 7 = 0, > + B + C = 0, 4 $
i.e. + 7+ C =0
From the three equations written F
above, we derive
+ > – $ = −2B ⟹ B = Ga
+ y – r K. In the same way we can write two more
equations. All three are listed below.
52
B = ;= = (− − > + $ )P
N
L7 = <@ = (− +> − $ )
C = A? = ( − > − $ )N ..(8)
O
M
Using the above relations those we have derived, we will prove some important notions in
terms of theorems.
Theorem 01: A R.C matrix with real entries, except a null matrix, cannot be a
symmetric matrix.
; <
Proof: Let the R.C matrix be A = @ $ as defined by (2) If it is a symmetric matrix
= > ?
then each one QR, ST, 4 UV must be positive but can never be negative.
A + y – r
Using property given in (8), as ;= > 0, we have a <0
− y + r + y +
In the same way as @< > 0 , we have a < 0 and A? > 0 gives −a
r < 0.
+ y
Adding all the results obtained above we have a + r < 0 which is possible for any
real values of ‘a, y, and r ’.
This implies that each one of ;=, <@, 4 A? = 0. [Which can make each one of ‘ a, y, and
+ y
r’= 0 and hence in turn a + r = 0]
This proves that except a null matrix a R.C matrix cannot be symmetric one. This proves the
theorem.
Lemma: On the same lines except for a non-null R.C matrix cannot be a skew
symmetric matrix.
Proof: we must have at least one of ;=, @<, 4 A? < 0 ; while two of them
+ are
y less or
equal to zero. This in turn implies that, on addition of 2;=, 2@<, 4 2A? , a + r ≤ 0;
which is not true. If it is a non-symmetric one then by its basic property
+ y +each
r one of diagonal
elements , ,, Y, ,Z[ \ = 1. This, in turn, implies that a < 0; which is not
possible for real entries matrix. This proves the lemma.
53
In the next section we will prove some defining features of 3x3 R.C matrices and then will
establish that those can be extended for the R.C matrices of higher order also.
Theorem 02: A R.C matrix, except a null matrix, is always a non-singular matrix.
Proof: By definition we accept that a null matrix is a R.C matrix and hence in that case R.C
matrix is a singular one.
The case when a R.C matrix is a non-null one, by the defining property of R.C matrix we
claim its non-singularity.
th th
[By definition of R.C matrix its i row vector is orthogonal to i column vector only making
the result of their dot product/ inner product equal to zero. Had it been orthogonal to any other
th
column except the i one then the column vectors become linearly dependent which results in
singularity of the matrix.]
We conclude that a R.C matrix, except a null matrix, is always a non-singular matrix.
Theorem 3: In the 3x3 R.C matrix with real entries product of at least two off diagonal
(principal) entries are negative.
[This is an important clue to construct a 3x3 R.C matrix. The same concept can be extended to
the R.C matrices of higher order.]
; <
@ $ as it is defined by (2)
Proof: Let us consider the R.C matrix A = >
= ?
As mentioned in the statement we want to prove that at least two of the terms QR, ST, 4 UV
A
must be negative; this is rather one of the most important salient features of R.C matrix. The
theorem targets on establishing that at least two of QR, ST, 4 UV are strictly less than zero
Being an R.C matrix the entries follow R.C property. We write relations (4), (5), and (6) as
below.
+ ;= + <@ = 0
;= ++ >A? +
+ $A? = 0
<@
= 0
54
With this we enjoin the notations with above relation to get
+ B + 7 = 0, > + B + C = 0, 4 $ + 7 + C = 0
C = A? = ( − > − $ )N
O
M
From this junction we discuss different cases for B = ;=, 7 = <@, 4 C = A?.
This helps us conclude that B = ;=, 7 = <@, 4 C = A? ^^ > 0 is not possible for a
R.C matrix.
Case 2: Any two of A, B, and C > 0 and the remaining < 0 is not possible.
The case 2 mentioned above supports the statement and hence the proof.
Case 5: Any two of A, B, and C are < 0 and the remaining one is > 0.
55
− y
Say A = bx < 0. This implies that −a + r <0
+ y
and B = cp < 0. This implies that −a − r <0
Adding them we get -2 a 2< 0 which is true. [∵ A is a real entry matrix.] In addition to this, for
− y
C = A? > 0 Implies a − r > 0. We have to show validity of the result. From the
It is the most important and useful notion in matrix algebra. For the given square
matrix A there exists a non-zero vector X such that for some value λ we have the matrix
equation bc = d c Satisfied. ‘d’ is called Eigen value and X is called the Eigen vector. In
this section we discuss salient features of Eigen values and Eigen vectors for the R.C matrix.
56
3.5.1 Introduction to important Preliminaries:
Before we proceed to enunciate our findings would like to mention certain peripherals
regarding the R.C matrix and Eigen value. This will simplify our proceedings. All these will
prove useful in the arguments proving the next theorems.
; <
Let us initially focus our attention on A = $
= > ? a R.C matrix with all real entries
defined by (2).
d , d . ,Z[ d @ A
- . /
Let be the Eigen values of A with corresponding non zero Eigen vectors
c-, c., ,Z[ c/.
[1] As the matrix A is an R.C matrix, by defining properties, we have the following results.
These results are already mentioned in earlier work but just to abridge we cite those at this
point.
+ ;= + <@ = 0, ;= + > + A? = 0, 4 <@ + A? + $ = 0
B = ;= = (− − > + $ )P
N
L7 = <@ = (− +> − $ )
C = A? = ( − > − $ )N
O
M
∴ +> +$
= −2 (bx + cp + qz ..(8)
d , d . ,Z[ d
- . /
[2] Also recalling the facts pertaining to Eigen values ; we write
d
(a) + d of+Eigen
Sum d =values
k\,Sl mn opl q,o\%R = , + Y + \
- . /
..(10)
d dSum
(b) + of
d product = (,Y
d + dofd Eigen − QR)
values + (,\
taken two−atTS) + (Y\ − UV)
a time
- . . / / -
..(11)
= of
d dProduct
(c) [lo. b =values
Eigen |b|
- .d/
..(12)
57
With this on hand we state some theorems.
Theorem 4: Eigen values of a R.C matrix with real entries are either all zero or exactly
one real.
Proof: As a null matrix is also a R.C matrix, we have all Eigen values are zero and hence the
proof. If the matrix A is not a null matrix then we proceed as follows.
Proof: If any one of the Eigen value is zero then it implies that |A| = 0.
This means that the R.C matrix is a singular matrix. This violates the defining property of the
Then one is real and different from zero while other two are complex conjugate of each
other.
58
3.5.2 Important Derivation:
In tune with above results we have up till now two important results--- (13) and (14).
d. + d . + d . = 1
- . /
From (13) we write that with no λ being zero.
⟹ d . + d . = −d. d , 4d
. / - . /
. We conclude that are complex conjugates of each other.
d = ∝ + )t d = ∝ −)t d. + d. = .(u . − v. ) = − d .
. / . / -
Let and . So we get .
∴ − d . = .(u . − v . ) ..
-
(15)
Deductions: The results established above will help deduce the following relations.
; <
For the R.C matrix A = $
= > ?
[1] Trace,
@ $A = x
w + w + w = + >+
(Say)
[3]Determinant,
w w w = |B| = det (B) = }
w + w + w =0
[4]
~6]w + w = 2 = x – w 4w – w = 2)t
and in connection with [1] above
59
] w = x − 2 ). a = (w − x)/2 w (2 ) + w w = y
[7 and
Gw – w K = 2 2 − 3 w = + − w = – −
[8] Using we have and
This conveys that knowing only the real Eigen value it is sufficient enough to write the
remaining two complex conjugate Eigen values.
C
[9] Fact: in a given R.C matrix there exists at least one column or a row say such that
for a non –zero real value ‘c ’ such that either C = <. C #$ = <. ; i.e. the column
or the row is a multiple of some real constant.
1 2 −3 −1
For B = 1 3 1
−6 the third column
C = 3 −2
2
1 1
3.5.3 Graphical Method of Approximating Real Eigen Value:
By now, it is well known that a non-null R.C matrix has only one non-zero real root
while the remainder two are complex conjugate. [At this stage we reiterate that A real entry
R.C matrix cannot be either symmetric or skew symmetric.]
The vision to shape this section is to locate graphically and approximate algebraically
the real root of the characteristic equation of the given R.C matrix. As we have discussed
many possible properties inter-linking the different Eigen values of a given R.C matrix, we
state here what we shall require at times. We need the first one (5) above in set (12); It is our
characteristic equation.
d/ – k d . + d– = 1
For real eigen root λ, the graph of f(λ) on set of perpendicular real axis, will intersect the x-
axis in a point, say x 1= λ ;1 its location is our objective.
d = 0, "(d) = − ℎa$a = [lo. |b| = d d d d , 4d
- . / . /
For where, as said earlier,
are complex Eigen values. Plotting this, we get the graph of a cubic curve. We parallel our
work citing a real R.C matrix.
60
1 2 −3 x = x$ <a = 6, y = 18, 4 } = −3
Let A = 1 3
2 −6 With
"(d) = d/ – 1d .
1
+ - d + / , "#$ d = 1, n(d) = /. This situation is graphed as below in
Figure: 3.5.1.The Figure: 3.5.2 shows its magnification on an interval about its intersection on
X-Axis.
During the time that we derived and critically reviewed the characteristics of R.C
matrices of dimensions 2 × 2 and onwards, we could find many interesting features. We
commit, we have searched a small area and still we enjoin our efforts inspired by a new result
we work upon. Excavating such unknown area may elaborate mathematically ignited minds.
All constructive suggestions are welcome.
Annexure: As discussed, we, in this section, will elaborate the technique of finding an
extension of a 2 × 2 R.C matrix to the R.C matrices of higher order. We begin with a simple
R.C matrix of order 2 × 2.
62
−1
2 3
1 1
Let us consider, A1 =
1 > = (1)= + 1 4 > = ( −1)= + 1
Let us consider the column system as [which shows
2
perpendicular lines in R space.]
in different positions
3 + 2−are the real values. It is so planned
3 (1) + that
(1) they satisfy R.C property.
2 3 (1) + @< = 0, 2− + A< = 0 ,
We have,
4 @< + A< + (< ) = 0
This gives us a free choice for selection of variables remaining within the given equation.
@ = 1, < = , A = 1, < = − 4 £a <a < = .
¢
We select
1/2 1
1/4 −1/2
2
The extended version of R.C matrix is now, A = 1
1
1 4 × 4.
Again on the same lines, this can be extended to a R.C matrix of the size
−1/2 1/2
1 5 −1
5 then Eigen
For example, if we consider an example of R.C matrix as B= 2 −2
2
values are obtained as,
|B − w⁄| = 0 ⇒ @ (w) = w − 8w + 32w + 128 = 0
11 7
63
Step: 01
d =1
p
Initial approximation of real Eigen vector is .This initial approximation will always
work as we know that for any real R.C matrix is with a Real Eigen value and a pair of
complex Eigen value. As a sum of square of Eigen values of R.C matrix is always zero, all
Eigen values of R.C matrix are zero if real Eigen value is zero.
@ (0) = 128
This gives information about height ‘h’ of characteristic equation in XY plane, which in turn
is always non zero as discussed above.
Characteristic equation of A is a cubic polynomial and using nature of graph we can claim a
point on X axis with exact height ‘-h’ (same magnitude but opposite in sign)
@ (w) = −128
So, equation is always consistent and gives a real solution.
@ (w) = −128 ⇒ @ (w) = w − 8w + 32w + 128 = −128
2 2
Step: 02
w = −1.76235
¤
Now we iterate same process with input into characteristic equation
@ (w) = 0
.
@ (−1.76235) = 41.28413657
@ (w) = −41.28413657
So, equation is always consistent and gives a real solution.
64
Then average of these two approximations is second approximation derived as,
w
w = ¤ F¤ = ⇒ d = −.. .ƒ --
+w (−1.76235 ) + (−2.74987) .
2 2
Step: 03
w = −2.25611
¤
Now we iterate same process with input into characteristic equation
@ (w) = 0
.
@ (−2.25611) = 3.60054850
@ (w) = −3.60054850
So, equation is always consistent and gives a real solution.
@ (w) = −3.60054850 ⇒ @ (w) = w − 8w + 32w + 128 = −41.28413657
2 2
Step: 04
w = −2.29865
¤
Now we will repeat our first step with input into characteristic equation
@ (w) = 0
.
@ (−2.29865 ) = 0.02727735
@ (w) = −0.02727735
So, equation is always consistent and gives a real solution.
@ (w) = −0.02727735 ⇒ @ (w) = w − 8w + 32w + 128 = −0.02727735
2
= −1.76235 w = −2.256112w = −2.29865
w , , and w ¢ = −2.29897
Conclusion:
65
Four approximations gives three decimal accurate solution. For further accuracy we can
repeat same stapes.
66