Вы находитесь на странице: 1из 7

Cady Bright

30.1.17

Invertible Matrix Theorem Proof

1. 2x2 - Invertible
I have a square matrix A = [ ], whose entries are constrained only by the following:
If a is 0, b and c are not 0;
If b is 0, a and d are not 0;
If c is 0, a and d are not 0;
If d is 0, c and b are not 0;
And the rows and vectors cannot be scaled versions or equal to each other.
Then, when I take the determinant, (det(A) = ad-bc) the entries are not equal to 0,
because there are no zero-vectors or zero-row-vectors, and entries do not cancel each other
out.
Therefore, when I take the inverse of matrix A (by switching the a and d entries, making
the c and b negative, and scaling all entries by 1/det(A)), the product is not completely nullified
as it would be if the determinant was equal to zero because all entries would have to be scaled
by 1/0, which doesn’t exist.
This inverse is proven correct when multiplied by matrix A because it will equal the
identity matrix. This in turn proves both that matrix A has two pivot points (the two 1’s in the
identity matrix) and can be row reduced to equal the identity matrix.
This reasoning, that if one of the above statements is correct pertaining to a square,
invertible matrix, all of them are, can be seen in the matrix below:

Matrix A = ​[ ]
Where the det(A) = -20, and it is row equivalent to the identity matrix through the following
row-reducing process:

=​[ ]​ R1: 2R2-R1

=​[ ]​ R2: 3/5R1-R2

=​[ ]​ R1: 1/10R1, R2: 1/2R2


=​ [ ]
There are two pivot positions - both 1 entries in the above identity matrix. The transverse of A is
also an invertible matrix through the same process:

A​T​ = ​[ ]
Where the det(A) = -20, and it is row equivalent to the identity matrix through the
following row-reducing process:

=​[ ]​ R2: 2R1-R2

=​[ ]​ R1: 3/5R2-R1

=​[ ]​ R2: 1/10R2, R1: 1/2R1

=​[ ]
There are two pivot positions - both 1 entries in the above identity matrix. Finally, A​T​ has
an inverse matrix:

Matrix A​T-1​ = ​[ ]
Matrix A also has an inverse matrix:

Matrix A​-1​ = ​ [ ]
2. 2x2 - Not Invertible
By the same reasoning as above, if one of the above statements is ​incorrect​ pertaining
to a square, invertible matrix, all of them are. This can be seen in the matrix below:

Matrix A = ​ [ ]
The determinant of this matrix, or det(A) = 0, and it is not row equivalent to the identity matrix,
as when you try to row-reduce it, you get a zero vector:

=​ [ ]​ R1: 2R2-R1

=​ [ ]
Thusly, there are no pivot positions. The transverse of A is also not an invertible matrix, by
proofing the matrix using the same process:

A​T​ = ​[ ]
In this matrix, det(A) = 0, and it is also not row equivalent to the identity matrix:

=​[ ]​ R2: 4R1-R2

=​[ ]
There are also no pivot positions. Finally, A​T​ has no inverse matrix:

Matrix A​T-1​ = ​[ ]​ [=​ ]


Matrix A also has an no inverse matrix:

Matrix A​-1​ = ​ [ ]
3. 3x3 - Invertible

I have a square matrix A = [ ], whose entries are constrained only by the following:
If a is 0, b, c, d, and g are not 0;
If b is 0, a, c, e, and h are not 0;
If c is 0, a, b f, and i are not 0;
If d is 0, a, g, e, and f are not 0;
If e is 0, b, h, d and f are not 0;
If f is 0, c, i, d, and e are not 0;
If g is 0, a, d, h and i are not 0;
If h is 0, b, e, g, and i are not 0;
If i is 0, g, h, f, and c are not 0;
And the rows and vectors cannot be scaled versions or equal to each other.
Then, when I take the determinant, det(A) = a(ei-fh)-b(df-gi)+c(dh-eg) the entries are not
equal to 0, because there are no zero-vectors or zero-row-vectors, and entries do not cancel
each other out.
Therefore, when I take the inverse of matrix A, the product is not completely nullified as it
would be if the determinant was equal to zero because all entries would have to be scaled by
1/0, which doesn’t exist.
This inverse is proven correct when multiplied by matrix A because it will equal the
identity matrix. This in turn proves both that matrix A has three pivot points (the three 1’s in the
identity matrix) and can be row reduced to equal the identity matrix.
This reasoning, that if one of the above statements is correct pertaining to a square,
invertible matrix, all of them are, can be seen in the matrix below:

Matrix A = ​[ ]
Where the det(A) = 13, and it is row equivalent to the identity matrix through the following
row-reducing process:

=​[ ]​R2: 2R1+R3

=​[ ]​R1: R2-R1

=​[ ]​R3: 6R2-R3

=​[ ]​R3<->R2

=​[ ]​R3: 39/13R2-R3

=​[ ]​ R2: 1/13R2

=​[ ]​ R1: R2+R1


=​ [ ]
There are three pivot positions - all of the 1 entries in the above identity matrix. The transverse
of A is also an invertible matrix through the same process:

A​T​ = ​[ ]
Where the det(A) = 13, and it is row equivalent to the identity matrix through the
following row-reducing process:

=​[ ]​
R3: 2R1+R3

=​[ ]​R1: 3/5R2-R1

=​[ ]​
R2<->R3

=​[ ]​R3: 3R2+R1

=​[ ]​R1: 2R3+R1

=​[ ]​ R1: 1/13R1

=​[ ]​ R2: 3R1-R2

=​[ ]​ R3: 7R1+R3


=​[ ]
There are three pivot positions - all of the 1 entries in the above identity matrix. Finally,
A​T​ has an inverse matrix:

Matrix A​T-1​ = ​[ ]
Matrix A also has an inverse matrix:

Matrix A​-1​ = ​ [ ]
4. 3x3 - Not Invertible
By the same reasoning as above, if one of the above statements is ​incorrect​ pertaining
to a square, invertible matrix, all of them are. This can be seen in the matrix below:

Matrix A = ​ [ ]
Where the det(A) = 0, and it is row equivalent to the identity matrix through the following
row-reducing process:

=​ [ ]​ R1: R1-R3

=​ [ ]
Thusly, there are no pivot positions. The transverse of A is also not an invertible matrix, by
proofing the matrix using the same process:

A​T​ = ​[ ]
Where the det(A) = 0, and it is row equivalent to the identity matrix through the following
row-reducing process:
=​ [ ]​ R3: -4R1+R3

=​ [ ]​ R2: 2R3+R2

=​ [ ]​ R2: 1/3R2, R3: 1/3R3

=​ [ ]​ R1: R1-R3

=​ [ ]​ R1: 2R2-R1

=​ [ ]
There are also no pivot positions. Finally, A​T​ has no inverse matrix:

Matrix A​T-1​ = ​[ ]​ [
=​ ]
Matrix A also has no inverse matrix:

Matrix A​-1​ = ​[ ]​ [
=​ ]

Вам также может понравиться