Вы находитесь на странице: 1из 13

1.

6 BUSINESS MATHEMATICS AND STATISTICS


Ques.1.
a. What is a matrix? Explai the a!!iti" a! multipli#ati" "$ matri#es %i&i% a
example "$ ea#h.
'. (i! the i&erse "$ the $"ll")i% matrix*
+ , 6
A - 1 . /
0 , 6
As)er 1*
a. What is a matrix? Explai the a!!iti" a! multipli#ati" "$ matri#es %i&i% a
example "$ ea#h.
As)er a*
A matrix is a rectangular array of entries or elements, which can be variables, constants,
functions, etc. A matrix is denoted by an uppercase letter, sometimes with a subscript which
denotes the number of rows by the number of columns in the matrix. ("r example,
A
mn
denotes a matrix with the name A, which has m rows and n columns. The entries in a
matrix are denoted by the name of the matrix in lowercase, with subscripts which identify
which row and column the entry is from. Matrices can also be represented with single letters
A, I, or with a single subscripted variable (ai ! bi" if and only if ai ! bi for all i, , which
says symbolically that two matrices are e#ual when their corresponding elements are e#ual.
The entries in our above example would be denoted in the form a
i
, which would mean that
the entry is in row i, column . ("r example, an entry denoted as a
$%
would be in the second
row, in the third column (counting from the upper left, of course." The entries in the matrix
are usually enclosed in rounded brackets, although they may also be enclosed in square
brackets.
The $"ll")i% are examples "$ matri#es*
$ & ' $ & ( a
&&
))))a
&n
A ! *+ ( $ , ! $ I
$
! ( & -
m.n
!
& a
m&
)))..a
mn
There are some special types of matrices. A s1uare matrix has the same number of rows as
columns, and is usually denoted A
nxn
. A !ia%"al matrix is a s#uare matrix with entries only
along the diagonal, with all others being /ero. A diagonal matrix whose diagonal entries are
all & is called an i!etit2 matrix. The identity matrix is denoted I
n
, or simply I. The /ero
matrix 0
mn
is an matrix with m rows and n columns of all /eroes.
1iven two matrices A and ,, they are considered e#ual (A!," if they are the same si/e, with
the exact same entries in the same locations in the matrices.
Matrix 3perati"s*
2nder limited circumstances matrices can be added, subtracted, and multiplied. Two matrices
can be added or subtracted only if they are the same size. Then (a
i
" 3 or * (b
i
" ! (a
i
" ! (b
i
",
which says that the sum or difference of two matrices is the matrix formed by adding or
subtracting the corresponding elements.
These rules for adding and subtracting matrices give matrix a!!iti" the same properties as
ordinary addition and su'tra#ti". It is closed (among matrices of the same si/e",
commutative, and associative. There is an additive identity (the matrix consisting entirely of
/eros" and an additive inverse.
Multipli#ati" is much tric4ier. For multiplication to be possible, the matrix on the left
must have as many columns as the matrix on the right has rows. That is, one can multiply
an m n matrix by an n # matrix but not an m n matrix by an p # matrix if p is not e#ual
to n. The product of an m n matrix and an n # matrix will be an m # matrix.
5et6s understand them in detail as follows7
ADDITI3N 3( A MAT4I5*
Addition of matrices is very similar to addition of vectors. In fact, a vector can generally be
considered as a one column matrix, with nrows corresponding to the n dimensions of the
vector. In order to add matrices, they must be the same si/e, that is, they must have an e#ual
number of rows, and an e#ual number of columns. 8e then add matching elements as shown
below,
$ & ( $
A ! *9 % , ! & *%
$ *$ % *$
$3( &3$ $ %
A3, ! *93& %3(*%" ! *% (
$3% *$3(*$" : *9
Matrix addition has the following properties7
&" A 3 , ! , 3 A (commutative"
$" A 3 (, 3 -" ! (A 3 ," 3 - (associative"
SCA6A4 MU6TI76ICATI3N
;calar multiplication of matrices is also similar to scalar multiplication of vectors. The scalar
is multiplied by each element of the matrix, giving us a new matrix of the same si/e.
<xamples are shown below,
$ &
A ! *9 % , 4 ! $, m ! *%
$ *$
$ & $($" $(&" 9 $
4A ! $ *9 % ! $(*9" $(%" ! *+ =
$ *$ $($" $(*$" 9 *9
$ & *%($" *%(&" *= *%
mA ! *% *9 % ! *%(*9" *%(%" ! &$ *>
$ *$ *%($" *%(*$" *= =
;calar multiplication has the following properties7
&" c(A 3 ," ! cA 3 c, (distributive", where c is a scalar
$" (c 3 d"A ! cA 3 dA (distributive", where c, d are scalars
%" c(dA" ! (cd"A
Matrix subtraction, similar to vector subtraction, can be performed by multiplying the matrix
to be subtracted by the scalar *&, and then adding it. ;o, A * , ! A 3 (*," ! (*," 3 A. ;o li4e
adding matrices, subtracting matrices re#uires them to be the same si/e, and then operating
on the elements of the matrices.
MAT4I5 MU6TI76ICATI3N
Two matrices can also be multiplied to find their product. n order to multiply two matrices,
the number of columns in the first matrix must equal the number of rows in the second
matrix. ;o if we have A
$%
and ,
%9
, then the product A, exists, while the product ,A does
not. This is one of the most important things to remember about matrix multiplication. Matrix
multiplication is not commutative. That is, A, ? ,A. <ven when both products exist, they do
not have to be (and are not usually" e#ual. Additional properties of matrix multiplication are
shown below.
!atrix multiplication involves multiplying entries along the rows of the first matrix with
entries along the columns of the second matrix. ("r example, to find the entry in the first
row and first column of the product, A,, we would ta4e entries from the first row of A with
the first column from ,. 8e ta4e the first entry in that row, and multiply (regular
multiplication of real numbers" it with the first entry in the column in the second matrix. 8e
do that with each entry in the row@column, and add them together. ;o, entry ab
i
! a
i&
b
&
3
a
i$
b
$
3 ... 3 a
im
b
m
. This seems complicated, but it is fairly easy to see visually. 8e continue
this process for each entry in the product matrix, multiplying respective rows in A by columns
in ,. ;o, if the si/e of A is mn, and the si/e of , is np, then the si/e of the product A, is
mp. 8e show this process below7
8e now show some properties of matrix multiplication, followed by a few examples7
&" A(,-" ! (A,"- (associative"
$" A(, 3 -" ! A, 3 A- (left distributive"
%" (A 3 ,"- ! A- 3 ,- (right distributive"
9" 4(A," ! (4A", ! A(4,", where 4 is a scalar
:" A, ? ,A ("t commutative"
Example*
Matrix multiplication can also be written in exponent form. This re#uires that we have a
s#uare matrix. 5i4e real number multiplication and exponents, A
n
means that we multiply A
together n times. ;o A
$
! AA, A
:
! AAAAA, and so on. 8e should note, however, that unli4e
real number multiplication, A
$
! ( does "t imply that A ! (. The same is true for higher
exponents.
Questi" 1*
'. (i! the i&erse "$ the $"ll")i% matrix*
+ , 6
A - 1 . /
0 , 6
As)er*
8A8 - Det A - + . / 9, 1 / :6 1 .
, 6 0 6 0 ,
8A8 - + ;.<6 = /<,> 9, ;1<6 = 0</> :6 ;1<, = 0<.>
- + ;0?90+> 9, ;69+,> :6;,91.>
- +;9+> 9,;91/> :6 ;911>
- 9, : @+9 66
8A8 - +
3 ;.<6 9 ,</> 9 ;1<6 9 0</> : ;1<, 9 0<.>
A
91
- A 9 ;,<6 9 ,<6> : ;+<6 = 0<6> 9 ;+<, = 0<,>
: ;,</ = .<6> 9 ;+</ = 1<6> : ;+<. = 1<,>

3 ;9+> 9 ;91/> : ;911> 9+ :1/ 911
A
91
- A 9 ;?> : ;96> 9 ;9,> - A ? 96 :,
: ;+> 9 ;1?> : ;6> + 91? :6
9+ ? + 9+ ? + 9+B+ ?B+ +B+
A
91
- 1B8A8 :1/ 96 91? - A 1/ 96 91? - 1/B+ 96B+ 91?B+
911 :, :6 911 , 6 911B+ ,B+ 6B+
He#eC I&erse "$ $"ll")i% matrix is
+ , 6
A - 1 . /
0 , 6
91 ? 1
A
91
- D 90 9.
911B+ + 0
Questi" ,*
a. Explai the #"rrelati" 'et)ee t)" &aria'lesC H") is it measure!?
'. The $"ll")i% ta'le %i&es the sales i laEh rupees a! the pr"$its i laEh rupees.
(i! 7ears"Fs #"rrelati" #"e$$i#iet.
5 - Sales* +? +6 0+ ,, .. 6? @+ /? D? 1?+
G - 7r"$its* , 6 6 / 1? 11 10 1, ++ +6
Als" $"rm the re%ressi" e1uati"s. What is the expe#te! pr"$it i$ the sales is 1.?
6aEhs.
As)ers*
a. Explai the #"rrelati" 'et)ee t)" &aria'lesC H") is it measure!?
As)er a*
A #"rrelati" is the measurement of the relationship between two variables. These variables
already occur in the group or population and are not controlled by the experimenter.
-orrelation is a statistical measurement of the relationship between two variables. Aossible
correlations range from 3& to B&. -orrelation is a statistical techni#ue that can show whether
and how strongly pairs of variables are related.
("r example, height and weight are relatedC taller people tend to be heavier than shorter
people. The relationship isnDt perfect. Aeople of the same height vary in weight, and you can
easily thin4 of two people you 4now where the shorter one is heavier than the taller one.
Eonetheless, the average weight of people :D:DD is less than the average weight of people :D=DD,
and their average weight is less than that of people :D'DD, etc. "orrelation can tell you #ust
how much of the variation in peoples$ weights is related to their heights.
There are several different correlation techni#ues. The ;urvey ;ystemDs optional ;tatistics
Module includes the most common type, called the %earson or product&moment correlation.
The module also includes a variation on this type called partial correlation. The latter is
useful when you want to loo4 at the relationship between two variables while removing the
effect of one or two other variables.
The main result of a correlation is called the #"rrelati" #"e$$i#iet (or FrF". It ranges from
*&.( to 3&.(. The closer r is to 3& or *&, the more closely the two variables are related.
f r is close to ', it means there is no relationship between the variables.
f r is positive, it means that as one variable gets larger the other gets larger. A positive
correlation is a direct relationship where as the amount of one variable increases, the amount
of a second variable also increases.
f r is negative it means that as one gets larger, the other gets smaller (often called an
FinverseF correlation". In a negative correlation, as the amount of one variable goes up, the
levels of another variable go down.
8hile correlation coefficients are normally reported as r ! (a value between *& and 3&",
s#uaring them ma4es then easier to understand. The square of the coefficient (or r square) is
equal to the percent of the variation in one variable that is related to the variation in the
other. After s#uaring r, ignore the decimal point. *xample, An r of .: means $:G of the
variation is related (.: s#uared !.$:". An r value of .' means 9>G of the variance is related (.'
s#uared ! .9>".
A correlation report can also show a second result of each test & statistical significance. In
this case, the significance level will tell you how li4ely it is that the correlations reported may
be due to chance in the form of random sampling error. If you are wor4ing with small sample
si/es, choose a report format that includes the significance level. This format also reports the
sample si/e.
A key thing to remember when working with correlations is never to assume a correlation
means that a change in one variable causes a change in another. ;ales of personal
computers and athletic shoes have both risen strongly in the last several years and there is a
high correlation between them, but you cannot assume that buying computers causes people
to buy athletic shoes (or vice versa".
The second caveat is that the %earson correlation technique works best with linear
relationships+ as one variable gets larger, the other gets larger (or smaller) in direct
proportion. It does not wor4 well with curvilinear relationships (in which the relationship
does not follow a straight line". An example of a #ur&iliear relati"ship is age and health
care. They are related, but the relationship doesnDt follow a straight line. Houng children and
older people both tend to use much more health care than teenagers or young adults. Multiple
regressions (also included in the ;tatistics Module" can be used to examine curvilinear
relationships, but it is beyond the scope of this article.
The %earson %roduct&!oment "orrelation "oefficient (r), or correlation coefficient for
short is a measure of the degree of linear relationship between two variablesC usually
labelled I and H. 8hile in regression the emphasis is on predicting one variable from the
other, in correlation the emphasis is on the degree to which a linear model may describe
the relationship between two variables. In regression the interest is directional, one variable
is predicted and the other is the predictorC in correlation the interest is non*directional, the
relationship is the critical aspect.
The computation of the correlation coefficient is most easily accomplished with the aid of a
statistical calculator. The value of r was found on a statistical calculator during the estimation
of regression parameters in the last chapter. Although definitional formulas will be given later
in this chapter, the reader is encouraged to review the procedure to obtain the correlation
coefficient on the calculator at this time.
The correlation coefficient may ta4e on any value between plus and minus one.
The sign of the correlation coefficient (3 , *" defines the direction of the relationship, either
positive or negative.
UNDE4STANDINH AND INTE474ETINH THE C344E6ATI3N C3E((ICIENT
The correlation coefficient may be understood by various means, each of which will now be
examined in turn.
S#atter pl"ts*
The scatter plots presented below perhaps best illustrate how the correlation coefficient
changes as the linear relationship between the two variables is altered. 8hen r!(.( the points
scatter widely about the plot, the maority falling roughly in the shape of a circle. As the
linear relationship increases, the circle becomes more and more elliptical in shape until the
limiting case is reached (r!&.(( or r!*&.((" and all the points fall on a straight line.
A number of scatter plots and their associated correlation coefficients are presented below in
order that it may better help in estimating the value of the correlation coefficient based on a
scatter plot in the associated computer exercise.
r ! &.(( r ! *.:9 r ! 3.+:
Sl"pe "$ the 4e%ressi" 6ie "$ I9s#"res
The correlation coefficient is the slope (b" of the regression line when both the I and H
variables have been converted to /*scores. The larger the si/e of the correlation coefficient,
the steeper is the slope. This is related to the difference between the intuitive regression line
and the actual regression line discussed above.
Jaria#e Iterpretati"
The s#uared correlation coefficient (r
$
" is the proportion of variance in H that can be
accounted for by 4nowing I. -onversely, it is the proportion of variance in I that can be
accounted for by 4nowing H. 0ne of the most important properties of variance is that it may
be partitioned into separate additive parts.
CA6CU6ATI3N 3( THE C344E6ATI3N C3E((ICIENT
The easiest method of computing a correlation coefficient is to use a statistical calculator or
computer program. ,arring that, the correlation coefficient may be computed using the
following formula7
-omputation using this formula is demonstrated below on some example data7 -omputation
is rarely done in this manner and is provided as an example of the application of the
definitional formula, although this formula provides little insight into the meaning of the
correlation coefficient.
I H /
I
/
H
/
I
/
H
&$ %% *&.(' *(.=& (.=:
&: %& *(.(' *&.%+ (.>'
&> %: *(.$( (.&: *(.(%
$: %' (.:: .>$ (.:&
%$ %' &.9$ .>$ &.%&
;2M ! %.9(
'. The $"ll")i% ta'le %i&es the sales i laEh rupees a! the pr"$its i laEh rupees.
(i! 7ears"Fs #"rrelati" #"e$$i#iet.
5 - Sales* +? +6 0+ ,, .. 6? @+ /? D? 1?+
G - 7r"$its* , 6 6 / 1? 11 10 1, ++ +6
Als" $"rm the re%ressi" e1uati"s. What is the expe#te! pr"$it i$ the sale is 1.?
6aEhs.
As)er '*










E!&(
5 ;Sales> G ;7r"$its> 5
+
G
+
5<G
$( 9 9(( &= +(
$= = ='= %= &:=
%$ = &($9 %= &>$
99 + &>%= =9 %:$
:: &( %($: &(( ::(
=( && %=(( &$& ==(
'$ &% :&+9 &=> >%=
+( &9 =9(( &>= &&$(
>( $$ +&(( 9+9 &>+(
&($ $= &(9(9 ='= $=:$
T"tal ./1 1+? ,?@,D 1/D/ /6@/
;o, we plug the numbers from this table into the formula, and do the math7
! J&(.+='+* (:+&.&$("K@;#rt JL&(.9('9>*(:+&"
$
ML&(.&+>+ B(&$("
$
MK
! J&'(=(K @ s#rtJ=>>$>.9:+(K
! &'(=( @ &'+>=.$$%=$
r
x2
- ?.D.
4e%ressi" e1uati"*

5
;Sales>
G
;7r"$its>
X-X
bar
Y-
Ybar
(X-
Xbar)^
2
(Y-
Ybar)^2
(X-Xbar)*(Y-
Ybar)
20 4 -38.1 -8 1451.61 64 304.8
26 6 -32.1 -6 1030.41 36 192.6
32 6 -26.1 -6 681.21 36 156.6
44 8 -14.1 -4 198.81 16 56.4
55 10 -3.1 -2 9.61 4 6.2
60 11 1.9 -1 3.61 1 -1.9
72 13 13.9 1 193.21 1 13.9
80 14 21.9 2 479.61 4 43.8
90 22 31.9 10 1017.61 100 319
102 26 43.9 14 1927.21 196 614.6
Total 581 120 6992.9 458 1706
Mea
n 58.1 12
4e%ressi" e1uati" $"rmula*
The regression e#uation is a linear e#uation of the form7 K - '
?
: '
1
x.
8here, '
1
- L M ;x = x 'ar>;2 = 2 'ar> N B L M ;x = x 'ar>
+
N
! J&'(=@=>>$.>K ! (.$9
'
?
- 2 'ar = ;'
1
< x 'ar>
! &$ B (.$9.:+.&" ! *$.&'9
Therefore, regression e#uation is as follows7
H ! *$.&'9 3 (.$9. I
What is the expe#te! pr"$it i$ the sale is 1.? 6aEhs?
As)er* H ! *$.&'9 3(.$9.&:(
! *$.&'9 3 %=
G - 00./+6 i.e. Expe#te! 7r"$it

Вам также может понравиться