Вы находитесь на странице: 1из 13

SETTING THE SCENE

YoLt l'taye beefi Gppointed Dedn af Admiss¡ot1s at tlle Mesner School of llealth Care and
Tansorial Tiqcles. your contract stipulates thtit you will receíve a banus of g100,000 eqch year
tllqt the graduati1tl rate exceeds 75aA. Otlly after sigtlilg the contrclct do you fnd thtlt the
success rate for the Last 5 years has averaged only 23.7ok. you decide that the ofily way to
increase this obysmal figure is to impase t¡ghfer adtxissiofis critería, and yotr 111«t wíth the
faculty to drau, up d lkt of the d.sired dtüíbutes of successfuL studet s.TlEy affive.tt three:
(1) lhe eyes of an eagle, (2) tlrc hatl.ls of a watnan, and (3) the soul of a Byzantine usurer. you
det/íse a test battety for appl¡cafits, .,víth five tests i each area, just to be sure you,ve co.¿ered the
areLls well- Uníortunately, the test battery takes )2.6 hours to adnli ister, and you,re still fiof
sure that dll of the tests ín each drea dre tappifi! the right skilk. ls there any way you catl
(1) ruake sure you're fiet$ur¡tlg t¡1ese three areas alxd (2) elífiinate tests thdt are either
redufidant or neasur¡ng somethitlg entirely different?

u.,,ul. $c w,ruldr t br asking rhese qiresrions patlern ol results can be explained by a smaller
IA.I unle\,i ¡h( .rn.wcrs wet( "ye\." Tlre teth- numbcr of underlying constructs (sometimes called
niques wc cover in this chapter to solvc the Dean's latent variables or factors), (l) tcst some hypoth- tTlM(s FA, ot sFA,
dilemma arc principal cornponents analysis eses abour thc data, and (4) reduce the number ol
(PCA) aDd factor analysis (FA).rThey dilfe¡ f¡om vaúables to a more manageable sizc.r In the dark,
techni(lllcs we discussed earlie¡ in one important disfant past, around 0 Bc,a PCA and IA we¡e used for
way: no distinction is rnade berwee¡ independent quite dilferent purposes. Howevet the distinctiorl etllirely.
and dependent va¡iables; all are treated equally and between them has gradually disappeared, and now 2"Some pco¡tle"
neans
are based on one group of subjecls. That is, the goal PCA is used almost exclusively as simply thc lirst we forgot Árho, dfld
ol tlrese tcchniques is to exami[e the s1¡ ucture oÍ the step in FA. We'll keep using both rcrms because --e cd]1't liltd tl1¿
relationship among the variables, not to see how they're still around, and we'll indicate whe¡e one
¡hey relate to other va ables, such as group ment- technique ends and the other begins. )There arc ofher ways
bership or a set ol dependent variables. For rhis So, let's get back to the DeaD's dilemma. After
reasoll, sonle peol)lc have ¡eler¡cd to thesc tech- tlrcsr: lechlliques can
searching the literature for appropriate tests to use,
niqucs as "Lutargcted."2 he comes up with the l5 lisred in the box on page be sell, but we wofi't
To jut[p ahead of the story a bir, our beleaguered Il0, which he administers ro the 200 applicants
Dean will use these two procedures 1(): (1) explore over a l-day period. lThdt's'B¿fore
the relatiorship among rhe variables, (2) see if rhe Compulefi.'

t29
110 REGRESSION AND CORRELAIION

Atkibute Voriohle

Honds

WHAT ARE 'FACTORS'?


What he hopes ro find is shown in Figure l5 - l: thrce
diflerent a¡tributes, labeled in the )argc circles on the
lelt and each rapped by flve of the tests. Let's talk
about the attributes for a moment. Strictly speaking,
they don't really cxist. You can't see or measure "Soul
of a Byzantinc Usurer" directly; you lrler its presence
f¡om behavio¡s that arc supposedly based on it. we
expect (based on our theory of what Byzantine usu_
rers are like) that people who have more of this at-
tribute would charge higher interest rates, act more
"Scrooge-like," overch¿rge more, and so on, than FIGURE 15 - I
would people who have less oI the attribute. To give Three attributes (Eyes, Hands. and
another example. we can't see intelligence'; what we Soul), each measured by flve tests.
see and measure are various manifestations of intel-
mailldin lhlll il is ligence. If our theo¡y of intelligence is correct, people
itllpnssilllt lo sa¿ itj who have more of it should have a larger vocabulary, hypothesis. ln other situations, we may not knolv
flEir stu tlents l)tcdust know more facts, work ollt puzzles faster, and com_ beforehand how many factors (if any) there arc, and
plete more school than do people wilh less of it. what thc object in doing the sratistics is to determine this
we measure are the purportcd consequences of the number. This is referred to as the ¿rl/r¡¿¡lory use of
attribu¡e, and we say that the common thread that PC A ánd FA.
ca tltcl lvhÜl 1PpLicd makes them all cor¡elate wlth each other is the un- Actuaiiy, Iigure l5- 1 oversimplifies the ¡elation'
fo studetús r,lk redd deriying attribute itself. ship between facto¡s and variables quitc a bi¡. I[
ln psychological ja¡gon, we cali these atlributes variables I through 5 were determined solely by thc
hypothetical constructs; in statistics, they are Eye of an Eagle factor, they would all yield identical
called factors or latent variables. one pr¡rpose of results. The correlations among them would all be
PCA and IA is to deterñine ii nume¡ous measures I -00, and only one would need to be measLrred. In
(these could be paper-and-pencil tests, individual lact, the value of each va able is determined by hro
itcms on the tests themselves, physical characteris- points (ignoring any measurement error): (l) ihe
tics, or whatever) can be explained on the basis oI a degree to which it is correlated with the factor
smaller numbe¡ of these factors. In this example, the (represented by the aÍow coming Irom rhe large
Dean wants to know if applicants' performance on circles); and 12) its uníque contribution-what vari-
these l5 tests can bc explained by thc I underlyin8 able I measures that variables 2 through 5 do not,
factors; he will use these techniques to cofilirm hrs and so on (shown by the aüow from the boxes
PRINCIPAL COMPONENTS AND FACTOR ANALYSIS

Vorioble Uniqueness Foctor Vorioble Uniqueness

\i..''..
r!.'t.'.'..
\\|ii.:)
\l»¡'¡

'ii$
\l'¡
i '.'

FIGURE 15.2 FIGURE 15.'


Adding the unique A more accurate picture,
component oI each with each factor contributing
vadable to Figure l5 - l. to each va able.

labeled Lr in Figure 15 - 2). we can show this some- the dashed lines is small when compared with that
what more complicated, but accur¿te. pitture in of the solid lines.
Iigure 15-2.
What exactly is meant by 'uniqueness'?6 we can HOW IT'S DONE 6what exactty is rt
best define it in telms of its converse, communal-
fhe Correlation Matrix by any o[ riist
ity. The communality of a variable can be approxi- As we mentioned a bit earliet the first lew steps Howewr, that's ¡t
mated by its multiple corelation, li2, with all oI the in FA. for histodcal reasons, go by the name of question we'd best
other vadables; that is, how much it has in common PCA. We begin with a correlation matrix. on a tech- leave for the
with them and can be predicted by ¿hem. The nical note. we start with a correlation matrix philosophers.
uníqueness for va¡iable I is then simply (l - R,'); mainly because, in our fields, the va ables are each
that portion of variable I that cannot be predicted measured with very dilferent units, so we convert
by (i.e., is unrelated to) the remaining variables. all of them to standard scores. II the variables all
Before we go on, let's complicate the picture just used a simila¡ metric (such as when we factor
a bit more. Figure 15 -2 assumes that factor I plays a analyze items on a test. each using a 0 -to- 7 scale), it
role only for variables I through 5. factor 2 for 6 would be betteL to begin wirh a variance-covariance
through 10, and facfor 3 for 11 through 15. In matrix.
reality, each oI rhe factors influences all of the If life were good to us, we'd prcbably not need to
vadables to some degrce, as in Figure 15-3. We've go any fuÍher than a co[elation matrix; we'd find
added signs of these influences only for the contli- that all of the variables that measure one factor
bution of the first lactor on the other l0 vadables. correlate very strongly with each other and do not
Faclors 2 and J exert a similar influence on the correlate with the measures of the other attribufes
variables, but putting in the lines would have com- (i.e., the picture in Figure 15-2). However, this is
plicated the picture too much. What we hope to find almost never the case. The correlations within a
is that the influence of the factors represented by factor are rarely much above .85, and the measures
. :. ^ .. . ' '...'.
'i ....1
L)2 REGRESSION AND CORRELATION

I ',l

are ahnost always corrcl¿tcd with "unrclated" ones Othcrs, such as SPSS/PC, eive vou irs firsr cousin (on
to soÍle deqrce (morc like Figure l5 l). Thus \"¡e are ils mothe¡'s side), an antiimage cor¡elation ma-
lelt ]ooking ior patterns in a matrix ol In x \ - 1) - 21 trix. This is nothing more than a partial correlalion
uniquc correlations; in our casc, 1 I 5 x 14) : 2, or 105 nratrix with thc siglrs of the oll diagonal clenienrs
(nol cornting lhe 1.00s along thc main diagonal). as Ievcrsed lor somc reasorl that su4lasscth hLloran
shown in Tallle I5 - l. Ncc(llcss to say, rr_ving Lo Íiakc Llnderstanding- In either case, thcy'¡e in¡crltteted in
sense oi thjs jus¡ by e1e is close to impossible. the opposite r'vav as is the correlalio¡ riratrix: a large
flelorc going on to the ncxt step, it's worthí,hile number of high parrial correlations indica¡es \ou
to do a lew "diaenoslic checks" on this corre]ation shouldn't proceed.
r¡atrix. The reason is that compulers are ilrcrcdibly A rclaled di¿gnosiic rest involves Iooking at the
dumb animals. If no undcrll,ing facto¡ial structure communalitics. Because they are the squareLl ,i¿?l¡i
e\i\t, 1.'e\rlrilts ''l lL, ,ott, .rti,,n r.rltir,,,n.irrinq pla correlations, as opposed to pzz¡¡lrl correiations,
of purely random nuinllcrs bctÍ,ccn .30 and +.10 they shor¡ld be above .60 or so, reflecrinlt rhl3 facr
(i.e., prctty closc to 0), with 1.00s on the main di- that the variables are rclalcd to cach other to some
agoual (brcause a vañable is always perfectly corre- degree. You have to be carciul inter?leting rhe
latcd rvith itself), the computer r,/onld still grind commLlnaliLies ir SPSS/PC. The firsl tirlle ir prir-rrs
away merril],, churning oul rcams of paper, Iüll of them out, lhe,v nray (dcpending on other options
numbe¡s and graphs. signilying nolhiDg. The cx 1l,e'll discuss lalerl all Lrc 1.00. Later in the outpur.
treme example (]1 this is an identity matrix, which rherc will bc arothc¡ column of then], rn/ith values
has 1.00s al(»g the nain diagonal and zeros for all ranging [ron] 0.0 ro 1.0; thjs is thc colunln ro look ar.
thc o11 diagonal ierms. So several tests, formal and Arnong the [ornlal sratistica] tests, one ol the
otherwise, have been devek)ped to ensu¡e that so¡re- oldcst is thc Bartlett Test of Sphericity. WithoLrt
1lling is around Io la(tor analyze. going into rhe dctajls ol how it's calcl-rlatcd, i¡ yiclds
Somc ol thc mos¡ Lrselul 'tesls' do not involvc any a chi-square staristic. II its vahte is small, and thc
statistics at all, orher than counting- Tabachnick and associatcd p lcvcl is over.05, thc¡ the cor¡elatioll
'Their book is lilled lidcll (1989)7 recommcnd nothing morc sophisri- matrix doesn'1 di{fcr signiñcantly lrom an idcnri¡y
with utlco¡tnanly catcd rhan an cyeball cireck of the col¡elalior1 ma- matrix and you should slop right thcrc. However,
qaod wi\¿Dtñ dnd t x; if you havc only a lew co¡relations higher lhan Tabachnick and Fidell (1989) statc thai the Barrlerr
should lte ou the shelf .10, savc yoLrr ¡rapcr and sroL) right thcrc. tcst is "notoriously sensitive," especially with large
A sli:lhtly more stringent lest is to look at a ¡tatr';x sarrrple sizes, so cven ifit is statistically signilicant, it
ol lhe partial correlations. This'tcst'is based on docsn't ¡rcao that yott can safely procccd. Conse
the fac¡ that, if the variables do indced correlale quently, Bartlett's rest is a orlc-sided tcst: if ir says
rvith each other because of an undcrlying lactor you should¡¡'l go on to thc P ncipal Components
stmctüre. thcn the correlalion between any two slage, don't; but if it says you can go on, it ain't
variables shonld be.:mall .íter pcúti.Llillg o t the efferÍs ¡eccssarily so.
of the ather ydridbles. Some conputcr p¡ograms, such Anothcr lest is thc Kaiser-Meyer-Olkin Mea-
as BMI)B prinl out the fartial conelation matrix. sure of Sampling Adequacy (usually relerred to
PRINCIPAL COMPONENTS AND FACTOR ANALYS]S

by its nickname, l<MO), \'vhich is based on the No!v, this may seem like a tremendous amount of
squarcd partial corrcl;ltions. Ilt the SPSS/PC com cflort was expended to ge1 absolulely nowhere. lf we
pnlcr packagc. KMO vallre Ic¡¡ each va¡iatrle is began wjth l5 variables a]]d ended up with l5
prinrcd alon8 Ihc mai¡ diago¡al o[ the antiimagc lactors, rvhat have we gailcd? Actlrally, quite a bil.
correl¡lioll malrix, and a sunlmary value is also The ws lor the flrs¡ laclor are chosen so that they
gi\,cn. This allolvs you 1o check 1he overall adequacy cxpress the largest amount of va¡iance in the sarn-
ol ¡hc m¡rrix and also sec rvhich individual va¡iablcs ple. The ws i¡t the secr¡ncl lactor are (lerived to nteet
¡1ay not be pulling thcir lrrll stalislical weighl. Il thc two crilcrla: (I) the second lacto¡ is uncor¡elated
value is i¡ the .60s, Kaiser describes rhe rneasllre as with the lirst, and (2) ir expresses the largest
Tliose values iD lhe.50s are amount of va¡iance lelt over after the iirsl fafior is
ble" and lc¡lvcr oncs arc "rrnacccplable"; you should consldered. The rrs in all the remaining facto¡s arc
procced lvith thc nexl stcp accordingly Similarly, calculated in the same way, with each lactor uncor-
-volr can considcr eliminating valiallles rhal sholv rclated with aud exp)aining less varjance than thc
l, ,,r ., .1' n¡ rdcqua,i. freviou- urr'\. 5o, il J [J.lori,] .tru( lJ rc i. pr('.cnl in
the data, rrost o[ thc variance Ilray be exp]ajned on
Extracting the Factors lhe Lrasis of onll the first ierv lactors.
Assllming that all has gone well in the previor¡s Again returning 1L) our exampie, the Dean hopes
stcps, wc Dol¡/ go on lo extracting the Iactors, a tllat thc first I factors are ¡esponsiblc fo¡ r¡ost of thc
proccdurc only siightly Iess painlul than exlracti¡g variar]ce among the variables and rhat rhc ¡emain
teelh. The purposc of this is to comc r-rp with a series ing 12 factors will be relatively 'wcak' (i.e., he won't
ol linear combinations ol the variables 1o definc lose 1oo nluch information if he ignorcs them). Thc
each laclor. Fo¡ l¿cror l, thjs would look sonethinll actual resl¡lts are given in Table l5-2. For the
like: momcnt, illnore the column headed 'Eigenva]ue'
(we get back to this cryptic word a bil later) and look
I wr2x, - ..- + rrrr,Yr at thc lasL onc, 'Cul1rula¡ive percenr.'Notice that rhe
first laclor accounts for 37.4ok of the va¡iaDce, thc
wberc thc X te¡ms arc the I (in this case, 15) first two for ove¡ 507o, and the first five for almost
variables and the ws a¡e weights. These w tcms 757o of lhe variance of thc original data. So he
have nvo subscripts; thc first shows that lhey go ac¡ually may end up with what he's looking for.
rvlth {actor l, aod the second inclicates with which What we've jus¡ desc¡ibcd is the esscnce of PCA.
varialrle they're associated. The ¡eason is that, il lve what it lries to do. theñ, is explain the variance
have l5 variables, we will end up with 15 fac¡ors among a bunch of va¡ial¡les in terms of uncor¡elated
and therelo¡e 15 equations in the fo¡¡r of the one (the slatistical ferm is orthogonal) underlying fac
above. Fr¡r example, rhe second {actor would look tors or latenl variables. The way ¡t's used now is to
like: lry fo reduce thc rumbcr oI facto¡s as r¡ucl] as
possible so as to get a more parsimonious explana'
F): w)tx\ + ri/22,Y2 + rion of what's going on. In fact, though, PCA is only
134 REGRESSION AND CORRÉLATION

eigenvalue. without going into the int cacies of


matdx algebra, an eigenvalue can be thought oI as
Factor Eigenvalue an index o{ va ance. In PCA, each factor yields an
eigenvalue, which is the amount of the total vari-
ance explained by that factor. we said previously
that the ws are chosen so that the fl¡st factor
expresses the largest amount of variance. It was
another way oI saying that the flrst factor has the
largest eigenvalue, the second facto¡ has the second
largest eigenvalue, and so on. So why use the
crite on of 1.0 for the eigenvalue?
The reason is that the flrst step in PCA is ¡o
transform all of the variables to z scores so that each
has a mean of 0 and a va¡iance of l. This means that
the total amount of va ance is equal to the number
of vaiables; if you have 15 variables, then the total
va ance within the {z-transformed) da¡a matrix is
15. If we add up the eigenvalues of the 15 factors
that come out of the PCA (or any other factor
extraction method), they will sum to-that's righl,
class, lr.r¡ So you can think of a Iactor with an
sAtltl baliew LLs, if eigenvalue oI less than 1.0 as accounting lor less
is 'l one way ol determining the lactors. BMDP has four va¡iance than js generated by one variable. Obvi-
worth tl1¿ eJfot t. dillerent methods, and SPSS/has seven. ously then, dea¡ reade¡,r2 we gain nothing by keep-
e
ltt psychiatric circles. So, which one do you use, PCA or one of the ing factors with eigenvalues unde¡ 1.0 and are
it ir sdid that ol1e others? Unless you want to delve into the minutiae fu¡the¡ ahead (in terms of explaining the va an.e
ü1t1 of bccomc d of how one technique diffe¡s f¡om the others,s yorl with fewer latent variables) if we keep only those
facf o r d fl Ltlysl Lt t1 til might as well go with PCA. Several people have with eigenvalues over I.0; hence, the eigenvalue
one s self has been compared the results of the different procedures and one criterion.
factar d dl-yzed. have generally found the same thing. If the data are This test has two problems. The firs¡ is that it's
toTha(s Íleltry 11. well-behaved (i.e., large subject-to'va able ratio, somewhat arbitrary: a factor with an eigenvalue ol
few useless va¡iables- no extreme deviation f¡om l.0l is retained, whereas one with a value ol .99 is
I{dis¿r, nof Kdiser
normality, and no outlie6), then all of the solutions ¡ejected. This ignores the Iact that eigenvalues, like
Wilh.l l
yield comparable results when you go on to the next any other parameter in Statistics, are measured with
t t
tf undon t belia,e step, factor analysis. II the data aren't well-behaved, some degree of error. On replication, these numbers
us, ad¿ np Íh¿ 15 your mother should have told you that you will likely change to some degree, leading to a
shouldn't be messing around with them to begin dilferent solution. The second problem is that the
'Eigctlvaluc'coluñh with. Kaiser criterion olten rqsults in too many factors
of Tdble 15 -2. See, we (factors ihat may not appear if we we¡e to repljcate
On Keeping and Discarding Factors the study) when more than about 50 variables exist
A few paragraphs back, we mentioned that one of and in too lew Iactors when fewer than 20 va¡iables
the purposes of the factor extraction phase was to are considered (Horn and Engstrom, 1979).
helored L1y Albefl
¡educe the number of facto¡s, so that only a few The Lawley test üies to get around the first
Eú1st¿út, used whett
'sttong' ones remain. But first we have to resolve problem by looking at the significance of the facto¡s.
he i]ds dbouf lo hit
what we mean by'strong,'and what criteria we Unfortunately, it's quite sensitive to the sample size
)jolt w¡ th solh¿tl1i]1g apply. As with the previous phase (factor extraction) and usually results ir1 too many factors being kept
that wotlld fdke 6
and the next one (factor rotation), the problem isn'l when the sample size is large enough to meet the
onlllr to _lig re oul.
a lack of answe6, but rather a surfeit of them. minimal c¡iteria {about which, mo¡e later). CoDse-
At the same time, the number of factors to retain quently, we don't see it around much any more.
Kaiser (1970) refers to is one of the most important decisions a facto¡ A somewhat better test is Cattell's Scree Test.
lllis ledfilique as analyste must make. II too many or too few factors This is another one of those very powerful statistical
are kept, the results from later steps may be distorted tests that rely on nothing more than your eyeball.rr
(becduse üt ,ndtrix to a marked degree. we staft olI by plotting the eigenvalres for each of
olgebra, an The criterion that is still the most commonly used the 15 factors, as in Figure l5-4 (actually, we don't
eigenralue is called a is called the eigenvalue one test, or the Kaiser have to do it; most computer packages do it for us a¡
toot of the ndlrix). criterion, alter the person who popularized it.to It no extra charge). In many cases (but by no means
Could th ís be dn is the default (although, as we'll see, not necessarily all), there's a sharp break in the curve between the
the best) option in most computer packages. we point where it's descending and where it levels off;
prll¿ssiont iadlousyT should, in all fairness, desc¡ibe what is meant by an that is, where the slope of the curve changes lrom
PRINCIPAL COMPONENTS AND FACTOR ANALYSIS
rl5
¡1ega1ive to close to ze¡o.14 The last,,real,, facto¡ js
ó.0 -
the one belore the scree (the relatively flat portioD
ol the curve) begins. I[ seve¡a] breaks a¡e jn rhe 4.8 -
descending line, usual]y the f,rst one is choscn, but -?
o
lhis can be modified by two considerations. Ij¡st, we 3.ó -
usually want to have at least three fáctors. Second,
.9)
the scree may start after the sccond o¡ thi¡d brcak_ 2.4
We see this in Iigure l5-4;thereisabreakafte¡the FIGURE I5.4
second lactor, blrl it looks Iike the scree sta s alter 1.2
A scree plot fo¡
the third fac¡or, so we'll keep the first rhree. ln this
00 -
example, the number ol factors relained wirh the
criterion and with the scree test is ¡he same.
ttt || | | T rl-rt the I5 factors
Kaiser
The lacl that no statistical lest cxisrs for the scree
1234 5678 9 r01r r2131415
tesi poses a bit of a problem Ior compute¡ prog¡ams, Focior
rvhich love ¡o deal witl.r numbers. Almost all pro-
Srams use the eigenvalue one crite¡on as a dcfault
lvhen thcy go ou to the next steps of lactor analysis.
If you do a scree plot and decide you woD,t keep all
the lac¡ors that have eigenvalues over 1.0, you have
to mn ¡he IA in two steps: once Lo p¡oduce the scree
plot, and again for you ro ovc¡ride the eigenvalue
.ri¡crion. You can usually do rhis by specifying
eirher the minirnum eigenvalue (equal to rhe value
ol the smallesr one you want 1() retain) or the acrual
numbc¡ of facrors to keep. It's a pain in the royal
derrierc ¡o have to do it in two steps, but it can be
done

The Matrix of Factor Loadings


Alter lve've extracted the factors and decided on
how many to ¡etain, the computer gives us a fable
(like Table I5 - 3) rhar js variously called rhe Factor
Matrix, the Factor Loading Matrix, or ¡he Factor
Structure Matrix. Just ¡o conluse things even
mo¡e, i1 can also be called the Facto¡ pattern
Matrix. As long as we keep the {acrors orthogonal
to cach o¡he¡, the factor slruc¡ure matrix and the are not dependent on one another.) This becomes ta
ln leoloq\', "scree"
factor pattern matrix are identical. When we relax important Iatel when we see what happens when ís lhc rtllhle thdl
this ¡estriction (a topic we'il discuss a bi¿ larer), the we ¡elax the requiremcnt oI orthogonality. ttctlfil11r[al( df f]i¿
rwo nlatrices becone different. Second, thc communality of a variable, which we loot of d hill; hctl: it s
Table l5-l tells us the correlation be¡ween each approximated with R2 previously, can now be de- th¿ j nk nl¿t tht
variablc and lhe various IactoIs. In statisticalja¡gon, rived exactly. FoI each variable, rt is thc sLt l of the sh'o ]111 I tlctors. h1 \rl I il
we speak of the variables loading on the lactors. So, squared factor loadings across the Iactors that we,ve ofhal sltlLs bonk ü]t
'Visual Acuity'loads .627 on factor I (i.e., cotrelates kept. Looking ar Table I5 - t, it would be (.62684)2 + yLlu also llel d basic
.ó27 with the flrsr factor), .285 on facror 2, and.347 (.2a525\2 + (.3465))2 : .594 for ACUITY We usually
lro nditry üt leolo$
on Iactor 3. As with orher corrclarions, a higher use rhe abb¡eviation ¡: for the communality, and dl lll¿ s..¡1ú, li\lt'?
(absolute) value means a closer relationship be- therefore the uniqueness is written as (l - ft2).
twcen the factor and the variable. In this case. then, At this poinr, we still don't know what ¡he lactors
'Visltal Acuity'is most closely associated with the mean. The fl¡sr Iacto¡ is simply the one that ac
firs¡ laclor. counts Ior mos¡ of the variance; it does rot necessar-
A couple of interesting and informative points ily reflect the first factor we want to flnd lsuch as
about factor loadings. Iirst, they ate stan.lardized the Eyes of an Eagle), or rhe variables higher up on
retressio coeff.cients (p weights), which we first ran the list. However, we'll postpone our discussion of
across in muitiple regression. In lactor anaiysis, the interprerarion Ltntil after we've discussed facto¡ ¡ota-
DV is the original vatiable itself and rhe factors are tion below.
lhe IVs. As long as the factors a¡e orthogonal, these
reg¡ession coefficients are identical to correlation Rotating the Factors
coelficients. (The reason is that, if the lacto¡s a¡e Why rotate at all? Up ro now, whar we've
uncor¡elated, i.e.. orrhogonal, then rhe fl weighrs done wouldn't arouse strong emotions among most
136 REGRESSION AND CORRELATION

ltTo the extent to which statistician s. I 5 We've simply transformed a number NYSTAGMUS loads strongly on factors I and
stro g emotiofis can of variables into facto$. The only subjective element 2, CARROTS loads olr all I factors to
be arcused ifi statisti- was in selecting the number o{ factors to retain. comparable degrees, and so on. Factorial
cia s (\'rhich is wlly However, if we asked for the factor matrix 10 include complexity makes it more difficult to
\^,e fekr to stdtisti- all of the factors, rather than just those over some interpret the role of the variable. INTEREST
cians as thef, ralher criterion, we could go back and forth between fac- is explained by both factor I and facror 2
tors and vadables without losing any information at and, conversely, the explanation of these
all. It is the next step, Iactor rotation, that really gets factors must take CARROTS into account. Il
the dander up among some (unenlightened) statis- would make life much easie¡ if we could
tical folks. The reason is that we have, literally, an understand the factors on the basis of
inñnite numbe¡ of ways we can rotate the factors. mutually exclusive sets of variables.
which rotation we decrde to use (assuming we don't l. Magnitude of the loadings. This is really a
merely accept the program's default options without consequence oI the second cdterion. If a
question) is totally a matter of choice on the ana- va¡iable loads strongly on one factor, then its
iyst's part. loadings on the other factors will be close to
So, if factor rotation is still somewhat controver- 0. The reason is that the sum oI the squares
sial, why do we do it? Unlike other acts that arouse of the loadings across factors (the va able's
strong passions, we can'a explain it simply on the communal¡ty, you'll remember) remains
basis of the fact that it's fun. To us t¡ue believers, constant when we Iotate; so as some
factor rotation serves some useful functions. The loadings go up, orhe¡s have to go down.
primary one is to help us understand what (if 4. Unipolar factors. If some loadings were
anything) is going on with the factors. positive and others negative, rhen a high
To simplily interpretation of the facton, the fac- score on the factor would indicate more of
tor loading matrix should satisfy lour conditions: some va¡iables, whereas a low score would
l. The variance should be fairly evenly indicate more of other va¡iables. Again, in
distributed across the factors. the inte¡est of interprefive ease, we'd like
2. Each variable should load on only one factor. the factor to be unipolar; rhar is, a higher
l. The factor loadings should be close to l 0 or score on the factor means more of the latent
0.0. va¡iable, and a lowe¡ score simply means less
4. The factors should be unipolar (all the strong oI it. This occurs when all of rhe factor
vari¿bles have ¡he sdme s¡gn). loadings have the same sign.
Let's see how well the {actor loading mat¡ix in From a mathematical viewpoint, nothing is wrong
Table l5-I meets these crite a. with most of the variance being in one factor, or
I. Distribution of variance. If we go back to with factorial complexily, or with loadings in the
Table 15 - 2, we can add up the eigenvalues middle range, or with bipolar factors. Howevet it is
of the first three factois. Their sum, 9.1788, easiest to interpret the iesults of a facto¡ analysis if
shows the amount of variance explained by we can meet these cfite¡ia arrd aim lot sttü¿t tal
them (which is 61.2% of the total va ance síl1lp|ícíty. 'lhis \s what rotating the factors tries ro do.
of l5). Of this amount, the fi¡st Iactor Unfortunately, no one's lound a way to optimize all
accounts for 1.5.6025 + 9.1788). or 61.0%, of these c¡iteria at once. A rotation that spreads the
the second factor Ior 12.0252 + 9.1798) variance equally across the factors may not necessar-
or 22.7ó/", and the third factor for the ily reduce factorial complexity; and one that reduces
remaining 16.9o/o. So, the flIst factor complexity may not produce unipolar facto§. Need-
contains a disproportionate share of the total less to say, this has resulted in a profusion oI rotation
variance explained by the three factors. We techniques. each one designed ro g¡ve priority ro a
can also see this in the {act that all of the diffe¡ent c¡iterion, and all of which yield somewhat
'uwh¿tt fiakes f¡tclot va ables load strongly on this factor (Table different results.r6 The one that's used most is called
l5-3): t2 of the l5 have loadings over.50 varimax, and that's what we'll go with first.
utlique it
the .freld of on factor 1, and only 2 vadables A simple exaÍnple. Let's see how rotating the
stalíslícs is lhtlÍ the (NYSTAGMUS and CARROTS) Ioad higher factors can help meet the four criteria and grant us
l¿chfiiques are not on another factor than they do on lactor l. our wish for simplicity- However, because it's hard to
amed afier people. This situation is extremely common and is draw I -dimensional pictures (l dimension for each
H o1t l¿.,er, after h yí t1 g Iound because consistency tends to occur in factor), we'll start ofI by forcing the PCA to give us
to get !"our ¡onque people across various measures. What factor only two factors. We can then generalize the proce-
!1tout1d lerfis like I often picks up is this "general faoor," dure to three or more Iactors, although we won't be
which only rarely tells us something we able to visualize the results as readily.
binormamin, or didrl't aheady know. By asking lor two facrors, ou¡ facto¡ loading table
oblitltdx, you dliflosl 2. Factorial complexity. whenever a vafiable will have just two columns. Let's plot each va¡iable,
wish they had been loads strongly on two or more factors, we using the loading on Ia«or I as the X coordinate and
gi)en humdrl tlfies. call it factorially complex. In Table l5 - 3, the loading on the second factor as the Fcoordinate.
PRINCIPAL COMPONENTS AND FACTOR ANALYSIS 117

What we'll get is Figure l5-5, where we can see


problerns with ail of the criteriai (l) all ol the
va¡iables show some deg¡ee of ioading on factor 1;
(2) most of the variables are in the middle portions
oi the quadrants, showing that tbey are loading on
both oI the facto¡s; (l) the lactor loadings all seem to
lall between .4 and .8 on factor l, and most o{ them
are between .2 and .6 (absolute values) on factor 2;
and (4) factor I is unipolar, but factor 2 is deflnitely
bipolar.
Now keeping the axes orthogonal (at right an-
gles) to each other, let's rotate them (Figure l5-6).
The new axes are labeled facto¡ l'and factor 2'. The
only problem is that if we continue to rotate the
axes clockwise until lactor t' is horizontal, all of the FIGURE 15.5
Iacfor 2' coordinates will be negative; again, not a
A factor plot of tbe
statistical problem. but it makes interprctation a bit
two-facto¡
harder. we can correct this little annoyance simply
solution.
by reversing all oI the signs of the lactor 2'factor
loadings, which is quite kosher, mathematically
sp€aking. We end up with Figurc I5 - 7.
How do our criteria Iare in this picture? (l) A
grcup of variables are showing a high loading on
factor 2 but not on factor l, demonstrating that not
all of the va¡iables are loading on the flrst factor any
more. (2) The variables seem to bc closer to the axes
than to the middle of the quadrant, indicating re-
duced factorial complexity. (l) Each variable is close¡
to the top on one factor and closer to the origin for
the other factot showing that the loadings are nearer
ro 1.0 or 0.0. (4) All of lhe v¿ri¿bles are in or \cr)
near to the fi¡st quadrant. This means that all of the
signs a¡e positive (or those loadings which a¡e neg-
ative are very small), resulting in unipolar factors.
When we have more than 2 factors, we can plot all
possible pairs of them. Howevel if we had as lew as FIGURE T5-6
5 facto¡s, we'd have 10 graphs to wade through; 10 Figure I5 - 5, with
lacto¡s would result in 45 graphs, and so on. the rotated axes
Orthogonal versus oblique rotations. Be- superimposed.
lore returning to our original problem, Iet's use this
two-facto¡ solution to illustrate one more point.
You'll remember that earliet we said, ". . . keeping
ihe axes orthogonal (at right angles) to each other,
let's rotate them" (Norman and Streiner, personal
communication). However, the {actors don't rdve to
be o hogonal. In fact. having some degree of corre-
lation among the factors is probably a better reflec-
lion of reality than having st ctly independent ones.
So, although it's easier to t¡rink of Hands oI a
woman as being a completely separate attribute
from Eyes of an Eagle, it's likely more accurate to
think of them as being correlated to some degree.
wheD we rotated the axes in Figu¡e 15-6, we
were still left with some of the variables being near
the middle of the quadrant. Because the angle FIGURE 15 -7
between the axes was fixed at 90 degrees, there was Figure l5 - 6, with
littie we could do. But, relaxing the condition that the rotated axes
the factors have to be orthogonal, we can draw each tu¡ned more to be
axis closer to the middle of each group of variables, borizontal and
as in Figurc l5 - 8. We call this an oblique rctation. vertical.
138 REGRESSION AND CORRELATION

In this case, to understand what lactor I is mea-


(\ suring,'we not only have to look at the variables that
!t have a high loading. but we also have to consider
o any correlation between lactor I and the othe$. The
correlation among the Jacto6 leads to anolher issue,
which we briefly mentioned earlier. As long as the
factors were uncoüelated, each variable's regression
coefficients for the factoB were the same as the
coffelations between the variable and the facto¡s;
that is. the loadings could be interpreted either as
simple correlations or as p weights. However, once
we introduce some coÍelation between the facto$,
this equivalence doesn't hold any more. The factor
FIGURE 15 - 8 structure Íratrix still consists of the loadings defined
as partial regression coefficients, but now the facto¡
An oblique
pattem mallix holds the simple correlations between
rotation to the
factor plot in the va ables and the facto¡s. The higher the corre-
Figure 15-5. lation among the factors, the greater the diÍ{erence
between these two matrices.
So. even rhough oblique rotations may mirror
reality more closely than do orthogonal ones, most
Foctor Vorioble Uniqueness people preler the latter. The reason is lhat ortho-
gonal rotations have a number oI desi¡able qualities.
Because the factors are uncorrelated with each other
(that's the mathematical meaning of 'orthogonal'),
any score de¡ived lrom one factor will co(elate
0 with scores derived from the other factors. This is
a useful property if the results of a PCA or FA are to
U.
be further analyzed with another statistical test.
U. Also, as we've said. the interpretation of the factors
is far easier if they are all independent from one
another
Back to the Dean. Be{ore we leave the topic
ol rotations, Iet's just see how our three-factor
solution fared with a varimax rotarion. We'll skip
the graphing stage because, in the absence of
iu"l 3 -dimensional graph paper, we would haye to Iook
,u.,,. at three factor plots for the unrotated solution (fac-
tor I vs 2; I vs J; and 2 vs f) and an equal number
after the rotation. Instead, we'll focus on the factor
matrix. The unrotated matrix was given in Table
FIGURE 15 - 9 l5 - 3; the rotated one is in Table 15 -4.
In an oblique Before rotation, these three facto$ accounted lor
rotation, the 61.2'/¿ ol the total variance; this doesn't change.
faclors a¡e what does change is the distribution of the vadance
correlated with across factors. If you recall, oI th€ variance that is
each other. explained, factor I was responsible for 61.0%, factor
2 fot 22.10/ó, and facto¡ ) fot 16-90/0. After rotation,
these numbe$ become )7.0o/o, )).2o/o, arLd,29.8%:
The advantage is that oblique solutions often lead obviously a much more equitable division. This is
to greater stmctural simplicity (using the criteria we also reflected in the fact that now only five va ables
listed before) than do orthogonal rotations. The load strongly on factor l; preüously, the majority of
tradeoff is that we now have to contend with the them did.
Iaoors being coüelated with each other to varying The other criteda did just as well. If we plot the
degrees. Instead of the relatively simple description absolute magnitudes of the uuotated facto¡ load-
of Figu¡e l5-2, where the value of each va able is ings, as we did in the left side of Figure I5 - 10, we
determined only by its "own" factor and its unique see that most oI them fall between .J and .7. The
component, we have a morc complicated situation dght side shows the same thing for the rorated
(Figure l5 - 9). loadings; the graph is much more bimodal, with
PRINCIPAI, C.}MPONENTS AND FACTOR ANALYSIS 139

relatively few values in the middle range. So we


seemingly have succeeded in driving the loadings
closer to 0.0 or 1.0. Also, in the unrotated solution,
Fa.tor I Fa.tor 2 Fá.tor l
12 oj lhc 45 loadings were negative; in the rolated
on€, only 3 are, and they are relatively small. Last.
.,|.
only one variable, CHECI(S, shows any degree of
iactorial complexity. The conclusion, then, is that
rctating the axes got us a lot closer to structural
simplicity.

INTERPRETING THE FACTORS


Now that we've got the factors, what do we do with
them? The first step is to determine which va¡iables
load on each factor. To do this, we have to figure out
which loadings are significant and which can be
safely ignored. we know a couple of ways of doing
¡his. one way is io adopt some minimum value,
such as.l0 or.40. The problem is that any number
we choose is completely arbitrary and doesn't take
the sample size into account; a loading of .38 may be
mcaningful if we had 1,000 subjecrs, but it may
represent only a chance flucluation from 0 wirh l0
subjecrs.
A bette¡ mcthod would be to rctain only those
loadings which are staristically sig¡1iñcant. we catl do Where did these numbers come from? when N >
lhis by looking up the critical value in a table lo¡ the 100, the úormal curve is a good approximation for
correlation (see Table F in the appcndix). But which the correlation dist¡ibution, and 2.576 marks ofI thc
value to use? Stevens (19E6) recommends (l) using 17. level of significance. Following Stevens, we
thc I% level of signiflcance rather than the 57o double this thence, 5.1 52) and then multiply by the
because of the number of tests tha¡ will be done, and SE for a correlation, which is tl + V(N-2)l and
then (2) doubling that value because the SEs of voila! So, il you want to use the 57o level, use 1.920
Iactor loadings are up 1(] twice those of ordinary in the numerator.
correlations. when the sample size is ovcr I00 (and Let's usc this lor ou¡ data. Because we had 200
we'll soon see why it had bctter be), a good approx most unwilling people taking the tests,rT we would l7This was just an
imation to use would be: get: editorial coñmenl;
lheir sfrlle of miñ!1
5.152 6v=IS=s.366 (loes ot alfed lhe
V¡v 2 Vrs¡
(r5-r) (15-2)

FIGURE 15 - IO
PIot of the factor
loadings ior the
un¡otated and
rotated solutions.

0.0 0.2 0.4 0.ó 0.8 L0 0.0 0.2 0.4 0.ó 0.8 L0
Foctor loodings Foctor loodings
140 REGRESSION AND CORRELATION

rclatively lew itcms, this may bc our orl]Y


alternative, "' In or.rr example, trecause the
Fa.tor t Fa.tor 2 Factor l DeaD will have a new batch of 200
consenring adults nexl year, this oplion is
Matrix ol
Acullv i t .699 feasible.
srgnl¡cant- Color .6Ot
lactor loac¡ll1e Table 15-5 can also help the Dcan in anothcr
\y(i¿gmus .134
way. I[ he wants to make thc tcsl battery shorter, he
DerJil 57
can elinrinate those tests wirh the lowest factor
': Carrots ,810 '
loadings. Needless to say, thc redLlced balter]- u'ill
line der teril Y 786
' -Gross dexterity- I ' a2) not p¡edid thc laclors as $'ell, so yel anothel
,r . ,: rsoftness, :.: . .' .:. ' ;857 t¡adcoff has to bc made.
_
'
Bcfore we waltz away, thotrgh, \^,,e should rnal<e
Che.ks -- .s7O .4 t'7 lwo last checks ol thc factors. A laclor should co¡lsist
,' .]. .Interest ' .732 . ol a1 least thrcc variables (TabachDick and Fidell say
Scrooge .7 )g you can get away with two, but we feel that's low).
: Dunning 797 Any lactor tl'rat co¡tains fewcr sho¡-rld be discarded.
. :' r. , Overcharge.
: .7O2 .. .
Scco¡lcl, it's wise to go Lrack to the original correla-
, :. ' ' .:' 'silling
:lsj tion matdx and scc il the variables iD thc lactor are
indeed corclated with each other Although i1's
unusual, sjtuatjons can arise iD which they're not,
and again rhat factor should be thrown away.
I{ we now go ba.k to Table 15 4ard elilninate all
loadings lower tllan rhis (and rot¡ltd down to thrce
dccimals to urake the rlumbcrs easier to read), we
gct Table l5-5. Suddenly thc light shines; it looks USING THE FACTORS
lil<e rvc've pulled some degree ol order out of cha- In many cáscs, thc steps we've gone through are as
I 3
l)oLtbtüs r\iuld stty os.'u Facto¡ I consists ()1 six variables: CHECKS, lar as researchcrs want ¡o go. They've used PCA and
r|c |c (i-¿dl¿(l
cltlos INTEREST, SCROOGE, DUNNING, OVERCHARGE, FA to eitllcr cxplore the data o¡ confirnl so¡¡e
nul nl ad¿r, btll and BILLING. This looks ver-y much like the postu- hypothcses about them, and also to elintinale vad-
\,llat L1o thrse oltl latcd Soul lactor. witlt tlle addition of the CHECKS ables thar were cjther not too helpful or factoriall-v
slickt-i1-lhc tfiud variable (a point Í,e'11 re¡ur¡ 1() soon). Sinlilariy, complex. Howevet, wc can usc these Procedrlres in
{actor 2 corresfoDds ro thc lrands attÚbutc, and anolher way: lo reduce lhe lruinbcr ol variables. we
teToolíti:l Dur ov4t llor lactor I to Eyes. may want lC) do this lor a lew reasons.
Howevcr, therc's orc l]y in ¡hc oint¡rcnt. The Firsr, sübjcct lo-variable ratios that are too low
a b¡|, sea Slt'¿it1er dnd fol some rnultivariable procedurcs may still be okaY
CIIECKS variablc is both laclori¿lly cornPlex (load-
Nor,ndn ll9t9) fot in FA (see belor'v). So we can use PCA and FA lo
ing on laclors I and 2), and its highest loading is on
nlore dctdils 011 scale changc a iarge numbc¡ of variables into a smaller
the "wrong" facto¡. So, what do we do wirh it? Wc
have three optioDs: rumber o[ factor-s, which we can rhcn analyze with
l- we caú tllrow thal tesl ou1 ol the batlery linear rcgressio¡ or so ething else. Second, it may
hecause it isn't tapping In/hal wc thoughl it be easier lor Lls to understa d lvhat a pattern of (say)
would. I1 thcre are enough variablcs three [actors rneans rathcr than tryi[g 1o juggle ] 5
remaining in the factors (a minimum of scores in onr minal all at once. whal we lvould like
three), this may be a scnsible option. Wc 1() do, lhcn, is to come ull with one numbcr for each
would also toss out variables that didn'l load facto¡. In ou¡ example, each persort would have 3
rvell on any factor. This would be the case \corc\, ruthcr tlr¿r 15, whirlr, irt e..etrt'.'rtrlt¿.e.
when the vadable is qttite complcx, loading the subjecl/variablc ratio by 5.
on a number of facto¡s, or wl]en it loads on we me¡tioned earlier that the factor loadings are
some factor we didn't retair- panial regrcssion weiEhts. So why not simply use
2. wc can keep the vafiable in both lactors. them like a regression equation? The reason is that
Howcvcr, if our aim is to achieve simplicily they were derjved to predict the value ol lhc varíable
and end up wilh uncorrelatcd factors, this fron the factors. What we want to do is just the
wouldn't be a good choice. opposite, to predict the f.úot from the l)atidbles. So, if
l. If the variablc is one we devised (e.9., aD we w¡rr. we can conrm.lnd lhe comll¡lar lo gi\ e 1.5
itc¡n olr a test we're writing, or an entire Lest a factor score coeflicient matrix, sr-lch as the one
we're dcvcloping), we could rewrite it. The in Tablc 6. Each column is a regresslon equation,
15 -
downside of this is thal we would have to with the predicted factor sco¡e as the DV and the
repcat the whole study with a new grolrp of variables as the IVs.
subjccts to sce iI the ¡evised variable is betler So, the three-f¿ctor scores for subject I would be
than the original. However, if we're lound by plugging het 15 stdndardized scores into the
dcvcloping a scale, and orlc lactor llas equdtiun\. whith noul<l rhen reJd:
PRINCIPAL COMPONENTS AND FACTOR ANALYSIS I4t
FSr=(.02081)ACUITY (.00017)COLOli
+ ..+ (2ó515)BILLING
FSr = ( .0ó561)ACUITY (.0901)COLOR Fa.tor I Factor 2 frá.lor l
+ .. (.07,188)BILLINC
FSr = (.27844)ACUITY + (.26219)COLOR
+ ...+ (0019t)BILLTNC

Most computcr programs can calculate I lle lactor


scores lor us and thcn save thcm in a [ilc, naking
the job oi lransferriug tl}e results to another pro
grarr nruch easier.
It alrrrost goes wilhout sayirr! that if wc have onc
way to do rhings in F , a coLrple of other ways are
lurking around just to complicate otrr lives. Cor¡pLrt
inq lactr¡¡ scorcs is no cxccption. All of thcm yicld
scores lvitll ¡ nearr r¡f 0. Whcrc they differ is in 11 )
lhc varjance ol the scor-es, an(l (2) lllc cr¡rrclalion
amon{ the factor sco¡es- the lacto¡s are
uncorrclated (assuming we ^lthough
ve stu)pped at PCA or
used an orthogonal rotation in FA), the [ac1or scores SAMPLE SIZE
lhenselves may be, depending on which techniqrre In laclor'¿nalysis, thcr:c arc no power tables (at leasr
we us!.. nonc that we l(rt()w abollt) to tell cxacrly how many
However. we'll nrention one more [act about subjects lo usL'. What \/(' ¿/(r have arc lirnrly held
lactor scores that may actually sitrplify your'1ile. belicls'o anrl a lew Nlontc Carlo simulaLio¡rs. Whal )¡)oli¿n ar.tud ds
'r,ith
When more tha[ 10 va ables are krading on a thcy boil dorvn to is thisr (1) we DLrst havc an ttltch wltcne]rce 4s
laclo¡ you (an probably lorgct aboul thc cqualjorrs absolúfc ¡11¡nitrukl of 5 subjcfls pcr variable; with
en¡irell,- Il you use rnir weighls, ser each significanl thc pr-oviso that we l'ravc (2) at least 100 subjecls. rlebalitr,1 íJ at4els an
lo¡ding equal to 1.00 (or -1.00 il ir's iegarive) a¡d Gorsuch 11981), oDe ol the grancl-dacldies ol fA, qet Lln¡dt1.ll kltlt)
the n()llsignificant oncs to 0.00: then all you have 10 and thc person who ploposcd these guideliDes, said irith .tllolt tlu safitc
do is add qr the (standardizcd) scorcs; lolgcl aboL¡t that tlris should sullice o/?11 rrthe corlrlnuualitics arc dclr¿e of tlain ta bick
nultiplying them by the coellicienls. The reason is high and rhcrc are rlany variables lor each lactor. I[ then p).
that with nrore than 10 IVs, thc [] wcights don'r you don't mrtt thcse two conditions, lhcn you
inl'-1,\( lr(prÉJi(Lircal, 'r),,I lr- (,,.]dt r',' ',dl] should probably at Ieast double the subject/variable
degree that's wortlr worrying aboul (Col1en, 1990; ralio, as wcll as lllc total numlrer ol subjects ana-
Wainer, 1976). Actually, this is mosl lrLrc whcn thc lyzc'd. Wc dare say lhal il these nrles are lollorn¡ed,
va¡iablcs a¡c loLally uncorrelated n,ilh each olhcr; thc [unlber ol Iactor ar]alyses ¡rerloroted each year
the grealer lhe magnitude o[ LIle corre]alion, the will clrop by aboul 707o, rcsullirlg in much joy
greater thc possiblc loss in cfficicncy lvhcn usürg aolong rcaders of iolrlDals arld much constc¡naliolr
thrse urrit weights. \ irllin Ih. 1'¡p,'t' rrrar.:rl,rr t.rr'rrrg Lr.r,trres,.

Fa.ror I Fa.ror 2 Fa.tor I Fá(tor 4

ln an atter¡pt to gain imrnortality by attachjrg


his name to a qucstionnaire. onc ol the authors
devekrps a test lor budding social workcrs called
thc Strcincr Knowledge oI Re]ationships,
Empathy, and warnth scale (the SCREW). He
starls oII with l2 iteils, wilich he hopes will tap
thesc thrcc arcas, and administcrs the[r to a
validation sample of 6l alrcady blooming SW
types. The rotatcd lactor Ioading rnatrix js
shown in thc lable.

Вам также может понравиться