Академический Документы
Профессиональный Документы
Культура Документы
19-2
Factor analysis is a general name denoting a class of Procedures primarily used for data reduction and summarization. Variables are not classified as either dependent or independent. Instead, the whole set of interdependent relationships among variables is examined in order to define a set of common dimensions called Factors.
19-3
To identify a new, smaller set of uncorrelated variables to replace the original set of correlated variables for subsequent analysis such as Regression or Discriminant Analysis. -- psychographic factors may be used as independent variables to explain the difference between loyal and non loyal customers.
19-4
Assumptions
Models are usually based on linear relationships Models assume that the data collected are interval scaled Multicollinearity in the data is desirable because the objective is to identify interrelated set of variables. The data should be amenable for factor analysis. It should not be such that a variable is only correlated with itself and no correlation exists with any other variables. This is like an Identity Matrix. Factor analysis cannot be done on such data.
19-5
An Example
A study conducted to determine customers perception and attributes of an airline. A set of 10 statements were constructed and respondents were asked to rate in a 7 point scale ( 1= completely agree, 7 = completely disagree ) Statements were as follows: 1. The Airline is always on time 2. The seats are very comfortable 3. I love the food they provide 4. Their air-hostesses are very courteous 5. My boss/friend flies with the same airline 6. The airlines have younger aircrafts 7. I get the advantage of a frequent flyer program 8. It suits my schedule 9. My mom feels safe when I fly in this airline 10. Flying by this airline compliments my lifestyle and social standing in the society
19-6
Example Contd..
Do the ten different statements indicate 10 different factors which influence a customer to fly by this airline ? OR Is there any correlations between these statements so that we can identify only a few factors such that some of these statements can be associated to these factors.
19-7
Similarly factor score can be calculated for each respondent. If there were 20 respondents, we would get a table containing 20 factor scores.
19-8
19-9
The factors thus extracted are done using a technique called Principal Component Analysis.
19-10
It is possible to extract as many factors as there are variables but the very purpose of factor analysis will be defeated and hence a smaller number of factors need to be found. Question is --- how many?
Several procedures are available: -- Determine based on Eigenvalues. An eigenvalue represents the amount of variance associated with the factor. Generally only factors with an Eigenvalue of >1.0 is included.
19-11
A scree plot is a plot of the eigenvalues against the number of factors. Typically, the plot has a distinct break with a gradual trailing off with the rest of the factors. This trailing off is referred to as Scree.
19-12
Scree Plot
3.0 2.5 Eigenvalue 2.0 1.5 1.0 0.5 0.0 1
3 4 5 Component Number
19-13
19-14
Kaiser-Meyer-Olkin ( KMO ) measure of sampling adequacy . This index compares the magnitude of observed correlation coefficients to the magnitude of partial correlation coefficients. Typically it should be > 0.5 is considered as good enough for conducting Factor analysis for the data under consideration. Bartlett test of sphericity : It is a test used to examine the hypothesis that the variables are uncorrelated in the population. If the hypothesis can be rejected then the data is suitable for factor analysis.
19-15
19-16
Correlation Matrix
Variables V1 V2 V3 V4 V5 V6
V3
V4
V5
V6
1.000 -0.136
1.000
19-17
19-18
Factor Matrix
Variables V1 V2 V3 V4 V5 V6 Factor 1 0.928 -0.301 0.936 -0.342 -0.869 -0.177 Factor 2 0.253 0.795 0.131 0.789 -0.351 0.871
19-19
Although the initial or unrotated factor matrix indicates the relationship between the factors and individual variables, it seldom results in factors that can be interpreted, because the factors are correlated with many variables. Therefore, through rotation the factor matrix is transformed into a simpler one that is easier to interpret. In rotating the factors, we would like each factor to have nonzero, or significant, loadings or coefficients for only some of the variables. Likewise, we would like each variable to have nonzero or significant loadings with only a few factors, if possible with only one. The rotation is called orthogonal rotation if the axes are maintained at right angles.
19-20
The most commonly used method for rotation is the varimax procedure. This is an orthogonal method of rotation that minimizes the number of variables with high loadings on a factor, thereby enhancing the interpretability of the factors. Orthogonal rotation results in factors that are uncorrelated. The rotation is called oblique rotation when the axes are not maintained at right angles, and the factors are correlated. Sometimes, allowing for correlations among factors can simplify the factor pattern matrix. Oblique rotation should be used when factors in the population are likely to be strongly correlated.
19-21
19-22
A factor can then be interpreted in terms of the variables that load high on it.
Another useful aid in interpretation is to plot the variables, using the factor loadings as coordinates. Variables at the end of an axis are those that have high loadings on only that factor, and hence describe the factor.
19-23
-2.66E-02
V2
V2
V3 V4 V5 Factor 2
-5.72E-02
0.934 -9.83E-02 -0.933
0.848
-0.146 0.854 -8.40E-02
V5
V3
V1
V6
8.337E-02 0.885
19-24
A few examples
We can now take few examples with hypothetical data and run
19-25