Академический Документы
Профессиональный Документы
Культура Документы
Submitted By:
Faculty of Arts
MUZAFFARABAD
1. Calculation of Mean (Average) in SPSS
WE will demonstrate how to compute a mean (average) variable from a set of variables in SPSS.
In this example, let’s say we have some scores (out of 100) on different college
subject tests. There are scores on Math, English, Biology and Chemistry from 13
students.
What we want to do is to average each student’s scores to see what their mean
score is from the 4 subjects; thus, creating a mean variable.
2. In the new Compute Variable window, first enter the name of the new variable to be created
in the ‘Target Variable’box.
Remember, SPSS does not like spaces in the variable names. A good example is to add the suffix
‘_avg’ to the variable name to signify that it is a mean.
Within the brackets of the mean function, enter all of the variables to be averaged, separating
each one with a comma. Ensure the variables are entered exactly as they appear in the SPSS
datasheet. To avoid errors with typos, you can also double-click on the variables listed in the box
to the left of the window.
MEAN(Maths,English,Biology,Chemistry)
The output
After running the MEAN compute function in SPSS, the new variable should be visible in
the data sheet. This value will then be the mean of the variables entered into the function. In the
example, this is the ‘Score_avg‘ variable.
Conclusion
In this guide, I have described how to compute the mean of various variables in SPSS.
Specifically, the ‘MEAN(?)‘ function is utilized in the Compute Variable option.
Pearson Correlation
Pearson’s correlation
Pearson product-moment correlation (PPMC)
To run the bivariate Pearson Correlation, click Analyze > Correlate > Bivariate. Select the
variables Height and Weight and move them to the Variables box. In the Correlation
Coefficients area, select Pearson. In the Test of Significance area, select your desired
significance test, two-tailed or one-tailed. We will select a two-tailed significance test in this
example. Check the box next to Flag significant correlations.
Click OK to run the bivariate Pearson Correlation. Output for the analysis will display in the
Output Viewer
To run a bivariate Pearson Correlation in SPSS, click Analyze > Correlate > Bivariate.
The Bivariate Correlations window opens, where you will specify the variables to be used in the
analysis. All of the variables in your dataset appear in the list on the left side. To select variables
for the analysis, select the variables in the list on the left and click the blue arrow button to move
them to the right, in the Variables field.
A (Variables): The variables to be used in the bivariate Pearson Correlation. You must select at
least two continuous variables, but may select more than two. The test will produce correlation
coefficients for each pair of variables in this list.
D (Flag significant correlations): Checking this option will include asterisks (**) next to
statistically significant correlations in the output. By default, SPSS marks statistical significance
at the alpha = 0.05 and alpha = 0.01 levels, but not at the alpha = 0.001 level (which is treated as
alpha = 0.01)
E (Options): Clicking Options will open a window where you can specify which Statistics to
include (i.e., Means and standard deviations, Cross-product deviations and covariances) and
how to address Missing Values (i.e., Exclude cases pairwise or Exclude cases listwise). Note
that the pairwise/listwise setting does not affect your computations if you are only entering two
variable, but can make a very large difference if you are entering three or more variables into the
correlation procedure.
Output
Table of Pearson Correlation output. Height and weight have a significant positive
correlation (r=0.513, p < 0.001).
In SPSS Statistics, we created two variables so that we could enter our data: Income (the
independent variable), and Price (the dependent variable).
The five steps below show you how to analyze your data using linear regression in SPSS
Statistics when none of the six assumptions in the previous section, Assumptions, have been
violated. At the end of these four steps, we show you how to interpret the results from your linear
regression.
Click Analyze > Regression > Linear... on the top menu, as shown below:
Transfer the independent variable, Income, into the Independent(s): box and the dependent
variable, Price, into the Dependent:box. You can do this by either drag-and-dropping the
variables or by using the appropriate buttons. You will end up with the following screen:
ou now need to check four of the assumptions discussed in the Assumptions section above: no
significant outliers (assumption #3); independence of observations (assumption #4);
homoscedasticity (assumption #5); and normal distribution of errors/residuals (assumptions #6).
You can do this by using the and features, and then selecting the
appropriate options within these two dialogue boxes. In our enhanced linear regression guide, we
show you which options to select in order to test whether your data meets these four
assumptions.
Click the button. This will generate the results.