Вы находитесь на странице: 1из 9

ASSIGNMENT ON

Calculation of Mean, correlation


And Regression in SPSS

Subject: Research Methodology

Level: M. Com (3rd Semester)

Submitted To: Saliha Gull Abbasi

Submitted By:

Muhammad Kashif (Roll No 01)

Shahid Mehmood (Roll No. 04)

Abdul Wahab Abbasi (Roll No. 55)

Department of Management Sciences

Faculty of Arts

UNIVERSITY OF AZAD JAMMU AND KASHMIR,

MUZAFFARABAD
1. Calculation of Mean (Average) in SPSS

WE will demonstrate how to compute a mean (average) variable from a set of variables in SPSS.

In this example, let’s say we have some scores (out of 100) on different college
subject tests. There are scores on Math, English, Biology and Chemistry from 13
students.
What we want to do is to average each student’s scores to see what their mean
score is from the 4 subjects; thus, creating a mean variable.

Steps to calculate mean

1. In SPSS, go to ‘Transform > Compute Variable‘.

2. In the new Compute Variable window, first enter the name of the new variable to be created
in the ‘Target Variable’box.
Remember, SPSS does not like spaces in the variable names. A good example is to add the suffix
‘_avg’ to the variable name to signify that it is a mean.

In the ‘Numeric Expression‘ box, enter the function ‘MEAN(?)’.

Within the brackets of the mean function, enter all of the variables to be averaged, separating
each one with a comma. Ensure the variables are entered exactly as they appear in the SPSS
datasheet. To avoid errors with typos, you can also double-click on the variables listed in the box
to the left of the window.

In the example, the function will be the following:

MEAN(Maths,English,Biology,Chemistry)

3. Finally, click the ‘Continue‘ button to compute the mean variable.

The output

After running the MEAN compute function in SPSS, the new variable should be visible in
the data sheet. This value will then be the mean of the variables entered into the function. In the
example, this is the ‘Score_avg‘ variable.

Conclusion
In this guide, I have described how to compute the mean of various variables in SPSS.
Specifically, the ‘MEAN(?)‘ function is utilized in the Compute Variable option.

2. Calculation of Correlation in SPSS

Pearson Correlation

The bivariate Pearson Correlation produces a sample correlation coefficient, r,


which measures the strength and direction of linear relationships between pairs of
continuous variables

This measure is also known as:

 Pearson’s correlation
 Pearson product-moment correlation (PPMC)

Run a Bivariate Pearson Correlation

To run the bivariate Pearson Correlation, click Analyze > Correlate > Bivariate. Select the
variables Height and Weight and move them to the Variables box. In the Correlation
Coefficients area, select Pearson. In the Test of Significance area, select your desired
significance test, two-tailed or one-tailed. We will select a two-tailed significance test in this
example. Check the box next to Flag significant correlations.
Click OK to run the bivariate Pearson Correlation. Output for the analysis will display in the
Output Viewer

Explanation through pictures

To run a bivariate Pearson Correlation in SPSS, click Analyze > Correlate > Bivariate.

The Bivariate Correlations window opens, where you will specify the variables to be used in the
analysis. All of the variables in your dataset appear in the list on the left side. To select variables
for the analysis, select the variables in the list on the left and click the blue arrow button to move
them to the right, in the Variables field.
A (Variables): The variables to be used in the bivariate Pearson Correlation. You must select at
least two continuous variables, but may select more than two. The test will produce correlation
coefficients for each pair of variables in this list.

B (Correlation Coefficients): There are multiple types of correlation coefficients. By


default, Pearson is selected. Selecting Pearson will produce the test statistics for a bivariate
Pearson Correlation.

C (Test of Significance): Click Two-tailed or One-tailed, depending on your desired


significance test. SPSS uses a two-tailed test by default.

D (Flag significant correlations): Checking this option will include asterisks (**) next to
statistically significant correlations in the output. By default, SPSS marks statistical significance
at the alpha = 0.05 and alpha = 0.01 levels, but not at the alpha = 0.001 level (which is treated as
alpha = 0.01)

E (Options): Clicking Options will open a window where you can specify which Statistics to
include (i.e., Means and standard deviations, Cross-product deviations and covariances) and
how to address Missing Values (i.e., Exclude cases pairwise or Exclude cases listwise). Note
that the pairwise/listwise setting does not affect your computations if you are only entering two
variable, but can make a very large difference if you are entering three or more variables into the
correlation procedure.
Output

The results will display the correlations in a table, labeled Correlations.

Table of Pearson Correlation output. Height and weight have a significant positive
correlation (r=0.513, p < 0.001).

3. Calculation of Regression in SPSS

We are taking following example to understand how regression is


calculated

A salesperson for a large car brand wants to determine whether there is a


relationship between an individual's income and the price they pay for a car. As
such, the individual's "income" is the independent variable and the "price" they
pay for a car is the dependent variable. The salesperson wants to use this
information to determine which cars to offer potential customers in new areas
where average income is known.

In SPSS Statistics, we created two variables so that we could enter our data: Income (the
independent variable), and Price (the dependent variable).
The five steps below show you how to analyze your data using linear regression in SPSS
Statistics when none of the six assumptions in the previous section, Assumptions, have been
violated. At the end of these four steps, we show you how to interpret the results from your linear
regression.

Click Analyze > Regression > Linear... on the top menu, as shown below:

You will be presented with the Linear Regression dialogue box:

Transfer the independent variable, Income, into the Independent(s): box and the dependent
variable, Price, into the Dependent:box. You can do this by either drag-and-dropping the
variables or by using the appropriate buttons. You will end up with the following screen:
 ou now need to check four of the assumptions discussed in the Assumptions section above: no
significant outliers (assumption #3); independence of observations (assumption #4);
homoscedasticity (assumption #5); and normal distribution of errors/residuals (assumptions #6).
You can do this by using the and features, and then selecting the
appropriate options within these two dialogue boxes. In our enhanced linear regression guide, we
show you which options to select in order to test whether your data meets these four
assumptions.
 Click the button. This will generate the results.

Вам также может понравиться