Вы находитесь на странице: 1из 97

Research Process, Research Design

and Questionnaires
RESEARCH PROCESS
Identify and Define Research Problem

Theory / Practice

Hypotheses / Conceptualization

Research Design

Data collection

Data Analysis

Findings







In this workshop we
talk about all of the
steps in the research
process except Data
Analysis and Findings.
What is a problem?

. . . any situation where a gap exists between the
actual and the desired state.

A problem does not necessarily mean that something
is seriously wrong. It could simply indicate an interest
in improving an existing situation. Thus, problem
definitions can include both existing problems in the
current situation as well as the quest for idealistic
states in the future.
RESEARCH PROCESS Research Problem
How are problems identified?

1. Observation manager/researcher senses that changes
are occurring, or that some new behaviors, attitudes,
feelings, communication patterns, etc., are surfacing in
ones environment. The manager may not understand
exactly what is happening, but can definitely sense that
things are not what they should be.

2. Preliminary Data Collection use of interviews, both
unstructured and structured, to get an idea or feel for
what is happening in the situation.

3. Literature Survey a comprehensive review of the
published and unpublished work from secondary sources
of data in the areas related to the problem.
RESEARCH PROCESS Problem Identification
A literature survey ensures that:

1. Important variables likely to influence the problem are
not left out of the study.
2. A clearer idea emerges regarding what variables are
most important to consider, why they are important, and
how they should be investigated.
3. The problem is more accurately and precisely defined.
4. The interviews cover all important topics.
5. The research hypotheses are testable.
6. The research can be replicated.
7. One does not reinvent the wheel; that is, time is not
wasted trying to rediscover something that is already
known.
8. The problem to be investigated is perceived by the
scientific community as relevant and significant.
RESEARCH PROCESS Problem Identification
Typical Business Research Problems:

1. Training programs are not as effective as anticipated.
2. Sales volume of products/services is not increasing.
3. Balancing of accounting ledgers is becoming increasingly
difficult.
4. The newly installed information system is not being used by
the employees for whom it was designed.
5. Introduction of flexible work hours has created more
problems than it has solved.
6. Anticipated results of a recent merger/acquisition have not
been realized.
7. Inventory control systems are not effective.
8. Frequent interruptions in production.
9. Low employee morale.
10. Frequent customer complaints.
11. Installation of an MIS keeps getting delayed.
12. Ad campaign is not generating new sales prospects.

RESEARCH PROCESS Problem Identification
What are some business problems
you are aware of or have confronted?
RESEARCH PROCESS Problem Identification
Problem Definition Steps:
Understand and define the complete problem. If more
than one problem is identified, separate and prioritize
them in terms of who and when they will be dealt with.
Identify and separate out measurable symptoms to
determine root problem versus easily observable
symptoms. For example, a manager may identify
declining sales or lost market share as the problem, but
the real problem may be bad advertising, low salesperson
morale, or ineffective distribution. Similarly, low
productivity may be a symptom of employee morale or
motivation problems, or supervisor issues.
Determine the unit of analysis = individuals, households,
businesses, objects (e.g., products, stores), geographic
areas, etc., or some combination.
Determine the relevant variables, including specifying
independent and dependent relationships, constructs, etc.
RESEARCH PROCESS Problem Definition
Examples of Well-Defined problems:

1. Has the new packaging affected the sales of the product?
2. How do price and quality rate on consumers evaluation of products?
3. Is the effect of participative budgeting on performance moderated by
control systems?
4. Does better automation lead to greater asset investment per dollar of
output?
5. Has the new advertising message resulted in higher recall?
6. To what extent do the organizational structure and type of information
systems account for the variance in the perceived effectiveness of
managerial decision-making?
7. Will expansion of international operations result in an improvement in
the firms image and value?
8. What are the effects of downsizing on the long-range growth patterns
of companies?
9. What are the components of quality of life?
10. What are the specific factors to be considered in creating a data
warehouse for a manufacturing company?
RESEARCH PROCESS Problem Definition
RESEARCH PROCESS Definitions
Variable = the observable and measurable characteristics/attributes the
researcher specifies, studies, and draws conclusions about.

Types of Variables:
Independent variable = also called a predictor variable, it is a variable or
construct that influences or explains the dependent variable either in a positive
or negative way.
Dependent variable = also known as a criterion variable, it is a variable or
construct the researcher hopes to understand, explain and/or predict.
Moderator variable = a variable that has an effect on the independent
dependent variable relationship. The presence of a moderator variable modifies
the original relationship between the independent and dependent variables by
interacting with the independent variable to influence the strength of the
relationship with the dependent variable.
Mediating variable = also known as an intervening variable, it is a variable
that surfaces as a function of the independent variable and explains the
relationship between the dependent and independent variables. Moderator
variables specify when certain effects will occur whereas mediators speak to
how or why such effects occur. Moreover, mediators explain how external
events take on internal psychological significance.
RESEARCH PROCESS Definitions continued . . .
Measurement = is the process of determining the direction and intensity of
feelings about persons, events, concepts, ideas, and/or objects of interest
that are defined as being part of the business problem. As part of
measurement, researchers use predetermined rules to assign numbers or
labels to: (1) individuals attitudes, behaviors, characteristics, etc.; (2)
objects features or attributes; and (3) any other phenomenon or event
being investigated. Rules tell researchers how to assign numbers or labels;
e.g., assign the numbers 1 to 7 to responses based on the intensity of an
individuals feelings, beliefs, etc.
Measurement involves two processes: (1) identification/development of
constructs; and (2) scale measurement. The first process involves
identifying and defining what is to be measured, while the second process
involves selecting the scale to measure the construct(s).
Construct = also referred to as a concept, it is a abstract idea formed in the
mind based on a set of facts or observations. The idea is a combination of a
number of similar characteristics of the construct. Examples of constructs
include: brand awareness, brand familiarity, purchase intentions,
satisfaction, importance, trust, service quality, role ambiguity, etc.
Scale measurement = using a set of symbols or numbers to represent the
range of possible responses to a research question.
Examples of Constructs Investigated in Marketing:
Constructs Operational Description
Brand Awareness Percentage of respondents that have heard of a
designated brand; awareness could be either
unaided or aided.
Brand Attitudes The number of respondents and their intensity of
feeling positive or negative toward a specific brand.
Purchase Intentions The number of people planning to buy the
specified object (e.g., product or service) within
a designated time period.
Importance of Factors To what extent do specific factors influence
a person's purchase choice.
Psychographics The attitudes, opinions, interests and
lifestyle characteristics of individuals
providing the information.
Satisfaction How people evaluate their post-purchase
consumption experience with a particular
product, service or company.
RESEARCH PROCESS Constructs
Role Ambiguity Construct
Conceptual/theoretical definition = the difference between the information
available to the person (actual knowledge) and that which is required for
adequate performance of a role.
Operational definition = the amount of uncertainty an individual feels
regarding job role responsibilities and expectations from supervisors,
other employees and customers.
Measurement scale = consists of 45 items assessed using a 5-point scale,
with category labels 1 =very certain, 2 =certain, 3 =neutral, 4 =
uncertain, and 5 =very uncertain.
Examples of items:
How much freedom of action I am expected to have.
How I am expected to handle non-routine activities on the job.
The sheer amount of work I am expected to do.
To what extent my boss is open to hearing my point of view.
How satisfied my boss is with me.
How I am expected to interact with my customers.
Source: Singh & Rhoads,
J MR, August 1991, p. 328.
Service Quality Construct
Conceptual/theoretical definition = the difference between an individuals
expectations of service and their actual experiences.
Operational definition = how individuals react to their actual service
experience with a company relative to their expectations that a
company will possess certain service characteristics.
Measurement scale = consists of 82 items assessed using a 7-point scale,
with category labels 1 =not at all essential to 7 =absolutely essential.
Examples of items:
Employees of excellent companies will give prompt service to customers.
Excellent companies will have the customers best interests at heart.
Excellent companies will perform services right the first time.
Employees of excellent companies will never be too busy to respond to
customer requests.
Excellent companies will give customers individual attention.
Materials associated with products and services of excellent companies
(such as pamphlets or statements) will be visually appealing .
Source: Parasuraman, Zeithaml &
Berry, J M, Fall 1985, p. 44.
RESEARCH PROCESS
Identify and Define Research Problem

Theory / Practice

Hypotheses / Conceptualization

Research Design

Data collection

Data Analysis

Findings







What is theory ??
RESEARCH PROCESS Theory/Practice
Theory = a systematic set of relationships
providing a consistent and comprehensive explanation
of a phenomenon. In practice, a theory is a
researchers attempt to specify the entire set of
dependence relationships explaining a particular set
of outcomes.

Theory is based on prior empirical research, past
experiences and observations of behavior, attitudes,
or other phenomena, and other theories that provide a
perspective for developing possible relationships.

Theory is used to prepare a theoretical framework
for the research.
RESEARCH PROCESS Theory/Practice
RESEARCH PROCESS
Identify and Define Research Problem

Theory / Practice

Hypotheses / Conceptualization

Research Design

Data collection

Data Analysis

Findings







Hypotheses = preconceptions the researcher
develops regarding the relationships represented
in the data, typically based on theory, practice or
previous research.
Examples:

The average number of cups of coffee students drink
during finals will be greater than the average they
consume at other times.

Younger, part-time employees of Samouels restaurant
are more likely to search for a new job.
RESEARCH PROCESS Hypotheses
Theoretical Framework = a written description
that includes a conceptual model. It integrates all
the information about the problem in a logical
manner, describes the relationships among the
variables, explains the theory underlying these
relationships, and indicates the nature and direction
of the relationships.

The process of developing a theoretical
framework involves conceptualization which is a
visual specification (conceptual model) of the
theoretical basis of the relationships you would like
to examine.
RESEARCH PROCESS Theoretical Framework
Basic Features of a Good Theoretical Framework:

1. The variables/constructs considered relevant to the study are
clearly identified and labeled.
2. The discussion states how the variables/constructs are
related to each other, e.g., dependent, independent,
moderator, etc.
3. If possible, the nature (positive or negative) of the
relationships as well as the direction is hypothesized on the
basis of theory, previous research or researcher judgment.
4. There is a clear explanation of why you expect these
relationships to exist.
5. A visual (schematic) diagram of the theoretical framework is
prepared to clearly illustrate the hypothesized relationships.

RESEARCH PROCESS Theory/Practice
RESEARCH PROCESS Conceptual Models
Price
Purchase
Likelihood
Price
Purchase
Likelihood
Independent Dependent
Variable Variable
Independent Dependent
Variable Variable
Discount Level
Restrictions
Moderator Variable
RESEARCH PROCESS Conceptual Models
Price
Purchase
Likelihood
Independent Dependent
Variable Variable
Perceived
Value
Mediator Variable
(full mediation)
Price
Perceived
Value
Purchase
Likelihood
Mediator Variable
(partial mediation)
Group Exercise: Use the Samouels and Ginos
restaurant database variables to develop a
theoretical framework/conceptual model of the
relationships that could be examined. Consider
and evaluate several models, but be prepared to
report your most interesting or thought
provoking model.
Theoretical Framework Conceptualization
Conceptual Models Samouels Employee Database
Potential Hypotheses:
Commitment is positively related to supervision, work groups and compensation.
Intention to Search is negatively related to supervision, work groups & compensation.
Employee
Commitment
Work Groups
Supervision
Compensation
Intention to
Search
Compensation
Work Groups
Supervision
Variable Description Variable Type
Restaurant Perceptions
X
1
Excellent Food Quality Metric
X
2
Attractive Interior Metric
X
3
Generous Portions Metric
X
4
Excellent Food Taste Metric
X
5
Good Value for the Money Metric
X
6
Friendly Employees Metric
X
7
Appears Clean & Neat Metric
X
8
Fun Place to Go Metric
X
9
Wide Variety of menu Items Metric
X
10
Reasonable Prices Metric
X
11
Courteous Employees Metric
X
12
Competent Employees Metric
Selection Factor Rankings
X
13
Food Quality Nonmetric
X
14
Atmosphere Nonmetric
X
15
Prices Nonmetric
X
16
Employees Nonmetric
Relationship Variables
X
17
Satisfaction Metric
X
18
Likely to Return in Future Metric
X
19
Recommend to Friend Metric
X
20
Frequency of Patronage Nonmetric
X
21
Length of Time a Customer Nonmetric
Classification Variables
X
22
Gender Nonmetric
X
23
Age Nonmetric
X
24
Income Nonmetric
X
25
Competitor Nonmetric
X
26
Which AD Viewed (#1, 2 or 3) Nonmetric
X
27
AD Rating Metric
X
28
Respondents that Viewed Ads Nonmetric

Description of Customer Survey Variables
Variable Description Variable Type
Work Environment Measures
X
1
I am paid fairly for the work I do. Metric
X
2
I am doing the kind of work I want. Metric
X
3
My supervisor gives credit an praise for work well done. Metric
X
4
There is a lot of cooperation among the members of my work group. Metric
X
5
My job allows me to learn new skills. Metric
X
6
My supervisor recognizes my potential. Metric
X
7
My work gives me a sense of accomplishment. Metric
X
8
My immediate work group functions as a team. Metric
X
9
My pay reflects the effort I put into doing my work. Metric
X
10
My supervisor is friendly and helpful. Metric
X
11
The members of my work group have the skills and/or training
to do their job well. Metric
X
12
The benefits I receive are reasonable. Metric
Relationship Measures
X
13
Loyalty I have a sense of loyalty to Samouels restaurant. Metric
X
14
Effort I am willing to put in a great deal of effort beyond that
expected to help Samouels restaurant to be successful. Metric
X
15
Proud I am proud to tell others that I work for Samouels restaurant. Metric
Classification Variables
X
16
Intention to Search Metric
X
17
Length of Time an Employee Nonmetric
X
18
Work Type = Part-Time vs. Full-Time Nonmetric
X
19
Gender Nonmetric
X
20
Age Nonmetric
X
21
Performance Metric

Description of Employee Survey Variables
RESEARCH PROCESS
Identify and Define Research Problem

Theory / Practice

Hypotheses / Conceptualization

Research Design

Data collection

Data Analysis

Findings







RESEARCH DESIGN Types
Research Design Alternatives Purpose:

(1) Exploratory to formulate the problem, develop
hypotheses, identify constructs, establish priorities
for research, refine ideas, clarify concepts, etc.

(2) Descriptive to describe characteristics of certain
groups, estimate proportion of people in a population
who behave in a given way, and to make directional
predictions.

(3) Causal to provide evidence of the relationships
between variables, the sequence in which events
occur, and/or to eliminate other possible explanations.
Two Broad Approaches:
1. Qualitative.

2. Quantitative.
Research Design Approaches
Role of Qualitative Research:
Search of academic, trade and professional
literature (both traditional & Internet).
Use of interviews, brainstorming, focus groups.
Internalization of how others have undertaken
both qualitative and quantitative research.
Use of existing questionnaires/constructs.
Outcome of Qualitative Research:
Improve conceptualization.
Clarify research design, including data collection
approach.
Draft questionnaire.
RESEARCH DESIGN
Role of Quantitative Research:
Quantify data and generalize results from
sample to population.
Facilitates examination of large number of
representative cases.
Structured approach to data collection.
Enables extensive statistical analysis.
Outcome of Quantitative Research:
Validation of qualitative research findings.
Confirmation of hypotheses, theories, etc.
Recommend final course of action.
RESEARCH DESIGN
RESEARCH PROCESS
Identify and Define Research Problem

Theory / Practice

Hypotheses / Conceptualization

Research Design

Data collection

Data Analysis

Findings







DATA COLLECTION
Approaches:

Observation
Human
Mechanical/Electronic Devices

Surveys
Self-Completion
Mail/Overnight Delivery/Fax
Electronic
Interviewer-Administered
Face-to-Face Home, Work, Mall, Focus Groups
Telephone

DATA COLLECTION
Selection of data collection approach?

Budget
Knowledge of issues qualitative vs. quantitative
Respondent Participation
Taste Test; Ad Test
Card Sorts; Visual Scaling
Time Available

DATA COLLECTION
Types of Data:

Primary

Secondary

PRIMARY DATA
Primary Data Sources:

Informal discussions; brainstorming
Focus groups
Observational Methods
Structured & Unstructured Surveys
Experiments

Primary Data Focus Groups
Focus Groups = bring a small group of people (10-12)
together for an interactive, spontaneous discussion of a
particular topic or concept. Discussion is led by a trained
moderator and usually lasts 1 hours.

Typical Objectives:

To identify and define problems.
To generate new ideas about products, services, delivery
methods, etc.
To test advertising themes, positioning statements, company
and product names, etc.
To discover new constructs and measurement methods.
To understand customer needs, wants, attitudes, behaviors,
preferences and motives.

Primary Data
Factors Influencing Overall Mobile Phone Satisfaction
2003 2002
Features 27% 21%
Durability 23% 16%
Physical Design 19% 28%
Battery Function 16% 16%
Operation 15% 19%


2004 Wireless Retail Sales Satisfaction Study
Sales Staff 44%
Price/Promotion 28%
Store Display 14%
Store Facility 14%

Source: J .D. Power and Associates, 2002, 2003 & 2004.
These factors
typically are
identified in
qualitative
focus groups
(exploratory
research).
These percentages
typically are determined
in quantitative surveys
(descriptive research).
Hotel Selection Factors:
1. Location
2. Past Experience
3. Recommendations or Friends and Family
4. Brand Reputation

Guest Satisfaction Factors:
1. Guest Room
2. Departure Process
3. Pre-Arrival/Arrival Experiences
4. Hotel Services
5. Food & Beverage services

Note: the first three factors account for more than 70 percent
of guest satisfaction ratings.

Source: J.D. Power & Associates, August 21, 2001.
Primary Data
Original Equipment Tire Satisfaction Study:

1. Product Quality 39%
- Number of tires with a problem
- Number of problems experienced
- Number of original tires replaced
2. Long-Term Performance 22%
- Wear ability
- Length of warranty
- Overall reliability & dependability
- Freedom from pull to left or right
3. Situational Performance 19%
- Traction on wet roads
- Traction at fast starts
- Holds road well in emergencies
- Lack of vibration at highway speeds
- Overall safety
- Overall ride at highway speeds
4. Design 14%
- Road quietness
- Style & appearance of sidewalls
- Tread design
- Size of tire matches size of vehicle
5. Winter Traction 5%

Source: J.D. Power & Associates, August 27, 2001.
Primary Data
What is the construct
in this study?
PRIMARY DATA Focus Groups
Focus Groups:

Some of my best experiences?




Some of my worst experiences?





PRIMARY DATA Observations
CONSIDERATIONS:

Methods human/mechanical/electronic.

Useful where respondent cannot or will not
articulate the answer.

Cannot be used to measure thoughts, feelings,
attitudes, opinions, etc.
Purpose of Questionnaires:
To obtain information that cannot be easily
observed or is not already available in
written or electronic form.

Questionnaires enable researchers to measure
concepts/constructs.
PRIMARY DATA QUESTIONNAIRES




Steps in Questionnaire Design:
1. Initial Considerations problem, objectives,
target population, sampling, etc.
2. Clarification of Concepts select variables,
constructs, measurement approach, etc.
3. Developing the Questionnaire
Length and sequence.
Types of questions.
Sources of questions.
Wording, coding, layout and instructions.
4. Pre-testing the Questionnaire.
5. Questionnaire Administration Planning.
QUESTIONNAIRE DESIGN
Open-ended Questions = place no constraints on
respondents; i.e., they are free to answer in their own
words and to give whatever thoughts come to mind.

Closed-ended Questions = respondent is given the
option of choosing from a number of predetermined
answers.
Two Types of Questions:
1. Open-ended
2. Closed-ended
QUESTIONNAIRE DESIGN
Examples of Open-ended Questions:

How do you typically decide which restaurant you will
eat at?
Which mutual funds have you been investing in for the
past year?
How are your investment funds performing?
Do you think airport security is better now than it
was six months ago?
QUESTIONNAIRE DESIGN
Open-ended Questions
Typically used in exploratory/qualitative studies.
Typically used in personal interview surveys involving
small samples.
Allows respondent freedom of response.
Respondent must be articulate and willing to spend time
giving a full answer.
Data is in narrative form which can be time consuming and
difficult to code and analyze.
Possible researcher bias in interpretation.
Narrative is analyzed using of content analysis. Software
is available (e.g., NUD*IST).
QUESTIONNAIRE DESIGN
Content Analysis Software:
TextSmart is a software package that enables users to view,
manipulate and automate the coding or categorization of responses to
narative data. The ability to automate the examination and
organization of narrative data is particularly helpful when a large
scale survey is undertaken. It can be used to analyze any textual data,
and its output can be exported to SPSS for further analysis. For
example, you can do correspondence analysis
*
on a contingency table
from a TextSmart analysis. For more information about TextSmart and
related SPSS products visit the WWW site www.spss.com.

QSR NUD*IST stands for Non-Numerical Unstructured Data
Indexing and Theorizing. It is a popular computer software package
used by researchers to analyze text from focus group or interview
transcripts, literary documents and so on. It examines non-textual
data such as photographs, tape recordings, films and so on. Users can
us it to index and link several documents in a structured way to
produce categorical data in a form amenable to further analysis.
NUD*IST output can be exported to software programs such as SPSS
and Excel. For more information about QSR NUD*IST and its related
product NVIVO visit their website
(http://www.scolari.co.uk/qsr/qsr_n4.htm).

Closed-end Questions:

Single Answer
Multiple Answer
Rank Order
Numeric
Likert-Type Scales
Semantic Differential
QUESTIONNAIRE DESIGN
1. Did you check your email this morning? __ Yes __ No
2. Do you believe Enron senior executives should be put in jail? __ Yes __ No
3. Should the U.K. adopt the Euro or keep the Pound?
__ Adopt the Euro
__ Keep the Pound
4. Which countries in Europe have you traveled to in the last six months?
__ Belgium
__ Germany
__ France
__ Holland
__ Italy
__ Switzerland
__ Spain
__ Other (please specify) _____________
5. How often do you eat at Samouels Greek Cuisine restaurant?
__ Never
__ 1 4 times per year
__ 5 8 times per year
__ 9 12 times per year
__ More than 12 times per year
Examples of Closed-end Questions:
Closed-end Questions
Typically used in quantitative studies.
Assumption is researcher has knowledge to pre-specify
response categories.
Data can be pre-coded and therefore in a form amenable
for use with statistical packages (e.g., SPSS, SAS)
data capture therefore easier.
More difficult to design but simplifies analysis.
Used in studies involving large samples.
Limited range of response options.
QUESTIONNAIRE DESIGN
Broad Considerations
Sequencing of questions.
Identification of concepts.
How many questions are required to capture
each concept.
Question wording.
Overall length of questionnaire.
Placing of sensitive questions.
Ability of respondents.
Level of measurement.
Open-ended versus closed-end questions.
QUESTIONNAIRE DESIGN
Questionnaire Sequence
Opening Questions
Research Topic Questions
Classification Questions
QUESTIONNAIRE DESIGN
Screening or Filter Questions:
. . . are used to ensure respondents included in the
study are those that meet the pre-determined criteria
of the target population.


Tonight we are talking with individuals who are 18
years of age or older and have 50 percent or more of
the responsibility for banking decisions in your
household. Are you that person? __ Yes __ No
QUESTIONNAIRE DESIGN Opening Questions
Rapport Questions:
. . . are used to establish rapport with the respondent
by gaining their attention and stimulating their interest
in the topic.

Have you seen any good movies in the last month?
__ Yes __ No

What is your favorite seafood restaurant?

QUESTIONNAIRE DESIGN Opening Questions
Concept/construct = an abstract idea formed in the mind. The idea
is a combination of a number of similar characteristics/variables that
collectively define the concept and are used to measure it. Constructs
are abstract/intangible and cannot be directly observed or measured
because they are the mental images a person attaches to an object,
such as attitudes, feelings, perceptions, expectations, or expressions
of future actions (e.g., purchase intentions).
Example Concept: Customer Service issues for
a B-to-B situation

Reliable delivery
Technical sales Support
Inside sales representatives
Field sales representatives
Complaint resolution
Ordering/Invoicing
Website design
QUESTIONNAIRE DESIGN
Research Topic Questions
Concepts
Concept Identification
Conceptual definition e.g., Service Quality.
As perceived by customers, it is the difference
between customers expectations or desires
of a vendor and their perceptions of the actual
situation (their experiences).

Working Definition for Concept
Decompose definition into components.
Search for items that are measurable.
QUESTIONNAIRE DESIGN
Service Quality Construct:




Research has shown the service quality construct can be indirectly
represented by the following measurable components:
The service providers ability to . . . .
communicate and listen to consumers;
sincerely empathize with customers in interpreting their needs
and wants;
be tactful in responding to customers questions, objections, and
problems;
create an impression of reliability in performing services;
create an image of credibility by keeping promises;
demonstrate sufficient technical knowledge and competence;
exhibit strong interpersonal skills in dealing with customers.
QUESTIONNAIRE DESIGN
Concept Development Exercise:
Concept = Restaurant Service Quality

1. What are the components of service quality as
they relate to a restaurant?

2. How do you measure these components?
QUESTIONNAIRE DESIGN
Preparing Good Questions:



Use Simple Words.
Be brief.
Avoid Ambiguity.
Avoid Leading Questions.
Avoid Double-Barreled Questions.
Check Questionnaire Layout.
Prepare Clear Instructions.
Watch Question Sequence.
QUESTIONNAIRE DESIGN
QUESTIONNAIRE DESIGN





Recently a survey was conducted by the United Nations using a
sample from several different countries. The question asked
was:

" Would you please give your opinion about the food shortage in
the rest of the world?"

The survey was a huge failure. Why?

In Africa they did not know what 'food' meant.
In Western Europe, they did not know what 'shortage' meant.
In Eastern Europe they did not know what 'opinion' meant.
In South America they did not know what 'please' meant.
And in the U.S., they did not know what 'the rest of the
world' meant.
Avoid Position Bias:




Position Bias:
How important are flexible hours in evaluating
job alternatives?
What factors are important in evaluating
job alternatives?

No Position Bias:
What factors are important in evaluating
job alternatives?
How important are flexible hours in evaluating
job alternatives?
QUESTIONNAIRE DESIGN
QUESTIONNAIRE DESIGN





To what extent do you agree or disagree with the
following statements?

Harrods employees are friendly and helpful.

Harrods employees are courteous and knowledgeable.

Double-Barreled Questions:
QUESTIONNAIRE DESIGN





. . . are used to direct respondents to answer the right
questions as well as questions in the proper sequence.

Have you seen or heard any advertisements for wireless
telephone service in the past 30 days?
If No, go to question #10.
If Yes , were the advertisements on radio or TV or both?
If the advertisements were on TV or on both radio and
TV, then go to question #6?
If the advertisements were on radio, then go to
question #8.

Following questions #6 and #8 the next question would be:

Were any of the advertisements for Sprint PCS?
Branching Questions:
QUESTIONNAIRE DESIGN





Introducing and explaining how to answer a series of
questions on a particular topic.
Transition statements from one section (topic) of the
questionnaire to another.
Which question to go to next (branching or skipping).
How many answers are acceptable, e.g., Check only
one response or Check as many as apply.
Whether respondents are supposed to answer the
question by themselves, or can consult another
person or reference materials.
What to do when the questionnaire is completed, e.g.,
When finished, place this in the postage paid
envelope and mail it.
Issues Self-Completion Instructions:
QUESTIONNAIRE DESIGN





How to increase respondent participation.
How to screen out respondents that are not wanted and
still keep them happy.
What to say when respondents ask how to answer a
particular question.
When concepts may not be easily understood, how to
define them.
When answer alternatives are to be read to respondents
(aided response) or not to be read (unaided response).
How to follow branching or skip patterns.
When and how to probe.
How to end the interview.
Issues Interviewer-Assisted Instructions:
Identify response bias for below questions:
1. Do you advocate a lower speed limit to save human lives?
2. When you visited the museum, how many times did you read the
plaques that explain what the exhibit contained?
3. About what time do you ordinarily eat dinner?
4. How important is it for stores to carry a large variety of different
brands of this product?
5. Would you favor increasing taxes to cope with the current fiscal
crisis?
6. Dont you see some danger in the new policy?
7. What small appliance, such as countertop appliances, have you
purchased in the past month?
8. When you buy fast food, what percentage of the time do you
order each of the following types of food?
9. Do you like orange juice?
QUESTIONNAIRE DESIGN
Comments on Questions:
1. A loaded question because everyone wants to save lives. Also, it
presumes that lower speed limits saves lives.
2. Too specific because respondents likely cannot remember the
exact number of times.
3. Ambiguous because dont know if dinner is lunch or evening.
4. Not specific enough about types of stores.
5. Overemphasis because refers to crisis.
6. Leading question because uses danger in sentence.
7. Answers likely to relate only to countertop appliances and not all
small appliances.
8. Over generalization because does not specify time period.
9. Ambiguous because may like orange juice for themselves, or for
their kids, but really do not know.
QUESTIONNAIRE DESIGN
Objective: to identify possible shortcomings of questionnaire.
Approaches informal or formal.
Can assess:








No hard and fast rules.
ability to perform meaningful
analyses
time to complete the
questionnaire
cost of data collection
which questions are relevant
whether key questions have
been overlooked
sources of bias
clarity of instructions
cover letter
clarity of questions
adequacy of codes and
categories for pre-coded
questions
quality of responses
likely response rate
Pre-testing Questionnaires:
QUESTIONNAIRE DESIGN
Scales = the approach used to measure
concepts (constructs).
Two Options:
1. Use published scales.
2. Develop original scales.
Scale Development

Sources of Published Scales
Organizational Behavior and Management
Price, James L., Handbook of Organizational Measurement, International Journal of
Manpower, Vol. 18, Number 4/5/6, 1997, ISSN 0143-7720, www.mcb.co.uk
Has 28 chapters with constructs measuring organizational behavior.

Management Information Systems (MIS)
www.ucalgary.ca/~newsted/surveys.html.
www.misq.org/archivist/home.html.

Marketing
Bearden, William O. and Richard Netemeyer, Handbook of Marketing Scales, Sage
Publications, 2
nd
ed., 1998. Summarizes over 130 marketing related scales.

Bruner, Gordon Paul Hensel, Marketing Scales Handbook, Chicago, Ill., American
Marketing Association, 1992. Includes almost 600 scales.

General
Robinson, John P., Phillip R. Shaver and Lawrence S. Wrightsman, Measures of
Personal and Social Psychological Attitudes, San Diego, CA: Academic Press, 1991.
Contains over 150 published scales in 11 different areas.

Buros Institute of Mental Measurements website has reviews of published tests
and measurements. www.unl.edu/buros

Decision Analyst
www.decisionanalyst.com

Decisive Technology
www.decisive.com

Perseus Development
www.perseusdevelopment.com

Socratic Technologies
www.sotech.com

SPSS
www.spss.com



Online Questionnaire Design
Survey Builder
www.surveybuilder.com

SurveyPro
www.surveypro.com

SurveySez
www.surveysez.com

WebSurveyor
www.websurveyor.com

Types of Scales:
Metric (interval & ratio)
Likert-type
Summated-Ratings (Likert)
Numerical
Semantic Differential
Graphic-Ratings
Nonmetric (nominal & ordinal)
Categorical
Constant Sum Method
Paired Comparisons
Rank Order
Sorting
MEASUREMENT SCALES
Examples of Likert-Type Scales:
When I hear about a new restaurant , I eat there to see what
it is like.

Strongly Agree Neither Agree Disagree Strongly
Agree Somewhat or Disagree Somewhat Disagree
1 2 3 4 5
MEASUREMENT SCALES Metric
When I hear about a new restaurant , I eat there to see what
it is like.

Strongly Strongly
Agree Disagree
1 2 3 4 5
Summated Ratings Scales:
A scaling technique in which respondents are asked to
indicate their degree of agreement or disagreement with
each of a number of statements. A subjects attitude score
(summated rating) is the total obtained by summing over
the items in the scale and dividing by the number of items
to get the average.

Example:

My sales representative is . . . .
SD D N A SA
Courteous ___ ___ ___ ___ ___
Friendly ___ ___ ___ ___ ___
Helpful ___ ___ ___ ___ ___
Knowledgeable ___ ___ ___ ___ ___
MEASUREMENT SCALES Metric
Alternative Approach to Summated Ratings scales:

When I hear about a new restaurant , I eat there to see what it is like.

Strongly Agree Neither Agree Disagree Strongly
Agree Somewhat or Disagree Somewhat Disagree
1 2 3 4 5

I always eat at new restaurants when someone tells me they are good.

Strongly Agree Neither Agree Disagree Strongly
Agree Somewhat or Disagree Somewhat Disagree
1 2 3 4 5
MEASUREMENT SCALES Metric
This approach includes a separate labeled Likert scale with each item
(statement). The summated rating is a total of the responses for all the
items divided by the number of items.
Numerical Scales:
Example:

Using a 10-point scale, where 1 is not at all important
and 10 is very important, how important is ______ in
your decision to do business with a particular vendor.

Note: you fill in the blank with an attribute, such as reliable
delivery, product quality, complaint resolution, and so forth.
MEASUREMENT SCALES Metric
Semantic Differential Scales:
A scaling technique in which respondents are asked to
check which space between a set of bipolar adjectives or
phrases best describes their feelings toward the stimulus
object.

Example:
My sales representative is . . . .
Courteous ___ ___ ___ ___ ___ Discourteous
Friendly ___ ___ ___ ___ ___ Unfriendly
Helpful ___ ___ ___ ___ ___ Unhelpful
Honest ___ ___ ___ ___ ___ Dishonest
MEASUREMENT SCALES Metric
Graphic-Ratings Scales:
A scaling technique in which respondents are asked to indicate their
ratings of an attribute by placing a check at the appropriate point
on a line that runs from one extreme of the attribute to the other.

Please evaluate each attribute in terms of how important the
attribute is to you personally (your company) by placing an X
at the position on the horizontal line that most reflects your
feelings.
Not Important Very Important
Courteousness _____________________________________
Friendliness _____________________________________
Helpfulness _____________________________________
Knowledgeable _____________________________________
MEASUREMENT SCALES Metric
Categorical scale:
Categorical scales are nominally measured opinion
scales that have two or more response categories.

How satisfied are you with your current job?
[ ] Very Satisfied
[ ] Somewhat Satisfied
[ ] Neither Satisfied nor Dissatisfied
[ ] Somewhat Dissatisfied
[ ] Very Dissatisfied

Note: Some researchers consider this a metric scale when coded 1 5 .
MEASUREMENT SCALES Nonmetric
Constant-Sum Method:
A scaling technique in which respondents are asked to divide
some given sum among two or more attributes on the basis of
their importance to them.

Please divide 100 points among the following attributes in
terms of the relative importance of each attribute to you.

Courteous Service ____
Friendly Service ____
Helpful Service ____
Knowledgeable Service ____
Total 100
MEASUREMENT SCALES Nonmetric
Paired Comparison Method:
A scaling technique in which respondents are given
pairs of stimulus objects and asked which object in a
pair they prefer most.

Please circle the attribute describing a sales
representative which you consider most desirable.

Courteous versus Knowledgeable
Friendly versus Helpful
Helpful versus Courteous
MEASUREMENT SCALES Nonmetric
Sorting:
A scaling technique in which respondents are
asked to indicate their beliefs or opinions by
arranging objects (items) on the basis of
perceived importance, similarity, preference
or some other attribute.
MEASUREMENT SCALES Nonmetric
Rank Order Method:
A scaling technique in which respondents are presented
with several stimulus objects simultaneously and asked
to order or rank them with respect to a specific
characteristic.

Please rank the following attributes on how important each is
to you in relation to a sales representative. Place a 1 beside
the attribute which is most important, a 2 next to the
attribute that is second in importance, and so on.

Courteous Service ___
Friendly Service ___
Helpful Service ___
Knowledgeable Service ___

MEASUREMENT SCALES Nonmetric
Practical Decisions When Developing Scales:




Number of items (indicators) to measure a concept?
Number of scale categories?
Odd or even number of categories?
(Include neutral point ?)
Balanced or unbalanced scales?
Forced or non-forced choice?
(Include Dont Know ?)
Category labels for scales?
Scale reliability and validity?
Scale Development
Balanced vs. Unbalanced Scales?




Balanced:
To what extent do you consider TV shows with sex and
violence to be acceptable for teenagers to view?
__ Very Acceptable
__ Somewhat Acceptable
__ Neither Acceptable or Unacceptable
__ Somewhat Unacceptable
__ Very Unacceptable
Unbalanced:
__ Very Acceptable
__ Somewhat Acceptable
__ Unacceptable

Scale Development
Forced or Non-Forced?



How likely are you to purchase a laptop PC in the next six months?
Very Very
Unlikely Likely
1 2 3 4 5 6 __ No Opinion
Scale Development
Category Labels for Scales?




Verbal Label:
How important is the size of the hard drive in selecting a laptop PC to purchase?
Very Somewhat Neither Important Somewhat Very
Unimportant Unimportant or Unimportant Important Important
1 2 3 4 5

Numerical Label:
How likely are you to purchase a laptop PC in the next six months?
Very Very
Unlikely Likely
1 2 3 4 5
Unlabeled:
How important is the weight of the laptop PC in deciding which brand
to purchase?
Very Very
Unimportant Important
___ ___ ___ ___ ___
Scale Development
Choosing a Measurement Scale:
Capabilities of Respondents.
Context of Scale Application.
Data Analysis Approach.
Validity and Reliability.
MEASUREMENT SCALES
Assessing Measurement Scales:

Validity

Reliability
MEASUREMENT SCALES
Measurement Error = occurs when the
values obtained in a survey (observed values)
are not the same as the true values
(population values).
RESEARCH DESIGN
Types of Errors:

Nonresponse = problem definition, refusal, sampling, etc.
Response = respondent or interviewer.
Data Collection Instrument:
Construct Development.
Scaling Measurement.
Questionnaire Design/Sequence, etc.
Data Analysis.
Interpretation.
SECONDARY DATA
Data that has been gathered
previously for other purposes.
SECONDARY DATA
Secondary Data Issues:

Availability
Relevance
Accuracy
Sufficiency

RESEARCH PROCESS
Identify and Define Research Problem

Theory / Practice

Hypotheses / Conceptualization

Research Design

Data collection

Data Analysis

Findings







Methods:
Dependence
Multiple Regression
Discriminant Analysis
ANOVA/MANOVA

Interdependence
Factor Analysis
Cluster Analysis
Data Analysis
Learning Checkpoint:

Define a research problem to be studied.
Identify the topics /concepts that will be covered
to answer research questions.
Identify the types of questions and/or scaling
you will use.
How will you evaluate the questions/scales you use?
Determine the best way to collect the data.
Present group suggestions; defend.
Research Design & Data Collection

Вам также может понравиться