Вы находитесь на странице: 1из 46

Qualitative Research Instrument

Quantitative Research
Instrument

Validity & Reliability of


Quantitative Research
Instrument
Topics & Presenters
Quantitative Research
Instrument

Qualitative Research
Instrument
Permata
Salsabila
Validity & Reliability of
Quantitative Instrument

Yeni Fitri Nurqadriyanti H


Stages Of Qualitative Research
Creswell (2012)
Stating
purpose & Collecting
Collecting
research data
data
questions Instrument

Having A tool for


Writing
literature measuring,
report
review observing, or
documenting data

Exploring
problem
Instrument
In qualitative research, the
instrument is “the researcher The researcher must be valid
itself.”
“The researcher is the key
instrument” Human instruments aims to:
Sugiyono (2017) • determine the purpose of
research
• collect , evaluate, and
analyze the data
• draw conclusion
Why does the researcher become the qualitative
research instrument?
Nasution in Sugiyono (2017)
01 Researcher as a sensitive tool who can react to
any stimulus from the environment.

02 Researcher as a tool can adapt to all aspects of


the situation and can collect diverse data.

03 Researcher can analyze the data and make


hypothesis.

04 Only human as researcher can draw a


conclusion.
How to collect qualitative data?

Observation
Interview

Audiovisual
Documents Material
Observatio
n

Advantage
Fostering an in depth and rich
understanding of a phenomenon,
The process of situation, setting, and the
gathering open-
behavior of the participants in
ended, firsthand
information, and that setting.
people’s behavior
by observing
people and places
at a research site. You will be limited to those
Disadvantage
sites and situations where
you can gain access.
Observation
Identify Method
5W + 1H
to observe
Field note
Design some
Ease into
means for
the site
recording notes

Select a site
Conducting
& people to
observation
observe

Writing
observation
report
Intervie
w

• Interview consists of oral questions asked by the interviewer


Interview
and oral responses by the research participants.

• Providing useful information.


Advantages
• We can see participants’ behavior directly.

• The researchers have to filter the answer to make


summary.
Disadvantages
• It can go everywhere, participant say more than they
intended to say
Forms of Interview
Creswell (2012)

Focus
Telephone
Group
Interview
Interview

One-on-one
Interview/ Email
Face to face Interview
Interview
Interview
eps in Constructing Interview
Defining the purpose of the study

Selecting a sample

Designing the interview format

Developing questions

Selecting and training interviewers

Conducting the interview

Analyzing the interview data


Types of Question
Items on Interview

Close-ended Questions
Open-ended Questions?
Do you strongly agree?
Do you agree?
Please explain your
Are you undecided?
response in more detail
Do you disagree?
Do you strongly disagree?
Documents
Documents consist of public and
private records that qualitative
researchers obtain about a site
or participants in a study

newspapers, minutes of
meetings, personal
Advantage: providing journals, and letters.
valuable information, ready
to be analyzed without
requiring transcription

Disadvantage: Sometimes
difficult to obtain and locate.
Steps in Collecting
Identify the types Documents
of documents

Seek permission
to use them

If you ask participants to


keep a journal, provide
specific instructions.

Examine the accuracy,


completeness, and
usefulness in answering
the research questions

Record information
from the documents
Audiovisual Materials
Audiovisual materials Videotapes,
consist of images or digital images,
sounds. paintings, films
and pictures.

Disadvantage:
Difficult to analyze
because of the rich
Advantages: information.
• People easily relate to
images because they
are so pervasive in our
society.
• Provide an opportunity
for the participants to
share directly their
perceptions of reality
Steps in Constructing Audiovisual
Materials

Determine what visual Identify the visual


material can provide material available and
information to answer obtain permission to use
research questions it.

Check the accuracy


Collect the data and
and authenticity of the
organize it
visual material
Quantitative Research Instruments

HOW? WHAT?
Develop yourself

Locate and modify Questionnaires

Locate entirely Tests

In quantitative research, instruments are used to


measure or observe the variables in the study.
QUESTIONNAIRES
Constructing Questionnaires

Reviewing the Deciding the Constructing


Knowing
literature required questionnaire items
respondents
information

Reexamining Pretesting Editing Specifying


and revising questionnaire questionnaire procedure for use
questions s s
Types of Questionnaires

Structured
consisting a set of Semi Structured
standardized Unstructured
containing both
questions with a
open-ended mostly containing
fixed scheme,
questions and open-ended
specified wording,
closed-ended questions.
ordered questions,
questions
such as closed
ended questions.
Questionnaires Formats
“The staple Rank-
scale” order Dichotomous
scaling

“The constant Multiple-choice


sum question” question

The rating-
scale
The
demographic
question The open-
ended
question
Disadvantages of Advantages of
Questionnaires Questionnaires
ber of
h a large numbe ly
ach
reac asi
people relatively e
ally
and economically
an
bllee
antifiab
Provide quan
answeerrss
on
Less time cco nsuming
n interview or
thaan
It is complex observation

d--
ed alyyze
Easy to anal
Un
na nswer
an ere
qu estio
ue n
tion
Types of Tests: Parametric & Non-Parametric
test
Parametric is a
hypothesis test providing
generalizations for
making statements about Non-parametric is made
the mean of the based on underlying
population (It is used assumptions and it does
when the researcher has not require population’s
information about the distribution to be denoted
population parameter). by specific parameters (It
is used when the
researcher has no idea
regarding the population
parameter).
Norm-Referenced & Criterion-
Referenced Test
Norm-referenced test is used to
measure students’
performances in comparison to
other students
A criterion-referenced test is an
assessment and test that
(to check whether students have
measures students’
performed better or worse than
performances with fixed criteria.
other).
It uses test score to judge them,
such as quizzes, multiple-
choice, true-false, open-ended
questions.
Types of data to measure

PERFORMANCE DATA BEHAVIOURAL DATA


• Behavioural
• Achievement tests, intelligence
tests, aptitude tests, and
checklist
others.
4 1

ATTITUDINAL DATA
FACTUAL DATA • Affective scale
• Public documents or school
records, such as grade
3 2 • Feelings
reports, school attendance
records, student
demographic data, census
data, and others.
iteria of Good Instruments

0 Widely cited &


1 recent
Accepted scales 0
of measurements 2

Fit the research


0 questions/
3
hypotheses in your
study
Reviewed by 0
4
other authors
Validity of Quantitative Instrument
Construct
Validity
The degree to which a test
measures what it is supposed Criterion-related
Content validity
to measure and, consequently, Validity
permits appropriate
interpretation of scores

Gay et al (2012) Validity


Content Validity

It is the degree to which a test


measures an intended Sampling
content area. Validity
Item Validity

Gay et al (2012)
How to measure
content validity?
By experts’ judgment  
(minimum of 3 experts) S= r – lo
V = Validity index of Aiken
lo = the lowest scores in the scoring
category
c = number of categories /criteria
r = rater category selection score
using Likert Scale
n = the number of raters
 

 
- The V index value ranges
using the Aiken index V from 0 to 1
or Gregory index. - Strongly valid if , moderately
valid if V = 0,4 – 0,8,
weakly valid if
Criterion-Related Validity

Predictive Validity
uses the score on a test to predict
It is the degree to which two scores performance in a future situation.
on two different measures are
correlated.
Concurrent Validity
(Lodico et al., 2010).
examines the degree to which one test
correlates with another taken in the
same time frame.
How to measure Criterion-Related
Validity?

Manually or
using
Ms. Excel If the coefficient is high
(near 1.0), the test has good
concurrent validity.

Gay et al (2012)
SPSS
Analyze using
Construct
Exploratory
Validity
Factor Analysis
a search for evidence (EFA)
that an instrument is
accurately
Construct Validity
SPSS, SAS,
measuring an MINITAB, R,
abstract trait or MPLUS, etc.
ability.

Analyze using
Confirmatory
3 resources of
Factor Analysis
construct validity
(CFA)
homogeneity,
convergence, and Lisrel, AMOS,
theory. MPLUS, PLS, etc.
Threatens
Validity
Unclear test directions & cheating

Confusing and ambiguous test items

Vocabulary too difficult for test takers

Overly difficult and complex sentence structures

Inconsistent and subjective scoring methods

Untaught items included on achievement tests

Failure to follow standardized test administration procedures


Reliability
It is a synonym for dependability, consistency
and replicability.
Cohen, et. al., (2008)
Reliability

Internal
Stability Equivalence Consistency
Reliability as
Stability
It is a measure of consistency over time
and over similar samples.
Cohen, et. al., (2008)
The correlation coefficient using
test-retest method is usually
estimated using Pearson
statistics or t-test.

Reliable if the correlation


coefficient using test-retest were
0.80 or higher (Whiston, 2005)
Reliability as Equivalence
(Bowling, 2009 as cited in Oluwatayo, 2012)

Alternate or
Inter-rater form
parallel form

Comparing the scores The simplest level of


using either Pearson calculating inter-rater
statistics or t-test agreement is using
statistics percentage

Producing comparable
responses if a correlation
coefficient of at least 0.80
Reliability as Internal Consistency
(Cronbach, 1951 as cited in Oluwatayo, 2012)

It tests for the The scores of the


homogeneity of items respondents analyze In educational research, much
in a measuring using the appropriate attention is placed on internal
instrument. statistical tools. consistency using split-half, item-
total correlations, Kuder-
Richardson and Cronbach alpha
Split-Half
Bryman & Crammer (1990) as cited in Oluwatayo (2012)

Correlating the
The items in an
marks in the odd
instrument can be
items with the even
split into two
items using Reliability if ri =
matched halves in
Pearson’s 0.80 and above.
terms of contents
statistics and
and cumulative
corrected for the
degree of
whole items using
difficulty.
Spearman-Brown
Kuder-Richardson-20 & 21 (KR-20&21)

It is for determining homogeneity of


items

The formula of KR-20 is for


dichotomous items.

The formula of KR-21 is for the test


(0-1) , Likert Scale (1-5), or essay:
Cronbach-Alpha or Coefficient Alpha
(Nunnally, 1994)

It is for the
items in
continuum.

Reliable if the
0.70 and
above or if it
is 0.67
Factors Affecting Reliability
Direct Factors Indirect Factors

The time of the first


The length of and second data
the test and collection.
quality of
instrument Distribution of
items respondents'
acquisition scores

Condition when Difficulty level of


collecting data instrument
or
administration
The length of
instrument

Objectivity in
scoring
Borg, W., & Gall, M. (2003). Educational research. New York: Longman Inc.
References
Cohen, L., Manion, L. & Morrison, K. (2008). Research methods in education 6th edition.
London & New York: Routledge Taylor & Francis Group.

Creswell, John W. (2012). Educational research planning, conducting, and evaluating


qualitative and quantitative research 4th
th edition. United States of America: Pearson
Education.

Lodico, M. G.; Spaulding, D. T. & Voegtle, K. H. (2010). Methods in Educational Research:


From Theory to Practice 2nd
nd edition. San Fransisco: Jossey-Bass.

Sugiyono. (2017). Metode penelitian kuantitatif, kualitatif, dan R&D. Bandung: Alfabeta.

Oluwatayo, J. (2012). Validity and reliability issues in educational research , 2(2), 391–400.


doi: 10.5901/jesr.2012.v2n2.391

Wilkinson, D., & Birmingham, P. (2003). Using research instruments: a guide for
researchers. London: RoutledgeFalmer
THANKS!

Вам также может понравиться