Вы находитесь на странице: 1из 13

10 Rules For Writing Multiple Choice

Questions
by Connie Malamed

Tweet

SUMOME

This is a back-to-basics article about the undervalued and littlediscussed multiple choice question. Its not as exciting as discussing 3D virtual learning
environments, but it might be just as important. If you need to use tests, then you want
to reduce the errors that occur from poorly written items.
The rules covered here make tests more accurate, so the questions are interpreted as
intended and the answer options are clear and without hints. Just in case youre not
familiar with multiple choice terminology, its explained in the visual below.

Here are the ten rules. If you have any others, please add them through the Comments
form below.

Rule #1: Test comprehension and critical thinking, not


just recall
Multiple choice questions are criticized for testing the superficial recall of knowledge.
You can go beyond this by asking learners to interpret facts, evaluate situations, explain
cause and effect, make inferences, and predict results.

Rule #2: Use simple sentence structure and precise


wording
Write test questions in a simple structure that is easy to understand. And try to be as
accurate as possible in your word choices. Words can have many meanings depending
on colloquial usage and context.

Rule #3: Place most of the words in the question stem


If youre using a question stem, rather than an entire question, ensure that most of the
words are in the stem. This way, the answer options can be short, making them less
confusing and more legible.

Rule #4: Make all distractors plausible


All of the wrong answer choices should be completely reasonable. This can be very
hard to accomplish, but avoid throwing in those give-away distractors as it detracts from
the tests validity. If youre really stuck, get help from your friendly SME. (BTW, this word
can also be spelled as distracter.)

Rule #5: Keep all answer choices the same length


This can be difficult to achieve, but expert test-takers can use answer length as a hint to
the correct answer. Often the longest answer is the correct one. When I cant get all four
answers to the same length, I use two short and two long.

Rule #6: Avoid double negatives


No big news here, right? Dont use combinations of these words in the same question:
not, no, nor, the -un prefix, etc. For example, this type of question could confuse testtakers: Which of the following comments would NOT be unwelcome in a work
situation? Flip it around and write it in the positive form: Which of the following
comments are acceptable in a work situation?

Rule #7: Mix up the order of the correct answers


Make sure that most of your correct answers arent in the b and c positions, which
can often happen. Keep correct answers in random positions and dont let them fall into
a pattern that can be detected. When your test is written, go through and reorder where
the correct answers are placed, if necessary.

Rule #8: Keep the number of options consistent


Did you ever have to convince a SME that he or she cant have answer choices that go
to h in one question and c in the next? Its something of a user interface issue. Making
the number of options consistent from question to question helps learners know what to
expect. Research doesnt seem to agree on whether 3 or 4 or 5 options is best.
Personally, I like to use 4 options. It feels fair.

Rule #9: Avoid tricking test-takers


As faulty as they are, tests exist to measure knowledge. Never use questions or answer
options that could trick a learner. If a question or its options can be interpreted in two
ways or if the difference between options is too subtle, then find a way to rewrite it.

Rule #10: Use All of the Above and None of the


Above with caution
I hate this rule because when you run out of distractors, All of the Above and None of
the Above can come in handy. But they may not promote good instruction. Heres
why. All of the Above can be an obvious give-away answer when its not used
consistently. Also, theAll of the Above option can encourage guessing if the learner
thinks one or two answers are correct. In addition, the downside to None of the Above is
that you cant tell if the learner really knew the correct answer.
++++++++++++++++++++++++

Tips For Writing Matching Format Test


Items
by Connie Malamed

SUMOME

When you write test items in a matching format, do you stress about
which terms should go on the left and which on the right? Are you puzzled about when
to use the matching format and whether multiple choice would be better?
Here are some answers to these perplexing issues.

The Matching Format


The matching test item format provides a way for learners to connect a word, sentence
or phrase in one column to a corresponding word, sentence or phrase in a second
column. The items in the first column are called premises and the answers in the
second column are the responses. The convention is for learners to match the premise
on the left with a given response on the right. By convention, the items in Column A are
numbered and the items in Column B are labeled with capital letters.

Many authoring tools come with a pre-built matching test item template, which may
involve dragging responses to the premise or typing the letters from Column B into
Column A. The authoring tool templates may vary from the conventions of the written
format.

When to Use Matching

The matching test item format provides a change of pace, particularly for self-check and
review activities. Many instructional designers employ them in quizzes and tests too.
They are effective when you need to measure the learners ability to identify the
relationship or association between similar items. They work best when the course
content has many parallel concepts, for example:

Terms and Definitions

Objects or Pictures and Labels

Symbols and Proper Names

Causes and Effects

Scenarios and Responses

Principles and Scenarios to which they apply

Construction Guidelines
If you decide to use a matching format, take the time to construct items that are valid
and reliable. Here are some guidelines for this.
1. Two-part directions. Your clear directions at the start of each question need two
parts: 1) how to make the match and 2) the basis for matching the response with the
premise. You can also include whether items can be re-used, but often pre-built
templates dont allow for this.
Example for exercise above: Drag each career name in Column B to the best definition
in Column A. No items may be used more than once.
2. Parallel content. Within one matching test item, use a common approach, such as
all terms and definitions or all principles and the scenarios to which they apply.

3. Plausible answers. All responses in Column B should be plausible answers to the


premises in Column A. Otherwise, the test loses some of its reliability because some
answers will be give-aways.
4. Clueless. Ensure your premises dont include hints through grammar (like implying
the answer must be plural) or hints from word choice (like using the term itself in a
definition).
5. Unequal responses. In an ideal world, you should present more responses than
premises, so the remaining responses dont work as hints to the correct answer. This is
not often possible when using a template.
6. Limited premises. Due to the capacity limitations of working memory, avoid a long
list of premises in the first column. A number that Ive come across is to keep the list
down to six items. Even less might be better, depending on the characteristics of your
audience.
7. One correct answer. Every premise should have only one correct response.
Obvious, but triple-check to make sure each response can only work for one premise.

Pros and Cons


The matching test item format allows you to cover more content in one question than
you can with multiple choice. Thats why I think they are excellent for intermittent
knowledge checks. They are also a very efficient approach to testing and can provide
an excellent objective measurement. In addition, they provide a way to add some
variety to your activities.
A disadvantage is the tendency to use this format for the simple recall of information.
Adult learners often require practice and testing of higher-order thinking skills, such as
problem solving. Dont limit your use of this format to recall of knowledge alone. Rather,
try to find ways to use matching for application and analysis too, such as presenting a
short scenario and asking for the best solution.

When using matching test items in an assessment, youll need to identify the specifics
of how they will be scored. Some prefer to give partial credit when somebut not all
of the responses are correct. Often, the authoring tool determines the approach, but if
you do have control, its an issue youll need to explore.
+++++++++++++++++

Good, Better, Best: Multiple Choice Exam


Construction
Do you remember the following childrens ditty? Good, better, best. Never let it rest. 'Til your good is better and your
better is best. This ditty reminds us as teachers who write multiple-choice exams that there is always room for
improvement. Writing better exams enables better assessment of our students learning.
Before we consider examples of mistakes to avoid, the following example will help us to establish a common
language about the parts of a multiple-choice exam item.

Example and Parts of a Multiple-Choice Item


3. What is chiefly responsible for the increase in the average length of life in =STEM
the USA during the last fifty years?

distractor- a. Compulsory health and physical education courses in


-public schools.

answer---

*b. The reduced death rate among infants and young children

distractor- c. The safety movement, which has greatly reduced the


-number of deaths from accidents

=Alternatives

distractor- d. The substitution of machines for human labor.


--

To make your good exams better, and to make your better exams the best, try to avoid these exam writing mistakes.

1. Poorly Written Stems


A stem is the section of a multiple-choice item that poses the problem that the students must answer. Stems can be
in the form of a question or an incomplete sentence. Poorly written stems fail to state clearly the problem when they
are vague, full of irrelevant data, or negatively written.

a. Avoid vague stems by stating the problem in the stem:


Poor Example
California:
a. Contains the tallest mountain in the United States.
b. Has an eagle on its state flag.
c. Is the second largest state in terms of area.
*d. Was the location of the Gold Rush of 1849.
Good Example
What is the main reason so many people moved to California in 1849?
a. California land was fertile, plentiful, and inexpensive.
*b. Gold was discovered in central California.
c. The east was preparing for a civil war.
d. They wanted to establish religious settlements.

b. Avoid wordy stems by removing irrelevant data:


Poor Example
Suppose you are a mathematics professor who wants to determine whether or not your teaching of a unit on
probability has had a significant effect on your students. You decide to analyze their scores from a test they took
before the instruction and their scores from another exam taken after the instruction. Which of the following t-tests is
appropriate to use in this situation?
*a. Dependent samples.
b. Heterogenous samples.
c. Homogenous samples.
d. Independent samples.
Good Example

When analyzing your students pretest and posttest scores to determine if your teaching has had a significant effect,
an appropriate statistic to use is the t-test for:
*a. Dependent samples.
b. Heterogenous samples.
c. Homogenous samples.
d. Independent samples.

c. Avoid negatively worded stems by stating the stem in a positive form:


Poor Example
A nurse is assessing a client who has pneumonia. Which of these assessment findings indicates that the client does
NOT need to be suctioned?
a. Diminished breath sounds.
*b. Absence of adventitious breath sounds.
c. Inability to cough up sputum.
d. Wheezing following bronchodilator therapy.
Good Example
Which of these assessment findings, if identified in a client who has pneumonia, indicates that the client needs
suctioned?
a. Absence of adventitious breath sounds.
b. Respiratory rate of 18 breaths per minute.
*c. Inability to cough up sputum.
d. Wheezing prior to bronchodilator therapy.
Note: Test Writing experts believe that negatively worded stems confuse students.

2. Poorly Written Alternatives


The alternatives in a multiple-choice item consist of the answer and distractors that are inferior or incorrect. Faculty
often find coming up with enough distractors to be the toughest part of exam writing. Common mistakes in writing
exam alternatives have to do with how the various alternatives relate. They should be mutually exclusive,
homogenous, plausible and consistently phrased.

a. Avoid Overlapping Alternatives


Poor Example

What is the average effective radiation dose from chest CT?


a. 1-8 mSv
b. 8-16 mSv
c. 16-24 mSv
d. 24-32 mSv
Good Example
What is the average effective radiation dose from chest CT?
a. 1-7 mSv
b. 8-15 mSv
c. 16-24 mSv
d. 24-32 mSv

b. Avoid Dissimilar Alternatives


Poor Example
Idaho is widely known as:
*a. The largest producer of potatoes in the United States.
b. The location of the tallest mountain in the United States.
c. The state with a beaver on its flag.
d. The Treasure State.
Good Example
Idaho is widely known for its:
a. Apples.
b. Corn.
*c. Potatoes.
d. Wheat
Note: The good example tests students knowledge of Idahos agriculture. The poor example is confusing because
students are unsure if they are answering a question on Idahos agriculture, geography, flag or nickname.

c. Avoid implausible alternatives:


Poor Example
Which of the following artists is known for painting the ceiling of the Sistine Chapel?
a. Warhol.
b. Flinstone.

*c. Michelangelo.
d. Santa Claus.
Good Example
Which of the following artists is known for painting the ceiling of the Sistine Chapel?
a. Botticelli.
b. da Vinci.
*c. Michelangelo.
d. Raphael.

d. Avoid inconsistent phrasing of alternatives:


Poor Example
The term operant conditioning refers to the learning situation in which:
a. A familiar response is associated with a new stimulus.
b. Individual associations are linked together in sequence.
*c. A response of the learner is instrumental in leading to a subsequent reinforcing event.
d. Verbal responses are made to verbal stimuli.
Good Example
The term operant conditioning refers to the learning situation in which:
a. A familiar response is associated with a new stimulus.
b. Individual associations are linked together in sequence.
*c. The learners response leads to reinforcement.
d. Verbal responses are made to verbal stimuli.
Note: The length of answer in the poor example is longer than the distractors. Some students are keen at spotting
these changes. Also, the language in the poor example is from the textbook, but the distractors are in the instructors
own words. The good example makes the phrasing consistent in length and uses the instructors language.

Sources:
The parts of a multiple-choice item and examples in 1a, 1b, 2b, 2c, and 2d are from Steven J. Burton, et al, How to
Prepare Better Multiple-Choice Test Items: Guidelines for University Faculty,
(http://testing.byu.edu/info/handbooks/betteritems.pdf).

The examples in 1c are from Mary McDonald, Systematic Assessment of Learning Outcomes: Developing MultipleChoice Exams (Jones & Bartlett Publishers, 2002): 94.

The examples in 2a are from Janette Collins, Writing Multiple-Choice Questions for Continuing Medical Education
Activities and Self-Assessment Modules, RadioGraphics 26 (2006):
549. http://radiographics.rsna.org/content/26/2/543.full.pdf+html

++++++++++++++++++++++++

Вам также может понравиться