Вы находитесь на странице: 1из 13

RESEARCH AND TEACHING

Exploring the Use of an Online Quiz


Game to Provide Formative Feedback
in a Large-Enrollment, Introductory
Biochemistry Course
By Rachel Milner, Jonathan Parrish, Adrienne Wright, Judy Gnarpe, and Louanne Keenan

L
In our large-enrollment,
introductory biochemistry course for earning outcomes in higher Although quizzes can be use-
nonmajors, we provide students with education may be improved ful learning tools, it is important to
formative feedback through practice through formative feedback, consider the variety of factors that
questions in PDF format. Recently, at least partly because learn- influence whether they are effective
we investigated possible benefits ing is enhanced through experience, (Akl et al., 2010; Biggs, 2012; B. R.
of providing the practice questions reflection, and practice (Ericsson & Crisp, 2007; G. T. Crisp, 2012; Shute
via an online game (Brainspan). Kintsch, 1995; Kolb & Kolb, 2009; et al., 2008). These include timing of
Participants were randomly Thatcher, 1990). That is, learning the feedback relative to the delivery
assigned to either the online game outcomes may be influenced by of the content; the number of attempts
group or the PDF-only group. Both activities provided, such as a quiz to answer that a student is permitted;
groups received identical practice (Biggs, 2012). It is argued that an- and the amount of detail provided in
questions and correct/incorrect swering study questions creates the feedback (Peat, Franklin, Devlin,
feedback only, with no additional more robust memory traces (Glass, & Charles, 2005). Gosper (2010)
detail. We believed that the game Brill, & Ingate, 2008) and that re- suggested that the nature of feedback
might increase engagement with the peated retrieval during learning is depends on the level of knowledge
material and improve performance key to long-term retention (Karpicke required; immediate feedback is most
on the examination. However, & Roediger, 2007). Many studies on important for memorization and recall
we found that the exam scores of the use of quizzes have reported im- of factual knowledge, and for this,
the two groups did not differ. The proved performance on exams (Gos- feedback should be simply correct/
attitude of participants toward the per, 2010; Nicol & Macfarlane-Dick, incorrect. In contrast, for higher order
practice questions was similar for 2006; Shute, Hansen, & Almond, cognitive tasks, delaying feedback
both formats, with agreement that 2008). Given this, it is increasingly might be more important, and when
the questions were appropriate common that instructors in higher given, informative and elaborate
for course objectives and helped education use quizzes as tools to feedback is more effective (Jaehnig
in preparation for examinations. support learning. These quizzes take & Miller, 2007), especially feedback
However, the game users were less a variety of forms, including paper- including an explanation of how one
likely to agree that the questions based and online homework quiz- would determine the correct answer
provided useful feedback on zes or in-class Jeopardy-style games (Narciss & Huth, 2006). Other factors
learning, suggesting that they had (Benek-Rivera & Mathews, 2004). that might influence the effective-
greater expectations for the practice Notably, much of the use of quizzes ness of feedback include the length
questions delivered as a game. in formative assessment is electronic of time that a quiz is available, the
Game users were also more than (Peat & Franklin, 2002) because of type of quiz, and student preferences
twice as likely to state that they the large class sizes, which are an in- and motivation (Chickering, 2006;
wanted the other or both formats. creasingly common feature in higher Dermo, 2009; Glass et al., 2008; Max-
education. well, 2010). Most critically, feedback

86
Journal of College Science Teaching
must be engaged with by students to 1. Is student performance in university in Canada. The course is
be effective, and so we must always examinations improved by access a large-class lecture format and is
consider factors that affect student to an online quiz game? offered in each term in multiple sec-
engagement, including outcomes and 2. What are student attitudes toward tions with different instructors. It is
rewards (Anderson et al., 2001; G. T. the practice questions delivered team taught by experienced instruc-
Crisp, 2012). in an online game? tors who follow a common syllabus
Students are generally motivated with shared materials to achieve
to use online quizzes, perceiving Methods consistent learning outcomes re-
them as effective because they pro- Context (Introductory gardless of section. The course is
vide autonomy and fast feedback Biochemistry, BIOCH200) focused on the “vocabulary” of the
(Dermo, 2009; Glass et al., 2008; The context was an introductory bio- subject. Students are introduced to
Maxwell, 2010). Motivation and chemistry course (BIOCH200) of- terms and basic concepts that they
engagement may also be improved fered at a large, research-intensive need to understand and memorize
through educational games (Horizon
Report, 2006) because these “can FIGURE 1
be fun” (Hudson & Bristow, 2006).
Despite this, the available evidence One page of a feedback file, with the correct responses underlined.
to date “neither confirms nor refutes The question files were identical to the feedback files except for the
the utility of educational games as underlining of correct answer.
an effective teaching strategy for Biological Membranes
medical students” (Akl et al., 2010), 1. Which of the following statements about passive and primary active transporter
and there is room for further inves- proteins is FALSE?
tigation of their impact. Therefore, A. They are both integral membrane proteins.
in this study we explored the effec- B. They both show a high degree of selectivity.
tiveness of an online quiz game as C. Both require a concentration gradient to function.
an alternative means for providing D. They both change conformation during transport.
students with practice questions. 2. Which one of the following statements is TRUE for passive transport across a bio-
Our context was a large-enrollment, logical membrane?
introductory biochemistry class with A. Passive transport is driven by a solute concentration gradient.
a pedagogical focus on recall of basic B. Passive transport is driven by ATP.
principles and terminology. Approxi- C. Passive transport is irreversible.
mately half of the study participants D. Passive transport is endergonic (requires an input of energy to occur).
had access to practice questions via E. Passive transport is not specific with respect to the substrate.
a game, Brainspan, and the remain-
3. Which of the following determines the force that “drives” an ion through an ion
ing participants were provided with channel in a membrane?
identical practice questions via A. The size and shape of the channel.
PDF, as usual. We hypothesized that B. The size of the ion.
providing the practice questions as C. The properties of the selectivity filter.
a quiz game might increase engage- D. The size of the concentration gradient across the membrane.
ment with the material, compared
with PDF only, and thereby improve 4. Which of the following statements about biological membranes is TRUE?
performance on the examination. A. The composition of membrane lipid bilayers may be varied slightly, to main-
tain it in the gel-crystalline state.
Because engagement is influenced by
B. The bilayer is stabilized by hydrophobic interactions between the polar lipid
individual differences (Anderson et head groups and the aqueous environment.
al., 2001; Biggs, 2012; Chickering, C. Integral membrane proteins penetrate or span the lipid bilayer, interacting
2006; Krathwohl & Anderson, 2010), with the hydrophobic lipid acyl chains.
we also explored student experiences D. Peripheral membrane proteins are covalently bound with the polar lipid head
with the game. Specifically, we asked groups of the bilayer.
two major research questions:

Vol. 45, No. 2, 2015 87


RESEARCH AND TEACHING

prior to enrolling in more advanced Brainspan is an asynchronous and compared average examina-
courses. Learning is evaluated by multiplayer online quiz generator tion scores for the game-users and
two multiple-choice examinations, (Gnarpe, 2009). Instructors create PDF-only groups. To explore stu-
and all students in a given term multiple-choice questions and load dent attitudes, we used a survey.
take identical examinations concur- these into the “games” they are cre- Toward the end of term, the survey
rently. Questions are selected from a ating. The appearance of the ques- was distributed at the beginning of
large database of pretested questions tions online is variable, with several a class period. Surveys were color
that were created by instructors for distinct backgrounds available for coded and included a check box so
BIOCH200. The question database selection, including a game show that the respondents could confirm
is not available to students, but they background. The background used which group they were assigned
are provided with practice questions in BIOCH200 was simple (Figure to. The survey included a series
from it as PDF documents. Only cor- 2A). Students log on to the site and of specific questions (Table 1) to
rect/incorrect feedback is given, with answer questions sequentially. For which participants responded using
no elaboration (Figure 1). each question, there is immediate a Likert scale (1 = strongly agree, 2
correct/incorrect feedback (Figures = agree, 3 = neutral, 4 = disagree,
Study design 2B and 2C). Students can check and 5 = strongly disagree), and an
The study was conducted in the five their progress in the game by gen- open-ended question: “Please add
sections of BIOCH200 that were of- erating a report that shows them the any comments or suggestions about
fered in the fall, winter, and spring questions and categories they made the use of the practice questions/
terms of a single academic year. errors on. The game saves scores so learning game in this course in the
In all five sections, students were that the student can exit and reenter space below.”
randomly assigned to one of two without losing points. While students
groups: One group received prac- are answering questions, they can Participants
tice questions as normal, via PDF, challenge other students to answer Participants were the students in any
and the other received password- questions that they have answered of the five sections who voluntarily
protected access to the quiz game, correctly. Challenges are associ- completed and returned a survey (N
Brainspan. During the study, many ated with betting points, adding = 549). Of these, 252 reported being
students assigned to the game in- an element of competition. A chal- in the game-users group and 297 re-
formed us that they had obtained lenge may be ignored. Questions in ported being in the PDF-only group.
copies of the PDFs from their class- Brainspan can be linked to additional A total of 400 participants included
mates. Given this, the comparison information files, and instructors can a written response to the open-end-
groups discussed here are students add feedback via a [Learn More] but- ed question.
who had access to the game (game ton, which becomes available after Students from diverse university
users) and those who did not (PDF the student has answered a question programs enroll in BIOCH200, in-
only). (Figures 2B and 2C). Students can cluding general science, kinesiol-
Each PDF was generated from also send a message to their instruc- ogy, medical laboratory sciences,
a Word document (Figure 1) and tors, and instructors can monitor nutrition, dental hygiene, physiology,
e-mailed to the PDF-only group at progress. However, in this study, no chemistry, cell biology, and educa-
the same time as game users were additional information or feedback tion. However, students enrolled in
given access to a game. A separate was provided because we wanted to each section comprise equivalent
“answer” PDF in which correct compare the use of the online game random samples from the same
responses were underlined was as directly as possible with the PDF population of students. Each section
emailed simultaneously. Students delivery of simple correct/incorrect of the course is approximately the
were encouraged to use the question feedback. same size, with approximately 950
file to self-test prior to viewing the This study used a mixed meth- students enrolled in the five sections
answer in the other file, but they had odology to answer the two research included in this study. This research
access to both files at the same time questions. To determine whether complies with the University Stan-
and could view or print them without student performance was improved dards for the Protection of Human
restriction. by access to the game, we calculated Research Participants.

88
Journal of College Science Teaching
FIGURE 2
The appearance of the question and feedback screens in the quiz game. (A) The question was presented at
the top of the screen with a series of responses listed below. Students selected a response by clicking on the
[A], [B], [C] or [D] button. (B) The screen as it appeared if the response selected was correct. (C) The screen as it
appeared if the response selected was incorrect.

Vol. 45, No. 2, 2015 89


RESEARCH AND TEACHING

TABLE 1
Student perceptions of the value of questions in game and PDF format.
Game users PDF users
1 The questions used were appropriate for accomplishing the objectives of the Mean 1.53 1.52
course.
N 252 297
SD .694 .678
2 The questions helped me to build a solid understanding of core concepts. Mean 1.76 1.65
N 251 297
SD .815 .757
3* The game or questions provided useful feedback to me on my learning. Mean 2.28 1.75
(r = .371)
N 250 294
SD 1.193 .940
4* The game or questions took too much time away from my studying. Mean 3.69 4.09
(r = .406) 241 294
N
1.135 .893
SD
5* The questions helped me to quickly identify concepts that I didn’t Mean 2.16 1.90
understand. (r = .429)
N 249 295
SD .948 .833
6 Use of the game or questions resulted in less course stress. Mean 2.41 2.27
N 248 294
SD 1.080 1.049
7 The game or questions helped me to “think like a biochemist.” Mean 2.83 2.76
N 245 286
SD 1.006 .993
8 The questions were designed well for learning this subject. Mean 1.83 1.76
N 249 295
SD .764 .774
9 I enjoyed the course more because of the use of the learning game or Mean 2.53 2.38
questions.
N 247 291
SD 1.107 .987
10 The game or questions allowed me to engage directly with the content being Mean 1.93 1.88
presented.
N 249 285
SD .917 .775
11* I felt better prepared for exams due to usage of the game/practice Mean 1.75 1.45
questions. (r = .406)
N 248 286
SD .928 .722

Note: Responses were collected using a Likert scale: 1= strongly agree; 2 = agree; 3 = neutral; 4 = disagree; 5 = strongly disagree.
Mean item scores for each question were calculated for the game-users and PDF-only groups. A mean item score of ≤2.0 is defined
as the agree range, a score of >2.0 to <4.0 is defined as the neutral range, and a score of ≥4.0 is defined as the disagree range.

*For items 3, 4, 5, and 11 (in bold), a Mann–Whitney test indicated a significant difference in response by the two groups. Item
3: U = 27331.0, z = –5.469, p < .001. Item 4: U = 28786.0, z = –3.971, p < .001. Item 5: U = 31502.5, z = –3.074, p < .01. Item 11: U =
28821.5, z = –4.229, p < .001. The effect size (r) was calculated as recommended by Grissom and Kim (2012) and is equal to the
Mann–Whitney U statistic divided by the product of the two sample sizes.

90
Journal of College Science Teaching
Data analysis notably, did not differ from mean We found that the game-users and
Examination scores and responses to scores achieved in other sections of PDF-only groups agreed to the same
the survey items were analyzed using BIOCH 200 over a period of at least extent that the questions were well
SPSS. Mean scores for each question 4 years prior to the study (data not designed and appropriate for learning,
were compared for PDF-only and shown). and this was unaffected by the format
game-users groups. Mean scores on (Items 1, 2, 8, and 10). Also, both
the final examination were also com-
Question 2: What are student groups responded neutrally to Items
pared, for PDF-only and game-user
attitudes towards the practice 6, 7 and 9, suggesting that neither for-
groups in each class.
questions delivered in an online mat was perceived as affecting stress
The open-ended responses were
game? levels or enjoyment of the course.
analyzed by categorization and cod- Table 1 compares mean scores for Only the scores for Items 3, 4, 5, and
ing. Many of the written responses game-users and PDF-only groups 11 appeared to differ significantly
contained multiple codes, so the sum for each of the survey items. Be- for the two groups (p < .01, Table 1),
of the responses by category is greater cause the Likert scale responses had and there was a medium effect size
than the number of participants a nonnormal distribution, we used the (≥0.3) for all four of these differences
(400 written responses included 547 nonparametric Mann–Whitney test (≥0.5 is a large effect size). Effect size
distinct comments). The comments to determine whether there were any was determined as recommended for
were analyzed first by two of the differences between the groups. Mann–Whitney tests by Grissom and
researchers who categorized each
comment as either negative or positive FIGURE 3
and then identified subcodes. These
researchers ensured consistency in Performance on the final examination: Game versus PDF-only groups.
coding through regular discussions Mean final examination scores for game-user and PDF-only groups in
of their analysis, which led to agree- five sections of BIOCH 200 offered in the academic year 2008–2009.
ment on the definitions of the codes. The graph shows mean scores + SD for game-user and PDF-only in each
Each researcher then completed of five sections. (The –SD error is assumed and not shown.)
coding of the responses separately.
The coding was cross-checked by a
third researcher (Gibbs, 2007), who
coded the responses herself, using the
codes and definitions agreed on by the
others, and then compared her own
coding with that achieved by the first
two researchers. The third researcher
found complete agreement.

Results
Question 1: Is student
performance in examinations
improved by access to an online
quiz game?
Figure 3 shows mean examination
scores for PDF-only and game-users
groups in each of the five sections of
BIOCH 200. We found no difference
in examination scores for the game
users compared with the PDF-only
group. The scores were highly con-
sistent from section to section and,

Vol. 45, No. 2, 2015 91


RESEARCH AND TEACHING

Kim (2012; Table 1). Interestingly, ment. Notably, in both cases feed- twice as many positive comments
game users were less likely to agree back was limited to correct/incorrect than negative (154 positive and 85
that the questions provided useful only: in the game this was displayed negative).
feedback on learning (Item 3), less on screen immediately after each Questions in PDF format were
likely to agree that the questions question (Figure 2), whereas in the more valued in preparation for
helped them to quickly identify PDF it was displayed in a separate exams: Overwhelmingly, the major-
concepts that they didn’t understand file (Figure 1). ity of positive comments for both
(Item 5), and less likely to agree that the game and the PDF were in the
the questions helped them to feel Comments and suggestions category exam preparation (56% of
better prepared for exams (Item 11). Students were also invited to pro- positive comments about the game
In contrast, game users were more vide a written response concerning and 71% of positive comments about
likely to agree that the practice ques- their overall impression of the game the PDF). Indeed, 29% of all com-
tions took too much time away from or PDF. Table 2 summarizes the cat- ments returned (157) were catego-
their studying (Item 4). egorization of these comments. rized as being positive with respect
We explored the difference in Game users submitted more nega- to preparation for the examination.
response for Item 3 further, by com- tive comments: Overall, there were These data suggest that regardless
paring the distribution of the scores somewhat more negative than posi- of format, the students perceived
for the two groups (Figure 4), and we tive comments (308 negative and practice questions as helpful to
found that whereas 85.0% of the PDF 239 positive). However, game users them. Interestingly, however, 110
users agreed (score = 1 or 2) that the submitted more than twice as many of the positive comments regarding
questions provided useful feedback negative comments than positive preparation for examinations were
on their learning, only 63.6% of (211 negative and 97 positive), from PDF users compared with only
game users agreed with this state- whereas PDF users submitted almost 47 from game users. Apparently, the

FIGURE 4
The distribution of responses for Item 3: The game or questions provided useful feedback to me on my
learning. Responses were collected using a Likert scale: 1 = strongly agree; 2 = agree; 3 = neutral; 4 = disagree;
5 = strongly disagree. For each group, the number of responses in each category is presented as a percentage
of the total responses received. PDF users: mean = 1.75; SD = 0.940; n = 294. Game users: mean = 2.28; SD =
1.193; n = 250.

92
Journal of College Science Teaching
game users did not feel quite as well that I did not see in practice. Overall it was fun to have the
prepared for examinations as their (P3-275) questions in that format . . . (G4-
classmates who were assigned the 86)
PDF. This finding was surprising Questions presented as a game
because the game and PDF provided were perceived as engaging and Students expressed frustration with
students with access to an identi- fun: In keeping with other research technical issues and format incon-
cal set of questions and equivalent, exploring the use and benefits of venience: A number of game users
limited, correct/incorrect feedback. learning games, 24% of the posi- provided negative comments, which
Some of the comments provided at tive comments about the game were were related to technical difficul-
least a partial explanation for this, categorized as indicating that the re- ties (37) and difficulties in studying
suggesting that game users felt at a spondents “liked” the game and 16% (29). In contrast, there were almost
disadvantage in preparation for the were categorized as indicating it was no comments in this category from
exams because the game format was “engaging and fun.” Unsurprisingly, PDF users.
different from the exam: there were no positive comments
suggesting that the practice ques- I hated how the images had to
It needs to be more similar to the tions in PDF format were “engaging be open separately and could not
exams, I do not want them to be and fun.” see the questions and image/dia-
exactly the same but the exams gram on same page. This made it
have some completely new parts Learning game was fun. (G3-78) difficult to engage in questions.

TABLE 2
Number of comments by category from game-user and PDF-only groups.

Comment category Game users PDF users Fraction of total


(547)
Negative Feedback explanations 69 20 16%
Wanted other format or both 43 21 12%
Question content 18 29 9%
Technical difficulties 37 2 7%
Difficulties in studying 29 2 6%
Errors in questions/answers 5 17 4%
Boring, uninteresting 7 6 3%
Disliked 3 0 1%
Total 211 97 56%
Positive Exam preparation 47 110 29%
Liked 20 27 9%
Format convenience 2 16 3%
Engaging and fun 14 0 3%
Immediate feedback/instant 2 1 1%
Total 85 154 44%
Note: Written responses to the open-ended question were categorized as shown. Responses from game users (n =195) and PDF
users (n =205) were categorized separately. In some cases, a single response addressed several issues that were coded under
more than one category. Therefore, there are more comments recorded (n = 547) than written responses received (n = 400). The
table shows the number of comments in each category received from the game-users and PDF-only groups. The total number of
comments in each category (from both groups) is also given as a fraction (%) of the total number of comments received.

Vol. 45, No. 2, 2015 93


RESEARCH AND TEACHING

Website was sometimes down, copy. (G4-97) The “learn more” link never had
due to expiration of site certifi- anything posted; it should have
cate? I think students should be I used the PDF questions of extra information to help you
provided with both the PDF and my classmates who got them further understand the questions,
game, to study over the questions emailed to them and I know I regardless if you got it right or
after playing games. (G3-79) am not the only one who did this wrong. (G3-37)
all term. (G3-32)
It was frustrating that you could Discussion
not see the question when you This desire is likely related to Question 1: Is student
clicked on the diagram, pop-up the observation that students re- performance in examinations
window. You should be able ported feeling “better prepared” improved by access to an online
to move the pop-up window. for examinations when using the quiz game?
The window with the question PDF. We hypothesized that providing
should be larger so you do not practice questions as an online
have to scroll a lot. (G3-37) Students wanted feedback expla- game might increase motivation
nations to the questions: The largest and engagement and thus improve
I was not able to view any im- category of negative comments, 16% performance on exams. However,
ages on the game on my com- of all comments received, consisted we found no difference in perfor-
puter nor was I able to download of complaints about the lack of feed- mance between students who had
any software that allowed me to back explanations. access to practice questions in game
view them. (G4-86) format and those who had access
Sometimes I did not understand only in PDF format.
Bothersome to go back while why my answers were wrong so Much educational research lit-
studying, cannot print the notes a way to explain how the mecha- erature demonstrates the difficulty
and use them to study mate- nism works would be extremely of measuring “improvements” in
rial and have to go through all helpful. (G1-1) learning resulting from specific in-
the questions to review them terventions. Even if we are confident
for tests. Maybe a print option There should be an explanation that we measure learning outcomes
would be helpful for the ability for why the option chosen was well, those outcomes are influenced
to have a hard copy. (G3-78) wrong and there should be a by many factors, including the stu-
footnote describing where the dent’s inherent ability, individual
Game users were more likely to topic discussed can be found in differences (Chickering, 2006), mo-
want the other format or both for- the book or in notes, the dif- tivation (Biggs, 2012), and learning
mats: A notable proportion of the ficulty should be easier as well. approach (Entwistle & Peterson,
comments (12%) expressed a desire (G1-6) 2004; Entwistle, Tait, & McCune,
for access to the other format or to 2000). By offering practice ques-
both formats. However, twice as many It would be really nice if the an- tions in game format, we intended to
of these comments were provided by swers were explained, otherwise increase engagement with the course
game users (43) than by PDF users the game is no different than a material. However, students differed
(21), suggesting that the desire for the simple PDF of questions, waste notably in their level of engagement
“other format” was more pronounced of time. (G4-154) with the game; for example, while
among the game users. Many of the approximately one third of students
respondents noted that they had asked Notably, more than three times in Sections A1, A2, and B2 reported
classmates for a copy of the questions as many game users submitted com- using the challenge feature of the
in PDF format: ments in this category than PDF us- game regularly, another one third
ers (69 compared with 20). That is, reported using it never. That is, stu-
I completely admit to asking game users felt more strongly that dents were not equally engaged by
my friend to send me one PDF, they should have been provided with the game; many expressed a desire
much better, I like having a hard feedback explanations. for “the other format” and indicated

94
Journal of College Science Teaching
that they had acquired the PDF from engage with the material more ex- to the similarity in format between
classmates. tensively through the online game, the PDF and the examination. Had
There is substantial evidence that Brainspan. the examination been in an online
learner differences can affect learn- Overall, the survey responses format, the reverse might well have
ing outcomes. In BIOCH200, we revealed that students appreciated been reported. Of course, the stu-
have previously found that learning access to the questions in game for- dents’ reported only a perception,
approach (Entwistle & Peterson, mat, with a large majority (84%) of and there is no evidence that PDF
2004) is correlated with performance game users agreeing that the use of users were, in fact, better prepared
on the final examination (Milner, the game should be continued. Game for the examination.
2014). Also, in studying the impact users reported that the questions
of student personal response systems were engaging and fun, and they Conclusions and
(clickers) on learning outcomes, we valued the learning tool. Despite implications for practice
found that only the highest perform- this, however, game users submitted In this study, we observed no ben-
ers did better on the final examina- many more negative comments than efit of delivering practice questions
tion (Addison, Wright, & Milner, PDF users. in an online game compared with
2009). Overall, it is possible that In general, game users seemed to the PDF. This does not mean that
certain learners did benefit from us- feel strongly that detailed feedback there were no benefits, however, as
ing the game in the current study and explanations for the questions were impacts of learning tools on learn-
that we simply did not measure it. It missing. This was unexpected, be- ing outcomes are difficult to mea-
also remains possible that the game cause both the PDF and the game sure (Biggs, 1996). Overall, Brain-
could improve longer term retention provided identical, limited correct/ span was received by students as a
through extended practice, but this incorrect feedback. We suspect that “fun” addition to BIOCH200, and
was not measured here either. this sentiment resulted from the the participants expressed a clear
To address some of these issues, [Learn More] button, an additional desire for the use of the game to be
and to further explore the impact of feature of the game that had been continued. We believe that online
Brainspan on student learning in used in other courses. Unfortunately, games like Brainspan have the po-
BIOCH 200, we are currently ana- in the study reported here, we had tential to make worthwhile contri-
lyzing data from a more controlled neither the time nor the resources butions in similar, large-enrollment,
study in which individual examina- to populate the field. In general, the introductory-level science classes,
tion performances were measured, open-ended comments revealed that particularly when provided through
rather than class averages, and each students expected more from the tried-and-tested quiz systems that
student’s learning approach was as- game than from the PDF, and the require a minimal investment of
sessed. [Learn More] button was certainly a time, money, and resources.
factor in this. We were surprised by our stu-
Question 2: What are student Both PDF and game users agreed dents’ perception of disadvantage
attitudes towards the practice that the practice questions were and format inconvenience with the
questions delivered in an online helpful in preparation for exams. game. We believe that these percep-
game? Again, however, the game users tions reflect (a) our decision not
Student engagement with a quiz de- were less likely to agree, and the to populate the [Learn More] field
termines its effectiveness in learn- open-ended comments revealed that for this study and (b) the students’
ing (Anderson et al., 2001; Chick- the game users felt that they had desire to have the format of practice
ering, 2006). Because engagement been denied a useful tool, the PDF. questions match the format of the
is determined by individual prefer- Many more game users than PDF examination. We were reminded of
ences (Biggs, 2012), it is important users expressed a desire for the other the importance of soliciting feedback
to take student experiences into ac- format or both, and this is probably from students when implementing a
count when choosing or designing related to frustration with technical new learning tool, because students’
tools for learning (Gosper, 2010). In issues and format inconvenience, perceptions of a tool will affect their
this study, we used a survey to test both of which were expressed by engagement with it, and no tool can
our assumption that students would game users. It may also be related be useful if it is not used. ■

Vol. 45, No. 2, 2015 95


RESEARCH AND TEACHING

References practice for current and future Grissom, R. J., & Kim, J. J. (2012).
Addison, S., Wright, A., & Milner, learning. Assessment & Evaluation Effect sizes for research: Univariate
R. (2009). Using clickers to in Higher Education, 37, 33–43. and multivariate applications. New
improve student engagement and Dermo, J. (2009). e-Assessment and York, NY: Routledge.
performance in an introductory the student learning experience: Horizon Report. (2006). EDUCAUSE
biochemistry class. Biochemistry A survey of student perceptions Learning Initiative and New Media
& Molecular Biology Education, of e-assessment. British Journal Consortium. Available at https://
37(2), 84–91. of Educational Technology, 40, net.educause.edu/ir/library/pdf/
Akl, E. A., Pretorius, R. W., Sackett, 203–214. CSD4387.pdf
K., Erdley, W. S., Bhoopathi, P. S., Entwistle, N. J., & Peterson, E. R. Hudson, J. N., & Bristow, D. R.
Alfarah, Z., & Schünemann, H. J. (2004). Conceptions of learning (2006). Formative assessment
(2010). The effect of educational and knowledge in higher education: can be fun as well as educational.
games on medical students’ learning Relationships with study behaviour Advances in Physiology Education,
outcomes: A systematic review: and influences of learning 30(1-4), 33–37.
BEME guide no. 14. Medical environments. International Journal Jaehnig, W., & Miller, M. L. (2007).
Teacher, 32, 16–27. of Educational Research, 41, Feedback types in programmed
Anderson, L. W., Krathwohl, D. R., 407–428. instruction: A systematic review.
Airasian, P. W., Cruikshank, K. A., Entwistle, N., Tait, H., & McCune, Psychological Record, 57, 219–232.
Mayer, R. E., Pintrich, P. R., . . . V. (2000). Patterns of response Karpicke, J. D., & Roediger, H. L.
Wittrock, M. C. (Eds.). (2001). A to an approaches to studying (2007). Repeated retrieval during
taxonomy for learning, teaching, inventory across contrasting groups learning is the key to long-term
and assessing: A revision of Bloom’s and contexts. European Journal retention. Journal of Memory and
taxonomy of educational objectives of Psychology of Education, 15, Language, 57, 151–162.
(Complete ed.). New York, NY: 33–48. Kolb, A. Y., & Kolb, D. A. (2009).
Longman. Ericsson, K. A., & Kintsch, W. (1995). The learning way: Meta-cognitive
Benek-Rivera, J., & Mathews, V. Long-term working memory. aspects of experiential learning.
E. (2004). Active learning with Psychological Review, 102, 211– Simulation & Gaming, 40, 297–327.
“jeopardy”: Students ask the 245. Krathwohl, D. R., & Anderson, L. W.
questions. Journal of Management Gibbs, G. R. (2007). Analyzing (2010). Merlin C. Wittrock and
Education, 28, 104–118. qualitative data (Series: The SAGE the revision of Bloom’s taxonomy.
Biggs, J. (1996). Enhancing teaching Qualitative Research Kit). London, Educational Psychologist, 45,
through constructive alignment. England: Sage. 64–65.
Higher Education, 32, 347–364. Glass, A. L., Brill, G., & Ingate, M. Maxwell, A. (2010). Assessment
Biggs, J. (2012). What the student (2008). Combined online and in- strategies for a history exam, or,
does: Teaching for enhanced class pretesting improves exam why short-answer questions are
learning. Higher Education performance in general psychology. better than in-class essays. History
Research and Development, 31, Educational Psychology, 28, 483– Teacher, 43, 233–245.
39–55. 503. Milner, R. E. (2014). Learner
Chickering, A. W. (2006). Every Gnarpe, J. (2009). A multiplayer differences and learning outcomes
student can learn—if. About learning game for medical in an introductory biochemistry
Campus, 11(2), 9–15. education [Letter to the editor]. class: Attitude toward images,
Crisp, B. R. (2007). Is it worth the Medical Teacher, 31, 483–503. visual cognitive skills, and learning
effort? How feedback influences Gosper, M. (2010). Designing online approach. Biochemisty and
students’ subsequent submission quizzes: A whole of curriculum Molecular Biology Education, 42,
of assessable work. Assessment & approach. In Proceedings of the 285–298.
Evaluation in Higher Education, 32, IADIS International Conference Narciss, S., & Huth, K. (2006).
571–581. on Cognition and Exploratory Fostering achievement and
Crisp, G. T. (2012). Integrative Learning in Digital Age (pp. motivation with bug-related tutoring
assessment: Reframing assessment 11–18). feedback in a computer-based

96
Journal of College Science Teaching
training for written subtraction. opportunities on student simulations. Simulation & Gaming,
Learning and Instruction, 16, learning. Australasian Journal 21, 262–273.
310–322. of Educational Technology, 21,
Nicol, D. J., & Macfarlane-Dick, D. 102–117. Rachel Milner (rmilner@ualberta.ca) is
(2006). Formative assessment and Rocco, S. (2007). Online assessment a teaching professor in the Department
self-regulated learning: A model and evaluation. New Directions of Biochemistry, Jonathan Parrish is a
and seven principles of good for Adult & Continuing Education, teaching associate professor in the Depart-
feedback practice. Studies in Higher 2007(113), 75–86. ment of Biochemistry, Adrienne Wright is
Education, 31, 199–218. Shute, V. J., Hansen, E. G., & a teaching professor in the Department of
Peat, M., & Franklin, S. (2002). Almond, R. G. (2008). You can’t Biochemistry, Judy Gnarpe is a teaching
Supporting student learning: The fatten a hog by weighing it—or professor in the Department of Medical Mi-
use of computer-based formative can you? Evaluating an assessment crobiology and Immunology, and Louanne
assessment modules. British for learning system called ACED. Keenan is an associate professor in the
Journal of Educational Technology, International Journal of Artificial Department of Family Medicine and direc-
33, 515–523. Intelligence in Education, 18, tor of community engaged research, Divi-
Peat, M., Franklin, S., Devlin, M., & 289–316. sion of Community Engagement, all at the
Charles, M. (2005). Revisiting the Thatcher, D. C. (1990). Promoting University of Alberta in Edmonton, Alberta,
impact of formative assessment learning through games and Canada.

Ordering is simple
Why would you pay more for the same top quality product?
Here you will find discounts of up to 20% on anatomy models,
medical simulators, biology products and science supplies.

Apply Coupon Code


for an additional 5% discount on orders over $400.00

Discounted prices for

Vol. 45, No. 2, 2015 97


Copyright of Journal of College Science Teaching is the property of National Science
Teachers Association and its content may not be copied or emailed to multiple sites or posted
to a listserv without the copyright holder's express written permission. However, users may
print, download, or email articles for individual use.

Вам также может понравиться