Вы находитесь на странице: 1из 13

Running Head: SIMULATION EDUCATION EFFECTS

Simulation Education and the Effects on Retention and Self-Confidence


Gladys Dushane
University of Central Florida

Running Head: SIMULATION EDUCATION EFFECTS


Abstract
The Institute of Medicine published an article in 2010 called To Err Is Human: Building a
Safer Health System. The result of this article was a movement of practices and systems
to improve patient safety. Safer practices included creating a learning environment for
healthcare professionals. Simulation is being used as a tool for learning to improve
patient safety (Klipfel, et. al., 2014). With the growing popularity of simulation, testing
must be done to assess the effectiveness of this new learning environment. Literature was
obtained using several databases to assess if simulation education is impacting retention
and self-confidence in health professionals. The literature search did not support the
retention of knowledge from simulation education. The literature search did report results
that simulation education had a positive impact on self-confidence. Recommendations
would include research that would be directed at determining if there is a direct link with
positive self-confidence and positive patient outcomes.
Keywords: simulation, simulation education, simulation based education, SBT,
knowledge, retention, self-confidence

Running Head: SIMULATION EDUCATION EFFECTS


Significance
In order to effectively treat the general population in the healthcare setting, there
must first be proper education of healthcare professionals. To meet the needs of a variety
of learning styles, multiple teaching methods must be employed, along with using the
most efficient methods available. Simulation could be an answer to effectively educate
healthcare professionals and doing that as efficiently as possible (Aggarwal et. al., 2010).
There is recent emphasis on the use of simulation to assist in educating health
professionals. Many undergraduate and graduate programs are using simulation as an
enhanced teaching method (Bradley, 2006). The use of simulation allows for errors and
professional growth without risking patient safety. Health professionals and students are
able to improve their skills without harming patients in the learning process.
According to Bradley (2006), simulation is defined as imitating behavior or
situations by way of an apparatus for the purpose of personal training. Training in
healthcare used to require a live subject for a hands on approach in order to be able to do
a procedure from beginning to end. Simulation education allows for procedures to be
done from beginning to end without the use of human subjects. Simulation is used for
very small procedures to more serious critical processes. Health professionals are now
able to simulate a resuscitation, for example, from beginning to end in hopes of being
more prepared in the future during a live resuscitation. The range for simulation is very
broad and the possibilities are evolving. As simulation labs grow in popularity, even large
and complicated simulations such as surgical procedures are now possible.
The cost of simulation is unknown. Unfortunately it is usually not reported and
completely accounted for in research studies (Zendejas et. al., 2013). Lapkin and Levett-

Running Head: SIMULATION EDUCATION EFFECTS


Jones 2011) reported that medium fidelity simulation was just as effective as high fidelity
simulation and was 1/5th the cost of high fidelity simulation. This is suggestive that in
going forward, hospitals and healthcare settings that are trying to cut cost, will likely use
medium fidelity simulation when educating staff, students and professionals.
Retention of information taught to health professionals is often tested before
degrees, licenses or certificates are given. Retention of information is the defining tool
and measurement as to if a healthcare student will become a healthcare professional.
Self- confidence is not measured in the same manner (Aggarwal et. al., 2010). Selfconfidence cannot be measured via posttest. Because of this, self-confidence is a self
reported assessment. The lack of direct objective measurement does not negate its
importance. Self-confidence is important for professionals in the workforce because selfconfidence is one of the most powerful regulators of behavior and performance
(Druckman, 1994).
In todays society, error is unacceptable. Simulation provides a hands on
environment, where errors can be made and corrected without harm to the patient, and
without distressing effects to the provider (Aggarwal et al., 2010). The purpose of this
paper is to explore if simulation positively affects retention of information, and selfconfidence when used as a teaching method for health professionals.
Methods
The literature search used the databases CINHAL, MEDLINE, Academic
OneFile, Science Direct and OAlster with parameters of articles published in English
between January 2004 and January 2014. The databases were searched using the key
works: healthcare professions, simulation, education, self-confidence, and retention.

Running Head: SIMULATION EDUCATION EFFECTS


Inclusion criteria were results that discussed training, instruction, simulation
methods and models, simulation environment, education, research, and confidence
Exclusion criteria were results that discussed patient outcomes, cost, profitability, length
of stay and patient safety. Article quality and level of evidence were determined using
criteria published by Melnyk and Fineout-Overholt (2014).
Results
Search Results
In the collection phase, 23 potential articles were identified. Six articles were kept
and 17 were eliminated. Articles were eliminated if retention of information taught and
confidence were not both measured in the study. The six articles include two cohort
studies, and two descriptive studies. The following table highlights the themes of
retention and self-confidence.
Primary Study, Country
Miller, et al., (2014),
United States.

Design, Level of Evidence,


Sample
Cohort and Longitudinal
Study
Level IV
312 health professional
participants from October
2009- February 2012

Characteristics of
Intervention
10-hour curriculum:
Two-hour online module,
three 45-min hands on
workshop, and two
immersion simulation
scenarios.
Simulations were repeated
four times. Comparison
was made to earlier
simulation trials.
To evaluate performance,
trained content experts
completed quantitative
assessments of the students
response skills

Results
Knowledge items:
Students had a 31.9%
improvement over pretest
scores after the online
mini-course. No data table
available and no p value
shown.
Students scored better with
repeated performance.
Safety- Performance 2 vs.
performance 1 p<0.001,
OR 9.00
Safety- Performance 4 vs.
performance 3 p<0.001,
OR 4.57
79%-92% of participants
indicated improved
confidence in five areas:
Crisis communication
(91.7%), situational
awareness (85.7%),
maintaining safety in an

Running Head: SIMULATION EDUCATION EFFECTS


emergency (85.2%), triage
(85.2%), and crisis
leadership (79.2%)
Birch, et al., (2007),
United Kingdom

Cohort Study
Level 4
Sample size of 36 OB staff
(Junior and senior medical
and midwives). 6 teams of
6 members each

Lecture based teaching


(LBT), simulation based
teaching (SBT), or a
combination of the two,
lecture and simulation
(LAS).
Two teams had LBT, full
day training.
Two teams had SBT, full
day simulation.
Two teams had LAS, half
day of lecture and half a
day of simulation.

Short term:
LAS made the greatest
improvement with post
training score, improving
98 points.
SBT improved 74 points.
LBT improved 75 points.
Long term:
SBT continued to improve
by an additional 25 points.
LBT declined 4 points
LAS declines 3 points
p=0.086
Short term:
LAS improved 14 points
SBT and LBT improved 9
points each.
Long term:
SBT maintained
LBT decreased 3 points
LAS decreased 5 points

Srensen, et al., (2009),


United Kingdom

Descriptive, Longitudinal
Level 6
Sample size of 147 for first
training and 192 for
second training of
health professionals,
from June 2003-June 2006

Two training sessions (first


sample: postpartum
bleeding and shoulder
dystocia, second sample
preeclampsia and neonatal
resuscitation) using
simulated environments
12 participants per training
session for two and a half
hours. Lecture followed by
training workshop.

Neonatal resuscitation:
correct answers increased
from 65% to 94% early
post testing. (p=<0.001)
Late post testing still
significantly higher
percentage than pretest
(p=0.001)
postpartum bleeding:
confidence scores did not
improve immediately but
improved when measured
9-15month after training.
(p=0.007)
shoulder dystocia:
confidence improved after
initial testing but no
significant confidence
reduction compared with
testing 9-15month after

Running Head: SIMULATION EDUCATION EFFECTS


training. (p=<0.001)
preeclampsia: confidence
improved after initial
testing but no significant
confidence reduction
compared with testing 915month after training.
(p=<0.001)

Curran, et al., (2012),


Canada

Descriptive Study
Level 5

No direct intervention.
Focus groups and online
questionnaires.

Focus Group 28
participants.
Online survey
questionnaire completed
by 909 respondents.
Participants were rural and
urban health care
providers.

Curran, et al., (2010),


Canada

RCT
Level 2
Sample of 60 3rd year
medical students

Experimental group was


exposed to a computerized
training simulator
(ANAKIN) and control
group exposed to a training
video
ANAKIN includes:
Manikin simulator,
instrumentation actuators
and sensors, two way
audio and visual and
computer-mediated
assessment program.
Everyone received NRP
training and was required
to perform a megacode.

neonatal resuscitation:
confidence improved after
initial testing but decrease
in confidence from early
post testing to late post
testing. (p=0.001)
Overall respondents
reported preferences for
methods that allowed
hands on format, such as
practice with an instructor
(m=3.59), practice with
another healthcare
professional (m=3.72),
mock codes (m=5.42) and
practice with a mannequin
(m=5.74).
Highest confidence levels
after recent practice
(m=82.79) and after update
(m=79.95)
Lowest confidence levels
when not familiar with
new guideline (m=46.93)
and when they feel their
skills have deteriorated
(m=46.54)
No significant difference
between the mean scores
for the study groups on
knowledge test 1
(p=0.927)

No significant difference
between the mean scores
for the study groups on
confidence test 1
(p=0.071).
There was a significant
difference for between
confidence scores 1 and 2
(p=0.000) and 2 and 3
(p=0.000)

Running Head: SIMULATION EDUCATION EFFECTS

Kerr, et al., (2013), Canada

27 participants who were


residents or internist.

At 4-month and 8-month


intervals students were
exposed to (ANAKIN) or
exposed to a training video
Group randomized to
either a simulation based
workshop or a traditional
case-based interactive
workshop.

10 participants were
randomized to the
traditional group, while a
group of 9 and a group of 8
were randomized to the
simulation group.

Both groups were exposed


to the same obstetricinternal case (SOB in
pregnancy secondary to
pulmonary edema from
severe preclampsia).

RCT
Level 2

Multiple choice question


(MCQ) scores were used
for measurement.

No difference in pre- or
post-test scores for
traditional groups
p=0.76, d= 0.76
No difference in pre- or
post-test scores for
simulation groups
p=0.19 d= -0.44
Self reported confidence
increased in both groups.
For the traditional group
d=1.26 (p=0.003) and for
the simulation group
d=0.91 (p=0.004).

Retention
Six articles evaluated the association of simulation education and retention. Two
studies by Miller (2014), and Srensen (2009) did have improvement with simulationbased teaching. These were larger studies with 312, and 339 (total) participants
respectively. These were a cohort study and descriptive longitudinal study. The Miller
study showed a 31.9% increase from pretest scores and the Srensen study showed 29%
improvement in the posttest scores. The data obtained showed that the results were
significant, and knowledge was retained as evidenced by improved posttest scores.
The research presented by Kerr (2013) and Curran (2010), both randomized
controlled trials, did not show a correlation with simulation and retention of information.

Running Head: SIMULATION EDUCATION EFFECTS


The studies were small however (60 and 27 participants respectively) and were not
significant. Posttest scores for both groups did not have improvement with pretest scores.
Two studies, one a cohort study by Birch (2007) and a descriptive study by
Curran (2012) had small sample sizes of 36 and 28 participants respectively. Posttest
given supported a score improvement simulation based education. The significance of the
study not specified.
Self-confidence
Six articles evaluated the association of simulation education and reported selfconfidence. There was consistent evidence that self-confidence was improved through
simulated based education. Of the six research articles, all six had improved selfconfidence scores. Srensen (2009), Curran (2010) and Kerr (2013) had significant
results. The participants in these studies reported higher confidence after simulation
based education versus lecture-based education. Miller (2014), Birch (2007), and Curran
(2012) reported improvement with of posttest self-confidence but no significance was
obtained from the research.
Limitations of the Evidence
Concerns in the evidence included the small sample sizes of the articles searched.
Lack of research outside of resuscitation in regards to simulation was also a concern.
Lack of randomization and control groups in many of the studies were limitations also.
Limitations of the study included the type of healthcare professional studied in the
research process. For example, medical students without experience in a clinical area
were studied as opposed to a skilled nurse.

Running Head: SIMULATION EDUCATION EFFECTS


One of the biggest limitations in the evidence is that the outcome of selfconfidence is a subjective experience. Measuring self-confidence from a posttest report is
not in and of itself a clinically meaningful statistic. If patient outcomes improved because
participants reported self-confidence improvement, then a more accurate measure could
be made as to the meaningfulness of positive self-confidence results.
Recommendations
Healthcare Policy
Healthcare policy and procedures are meant to be carried out in a systematic
manner. Simulation education can be a means to have healthcare professionals carry out
the protocol. This is especially true with simulations of resuscitation that require a group
effort. Retention of information has to be the outcome achieved in order for the policies
to be effective. Further research needs to be done with randomized control trials.
Combining multiple facility results would be a logical next step. Retention of education
taught is an important assessment that needs further research with larger sample sizes.
Education
The education level of the participants involved is important. If research studies
are in regards to neonatal resuscitation for example, it is unfair to have someone
participate that is unfamiliar with the neonatal population. In going forward with the
research, skilled personal in their specialty field would provide for more fair results.
Practice
The reported self-confidence improvement scores are important indirectly on
performance and practice. Further testing needs to be completed to determine if selfconfidence does in fact improve patient outcomes. Recommendation of studying patient

Running Head: SIMULATION EDUCATION EFFECTS


outcomes would be the next logical step in assessing if positive self-confidence is a
relevant outcome from simulation. Positive patient outcomes would in practice, make
self-confidence an important outcome of simulation.

Running Head: SIMULATION EDUCATION EFFECTS


References
Aggarwal, R., Mytton, O., Derbrew, M., Hananel, D., Heydenburg, M., Issenberg, B., ...
Reznick, R. (2010). Training and simulation for patient safety. Quality and Safety
in Health Care, (19), I34-I43.
Birch, L., Jones, N., Doyle, P., Green, P., Mclaughlin, A., Champney, C., ... Taylor, K.
(2007). Obstetric skills drills: Evaluation of teaching methods. Nurse Education
Today, 27(8), 915-922.
Bradley, P. (2006). The history of simulation in medical education and possible future
directions. Medical Education, 40(3), 254-262.
Curran, V., Aziz, K., O'young, S., & Bessell, C. (2010). Evaluation of the Effect of a
Computerized Training Simulator (ANAKIN) on the Retention of Neonatal
Resuscitation Skills. Teaching and Learning in Medicine,16 (2), 157-164.
Curran, V., Fleet, L., & Greene, M. (2012). An exploratory study of factors influencing
resuscitation skills retention and performance among health providers. Journal of
Continuing Education in the Health Professions, 32(2), 126-133.
Druckman, D. (1994). Self-Confidence and Performance. In Learning, remembering,
believing enhancing human performance. Washington, D.C.: National Academy
Press.
Kerr, B., Hawkins, T., Herman, R., Barnes, S., Kaufmann, S., Fraser, K., & Ma, I. (2013).
Feasibility of scenario-based simulation training versus traditional workshops in
continuing medical education: A randomized controlled trial. Medical Education
Online, 18, 1-7. doi:10.3402/meo.v18i0.21312

Running Head: SIMULATION EDUCATION EFFECTS


Klipfel, J. M., Carolan, B. J., Brytowski, N., Mitchell, C. A., Gettman, M. T., &
Jacobson, T. M. (2014). Patient Safety Improvement Through In Situ Simulation
Interdisciplinary Team Training. Urologic Nursing, 34(1), 39-46.
doi:10.7257/1053-816X.2014.34.1.39
Lapkin, S., & Levett-Jones, T. (2011). A cost-utility analysis of medium vs. high-fidelity
human patient simulation manikins in nursing education. Journal Of Clinical
Nursing, 20(23/24), 3543-3552. doi:10.1111/j.1365-2702.2011.03843.x
Melnyk, B.M. & Fineout-Overholt, E. (2014). Evidence-based practice in nursing and
healthcare (3rd ed.)

Philadelphia: Wolters Kluwer/ Lippincott Williams &

Wilkins Health.
Miller, J. L. (2014). Improving emergency preparedness system readiness through
simulation and interprofessional education. Public health reports, 129, 129-135.
Srensen, J., Lkkegaard, E., Johansen, M., Ringsted, C., Kreiner, S., & Mcaleer, S.
(2009). The implementation and evaluation of a mandatory multi-professional
obstetric skills training program. Acta Obstetricia Et Gynecologica
Scandinavica, 88(10), 1107-1117.
Zendejas, B., Wang, A. T., Brydges, R., Hamstra, S. J., & Cook, D. A. (2013). Original
Communication: Cost: The missing outcome in simulation-based medical
education research: A systematic review. Surgery, 153160-176.
doi:10.1016/j.surg.2012.06.025

Вам также может понравиться