Вы находитесь на странице: 1из 6

REVIEW

URRENT
C
OPINION

Decision-making and safety in anesthesiology


Marjorie P. Stiegler a and Keith J. Ruskin b

Purpose of review
Anesthesiologists work in a complex environment that is intolerant of errors. Cognitive errors, or errors
in thought processes, are mistakes that a clinician makes despite knowing better. Several new studies
provide a better understanding of how to manage risk while making better decisions.
Recent findings
Heuristics, or mental shortcuts, allow physicians to make decisions quickly and efficiently but may be
responsible for errors in diagnosis and treatment. Using simple decision-making checklists can help
healthcare providers to make the correct decisions by monitoring their own thought processes.
Anesthesiologists can adopt risk assessment tools that were originally developed for use by pilots to
determine the hazards associated with a particular clinical management strategy.
Summary
Effective decision-making and risk management reduce the risk of adverse events in the operating room.
This article proposes several new decision-making and risk assessment tools for use in the operating room.
Keywords
cognitive errors, heuristics, medical errors, risk assessment, safety

INTRODUCTION
The operating room is a complex environment
in which life-threatening critical events may
occur without warning. Anesthesiologists must
make decisions quickly, often with incomplete
data in an environment that is intolerant of
errors. Although accidents and near misses in
the operating room are relatively uncommon on
an individual scale, thousands of adverse events
occur throughout the USA, annually. Effective
decision-making, risk assessment, and risk management are, therefore, essential components of
patient safety.
Anesthesiologists were among the first to adopt
crisis resource management (CRM) techniques, and
other specialties soon followed. Although early
CRM and human factors training were modeled
almost entirely on an aviation paradigm, these
safety initiatives have been adapted and extensively
modified by physicians to fit the unique needs of
the patient care environment. A recent editorial
highlighted the growing body of medical literature
(over 140 studies on CRM), demonstrating the
interest and expertise in human factors within
the healthcare community [1]. The early detection
of error-producing situations and the design of
error-resistant systems are now critical components
of research in medical human factors [2 ].
&

www.co-anesthesiology.com

Unsafe acts, or deviations, can be divided into


two categories: errors, which are unintended and
violations, which are deliberate. Errors may result
from a variety of factors, including unfamiliarity
with a given task, external pressures such as
production pressure, or systemic problems such as
fatigue due to extended work hours. Errors may be
caused by a lack of knowledge (a mistake) or
an unintentional incorrect action (slips of action
or lapses of attention). Cognitive errors are defined
as errors in the thinking process; a physician makes
an incorrect decision even though he or she knows
better. Knowledge-based errors may be prevented
with conventional training, but the solution to
cognitive errors remains elusive.
Violations in the operating room are usually
the result of an attempt to achieve a goal that is

Department of Anesthesiology, University of North Carolina at Chapel


Hill, N2198 UNC Hospitals, North Carolina and bDepartment of Anesthesiology, Yale University School of Medicine, New Haven, Connecticut,
USA
Correspondence to Keith J. Ruskin, MD, Professor of Anesthesiology and
Neurosurgery, Yale University School of Medicine, 333 Cedar Street
TMP3, New Haven, CT 06520, USA. Tel: +1 203 785 2802; e-mail:
keith.ruskin@yale.edu
Curr Opin Anesthesiol 2012, 25:724729
DOI:10.1097/ACO.0b013e328359307a
Volume 25  Number 6  December 2012

Copyright Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

Decision-making and safety in anesthesiology Stiegler and Ruskin

KEY POINTS
 Cognitive errors, or errors in thought processes, are
mistakes that a clinician makes despite knowing
better.
 Heuristics, or mental shortcuts, allow physicians to
make decisions quickly and efficiently but may be
responsible for errors in diagnosis and treatment.
 Using simple decision-making checklists can help
healthcare providers to make the correct decisions by
monitoring their own thought processes.
 Risk management strategies can help to reduce the
likelihood that an adverse event will occur and
minimize the harm to the patient if it does.

incompatible with safe practice. Skipping a few tasks


on the anesthesia machine checkout in order to get a
case started on time, for example, might please the
hospital administration but increases the risk of
an equipment malfunction at a critical point in
the procedure. One study of violations in anesthetic
practice found that the most common root cause
is habit: the violation had become part of the
physicians routine practice. The most important
protection against many violations is awareness:
even well intentioned healthcare providers may
deliberately deviate from safe practice for what
seems like a plausible reason [3].
Like CRM, the aviation was the first industry to
incorporate strategies for prevention of cognitive
errors as part of a culture to improve safety. Errors
in judgment and decision-making have been found
to be a contributing factor in over one of three
accidents and incidents in commercial and general
aviation [4]. In response to this problem, the US
Federal Aviation Administration (FAA) developed
a series of cognitive aids that have decreased the
risk of aviation accidents and incidents. This article
reviews cognitive errors and explains how cognitive
aids that were originally developed for pilots can
be adapted to improve risk assessment, judgment,
and decision-making by healthcare personnel,
potentially improving patient safety.

HEURISTICS
A heuristic is a mental shortcut that allows a person
to make a decision more quickly, frugally, or accurately by ignoring part of the information [5 ].
Heuristics minimize the amount of complex
thinking a person has to do and are often linked
to subconscious processing. They are inherently
neither good nor bad, and they can be helpful when
&&

applied in situations to which they have been


adapted [6]. Occams razor is an example of a
medical heuristic; it dictates that the simplest
explanation for all presenting symptoms should
be sought. Trainees are often told common things
are common, or if you hear hoofbeats, think of
horses instead of zebras. These heuristics were
developed to protect novices against availability
bias (see Table 1 [12]) because striking and novel
diseases (fascinomas) are more easily remembered,
potentially causing the physician to overestimate
their base rate and pretest probability.
The use of medical heuristics combats subconscious availability bias, but they can also lead
a physician astray. Elstein and Schwartz [7] describes
heuristics as mental shortcuts commonly used
in decision-making that can lead to faulty reasoning
or conclusions. Experts learn to ignore some
information when making critical decisions; this
is especially true in fundamentally uncertain
domains such as medicine [6]. They also rely heavily
on cognitive shortcuts and intuitive processes,
especially when making high-stakes decisions under
time pressure. This environment may be particularly
prone to cognitive error [8,9]. Simple heuristics have
been shown to be more accurate than standard
statistical methods when some relevant information
is unknown or must be estimated from samples [5 ]
but may also lead to significant errors. Over-reliance
upon subconscious processes and mental shortcuts,
or using these aids in the wrong circumstances, may
result in cognitive errors.
&&

COGNITIVE ERRORS
Cognitive errors are defined as thought-process
errors, or thinking mistakes that lead to incorrect
diagnoses, treatments, or both [10 ]. They are often
linked to failed heuristics and subconscious biases
and occur despite the availability of adequate
knowledge and data. Groopman [11] has stated that
technical errors account for only a small fraction of
incorrect diagnoses and treatments. Most errors are
mistakes in thinking. These thinking mistakes are
caused in part by subconscious processes, including
biases that may not even be recognized. Table 1 lists
several examples of cognitive errors that may be
particularly relevant to anesthesiology, but is not
a comprehensive list [12].
&&

PREVENTING AND RECOVERING FROM


COGNITIVE ERRORS
Strategies for prevention of and recovery from
cognitive errors are not well established in anesthesiology, but some concepts can easily be adapted

0952-7907 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins

www.co-anesthesiology.com

725

Copyright Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

Technology, education, training and information systems


Table 1. Cognitive Errors
Cognitive error

Definition

Premature closure

Accepting the first plausible diagnosis before it has been fully verified
No problem can withstand the assault of sustained thinking. Voltaire

Feedback bias

Significant time elapses between actions and consequences; lack of outcome data reporting.
Absence of feedback is subconsciously processed as positive feedback
The greatest of faults is to be conscious of none. Thomas Carlyle

Confirmation bias

Believing is seeing: seeking confirming evidence to support a diagnosis while discounting


disconfirming evidence, despite the latter sometimes being more definitive
I will look at any additional evidence to confirm the opinion to which I have already
come. Lord Molson

Availability bias

Error due to an emotionally memorable past experience (usually negative), subconsciously


ignoring important differences between the current presentation and that prior experience
Nothing fixes a thing so intensely in the memory as the wish to forget it. Michel de Montaigne

Omission bias

Tendency toward inaction rather than action, out of fear of failure or being wrong. May be
especially likely when a significant authority gradient is perceived or real
The man who makes no mistakes does not usually make anything. Edward Phelps

Commission bias

Tendency toward action rather than inaction, even when those actions are unindicated or
founded on desperation.

Sunk costs

Phenomenon during which the more effort and commitment invested towards a plan, the harder
it may become psychologically to abandon or revise that plan

You will do foolish things, but do them with enthusiasm. Sidonie-Gabrielle Colette

Insanity is doing the same thing over and over again and expecting different results. Albert Einstein
Anchoring/fixation

Focusing on one feature exclusively, at the expense of comprehensive understanding. This may
lead to misdiagnosis of a single problem, or missing concurrent diagnoses by focusing on just one

Framing effect/
unpacking principle

Allowing early presenting features to unduly influence decisions, particularly as related to transfer
of care from one person or team to another.

He who has a one-track mind, his train of thought often becomes derailed. Arthur Blank

An error does not become truth by reason of multiplied propagation. Gandhi


Overconfidence/denial

Inappropriate boldness, misplaced certainty of abilities. Refusal to acknowledge a dire situation


when faced with it. Heavily represented in quality assurance investigations, lawsuits, and
morbidity ,and mortality conferences [11].
Thinking you know when in fact you dont is a fatal mistake. Bertrand Russell

Outcome bias

Judging a decision on the eventual outcome, rather than the merits of the decision at the time it was made
Alls well that ends well

Adapted with permission from [12].

from other medical disciplines, and from cognitive


psychology. Some techniques can be employed at
the time of decision-making, whereas others can be
taught as part of generic educational curriculum.
Generic educational strategies include familiarization with the major classes of heuristics, facilitating
prediction of the various circumstances under which
they might fail [13]. Teaching physicians about common cognitive errors and how to use routine practices
of thought-process scrutiny may help to identify
these traps when they occur. Studies of unconscious
mental influences demonstrate that self-awareness
leads to better management of these cognitive
distortions [14]. Unfortunately, physicians, like all
humans, are generally poor at accurately assessing
726

www.co-anesthesiology.com

their own performance, and may have poor insight


into their own thought processes.
Cognitive forcing strategies are specific debiasing techniques that introduce self-monitoring into
the decision-making process. They are designed to
prevent errors by reducing automated, heuristical
thinking, and instead forcing deliberate, conscious
consideration of alternatives [13]. Although few
specific strategies have been developed for use
by anesthesiologists, cognitive forcing strategies
from outside disciplines can be used as examples.
Radiologists use systemic deconstruction as a
specific strategy to combat confirmation bias. The
radiologist reads every chest film in the exact
same systematic way, reporting on all structures
Volume 25  Number 6  December 2012

Copyright Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

Decision-making and safety in anesthesiology Stiegler and Ruskin

and findings, whether positive or negative, regardless of the study indication or clinical question
asked. Emergency medicine physicians use rule
out the worst scenario to increase the probability
that all critical diagnoses have received consideration [15]. A similar maxim from emergency
medicine states the most commonly missed fracture in the emergency department is the second
[16]. In other words, when a fracture or significant
soft-tissue abnormality is found, the search should
be continued for additional injuries [13].
Anesthesiologists can follow a Rule of Three
for diagnoses and therapeutic interventions. For
example, transient hypotension during induction
of anesthesia may be common, but if a treatment
(e.g., a vasopressor or fluid bolus) is initiated and
then repeated without effect, a differential of at least
three other diagnostic possibilities must be entertained before a third attempt at the same intervention is undertaken. Furthermore, any explanation
for a given problem (e.g., hypotension is caused by
surgical bleeding) must include at least three other
diagnoses, even if the evidence for surgical bleeding
seems compelling. This Rule of Three, proposed
here for the first time, forces consideration of
alternatives and prevents, among others, premature
closure, anchoring, sunk costs, framing, and confirmation bias.

DECISION-MAKING AND RISK


ASSESSMENT
Clinical decision-making is a process in which
a physician uses his or her knowledge to reason
through a novel situation in order to arrive at
a diagnosis or treatment plan [13]. Over 75% of
diagnostic errors can be attributed to cognitive
errors and faulty decision-making [17]. Having
already discussed cognitive errors and their related
strategies, we now present three additional tools:
the 3-P and DECIDE decision-making models,
and the PAVE risk-assessment method.
The 3-P decision-making tool, originally
developed by the FAA (Fig. 1) [18] is one of the
simplest cognitive forcing strategies.
To use this tool, the clinician first perceives that
the clinical situation has changed. He or she then
processes this information and determines a course
of action necessary to obtain a desirable outcome.
The last step is then to perform the needed actions.
This process is a closed loop: the clinician then
returns to the perceived step to determine whether
the intervention that was just made was successful.
This process assumes that every action produces
some kind of effect. Using this process causes the
clinician to actively look for that effect within an

Perceive

Perform

Process

FIGURE 1. The Perceive, Process, Perform cognitive forcing


tool. Adapted with permission from [18].

appropriate time frame. If the effect does not


occur, the most likely explanations are that the
clinician either misdiagnosed the problem or did
not apply the correct therapeutic intervention for
the correctly diagnosed problem. This tool can help
to prevent both knowledge-based errors (mistakes)
and inadvertent errors (slips), because the lack of
a successful outcome forces the anesthesiologist to
re-evaluate the clinical situation instead of blindly
pursuing an incorrect course of action.
The DECIDE model is a more detailed decisionmaking model that provides an organized method
for analyzing a situation and determining the
best course of action. The DECIDE model consists
of six steps:
(1)
(2)
(3)
(4)

detect the fact that something has changed;


estimate the need to react to the change;
choose a desirable outcome;
identify the actions needed to create that
outcome;
(5) do the necessary actions; and
(6) evaluate the effects of the actions.
DECIDE was originally developed by the FAA
(FAA aeronautical decision-making), but a variant
of this model has been described for use by healthcare managers [19]. The dual-process theory of
decision-making suggests that DECIDE may help
to reduce diagnostic and therapeutic errors. This
model hypothesizes that there are two distinct types
of reasoning: type 1 thinking is fast and intuitive.
Commonly repeated tasks are relegated to the
subconscious level, enabling a person to perform a
complex task (e.g., driving a car) without much
attention. Much routine clinical work is done with
type 1 thinking. If a situation seems familiar,
however, the physician may become overconfident
and miss critical details, resulting in an error
in diagnosis or treatment. Type 2 thinking is deliberate, slow, and analytic, and requires a substantial
amount of attention. Forcing a clinician to actively
consider a problem and determine a course of action
can decrease the impact of cognitive errors and
failed heuristics [20 ]. The 3P and DECIDE models
are, essentially, cognitive checklists that force the

0952-7907 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins

&&

www.co-anesthesiology.com

727

Copyright Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

Technology, education, training and information systems

replacement of habitual thinking and pattern recognition with deliberate consideration of alternatives

RISK MANAGEMENT
Risk is defined as the exposure to the possibility of
loss, injury, or other adverse circumstance. (Oxford
English dictionary) This definition can be further
expanded to include the probability and severity of
an injury that occurs as the result of an exposure to a
given hazard. Risk management is a formal method
of evaluating the likelihood of a given hazard and
then formulating a strategy for minimizing the
exposure, decreasing the possibility or severity of
an adverse outcome, or making a decision to avoid
the hazard altogether [21]. The ultimate goal of a risk
management program in anesthetic practice is to
identify and mitigate risks before a patient is harmed
[22]. Most risk assessment tools used by physicians
are used to estimate the probability of a patient
developing a specific medical condition (e.g., a
myocardial infarction after surgery). Few physicians,
however, receive any formal training in risk
management.
Because so many aviation accidents and
incidents are caused at least in part by human error,
the FAA has developed a series of formalized
tools for risk assessment and risk management.
These tools, which have been designed to be easy
to remember and use, are taught to all pilots and
may be adapted by physicians. Risk management is
guided by the following principles:
(1) Accept no unnecessary risk: unnecessary risks
expose the physician or patient to hazards
without providing an appropriate level of
benefit. For example, taking a patient with
known gastroesophageal reflux disease who
has just eaten to the operating room for a carpal
tunnel release exposes him to the unnecessary
risk of aspiration pneumonitis.
(2) Make risk decisions at the appropriate level:
decisions about risk should be made by the
person best equipped to develop an appropriate
mitigation strategy. This is sometimes challenging, when more than one attending physician is
caring for a patient (whether surgeon and anesthesiologist, or team of anesthesiologists) and
when care extenders and trainees are involved.
(3) Accept risk when the benefits outweigh the
costs: if the benefits to the patient outweigh
the hazards that have been identified, then a
given risk may be considered acceptable. For
example, anesthesiologists commonly bring a
patient with a full stomach to the operating
room for a repair of an open fracture.
728

www.co-anesthesiology.com

(4) Integrate risk management into planning at


all levels: risk is an unavoidable element of
anesthesia and surgery. Whenever possible, risks
should be anticipated and mitigated as much as
possible before the patient is brought to the
operating room.
One of the FAA formal risk assessment tools,
PAVE, can be easily adapted for use by anesthesiologists. As it was originally developed, PAVE
stands for Pilot, Aircraft, Environment, and External
Pressures. Changing the acronym gives anesthesiologists a simple, step-by-step checklist to estimate
the risks associated with a clinical situation.
(1) Patient: surgical illness and comorbidities.
(2) Anesthesiologist: training and skills, recent
experience, fatigue.
(3) Environment: where is the procedure (e.g.,
operating room or remote location)? What
equipment is available? Who will help if a
problem arises?
(4) External pressures: production pressure.
Although the use of the PAVE risk assessment
tool in anesthetic practice has yet to be studied, it
seems logical to assume that formally assessing the
risks associated with each of these factors will
enhance the anesthesiologists ability to identify
and mitigate the hazards associated with a surgical
procedure.

CONCLUSION
Effective decision-making and risk management are
critical elements of any strategy to reduce the risk
of adverse events in the operating room. Cognitive
errors are flaws in the thought process that may
result from failed mental shortcuts or biases such
as fixation errors. Cognitive forcing strategies are
techniques that allow an individual to monitor
his or her thought process and decision-making.
Tools such as the Rule of Three, 3P, and DECIDE
can help a physician to choose the correct course
of action by creating a cognitive checklist; they
force the physician to reflect upon each step of
the decision-making process. Risk management
helps an anesthesiologist to identify and take
steps to mitigate risks before a patient can be
harmed. Although a comprehensive risk-management strategy is beyond the scope of this article,
the PAVE risk-assessment tool can help a physician
to systematically evaluate the risks associated with
a specific course of action.
This article proposes the use of one new
decision-making tool and has adapted several others
Volume 25  Number 6  December 2012

Copyright Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

Decision-making and safety in anesthesiology Stiegler and Ruskin

for use by anesthesiologists. Incorporating the


use of anesthesia decision-making into clinical
practice can decrease the risk of cognitive errors
and improve patient safety.
Acknowledgements
Neither of the authors have received commercial sponsorship or funding for this article.
Conflicts of interest
There are no conflicts of interest.

REFERENCES AND RECOMMENDED


READING
Papers of particular interest, published within the annual period of review, have
been highlighted as:
&
of special interest
&& of outstanding interest
Additional references related to this topic can also be found in the Current
World Literature section in this issue (p. 747).
1. Gaba DM. Crisis resource management and teamwork training in
anaesthesia. Br J Anaesth 2010; 105:36.
2. Kontogiannis T. A systems perspective of managing error recovery and
&
tactical re-planning of operating teams in safety critical domains. J Safety
Res 2011; 42:7385.
This article reviews the disadvantages of systems that rely upon error suppression
and describes specific techniques that can be used to improve safety with error
detection and recovery. Strategies include recovery from fixation errors, structured
decision-making, anticipation of problems, and formulation of recovery plans.
3. Beatty PC, Beatty SF. Anaesthetists intentions to violate safety guidelines.
Anaesthesia 2004; 59:528540.
4. Shappell S, Detwiler C, Holcomb K, et al. Human error and commercial
aviation accidents: an analysis using the human factors analysis and
classification system. Hum Factors 2007; 49:227242.
5. Gigerenzer G, Gaissmaier W. Heuristic decision making. Annu Rev Psychol
&&
2011; 62:451482.
This is a comprehensive review of the controversies surrounding the science of
heuristics, with discussion of how well heuristics work in uncertain conditions,
which should be used, and when.

6. Wegwarth O, Gaissmaier W, Gigerenzer G. Smart strategies for doctors


and doctors-in-training: heuristics in medicine. Med Educ 2009; 43:721
728.
7. Elstein AS, Schwartz A. Clinical problem solving and diagnostic decision
making: selective review of the cognitive literature. BMJ 2002; 324:729
732.
8. Berner ES, Graber ML. Overconfidence as a cause of diagnostic error in
medicine. Am J Med 2008; 121:223.
9. Graber ML, Franklin N, Gordon R. Diagnostic error in internal medicine.
Arch Intern Med 2005; 165:14931499.
10. Stiegler MP, Neelankavil JP, Canales C, Dhillon A. Cognitive errors detected
&&
in anaesthesiology: a literature review and pilot study. Br J Anaesth 2012;
108:229235.
This is a review of cognitive errors in anesthesiology and is the source of the
included table in our chapter. This is among the first comprehensive reviews of
cognitive errors in anesthesiology.
11. Groopman J. How Doctors Think. New York, NY: Houghton Mifflin;
2007.
12. Stiegler MP. ASA Refresher Courses in Anesthesiology. In: Rosenblatt MBJ,
Gross JB, editors. Cognitive errors in anesthesiology: making mistakes even
when we know better. Philadelphia: Wolters Kluwer Health/Lippincott
Williams & Wilkins; 2012.
13. Croskerry P. Cognitive forcing strategies in clinical decisionmaking.
Ann Emerg Med 2003; 41:110120.
14. Fine C. A mind of its own: how your brain distorts and deceives. 1st ed.
New York: W. W. Norton and Company; 2006
15. Croskerry P. The cognitive imperative: thinking about how we think.
Acad Emerg Med 2000; 7:12231231.
16. Kovacs G, Croskerry P. Clinical decision making: an emergency medicine
perspective. Acad Emerg Med 1999; 6:947952.
17. Graber M. Diagnostic errors in medicine: a case of neglect. Jt Comm J Qual
Patient Saf 2005; 31:106113.
18. Aviation Instructors Handbook. [FAA-H-8083-9A]. US Department of
Transportation Federal Aviation Administration 2008.
19. Guo KL. DECIDE: a decision-making model for more effective decision
making by healthcare managers. Healthcare Manag (Frederick) 2008;
27:118127.
20. Ely JW, Graber ML, Croskerry P. Checklists to reduce diagnostic errors.
&&
Acad Med 2011; 86:307313.
This article discusses the ways in which faulty thinking processes can lead
to errors in patient care and then proposes a series of decision-making
checklists that physicians can use to organize their approach to problem
solving.
21. Haimes YY. Systems-based guiding principles for risk modeling, planning,
assessment, management, and communication. Risk Anal 2012; 32:1451
1467.
22. Davies J, Aitkenhead A. Clinical risk management in anaesthesia. In: Williams
J, Vincent C, editors. Clinical Risk Management: Enhancing Patient Safety.
2 ed. London, England: BMJ Books; 2001. pp. 111137.

0952-7907 2012 Wolters Kluwer Health | Lippincott Williams & Wilkins

www.co-anesthesiology.com

729

Copyright Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

Вам также может понравиться