Вы находитесь на странице: 1из 90

Human Errors

To Err Is Human

Human beings, even well


intentioned ones, will invariably
make errors in a complex system
Game Time
Questions
Mengapa tim anda gagal menyampaikan
pesan
Siapa yang patut dipersalahkan atas
kegagalan ini
Decrease RPM Increase RPM
Anatomy of an Event
Flawed
Defenses
Vision, Beliefs, &
Values
Vision, Beliefs, &
Values

on
Missi
Goals Event
es
Polici es
ss
Proce s
am
Pr ogr
Initiating
Latent Action
Organizational
Weaknesses

Error
Precursors
Myths of Human Error

Most accidents are due to human error


Only if you include designers and managers
Humans who cause accidents are at end of long line of
people who could have prevented the accident
Accidents caused by people, so eliminate human element
Cannot be done: automation has to be designed and
managed
Role as Operator Monitor, Backup, Partner
People must cope with technology
Newer technology doesnt
eliminate error
Nor does even newer technology
Terminology
Of human error
Error

Failure of a planned action to be


completed as intended (error of
execution) or use of a wrong plan to
achieve an aim (error of planning); the
accumulation of errors results in
accident.
Human Error
An inappropriate or undesirable human decision
or behavior that reduces, or has the potential for
reducing effectiveness, safety or system
performance. (Sanders and McCormick, 1993)
Those occasions in which a planned sequence
of mental or physical activities fails to achieve its
intended outcome, and when those failures
cannot be attributed to chance. (Reason, 1990)
Chance is NOT involved
Why Do Errors Happen
Discrete Action Classification

Errors of Omission - failure to do


Errors of Commission - failure to do
properly

Sequence Errors - failure to do in the


correct order

Timing / Rate Errors - failure to do at a


specific time or rate
Errors of Omission
Involve failure to do something.
An electrician was electrocuted while positioning
himself on a steel framework of an electric
substation.
There were several points of disconnect in order
to shut off power completely to the substation.
He apparently forgot to disconnect one of them.
Errors of Omission
Involve failure to do something.
Errors of Commission
Involve performing an act incorrectly.
A mechanic sitting on a conveyer belt called for
his partner to lightly hit the start button to jog the
belt a few inches forward.
The helper lost his balance & hit the button hard
enough to start the belt at full speed.
The mechanic was pulled between the belt & a
steel support 9 in. above the belt.
Errors of Commission
Involve performing an act incorrectly.
Sequence Error
occurs when a person performs some
task, or step in a task, out of sequence.
A crane operator lifting a 24-ton stone block has
the crane over turn.
Rather than lifting the boom and then rotating it
90 degrees, he rotates the near flat boom first,
and before he can lift it, the crane overturned.
Sequence Error
occurs when a person performs some
task, or step in a task, out of sequence.
Timing Error
occurs when one fails to perform an action
within an allotted time, either performing
too fast or to slow.
Taking too long to remove ones hand from a
workpiece in a drill press, a timing error that can
result in a nasty injury.
Timing Error
occurs when one fails to perform an action
within an allotted time, either performing
too fast or to slow.
Error: Two types

Action Slips: Erroneous Action:


- Unintentional - Intention to perform action
- Interplay between - Does not include violations
- Contextual Ambiguity - Line blurs when systematic
- Human fallibility strategy results in failure
Human Error Categories
Intentional

Dont Add a
C
O lubricate little extra
o
m the bearing grease m
i m
s i
s s
i Forget to Add the s
o lubricate the wrong i
n bearing grease
o
n
Unintentional
Mechanisms of Thought
There are three basic mechanisms of
thought (Rasmussen):
1. Skill- Based,
2. Rule-Based,
3. Knowledge-Based
These mechanisms span the range from
unconscious to conscious thought
processes.
Mechanisms of Thought
Skill-Based
Skill is the ability to carry out a task;
Skill-based cognitive processing and
performance refers to actions that are automatic
and easy due to an acquired skill.
They usually happen quickly and without
express effort on the part of the actor.
These are unconscious actions that we don't
need to explicitly "think about" in order to
accomplish. Any decisions are usually automatic
as well
Skill-Based
Most training is concerned with skill
development, the end goal being the
development of an automatic process.
Typically the actor needs to understand how to
execute a set of instructions, but not understand
the reasons behind them. Through training, the
actor will become proficient enough--skilled
enough--to perform the actions without the need
of instructions.
Rules-based
Rules-based processing involves
matching the context and problem
currently facing the actor. These rules are
typically of the "if X then Y" form, and can
be based on past experience, explicit
instructions, and so forth.
Rules-based
Rules-based processing comes to play
when an automatic skill fails and the
actor needs to fall back upon a set of
explicit instructions or rules at his
disposal. The actor examines and
interprets the current situation, and
chooses a rule that can best solve the
problem.
Knowledge-based
If rules-based processing doesn't solve the
problem, we fall back on knowledge-based
processing (we tend to prefer rules-based
solutions since they require less cognitive effort
on our part).
This is what happens when we are truly faced
with novel or unfamiliar situations, or where low-
level rules aren't appropriate (e.g. making
strategic decisions, or establishing a medical
diagnosis).
In general, this kind of processing involves the
processing of symbolic information.
Knowledge-based
As with rule-based processing, knowledge-
based processing is a conscious process. It
refers to what we typically think of as "analytic
thought," the process and analysis of
personal subjective knowledge.
Where skill is the ability to carry out a task,
knowledge is the possession of "information,
facts, and understanding" about a task.
(you may know a lot about a task but still not
be able to carry it out.)
Factors affecting thoughts and
actions:
the actual data you get
person's subjective values, from external sources
ethics, attitude, the social climate

Action

any physical factors impacting


a person's performance

fatigue, sleep loss,


alcohol, drugs, illness frustration, fear,
anger, anxiety
Human Failure can be reduced to a simple blunder
tree:
Error

Error of Planning
Error of Execution
(Mistake)

Slip Error Lapse Error


Slip Errors
An error of execution when the action
conducted was not that was intended; the
wrong action is observable
Lapse Error
An error of execution when the action
conducted was not was not what was
intended; the wrong action is not
observable.
Mistake
An error in which the action proceeds as
planned but fails to achieve intended
outcomes because the planned action
was wrong; error of planning.
Slips: Errors of Action

A slip is an action not in accord with the


your intentions: a good plan but poor
execution.
Since they are part of automatic,
unconscious actions, slips are unintended
acts due to a break in the routine.
Capture slips
Capture slips occur when you
automatically do something you didn't
mean to, usually because you fell into a
pattern you perform frequently.
For example, if you dial a particular
phone number often, your fingers get
used to hitting that particular sequence
of pushbuttons.
Description slips
Description slips occur when you haven't
correctly told yourself what you want to do,
i.e. an "incomplete or ambiguous
specification of intention.
This usually happens when your intended
action is similar to other actions you do a
lot, so that you perform the right action but
on the wrong object.
Associative activation slips
Associative activation slips occur when
your brain makes a faulty connection or
mental association between two ideas,
often when one is an external stimulus
that typically provokes a certain action.
The example is answering the phone
when you hear the doorbell.
Loss of activation slips
Loss of activation slips occur when you
lose track of why you're doing or trying to
do (the "activation" of the process).
This is essentially a temporary memory
loss, often due to interruption such as
someone handing you something, asking
you a question, or poking you in the eye.
Mistakes: Errors of Intention
A mistake is a planning failure, where actions go as
planned--but the plan was bad. These are errors of
judgment, inference, and the like, that result in an
incorrect intention, incorrect choice of criterion, or
incorrect value judgment.
Slips can often be prevented through checks built
into equipment and tools. For example, an O2/N2O
ratio limiter that prevents an anesthesiologist from
accidentally administering a dangerous combination
of gases.
Mistakes, on the other hand, stem from cognitive
breakdowns and are often influenced by a number
of external system factors, they are harder to
predict and prevent.
Rule-based errors
Rule-based errors occur when the wrong rule is chosen due
to the misperception of the situation, or the misapplication of
rule.
For example, selecting the wrong medication for a patient.
The medication may be correctly ordered and administered
(i.e. the procedure goes off without a hitch), but it is the
wrong medication for that particular patient.
Misperceptions that lead to rule-based errors can stem from
a number of sources, including external factors such as an
unclear or partially hidden read out on an ICU display,
confusing patient charts or lab result displays, and so forth.
Knowledge-based errors
Knowledge-based errors are the most complex of
the errors. They typically occur from a lack of or
misapplication of knowledge. As a result, often the
intention of the actor is itself erroneous.
The availability bias (choosing a course of action
because it is the one that comes most readily to mind)
The confirmation and overconfidence biases (fixation on
a particular course of action and actively pursuing
supporting evidence or ignoring contradictory evidence)
Three main fields involved in human
error:
1. Cognitive Science (also known as Cognitive
Engineering) is itself a mix of different
disciplines, including psychology, philosophy,
neuroscience, and artificial intelligence.
Cognitive scientists attempt to understand and
model cognitive abilities such as perception,
learning, language, memory, problem solving,
etc.
Three main fields involved in human
error:
2. Human Factors or Ergonomics look at the
specifics of human performance and how it
can be improved. On the computer side,
human factors engineers can help determine
how to lay out the control panels of medical
devices in order to maximize user
performance.
Three main fields involved in human
error:
3. Systems Analysis attempts to model
systems and organizations in order to
understand its functions, including its
relationships with other systems and its
subsystems. Researchers try to understand
how various components of a system can
contribute to a problem.
HE
Different Approaches

To human error
Different approaches
The problem of human error can be
viewed in 2 way:
1. The person approach
2. The system approach
Each has its model of error causation,
and each model gives rise to different
philosophies of error management
Person Approach versus
System Approach
Person approach System approach
Focus on individuals Focus on the conditions
under which individuals
Blaming individuals work (rules,
Methods: poster expectations, work
campaigns, writing metrics, communication,
etc.)
another procedure,
Building defenses to
disciplinary measures, avert errors/poor
threat of litigation, productivity or mitigate
retraining, blaming their effects
and shaming Methods: creating better
systems
Person approach, basis
The long-standing and widespread
tradition of person approach focuses on
the unsafe acts -errors and procedural
violations- of people on the front line.
Person approach, philosophy
This approach views these unsafe acts
as arising primarily from aberrant
mental processes such as
forgetfulness, inattention, poor
motivation, carelessness, negligence,
and recklessness.
People are viewed as free agents
capable of choosing between safe and
unsafe mode of behavior.
If something goes wrong, a person or
group must be responsible.
Person approach: countermeasures to
errors
The associated countermeasures are
directed mainly at reducing unwanted
variability in human behavior.
Posters that appeal to peoples fear,
disciplinary measures, threat of litigation,
retraining, naming, blaming, and
shaming.
Followers of these approaches tend to
treat errors as moral issues, assuming
that bad things happen to bad people-
what have been called the just- world
hypothesis
Person approach, why?
Blaming individuals is emotionally more
satisfying than targeting institutions.
Uncoupling of persons unsafe acts
from any institutional responsibility is in
the interests of managers
Person approach is also legally more
convenience.
Person approach:
shortcomings
Although some unsafe acts in any
sphere are egregious, most are not. in
aviation maintenance about 90% of
quality lapses were judged blameless.
Person approach:

shortcomings
Effective risk management depends crucially
on establishing a reporting culture. Without a
detailed analysis of mishaps, incidents, near
misses and free lessons, we have no way of
uncovering recurrent error traps.
The complete absence of such a reporting
culture contributed crucially into the
Chernobyl disaster.
Trust is a key element of a reporting culture,
and this in turn, requires the existence of a
just culture-where the line should be drawn
between blameless and blameworthy actions.
Person approach:
shortcomings
Focusing on the individual origins of error, isolate
unsafe acts from their system context.
2 important feature of human error tend to be
overlooked:
It is often the best people who make the worst
mistakes- error is not the monopoly of an unfortunate
few
Far from being random, mishaps tend to fall into
recurrent patterns. The same set of circumstances can
provoke similar errors, regardless of the people
involved.
The pursuit of greater safety is seriously
impeded by an approach that does not seek out
and remove the error-provoking properties within
the system
Blame and punishment
Anticipation of blame promotes cover up
Fear of criticism in close calls and near
misses precludes rational analysis of
possible injury precursor mechanisms,
and thus the opportunity for constructive
accident prevention
Quit Complaining
Your Job Could Be Worse
System approach
Humans are fallible and errors are to be
expected, even in the best organizations
Errors are seen as consequences rather
than causes, having their origins not so
much in the perversity of human nature as
in upstream systemic factors.
Work Systems Theory
Tasks

Technology Outcomes
Errors
Performance
Person Misfit? Quality of Care
Satisfaction
Environment injury/illness

Organization

Individual
Characteristics
System approach:countermeasures to
errors
Although we can not change the human
conditions, we can change the
conditions under which the human work.
A central idea is that of system
defenses. All hazardous technologies
posses barriers and safeguards. When
an adverse event occurs, the important
issue is not who blundered, but how and
why the defenses failed.
The Swiss cheese model of how defenses,
barriers, and safeguards may be penetrated by
an accident trajectory

Slices of Swiss cheese


As defensive layers

Holes as weaknesses
In defensive layers
The Swiss cheese model of system
accident
Defenses, barriers, and safeguards
occupy a key position in the system
approach.
High technology systems have many
defensive layers: some are engineered,
others rely on people and others
depend on procedures and
administrative controls.
The Swiss cheese model of system
accident
In an ideal word, each defensive layer would be
intact. In reality, they are more like slices of
Swiss cheese, having many holes- although
unlike in the cheese, these holes are
continually opening, shutting, and shifting
their location.
The presence of holes in any one slice does
not normally cause a bad outcome. Usually
this can happen only when the holes in many
layers momentarily line up to permit a
trajectory of accident opportunity- bringing
hazards into damaging contact with victims
The Swiss cheese model of system accident
The holes in the defenses arise
for 2 reasons:
1. Active failures
2. Latent conditions
Two Kinds of Error

Active Error

Latent Error
(leading to latent conditions)
Active Error
An error that occurs at the level of the
frontline operator and whose effects are
felt almost immediately
Latent Error
Errors in the design organization, training,
or maintenance that lead to operator
errors and whose effects typically lie
dormant in the system for lengthy periods
of time
Latent Errors
Adverse consequences which lay dormant within
the system for a long time, only becoming evident
when they combine with other factors to break
through the systems defences

These are committed by those far removed in


time and space from the immediate area:
- designers, high-level decision makers,
managers and maintenance personnel .

Decisions are shaped by various factors:


- economic, political, practical constraints.
Differences
Active Errors Latent Errors
Unsafe acts committed Created as a result of
by those at the sharp decisions taken at the
highest levels of the
end (operator)
organisation (choice of
equipment, SOP)

These actions can Their damaging


have immediate consequences become
adverse consequences evident when they
combine with local trigger
factors
Anatomy of an Event
Flawed
Defenses
Vision, Beliefs, &
Values
Vision, Beliefs, &
Values

on
Missi
Goals Event
es
Polici es
ss
Proce s
am
Pr ogr
Initiating
Latent Action
Organizational
Weaknesses

Error
Precursors
Active failures in Swiss cheese model

Active failures are the unsafe acts


committed by people who are in direct
contact with the patient or system (slips,
lapses,mistakes, and procedural
violations).
Active failures have a direct and usually
short-lived effect on the integrity of the
defenses.
Latent conditions in Swiss cheese model
Latent conditions are the inevitable resident
pathogens within a system.
They arise from decisions made by designers,
builders, procedure writers, and top-level
management.
They can translate into error-provoking
conditions within the workplace (time pressure,
understaffing, inadequate equipment, fatigue,
and inexperience)
They can create long-lasting holes and
weaknesses in the defenses (untrustworthy
alarms and indicators, unworkable procedures,
design and construction deficiencies).
Interaction between active failures
and latent conditions
Latent conditions may lie dormant within
the system for many years before they
combine with active failures and local
triggers to create an accident
opportunity.
Active failures are often hard to foresee
but latent conditions can be identified
and remedied before an adverse event
occur.
This approach leads to proactive rather
than reactive risk management
Active failures are like mosquitoes, they can be
swatted one by one, but they still keep coming.
The best remedies are to create more effective
defenses and to drain the swamps in which they
breed. The swamps, in this case, are the ever-
present latent conditions.
Strategic Approach
Re + Md E
1. Anticipate and prevent
active error (Re) at the job-site.
2. Identify and eliminate
latent organizational weaknesses
(Md).
The choice is very simple

Either you manage human error or


human error will manage you.
"Errors must be accepted as evidence
of systems flaws, not character flaws"

(Leape, 1997)