<<< Date Index >>>     <<< Thread Index >>>

[IP] Proper understanding of "The Human Factor"Risks Digest 23.07




Date: Thu, 11 Dec 2003 12:15:00 -0600
From: "Don Norman" <don@xxxxxxx>
Subject: Proper understanding of "The Human Factor"

  [Warning: This is not a posting of some news item. It is an essay -- well,
  a lecture -- triggered by two recent RISKS postings, particularly because
  the second posting completely misunderstood the purpose of the first and
  didn't bother to read the book which was being recommended. And exhibited
  an attitude on the part of designers that is the biggest risk of all risks
  -- because it is the kind of attitude that causes the very problems the
  RISKS group is designed to eliminate.  DN]

If we assume that the people who use technology are stupid ("Bubbas") then
we will continue to design poorly conceived equipment, procedures, and
software, thus leading to more and more accidents, all of which can be
blamed upon the hapless users rather than the root cause -- ill-conceived
software, ill-conceived procedural requirements, ill-conceived business
practices, and ill-conceived design in general. This appears to be a lesson
that must be repeated frequently, even to the supposedly sophisticated
reader/contributor to RISKS.

It is far too easy to blame people when systems fail. The result is that
over 75% of all accidents are blamed on human error.  Wake up people! When
the percentage is that high, it is a signal that something else is at fault
-- namely, the systems are poorly designed from a human point of view. As I
have said many times before (even within these RISKS mailings), if a valve
failed 75% of the time, would you get angry with the valve and simply
continual to replace it? No, you might reconsider the design specs. Yo would
try to figure out why the valve failed and solve the root cause of the
problem. Maybe it is underspecified, maybe there shouldn't be a valve there,
maybe some change needs to be made in the systems that feed into the valve.
Whatever the cause, you would find it and fix it. The same philosophy must
apply to people.

Item. I predict that the municipal water and wastewater treatment industry
is in for a series of serious accidents. Why? Because of postings like that
of Dave Brunberg (RISKS-23.06). He was triggered by Mike Smith's
recommendation for the book "The Human Factor" (RISKS-23.04), but without
bothering to read the book. So he tells us of the "Bubba factor" in his
industry, namely, the belief that operators (named "Bubba") are
characterized by stupidity, laziness, and general ineptness. Brunberg
complains that he must make his software work despite the incompetence of
his operators: "you walk a very fine line between making the plant so
inflexible that operators cannot respond to unforeseen problems and giving
Bubba a little too much latitude."

No wonder we continue to have problems. It is this attitude of developers
that cause the very problems they complain about. The book, the Human
Factor, is in fact an excellent argument against Brunberg's point of view.
In it, the author (Kim Vicente) points out that procedural demands, business
practices that reward productivity and punish safety, and the inability of
system designers to understand the real requirements on the plat operators
are what leads to failure. Poor Bubba is yelled at by his bosses for slowing
up production, penalized if he raises questions about safety. If he follows
procedures, he can't meet production requirements. If he violates them --
which is what everyone is forced to do -- he is punished if an accident
occurs. No matter that lots of other Bubbas have warned about that
likelihood.

Let me also recommend the excellent "Field Guide to Human Error
Investigations." Here, the author (Sidney Dekker) points out that the old
view of human error is that it is the cause of accidents whereas the new
view is that it is a symptom of trouble deeper inside a system. Alas, the
"old" view is in actuality the current view, whereas the "new" view is still
seldom understood. (The "new" view has only been around for 50 years, so I
suppose we need to give it more time.). The Field Guide is about aviation,
but it is very applicable to the waste industry as well -- and to hospitals,
and emergency crews, and manufacturing plants, and any situation where
accidents are being blamed on people.

The most serious RISK in all this is that people take the easy way out,
blame the operator for incompetence, and then smile smugly from their
air-conditioned office, far away from the plant. As long as this attitude
persists, we will have bigger and bigger accidents.

DISCLAIMER (MILD). My strong recommendation for "The Human Factor" appears
on the back jacket of that book and on my website.  My equally strong
recommendation for the "Field Guide" will appear on my website Real Soon
Now.

Dekker, S. (2002). The field guide to human error investigations. Burlington
VT: Ashgate.

Vicente, K. J. (2003). The human factor: revolutionizing the way people live
with technology. Toronto: A. A. Knopf Canada.

Don Norman, Nielsen Norman Group and Northwestern University
norman@xxxxxxxxxxx http://www.jnd.org
-------------------------------------
You are subscribed as roessler@xxxxxxxxxxxxxxxxxx
To manage your subscription, go to
 http://v2.listbox.com/member/?listname=ip

Archives at: http://www.interesting-people.org/archives/interesting-people/