Use, Misuse, Skill & Affordances
Marc Green
NOTE: This article appeared in the June 2006 issue of Trial under the title, "Human Error Vs. Design Error." It is available online, |
An experienced hospital nurse selects a Lasix vial from a cart containing several medications. After checking the label three separate times, she fills a syringe and injects the patient, who subsequently dies. The vial actually contained potassium chloride (KCL), as the label clearly stated. The nurse cannot explain the lapse.
A teenager goes to a nearby lake, where he walks on a pier. He passes two "No Diving" signs and reaches a point where "No Diving" is stenciled on the planks at his feet. He dives into the water, strikes his head on the lake bottom, and suffers spinal cord injuries that leave him paraplegic. He does not remember seeing the signs.
Accidents like these have two possible causes: human error and design error. The accident may have occurred because actor, or "user," was inattentive, careless, or reckless or because the product's design was unsafe and the user's behavior merely revealed the flaw.
To determine what caused an accident, many people will ask whether the user "could have done otherwise." If circumstances or poor design limited the user's actions, then he or she is blameless. If the user failed to prevent the mishap through possible action, then the cause is human error.
People often interpret the "could have done otherwise" criterion literally, as meaning that the user should always perform the correct action if it is physically possible. An important distinction exists, however, between possible behavior and likely behavior. Unconscious processes control most of our everyday behavior, and people typically act without conscious deliberation or consideration of risk.
People often cannot "do otherwise" for mental reasons that are just as compelling as physical ones. However, product design is sometimes based on unrealistic, idealized notions of perfect human behavior. A design that requires possible, but unlikely, behavior is just as defective as a design that has electrical or mechanical faults.
Taking the examples above, most people would say that the nurse was inattentive and the teenager was reckless. However, both of these assessments are wrong. Their behavior might be better described as skilled and adaptive.
Carelessness or Skilled Behavior?
A large body of scientific research shows that designers can predict user behavior with a reasonable degree of accuracy. However, faulty usability design is often missed, in part because such flaws are so pervasive and in part because most people don't have training in human factors, a discipline that applies scientific knowledge of perception, cognition, and response to product design.
Most people, jurors included, judge behavior based on commonsense "folk psychology" and introspection ("What would I have done?") rather than on scientific analysis. However, common sense often rests on several well-documented cognitive biases.
One is "fundamental attribution error," the strong tendency to attribute causation to "dispositional" factors inside the person (laziness, carelessness, etc.) rather than to environmental circumstances. Another is "hindsight bias," the mistaken belief that an outcome could have been readily predicted before the event occurred. Hindsight bias is strengthened by "counterfactual thinking," the creation of what-if scenarios. People will usually place causation on the mutable (changeable) aspects of an event. If a counterfactual suggests that the user could have changed the outcome by different behavior, then he or she must be the cause. (In contrast, few would say that gravity caused a fall. Gravity is immutable.)
These cognitive biases make for a much simpler world. If the nurse's inattention caused the accident, for example, then the problem could be easily fixed by firing her; if other circumstances were the cause, then the entire system of dispensing drugs might require review. The biases also make the solution cheaper because there is then no need to hire additional nursing staff.
In contrast, a scientific analysis examines three sets of factors controlling behavior in the situation: 1) physical circumstances, 2) general human abilities, limitations and predispositions, and 3) individual experiences, expectations, and goals.
The role of physical circumstances is usually obvious. A driver at night might experience dim lighting or short braking time. If physical constraints prevent correct action, then the user could not have done otherwise and is blameless.
In contrast, behavioral limits are less obvious. All humans are genetically endowed with abilities and limitations that fall into three categories: perception (input), cognition (processing), and response (output). People can only attend to so much (perception), remember so much accurately (cognition), or act so quickly (response).
Human predispositions have evolved to help people cope efficiently with a complex world. One example is
"stimulus-response compatibility," an innate connection between perception and response.
Humans have evolved behavioral predispositions in order to act successfully and efficiently. They are especially important in stressful situations, when users perceptually narrow their information intake. However, efficiency and accuracy sometimes tradeoff; predispositions provide a "quick and dirty answer," which is usually adequate but which may fail in unusual situations. For example, a driver who sees another car approaching on a collision course from the left will automatically steer to the right. The natural tendency to move away from objects heading toward us is fortunate because the conscious analysis of a projectile's path could cause a fatal delay. Could a driver do otherwise and steer left to avoid the collision? The answer is "yes" in the literal sense but "no" in a more realistic, behavioral sense.
In addition, humans have many purely mental predispositions, called "heuristics" and "biases," that also operate to quickly guide behavior, reduce cognitive complexity, and conserve mental resources. Fundamental attribution error and hindsight biases are examples, but there are many others that figure prominently in accidents.
One is "cue generalization," the switching from a difficult, attention-consuming visual cue to a simpler one. Recognizing a color or shape, for example, is a much simpler perceptual task than reading. Another powerful predisposition is "confirmation bias," the tendency to seek information that confirms already-held beliefs and tune out contradictory evidence.
Perhaps the most important predisposition is the creation of expectations based on experience. People would be virtually paralyzed if they had to stop and consciously analyze every detail of every situation before acting. Instead, we use experience to develop "schemata" and "scripts"--basic knowledge units that create expectations about how products work, what behavior is appropriate and what effect each behavior will have. This knowledge need not be acquired through perception at the time of product use, so behavior is faster, smoother, and more efficient.
Users learn to focus attention on the important elements of a situation and ignore its irrelevant or unvarying aspects. They become "inattentionally blind" to information sources that do not affect the achievement of their goal(s). In cognitive psychology, we would say that they have switched from conscious "controlled" behavior to unconscious "automatic" behavior.
Most routine behavior is automatic. People are seldom consciously aware, for example, of how they steer their cars or walk down stairs. These are complex tasks that only seem effortless because we are so skilled at them. Our reliable, highly learned schemata almost eliminate the need to rely on conscious control.
In fact, this is what we often mean by skill: The user has learned to substitute automatic behavior for the slow and consciously controlled behavior typical of beginners. Asking a skilled user to explain his or her automatic behavior is like asking a dropped ball why it fell. It is impossible to verbalize the real causes and scientific principles at work.
Skilled, automatic behavior may fail under unusual circumstances. Since a person pays minimal attention to irrelevant and unvarying aspects of routine tasks, he or she may not notice a change when the situation isn't normal. For example, the nurse who administered the KCL probably scrutinized labels carefully when she first began her nursing career. Reading is a relatively difficult task requiring close attention and higher cognitive processes.
After performing the routine repeatedly without incident, the nurse unconsciously learned that the Lasix vials were always in a particular cart location; the label name, packaging, and location were redundant selection cues. She started selecting vials according to their location in the cart (cue generalization), which was simpler and more efficient because it required less mental processing.
On the day of the accident, the positions became scrambled, and the KCL was located in the Lasix's normal place. The nurse was merely following an unconscious, efficient schema that had always worked. Her adaptation was skilled behavior and not inattention or carelessness. Skilled users "don't solve problems and don't make decisions; they do what normally works."
In another case, an infant received a fatal overdose of a drug after a pharmacist checked a reference but misread an extra 0 in the dosage instructions. The pharmacist was a substitute whose previous dispensing experience was with adults, so she unconsciously expected to see a larger, adult-sized dosage. Further, she had consulted a second reference book and made exactly the same error. While seeing may be believing, it is equally true that believing is seeing.
These stories have several morals:
People can sometimes do otherwise in the physical sense, but not in the psychological sense. The nurse, for example, automatically adapted by shifting from reading text to responding to the simpler location cue. Once she "correctly" selected the vial by location, she would have engaged in "confirmation bias," a strong expectation based on heavily reinforced experience that causes a person to miss or misperceive the highly obvious. Her two additional checks of the vial's label could merely confirm what she believed was the initial correct selection.
Some may say that this argument is nonsense because the nurse could have read the label and was merely inattentive. I would answer that she could not do otherwise because she was not aware that she was not aware.
Behavior should not be disembodied from its context. Every action occurs in a context that includes both the past and future as well as the present. At the moment of the critical action, the past is reflected in the user's learned expectations and the future in the user's goals. Both expectations and goals direct attention and determine what should be relevant and what should be ignored. The experience of successfully selecting a medication by cart location creates the expectation that label information can be ignored. A driver whose goal is to turn left at an intersection will look both left and right, but a driver who is to turn right will look only left and ignore the right.
Statements by people about their automatic behavior usually have little value. Since the controls of automatic behavior largely operate outside of awareness, users cannot consciously introspect about them. The nurse, for example, could not verbalize the reasons for her behavior.
Bad outcomes do not necessarily imply bad behavior. It is easy to blame the nurse in hindsight--after the outcome is known--and to say that she could have done otherwise. Of course, she didn't know the outcome before acting, so the outcome should not be used to judge her behavior. Since most people do not set out to create accidents, their actions must have seemed reasonable to them at the time. To understand the accident, the first question should then be "Why did their behavior seemed reasonable to them at the time?"
Circumstances--or more precisely, the people who create circumstances-often cause the accident. These people have the opportunity to make the deliberate choice to ensure proper design. The tendency of people to adapt by reducing mental effort (for example, by switching from reading to other cues) is foreseeable. The design of the hospital's drug-administration system failed to take likely human behavior into account and therefore was unsafe. Safe design must take people for what they are, not what we want them to be.
Use, Misuse, and Perceived Use
Many authorities blame human error for almost all accidents, attributing 80 to 90 percent of accidents to human error. However, these studies use statistical analyses that do not consider whether the user's behavior was reasonable given the relevant design or situation. They focus on the user because he or she was nearest in space and time to the mishap, while the designer is a shadowy figure far from the spotlight.
Users may be blamed for misuse as well. Product designers usually aim for a particular functionality--the car seat is intended to keep the child restrained; the "No Diving" sign is meant to be a warning. User behavior that is not in accord with the designer's intent may be termed product "misuse"--a subtle but direct attempt to make the user the responsible party.
Imagine a designer who intended to create a safe toaster. If it had faulty wiring and electrocuted a user, the manufacturer would doubtless be held responsible because the toaster was poorly designed. Intention is not the criterion for proper design.
In contrast, consider warning signs. If the teenage diver ignored a warning sign, he probably would be blamed for his own injuries because of the assumption that the sign fulfilled its intended function. However, the words on the sign are not intrinsically a warning--they are just words, or perhaps more accurately, just a bunch of lines and squiggles. Whether they will actually function as a warning depends on a large number of design and user variables.
Most products are capable of fulfilling many functions. The user makes his or her own determination about proper use, a perception partly based on unconscious and automatic mental processes.
Instructions may play a role in novel situations but are unlikely to be significant when the task is routine. If intended use and perceived use do not coincide, the designer's viewpoint has no inherent precedence. The issues are whether the designer created a reasonably usable product, reasonably transmitted the intended use to the user, had reasonable expectations of likely user behavior, and avoided conveying unintended or unwanted functions to the user.
The teenage diver's accident shows how intended and perceived function need not be identical. If his behavior is removed from the context in which it occurred, he probably behaved foolishly, and one might conclude he was just another reckless teenager.
However, the teenager had lived near the lake his entire life, had safely dived many times, and had seen many other people safely dive off the pier over the years, including many on the fateful day. He had never seen or heard of any diving accidents at the lake or seen the no-diving rule enforced. He had even seen hometown newspaper photographs of people diving from the pier.
From the teenager's perspective, he had to weigh the signs' credibility against a lifetime of direct experience. He didn't even "see" the signs that day. They were as irrelevant as the feeling of your shoes on the soles of your feet--which you probably hadn't noticed until just now.
In fact, the "No Diving" sign confused real and unreal hazards. The water depth around most of the pier was sufficient to allow safe diving, which is why no previous accidents had occurred. Unfortunately, the teenager dived into the one small place where the water was shallow. The signs "cried wolf," obscuring real risks, overwarning, and destroying their own credibility.
Products may also signal unintended functions. One frequent cause is unintended "affordances," which are innate connections between a product's appearance and its possible uses. People will judge proper use simply by looking at the product because "some product characteristics strongly trigger a particular use action."
For example, in one case, a driver placed papers on his dashboard, causing a windshield reflection that partly obscured his view. His car then struck and killed a pedestrian walking along a road. The accident occurred because the driver responded to an unintended function suggested by the vehicle's design. The dashboard was a flat horizontal surface that could be used as a table. The driver would never have had the opportunity to make this error if the dashboard had been properly designed, say, with a curved surface. It was completely foreseeable that some drivers would use the dashboard as a table.
Final analysis
In sum, this scientific approach to human behavior has several implications for attorneys. First, a user who "could have done otherwise" and prevented an accident is not necessarily at fault. Could-have-prevented does not necessarily imply should-have-prevented.
Second, the invisibility of the unconscious mental processes leads to gross overestimation of human error and to underestimation of design error.
Third, designs that rely on possible but unlikely behavior are just as defective as designs with improper electrical or mechanical engineering design. They ignore human factors design and rely on unrealistic assumptions about perfect human behavior. Such cases should be examined to see whether the error was predictable and whether the design should have been based on a more realistic assessment of likely behavior.
Lastly, the defendant's human factors development process should be closely examined. Products are often developed without proper human factors design or evaluation.