Home Experience Services Contact Us Seminars Attorney's Guide to Perception Resources
Medical Equipment: Good Design or Bad Design?

Marc Green

Imagine that you were designing a new computerized system for monitoring the life signs of critical care patients. As a critical system, you need to include alarms for high risk situations. You plan on having the floor nurse who activates the device for a new patient set the alarms. But you also recognize the need for the nurse to review and explicitly confirm all settings before activation. The nurse will setup the system, review the device settings and confirm that the system is ready for use before activation.

A hypothetical design for the operator-interface of your patient monitoring system might look like this. The nurse enters the patient information into a form on the system's computerized display. Next, the computer presents the nurse with a series of screens that show the system's alarm and data monitoring settings. The nurse reviews a screen, confirms the input by hitting "enter" and then views the next screen. After three screens of information have been confirmed, the system is set and ready for use.

Is this a good design?

The answer is no. There was a similar device whose design caused a patient death because a nurse failed to detect an alarm malfunction. When the hospital first introduced the device, the nurse initially viewed each screen and inspected its contents. With experience, however, her responses "chained" and became automatic. She would confirm each screen's settings by quickly hitting the enter key, without really reviewing the screen's information. On the fateful day, the alarm was not set due to a malfunction. However, the nurse failed to notice. The patient subsequently arrested but there was no alarm and the patient died.

This error was highly predicable by anyone knowledgeable in human factors. As people learn a task, their behavior goes from being "controlled" to "being automatic." In controlled behavior, people closely monitor the world and how their movements affect changes. With practice, two things happen. First, the person ceases to pay close attention to visual input. Unexpected changes will likely go unnoticed. Second, individual responses chain together to form a single extended response. The action proceeds on "muscle memory," the proprioceptive guidance from the hands themselves rather than from visual information. The first enter response provides the cue for the next. Having a series of identical responses, such as pressing the enter key 3 times in a row, strongly encourages response chaining.

For every new patient, the nurse would set the alarm, and invariably receive a message confirming that the alarm was set. She unconsciously learned that the screens contained no new information and that her attention was better allocated to other matters. She also likely exhibited "confirmation bias, a powerful mental tendency to seek evidence that verifies already-held beliefs. When the alarm system failed, she missed the screen saying that the alarm was still off. The response chain, once started, had run off without conscious supervision.

This error was not due to a mental lapse by the nurse. On the contrary, she had learned what all skilled professionals learn - to ignore irrelevant information and to attend only to what matters. The error was "normal" and predictable and due to faulty operator interface design.

Conclusion

The lesson is clear: while human factors is important for making more efficient and more usable medical devices, it is also important for making safer medical devices. Unfortunately, these goals sometimes conflict. Proper human factors design requires consideration of both usability and safety.

For a more compete discussion on these issues, see Green, M. (2004). Nursing Error and Human Nature," Journal of Nursing Law, 9, 37-44.

Other Topics
Personal Injury: Road Accidents
  • Is The Moth-Effect Real?
  • Human Error in Road Accidents
  • Reaction Time
  • Let's Get Real About Perception-Reaction Time
  • Why PRT Is Not Like Gravity
  • Vision in Older Drivers
  • Weather and Accidents: Rain & Fog
  • Accidents At Rail-Highway Crossings
  • Seeing Pedestrians At Night
  • Underride Accidents
  • Rear End Collision: Looming
  • Night Vision
  • Distracted Pedestrians
  • Failure To See
  • Perception-Reaction Time (PRT) Programs
  • Twilight (3.3 lux) As A visibility Criterion
  • Human Error And Fault Tolerance
  • Junk Science Meets Impaired Drivers
  • Personal Injury: Warnings & Product Defects
  • Warnings and Warning Labels
  • Warning Effectiveness Checklist
  • The Psychology of Warnings
  • Drugs, Adverse Effects & Warnings
  • Are Warnings Effective?
  • Human Error Vs. Design Error
  • Product Misuse And "Affordances"
  • Safety Hierarchy: Design Vs. Warning
  • Thinking Like A Human Factors Expert
  • Personal Injury: Other
  • Diving Accidents in Pools
  • Falls Down Steps
  • Medical Error
  • Computer & Medical Error
  • Nursing Error
  • Criminal & Police
  • Errors in Eyewitness Identifications
  • Perceptual Error in Police Shootings
  • Eyewitness Memory Is Unreliable
  • Human Factors In Forensic Evidence
  • Intellectual Property
  • "Any Fool Can See The Trademarks Are Different"
  • Measuring Confusion For Intellectual Property
  • Color in Trademark and Tradedress Disputes
  • Color Functionality: A Case Example
  • Forensic Human Factors
  • Seeing Color
  • Determining Visibility
  • "Inattentional Blindness" & Conspicuity
  • Computer animation has perceptual limitations
  • Photographs vs. Reality
  • The Six Laws Of Attention
  • What is "inattention?"

  • | Home | Experience | Services | Contact Us  | Seminars/CLE | Attorney's Guide  | Resources |

    Copyright © 2013 Marc Green, Phd
    Home Page:
    http://www.visualexpert.com
    Contact Us