Behind Human ErrorThis work takes you behind the human error label. Divided into five parts, it summarizes the most significant results, explains the role of normal cognitive system factors in operating safely at the sharp end, tells how hindsight bias always enters into attributions of error, and much more. |
What people are saying - Write a review
LibraryThing Review
User Review - rhbouchard - LibraryThingA very interesting perspective. The book makes the case of placing yourself in the shoes of those who made mistakes because hindsight biases any fruitful discussion of problems. Read full review
Contents
AN INTRODUCTION TO THE SECOND STORY | 1 |
THE PROBLEM WITH HUMAN ERROR | 3 |
BASIC PREMISES | 19 |
COMPLEX SYSTEMS FAILURE | 35 |
LINEAR AND LATENT FAILURE MODELS | 41 |
COMPLEXITY CONTROL AND SOCIOLOGICAL MODELS | 61 |
RESILIENCE ENGINEERING | 83 |
OPERATING AT THE SHARP END | 97 |
HOW DESIGN CAN INDUCE ERROR | 141 |
CLUMSY USE OF TECHNOLOGY | 143 |
HOW COMPUTERBASED ARTIFACTS SHAPE COGNITION AND COLLABORATION | 155 |
MODE ERROR IN SUPERVISORY CONTROL | 171 |
HOW PRACTITIONERS ADAPT TO CLUMSY TECHNOLOGY | 191 |
REACTIONS TO FAILURE | 197 |
HINDSIGHT BIAS | 199 |
ERROR AS INFORMATION | 215 |
BRINGING KNOWLEDGE TO BEAR IN CONTEXT | 101 |
MINDSET | 113 |
GOAL CONFLICTS | 123 |
BALANCING ACCOUNTABILITY AND LEARNING | 225 |
SUMMING UP HOW TO GO BEHIND THE LABEL HUMAN ERROR | 235 |
Other editions - View all
Behind Human Error David D. Woods,Sidney Dekker,Richard Cook,Leila Johannesen,Nadine Sarter Limited preview - 2017 |
Common terms and phrases
accident actions and assessments activities actually adapt aircraft analysis anesthesiologist attribution automation aviation bad outcomes behavior blood pressure breakdown cause clumsy cognitive system Comair complex systems consequences context control theory cope create crew critical decision demands detect develop device disaster display distributed cognitive double binds dynamic effective erroneous actions evaluation example expertise factors feedback flaws flight function fundamental surprise hazards high reliability organizations hindsight bias human performance human-computer interaction human-machine human-machine system incident increase interface investigation involved Jens Rasmussen knowledge label human error latent failure model learning machine mental models mode error monitoring multiple normal normal accident occur operational organization organizational patient pilots potential practice practitioners problem procedures produce PSIA Rasmussen relevant response result risk role runway safety sequence sharp end situation stakeholders standard strategies studies tasks technical thev tradeoffs understand users vulnerabilities