Often, when stories of accidents appear in the news, speculation begins that the hapless operator “made a mistake.” With this assumption, many move on, satisfied the cause has been found — operator error.
A surprising number of industry and government safety programs focus on carrot-and-stick efforts to transform people into reliable performers. However, systems thinking tells us that an error is a symptom of a system that needs change. People generally strive to do the right thing and get the task done efficiently; in hindsight, people apply the error label to mishap-related actions and decisions since they have gained precise knowledge of the bad outcome. What was the context for causal actions or decisions — was it unsafe or confusing conditions?
Finding or compensating for errors is just the beginning of pursuing a change. Why did the error occur? Why did the decision seem to be a good idea at the time? Discovering the factors that shape performance and improving the system or interface will reliably reduce safety risk, compared to counseling employees and placing them back into an unsafe system.