Workplace Accidents and Human Error
When a workplace accident occurs, we often focus attention on the individual and the errors he or she made. But that’s not the only way to look at workplace accidents and human error.
According to professor of psychology Dr. James Reason, the human error problem can be viewed in two ways:the person approach and the system approach. Each has its model of error causation and each model gives rise to quite different philosophies of error management. Understanding these differences has important practical implications for coping with the ever present risk of mishaps in the workplace.
According to Dr. Reason:
- The person approach focuses on the errors of individuals, blaming them for forgetfulness, inattention, or moral weakness. The emphasis is on unsafe acts.
- The system approach concentrates on the conditions under which individuals work and tries to build defenses to avert errors or mitigate their effects.
High reliability organisations – which have less than their fair share of accidents – recognise that human variability is a force to harness in averting errors, but they work hard to focus that variability and are constantly preoccupied with the possibility of failure.
The person approach emphasizes unsafe acts resulting from slips, lapses, fumbles, mistakes, and violations of safety rules. The person approach blames the errors that lead to accidents on human failings such as forgetfulness, inattention, poor motivation, carelessness, negligence, and recklessness.
A serious weakness of the person approach is that focusing on the individual origins of error isolates unsafe acts from their system context. As a result two important features of human error tend to be overlooked:
- Often, the best people make the worst mistakes.
- Rather than being random, mishaps tend to fall into recurrent patterns. The same set of circumstances can provoke similar errors, regardless of the people involved. The pursuit of greater safety is seriously impeded by an approach that does not seek out and remove the error provoking properties within the system at large.
The person approach remains the dominant tradition. From some perspectives it seems logical. Blaming individuals is seemingly easier and more satisfying than blaming the employer. People are viewed as being capable of choosing between safe and unsafe behaviors. If something goes wrong, it seems obvious that an individual (or group of individuals) must have been responsible. Seeking as far as possible to separate a person’s unsafe acts from any employer responsibility is clearly in the interests of managers, as well as insulating the employer in a legal perspective.
Nevertheless, the person approach has serious shortcomings and adherence to this approach is likely to thwart the development of safer workplaces. Without a detailed analysis of mishaps, incidents, near misses, and “free lessons,” we have no way of uncovering recurrent error traps or of knowing where the “edge” is until we fall over it. In plainer terms, by automatically blaming human error, there is less incentive to fully investigating other systemic issues that may have contributed to the accident. The complete absence of such a reporting culture within the Soviet Union contributed crucially to the Chernobyl disaster.
The system approach, on the other hand, recognizes that humans are fallible and errors are to be expected, even in the best organizations. “Errors are seen as consequences rather than causes, having their origins not so much in the perversity of human nature as in ‘upstream’ systemic factors,” says Reason. “These include recurrent error traps in the workplace and the organisational processes that give rise to them.”
Preventing workplace accidents under the system approach is based on the assumption that though you can’t change the human condition, you can change the conditions under which the employee works.
“A central idea is that of system defences,” Reason asserts. “All hazardous technologies possess barriers and safeguards. When an adverse event occurs, the important issue is not who blundered, but how and why the defences failed.”
This is often referred to as the “Swiss Cheese Model“. In an ideal world each defensive layer would be intact. In reality, however, they are more like slices of Swiss cheese, having many holes – though unlike in the cheese, these holes are continually opening, shutting, and shifting their location. The presence of holes in any one “slice” does not normally cause a bad outcome. Usually, this can happen only when the holes in many layers momentarily line up to permit a trajectory of accident opportunity – bringing hazards into damaging contact with victims.
High reliability organisations are the prime examples of the system approach. They anticipate the worst and equip themselves to deal with it at all levels of the organisation. It is hard, even unnatural, for individuals to remain chronically uneasy, so their organisational culture takes on a profound significance. Individuals may forget to be afraid, but the culture of a high reliability organisation provides them with both the reminders and the tools to help them remember. For these organisations, the pursuit of safety is not so much about preventing isolated failures, either human or technical, as about making the system as robust as is practicable in the face of its human and operational hazards. High reliability organisations are not immune to adverse events, but they have learnt the knack of converting these occasional setbacks into enhanced resilience of the system.
Over the past decade researchers into human factors have been increasingly concerned with developing the tools for managing unsafe acts. Error management has two components: limiting the incidence of dangerous errors and—since this will never be wholly effective—creating systems that are better able to tolerate the occurrence of errors and contain their damaging effects. Whereas followers of the person approach direct most of their management resources at trying to make individuals less fallible or wayward, adherents of the system approach strive for a comprehensive management programme aimed at several different targets: the person, the team, the task, the workplace, and the institution as a whole.
- James Reason: Human Error (amazon.com)
- Minister: Human error not behind French crash (news.yahoo.com)
- How to Ensure Safety at Workplace (visual.ly)
- Human Error Suspected in Propane Gas Explosion in Florida (breitbart.com)
- ‘Human error’ is not a root cause of problems (utcc.utoronto.ca)
- The Hogwarts’ guide to occupational health and safety (rospaworkplacesafety.com)
- Human error causes Taiwan’s nuclear plant shutdown (english.kyodonews.jp)