SafetyDog

A short list of Don’ts

In human error, Human Factors, Patient Safety on November 7, 2010 at 8:09 pm

When the outcome of a process is known, humans can develop a bias when looking at the process itself. When anesthesiologists were shown a sequence of the same events, being told there was a bad outcome influenced their evaluation of the behavior they saw (Woods, Dekker, Cook, Johannesen & Sarter, 2010). This has shown to be true in other professions also. This tendency to see an outcome as more likely than it seemed in real time is known as hindsight bias. This is why many failures are attributed to “human error.” In actuality, the fact that many of these failures do not occur on a regular basis show that despite complexity, the humans are somehow usually controlling for failure. It is important to study the failure as well as the usual process that prevents failure.

When following up on a failure in a healthcare system, Woods, et al., (2010) recommend avoiding these simplistic but common reactions:
“Blame and train”
“A little more technology will fix it”
“Follow the rules”

  1. Are these the most common responses or just sample?

  2. Good question. We have all no doubt seen these used or even recommended one or more ourselves as the solution after an investigation.

    According to Woods, et al., (2010), these three are the “usual” recommendations that come out of error investigations. The countermeasures of “blame and train”, “a little more technology will be enough” and “only follow the rules” are described as ineffective because they are based on hindsight bias. All of these bode the underlying assumption that somewhere along the line, the person should have predicted the bad outcome. But baring a blatant sabotage of a process, people at the sharp end of an organization who are involved in a bad outcome were pursuing success. And very often they were behaving in a manor that had in fact, brought success in the past.

    Workarounds, which are often cited as a source of failure, are created in the pursuit of success. According to Colonel Haskins SIOP LEC October 2010), people will always train to the test. So if an organization is monitoring administration times of a drug such as coumadin, employees will do whatever it takes to get that drug given on time. They may skip other safety steps to accomplish this if they run into conflicting priorities. If the organizational metric is the presence of safety behaviors over productivity, then safety will always be chosen in times of conflicting priorities. How many investigators explore workarounds to achieve metrics as a possible contributor to failure?

    Woods, et al., (2010) are very clear that any error investigator has the obligation to investigate the contributions to the error or failed outcome but also WHAT BLOCKED the operator from seeing that the outcome would be failure. What made sense to them at the time as the right thing to do, that ultimately ended up leading to failure? This is where an organization gets its most safety gain…because failure can be prevented any time an operator has the feedback and the authority to get a failed process back on track. In fact in complex, high reliability systems it is the human who is often the hero.

Leave a comment