An earlier post presented James Reason’s Swiss cheese model of failure in complex organizations. This model and the concept of latent failures are linear models of failure in that the failure is the result of one breakdown then another then another which all combined contribute to a failure by someone or something at the sharp end of a process.
More recent theories expand on this linear model and describe complex systems as interactive in that complex interactions, processes and relationships all interact in a non-linear fashion to produce failure. Examples of these are Normal Accident theory (NAT) and the theory of High Reliability Organizations (HRO). NAT holds that once a system becomes complex enough accidents are inevitable. There will come a point when humans lose control of a situation and failure results; such as in the case of Three Mile Island. In High Reliability Theory, organizations attempt to prevent the inevitable accident by monitoring the environment (St Pierre, et al., 2008). HRO look at their near misses to find holes in their systems; they look for complex causes of error, reduce variability and increase redundancy in the hopes of preventing failures (Woods, et al., 2010). While these efforts are worthwhile, this still has not reduced failures in organizations to an acceptable level. Sometimes double checks fail and standardization and policies increase complexity.

One of the new ways of thinking about safety is known as Resilience Engineering. … Read the rest of this entry »
Archive for the ‘Human Factors’ Category
Beyond Reason…to Resiliency
In High Reliability Orgs, human error, Human Factors, Normal Accident Theory on November 13, 2010 at 5:06 pmA short list of Don’ts
In human error, Human Factors, Patient Safety on November 7, 2010 at 8:09 pmWhen the outcome of a process is known, humans can develop a bias when looking at the process itself. When anesthesiologists were shown a sequence of the same events, being told there was a bad outcome influenced their evaluation of the behavior they saw (Woods, Dekker, Cook, Johannesen & Sarter, 2010). This has shown to be true in other professions also. This tendency to see an outcome as more likely than it seemed in real time is known as hindsight bias. This is why many failures are attributed to “human error.” In actuality, the fact that many of these failures do not occur on a regular basis show that despite complexity, the humans are somehow usually controlling for failure. It is important to study the failure as well as the usual process that prevents failure.
When following up on a failure in a healthcare system, Woods, et al., (2010) recommend avoiding these simplistic but common reactions:
“Blame and train”
“A little more technology will fix it”
“Follow the rules”
Attach ’em!
In Human Factors, Patient Safety on October 9, 2010 at 1:54 pmOSHA 2002:
“Workers operating a jackhammer must wear safety glasses and
safety shoes that protect them against injury if the jackhammer slips
or falls. A face shield also should be used… Working with noisy tools such as jackhammers requires proper, effective use of appropriate hearing protection.”
Are there products in the hospital setting that we could combine into all-in-one devices to make things safer and more available for either the patient or the staff?

Is 66 days enough?
In Checklists, Human Factors, Patient Safety, Safety climate on October 6, 2010 at 12:20 pmOne of the difficulties in infusing safety into the healthcare environment is getting safety behaviors habitually into bedside practice. The previously referenced degradation of the anesthesia safety policy published in Quality and Safety in Health Care is a perfect illustration of this dilemma. View the full text of this article The natural lifespan of a safety policy: violations and system migration in anaesthesia.
A recent experiment published in the European Journal of Social Psychology contained the results of a study focused on the length of time it took to insinuate a behavior into habit…. Read the rest of this entry »
User Interfaces and Safety
In Human Factors on September 25, 2010 at 4:38 pmI upgraded to a new android touch screen phone this week, the Motorola Backflip. What a great little device! It’s so easy to use and so much faster and lighter than my old Windows Mobile phone. The user interface was so intuitive, I barely had to open the instruction booklet.
I am having only one problem and it’s a major one…more Read the rest of this entry »
Scratch Tickets & Independent Double Checks
In Human Factors, Interuptions, Multitasking, Normal Accident Theory, Patient Safety, Teamwork on September 19, 2010 at 2:34 pmI played tennis this morning with a friend. On the way home I thought I would stop at the supermarket to pick up some snacks for the Patriots game today. I realized I forgot my debit card (ah, the limitations of the human memory). Looking for alternate forms of payment, I found winning lottery scratch tickets in my glove compartment.

I quickly added them up (3 of them) and confirmed that … Read the rest of this entry »
The “smart room” by GE
In Human Factors, Patient Safety, Safety climate on September 16, 2010 at 7:46 pmThe future of safety?
Missed Care: an error of omission
In Interuptions, Multitasking, Patient Safety on September 16, 2010 at 7:21 pmAccording to Kalisch, Landstrom and Hinshaw (2009) one overlooked aspect in addressing patient safety is the concept of “missed care.” Missed care is classified in terms of error as an act of omission. Missed care is a concept that nurses are very well aware of but hesitant to bring into open discussion (Kalisch, et.al, 2009). Some reasons suspected for covering up these omissions are guilt, a feeling of powerlessness to correct the situation and fear of punishment for not completing tasks. There are even reports of false documentation to hide these errors of omission because of fear of retribution and an acceptance of this being the norm (Kalisch, et al, 2009). Kalisch, et al, (2009) liken this hiding of these errors to the hiding of medication errors and near misses that was prevalent prior to the patient safety movement…. more Read the rest of this entry »
Read it: “Crisis Management in Acute Care Settings”
In Human Factors, Patient Safety on September 8, 2010 at 12:06 pmCrisis Management in Acute Care Settings by Pierre, Hofinger, and Buershaper (2008).

This is one of those books that I find so relevant I have read it several times and own the hardcover, the paperback and the Kindle version. I have referred to it numerous times in papers and conversations. It is a short book at 227 pages but every page is filled with amazing material.
The preface to the book begins with the quote “All of life is problem solving” (Popper). Since they say it better than I, here are excerpts from the authors’ words describing what this book is about:
On a regular basis, healthcare professionals are faced with problems that are sudden, unexpected and pose a threat to a patient’s life. Worse still, these problems do not leave much time for… Read the rest of this entry »
What is the link between Human Factors Theory and cheese?
In Human Factors on September 6, 2010 at 6:32 pmIn the previous post Some Fun with Human Factors we looked at a maze with a mouse trying to get to the cheese to illustrate how confusing it can be navigate the many user interfaces one encounters in a typical hospital workday. Now onto to another cheese analogy in Human Factors: James Reason’s Model of Accident Causation.
