SafetyDog

Archive for November, 2010|Monthly archive page

Nothing could be finer?

In Patient Safety on November 26, 2010 at 8:35 am

A new article in NEJM:
“In conclusion, harm to patients resulting from medical care was common in North Carolina, and the rate of harm did not appear to decrease significantly during a 6-year period ending in December 2007, despite substantial national attention and allocation of resources to improve the safety of care. Since North Carolina has been a leader in efforts to improve safety, a lack of improvement in this state suggests that further improvement is also needed at the national level. Although the absence of large-scale improvement is a cause for concern, it is not evidence that current efforts to improve safety are futile. On the contrary, data have shown that focused efforts to reduce discrete harms, such as nosocomial infections10,36 and surgical complications,37 can significantly improve safety. However, achieving transformational improvements in the safety of health care will require further study of which patient-safety efforts are truly effective across settings and a refocusing of resources, regulation, and improvement initiatives to successfully implement proven interventions.”

 

Usability and Safety

In ergonomics, Human Factors, Patient Safety, Resiliency, safety on November 24, 2010 at 2:13 pm

* please watch video at the end of this post*
In reading articles and other literature on safety one will find recommendations based on their strength of effectiveness.
Here is a chart from an article by Stevens, Urmson, Campbell and Damignani (2010):

How it works: A simple safety intervention in a hospital is to have everyone wearing visible picture IDs. Problem: the IDs keep flipping over hiding the face and name of the employee. Solutions from least effective to most effective:

  • put a warning on the non-picture side that says “this side faces in” (safety still depends on human action)
  • train the employees to remember to periodically check to ensure their badge is facing out (safety depends on human memory)
  • put the picture on BOTH sides of the badge so the flipping over does not compromise the safety system (safety achieved from designing forced function!)

We tend to rely too much on training and memory: read more Read the rest of this entry »

Creative Patient Safety from Metro West!

In Patient Safety on November 17, 2010 at 11:13 am

Enhancing resilience: Medication Error Recovery

In human error, Human Factors, Resiliency on November 13, 2010 at 5:48 pm

Read the rest of this entry »

Beyond Reason…to Resiliency

In High Reliability Orgs, human error, Human Factors, Normal Accident Theory on November 13, 2010 at 5:06 pm

An earlier post presented James Reason’s Swiss cheese model of failure in complex organizations. This model and the concept of latent failures are linear models of failure in that the failure is the result of one breakdown then another then another which all combined contribute to a failure by someone or something at the sharp end of a process.
More recent theories expand on this linear model and describe complex systems as interactive in that complex interactions, processes and relationships all interact in a non-linear fashion to produce failure. Examples of these are Normal Accident theory (NAT) and the theory of High Reliability Organizations (HRO). NAT holds that once a system becomes complex enough accidents are inevitable. There will come a point when humans lose control of a situation and failure results; such as in the case of Three Mile Island. In High Reliability Theory, organizations attempt to prevent the inevitable accident by monitoring the environment (St Pierre, et al., 2008). HRO look at their near misses to find holes in their systems; they look for complex causes of error, reduce variability and increase redundancy in the hopes of preventing failures (Woods, et al., 2010). While these efforts are worthwhile, this still has not reduced failures in organizations to an acceptable level. Sometimes double checks fail and standardization and policies increase complexity.

One of the new ways of thinking about safety is known as Resilience Engineering. Read the rest of this entry »

A short list of Don’ts

In human error, Human Factors, Patient Safety on November 7, 2010 at 8:09 pm

When the outcome of a process is known, humans can develop a bias when looking at the process itself. When anesthesiologists were shown a sequence of the same events, being told there was a bad outcome influenced their evaluation of the behavior they saw (Woods, Dekker, Cook, Johannesen & Sarter, 2010). This has shown to be true in other professions also. This tendency to see an outcome as more likely than it seemed in real time is known as hindsight bias. This is why many failures are attributed to “human error.” In actuality, the fact that many of these failures do not occur on a regular basis show that despite complexity, the humans are somehow usually controlling for failure. It is important to study the failure as well as the usual process that prevents failure.

When following up on a failure in a healthcare system, Woods, et al., (2010) recommend avoiding these simplistic but common reactions:
“Blame and train”
“A little more technology will fix it”
“Follow the rules”

Human Error in the news

In Patient Safety on November 6, 2010 at 1:23 pm

Read the following from Google News this week. These are all being called “human error” although many seem like they are predictable human factor issues that should be built into safety systems. These involve police, pilots and NASA.

Police:
A woman reported calling 911 three times after being punched in the face by her brother. The call was sent to two different country police offers who were dealing with another violent situation nearby and an Alzheimer’s woman who was missing. The call center logged the complaint by woman who was punched, but the officers did not follow through with procedures. click read more below Read the rest of this entry »

Best Practice Alarm Fatigue

In alarm fatigue, Patient Safety on November 5, 2010 at 11:54 am

I thought I would share references from a recent literature search on alarm fatigue and cardiorespiratory monitoring of patients.

References
Read the rest of this entry »

Man versus System

In human error, Normal Accident Theory, Patient Safety, Safety climate on November 5, 2010 at 10:20 am


The person approach to looking at safety issues assumes failures are the result of the individual(s) involved in direct patient contact.  In this model, when something goes wrong it is the provider’s fault due to a knowledge deficit, not paying attention (and other cognitive processes), or not at their best (St. Pierre, et al., 2008).  Some other descriptions assumed of  individuals involved in a person approach to failures include: forgetful, unmotivated, negligence, lazy, stupid, reckless…click below to read more Read the rest of this entry »

Poll

In Patient Safety on November 5, 2010 at 9:23 am

In a just and high safety culture, failures are looked at in a systems approach rather than via the blame the individual approach. In a resilient organization that is consistently monitoring its failures and potential for failures, the most likely statement would be: There was a medication error in the PACU. The processes that led to the failure would be reviewed and a prevention plan would be developed with the assistance of the nurse who was at the sharp end of the failure.
The statement about the 5 rights is not likely to be heard as this also implies only individual accountability. For more information about weaknesses in reliance on the 5 rights framework, read this from the ISMP:

Q: Won’t medication errors be prevented if nurses just follow the “Five Rights?”
A: Many nurses during their training have learned about the “five rights” of medication use: the right patient, drug, time, dose and route.

However, the “five rights” focus on the nurse’s individual performance and does not reflect that responsibility for safe medication use lies with multiple individuals. Although the “five rights” serve as a useful check before administering medications, there are many other contributing factors to a staff member’s failure to accurately verify the “five rights,” despite their best efforts. For more detailed information, see the following articles.

•“Nurses’ rights regarding safe medication administration” ISMP Medication Safety Alert!® Nurse Advise-ERR July 2007
•The five rights: A destination without a map ISMP Medication Safety Alert!® Jan. 25, 2007
•“The five rights cannot stand alone” ISMP Medication Safety Alert!® Nurse Advise-ERR November 2004
•The “five rights” ISMP Medication Safety Alert!® April 7, 1999
ISMP report.

%d bloggers like this: