SafetyDog

Archive for the ‘Root cause analysis’ Category

Stamp out unsafe processes

In Root cause analysis, safety on September 20, 2013 at 8:18 am

From Nancy Leveson’s site at MIT

Applying System Engineering to Pharmaceutical Safety by Nancy Leveson, Matthieu Couturier, John Thomas, Meghan Dierks, David Wierz, Bruce Psaty, Stan Finkelstein. Journal of Healthcare Engineering, Sept. 2012.

While engineering techniques are used in the development of medical devices and have been applied to individual healthcare processes, such as the use of checklists in surgery and ICUs, the application of system engineering techniques to larger healthcare systems is less common. System safety is the part of system engineering that uses modeling and analysis to identify hazards and to design the system to eliminate or control them. In this paper, we demonstrate how to apply a new, safety engineering static and dynamic modeling and analysis approach to healthcare systems. Pharmaceutical safety is used as the example in the paper, but the same approach is potentially applicable to other complex healthcare systems.
One use for such modeling and analysis is to provide a rigorous way to evaluate the efficacy of potential policy changes as a whole. Less than effective changes may be made when they are created piecemeal to fix a current set of adverse events. Existing pressures and influences, not changed by the new procedures, can defeat the intent of the changes by leading to unintended and counterbalancing actions by system stakeholders. System engineering techniques can be used in re-engineering the system as a whole to achieve the system goals, including both enhancing the safety of current drugs while, at the same time, encouraging the development of new drugs.

Read this an other papers about this new model of incident investigation here

RCA on Root Cause Analysis

In Force function, ismp, Root cause analysis on March 2, 2013 at 11:55 pm

flowerRCA (root cause analysis) is a tool often used to provide an assessment after the occurrence of an adverse event or when investigating the safety of an environment. The idea behind this risk assessment is to uncover the overt and latent factors behind unsafe situations.  In non-medical industries this has proven to be an effective tool but in healthcare, the belief that this tool helps is variable. While many RCAs have uncovered surprising holes in healthcare safety systems, there are also concerns with its value.

In a 2008 interview with Robert Watcher, Albert Wu  said “Although we are living in an era of evidence-based medicine, root cause analysis was widely adopted by the medical community in the 1990s without the benefit of much evidence. Every institution now conducts root cause analysis. Thousands of health care workers devote many hours to conducting these analyses, yet root cause analysis has never really been evaluated.” (AHRQ, http://webmm.ahrq.gov/perspective.aspx?perspectiveID=61)

Some of the barriers to conducting root cause analysis:
(read the rest of this post) Read the rest of this entry »

Find the mistake

In adverse events, human error, Human Factors, Root cause analysis on February 2, 2013 at 5:56 pm

Capture

This mini human factors test is going around Facebook.
Once you find the mistake it becomes almost impossible not to see it.

This illustrates two concepts
1) we see what we expect to see and our brain “corrects” what does not conform and therefore we can easily misread labels
2) Only hindsght is 20/20. When investigating an error after the fact, hindsight bias may cause one to think the error was foolish and was easy to detect at the time. Now that you see the error in this little picture it seems to be so evident that you wonder how it could have been missed initially

This is why we need barcode medication identification systems for preparation and administration.
And this is why it is so important to understand what was actually known at the time of an error and not what we know in hindsight. Many errors occur when people are doing what they always have done. Usually there is no significiant deviation from norm.

Back to the Future…

In Behavior change, High Reliability Orgs, Resiliency, Root cause analysis on January 28, 2012 at 8:08 am

ISMP newsletter, 1998

“Currently, there is no consistent process among healthcare organizations for detecting and reporting errors. Since many medication errors cause no harm to patients, they remain undetected or unreported. Still, organizations frequently depend on spontaneous voluntary error reports alone to determine a medication error rate. The inherent variability of determining an error rate in this way invalidates the measurement, or benchmark. A high error rate may suggest either unsafe medication practices or an organizational culture that promotes error reporting. Conversely, a low error rate may suggest either successful error prevention strategies or a punitive culture that inhibits error reporting. Also, the definition of a medication error may not be consistent among organizations or even between individual practitioners in the same organization. Thus, spontaneous error reporting is a poor method of gathering “benchmarks;”it is not designed to measure medication error rates.” Read the full newletter here

Hey McFly, why have we made so little progress?

Let Employees Solve Problems

In Patient Safety, Root cause analysis, Safety climate on October 25, 2010 at 9:57 pm

Excerpt from Harvard Business School working Knowledge:
“A Harvard research team recently set out to better understand what managers can do to encourage employees to speak up about problems, and to investigate how managers can encourage employees to offer solutions.
The team’s working paper, “Speaking Up Constructively: Managerial Practices that Elicit Solutions from Front-Line Employees” considers data on nearly 7,500 incidents from a single hospital to determine whether two types of managerial actions increase the frequency with which frontline workers speak up by reporting incidents and do so constructively by including solutions in their incident reports”

Please read their article: View full post
Read Speaking up constructively linked from that post

Brainwrite, not storm!

In Patient Safety, Root cause analysis, Teamwork on September 24, 2010 at 7:43 pm

When a safety issue arises hospitals often convene a team to come with ideas for a safer process. Three types of teams include project teams, virtual teams, and quality circles. A project team is time limited and focused on a one-time output (Borkowski, 2011). They are usually formed to solve a particular problem and exist only until that problem is resolved (Landy & Conte, 2010). A virtual team needs technology to exist. These teams can be permanent or task focused and are defined by their ability to work across time, space and physical distance (Borkowski, 2011). Quality circles are like mini think tanks where a group of employees convenes to identify problems and generate ideas (Landy & Conte, 2010). This group submits these suggestions to management who then decide whether to act on these proposals (Landy & Conte, 2010)…moreRead the rest of this entry »

%d bloggers like this: