Archive for the ‘Resiliency’ Category

User centered design and safety

In design, Resiliency on August 30, 2014 at 6:44 pm

How much could our health care workers contribute to patient safety if we gave them some time back?

Anita Tucker’s work …highlights how the healthcare hero culture rewards front line staff for workarounds and prevents them from actually finding permanet improvement. On the spot problem solving gets more credit than providing a permanent change. Nursing in particular is encouraged to solve problems superficially to get the patients what they need rather than improve performance over time.
Instead of fixing processes and environment, we are forcing front line staff to encouter small road blocks over and over. Its like groundhog day every shift. Let’s fix these matter how small and unglorious..let’s free up our front line staff for safety efforts!

Neuroscience Saturday: BJ Fogg and Starter Steps

In Behavior change, Patient Safety, Resiliency, Safety climate, usability, user experience on July 19, 2014 at 8:35 am

Anyone who knows me knows I love BJ Fogg’s behavior models. He is a design psychologist who runs a persuation lab out at Stanford. His latest behavior change model is based on his research about lasting change which basically falls down to: making things easy to do and changing the environment.
His latest little flip book sums up his findings to date.
Lots of lessons for us in healthcare and these are my take aways:
*we tend to love dramatic change initiatives: secret: they usually dont work
*Starter steps or baby steps arent glamorous and flashy but they work
*We clearly need to reward change and not the flashing marketing campaigns when it comes to safety (how many hours have you spent on catchy acronyms….did it make a difference??)
*BJ desribes certain things to look for that can warn you that you are designing for epiphany instead of change secret: hoping staff epiphanies will lead to behavior change doesnt usually work

If you care about patient safety AT ALL please read BJ’s latest little flipbook.. I have never read so much great info in one place

Chocolates and patient safety

In High Reliability Orgs, Normal Accident Theory, Patient Safety, Resiliency on September 22, 2013 at 3:17 pm

From Rosemary Gibson:

How Overtreatment and High Volume Health Care is Making Patient Safety a More Distant Reality by Rosemary Gibson

“This past week at the National Health Care Quality Colloquium I showed the classic “I Love Lucy” chocolate factory video during a presentation. The laughter was audible and the point was made: when the pace of work speeds up, work-arounds and cutting corners are inevitable. Employees tell the boss everything is fine –when its not.

In a health care system riddled with defects where doctors and nurses are required by health care executives to work at a faster pace, the number of adverse events — and patients harmed — will increase. High volume health care and productivity targets are a toxic mix. … Read and’s funny but not…
The Chocolate scene on you tube

dig down deep or side step?

In Patient Safety, Resiliency on April 25, 2013 at 9:22 am

Is lateral thinking the key to improving patient safety?

Harvard Business Review


In Patient Safety, Resiliency on April 20, 2013 at 4:10 pm
Lessons from terrorism: we cant prevent every accident or purposeful act but we can respond and bounce back quickly to reduce impact.

When productivity trumps safety we all lose.

In human error, Normal Accident Theory, Resiliency, Safety climate on September 30, 2012 at 9:32 am

““You are judged by numbers in the lab,” McShane said. “There is a culture of pressure to get it done with no new ­resources. But there is no ­excuse for [cheating] at the end of the day.” (, 2012)

So goes the story of Annie Dookan, a chemist in a Massachusetts crime lab who is suspected of compromising evidence in many of the 34000 samples she has tested in her 9 year career.  Her motivation seems to be no more nefarious than trying to look like a stellar employee.

What does this have to do with patient safety? It is common in hospitals today to push the border in terms of productivity.  Add some more patients, add new procedures, add no more  staff.  In safety studies this can result in what is known as drift.  You get through one shift with suboptimal staffing and nothing bad happens so you chance it again, then again, and little by little in order to cope: staff develop workarounds and short cuts that all begin to be seen as normal (culture) and less risky as staff has not gotten feedback on any bad results. If staff continue to be judged on output (census, patient turnover, lower expenditures) they will seek to make these their priority rather than follow safe procedures.

According to Cook (2000) work processes do not chose failure but drift toward it as production pressures and change erode the defenses that normally keep failure at a distance. “This drift is the result of systematic, predictable organizational factors at work, not simply erratic individuals.  To understand how failure sometimes happens, one must first understand how success is obtained-people learn and adapt to create safety in a world fraught with gaps, hazards, trade-offs, and multiple goals.”

In safety critical environments that deal with people’s lives, leaders should be preoccupied with failure not productivity. A leader is responsible to identify drifts by being present in daily processes. Drifts can be identified by observing staff behaviors, reviewing peer reports and asking people what types of things they are worried about. Asking staff to “do their best” without a supporting environment will not result in a high performing system. Productivity goals should be made based on an analysis of the work not by how much money is in the budget. I think it’s time as a nation we say in all instances “if there isnt enough money to do things right, don’t do them at all.”

 Annie Dookin made some bad choices but she worked in an environment where bad choices were acceptable and when peers did speak up, nothing was done. Who is responsible for this?

And who is responsible for the incarceration or punnishment of some people who might be innocent who are imprisoned: all because a culture of productivity over-ranked safe procedures. In these circumstances, just as in healthcare, humans always suffer.

Safety first. Productivity second. These cannot just be words and slogans. They have to be guiding principles that are evident in everything we do, in healthcare and in crime labs.  It scares me that this lab was run by……..The Department of Public Health 😦

Almost like being there…

In Force function, Patient empowerment, Resiliency on April 14, 2012 at 12:16 pm

Videos from the 2012 Heathcare design conference

Back to the Future…

In Behavior change, High Reliability Orgs, Resiliency, Root cause analysis on January 28, 2012 at 8:08 am

ISMP newsletter, 1998

“Currently, there is no consistent process among healthcare organizations for detecting and reporting errors. Since many medication errors cause no harm to patients, they remain undetected or unreported. Still, organizations frequently depend on spontaneous voluntary error reports alone to determine a medication error rate. The inherent variability of determining an error rate in this way invalidates the measurement, or benchmark. A high error rate may suggest either unsafe medication practices or an organizational culture that promotes error reporting. Conversely, a low error rate may suggest either successful error prevention strategies or a punitive culture that inhibits error reporting. Also, the definition of a medication error may not be consistent among organizations or even between individual practitioners in the same organization. Thus, spontaneous error reporting is a poor method of gathering “benchmarks;”it is not designed to measure medication error rates.” Read the full newletter here

Hey McFly, why have we made so little progress?

Human error and Hepatitis C

In human error, Resiliency on December 31, 2011 at 4:59 pm

The investigation of how a child in Boston received Hepatitis C via cardiac surgery in which blood vessel tissue was transplanted, revealed a case of human error in reading the hepatitis status of the donor back in March.

Human error WILL occur and resilience in catching and responding to these errors is what will keep patients safe.  Resiliency is the ability of a system to adjust its functioning in the event of a mishap or under a state of continuous stress (Nemeth, Wares, Woods, Honagell & Cook, 2008).   Even after the error in reading the tissue occurred there was opportunity to prevent the error from reaching the child.  Another person who had received a kidney from the same donor tested positive for Hepatitis C but it was 11 days before a communication occurred with the Office of Blood, Organ, and Other Tissue Safety at the CDC (Conaboy, 2011, Boston Globe).  The child’s surgery was performed 3 days before the official communication but 8 days after the kidney recipient tested positive.  As soon as the first kidney recipient tested positive, the human error should have been discovered and further infections could have been prevented.  A human error occurred but system problems and communication impairments made this a larger catastrophe than it should have been.  This illustrates that while the sharp end workers are prone to human errors, the blunt end administrators can add resiliency by looking to build safer processes and systems.  Compounding this error was the fact that organs and tissues are regulated by separate agencies.  Tissue banks are overseen by the FDA and Organs by the Health Resources and Services Administration (Conaboy, 2011). The two have no protocols for sharing information. This is eerily similar to the situation prior to the 911 attacks in that the FBI and the CIA had no protocols for sharing information.

This is a lesson for all in terms of the open sharing of data. We must break down silos in healthcare where they occur and increase opportunity for feedback to those in the system as to the functioning of the system whenever possible. Putting the patient at the center of all we do is a first step in identifying how and where these silos exist.  Human error will occur but monitoring the system and sharing information will create resiliency that will mitigate harm.

Mobile Persuation: the future of patient safety?

In Human Factors, Resiliency on September 24, 2011 at 6:56 pm

Many experts believe mobile phones are the future of societal behavior change (Fogg & Eckles, 2007). They are light and full of features and the wireless networks on which they run rival speeds of wired high speed systems. The best feature of mobile phones that contribute to persuasive behavior change is their ability to relay information, make that information actionable and maintain links to social networks. The social connection is where the real magic happens (Fogg & Eckles, 2007). From a mobile platform one could conceivably fall in love, start a revolution, read a life changing book or find a treasure.

In terms of patient safety, I see the ability to act on information as well as the social interaction to hold the most promise. Abnormal labs are sent to an MD who can click on them which takes him to an order set designed for treating the anomalies. This relays to a nurse who carries out the treatment and is beeped if any of the patients vital signs are compromised or even if the patient calls to report a symptom. A nurse could see what the patient ordered for a meal and put a hold on it if it contained too much potassium for example.
With gps and bluetooth, healthcare providers can be tracked on handwashing performance or prompted to wash when they pass a pump. A provider who cant come to a bedside could chat in real time with words and video.
Phones can already scan product barcodes and this technology could enable healthcare providers to have all patient information with them wherever they went.

It’s time to throw out those one way pagers and switch to interactive technology. Pagers function well in a hierarchical system, however if relational coordination is a safety goal, the technology has to facilitate relationships and action.

%d bloggers like this: