Insights
Today on the balcony I am joined by Dirk de Korne a leader in quality improvement from the Netherlands. We start our conversation by taking the view that collectively we must look the truth in the face if we are to prevent harm and there is no greater enemy of improvement than indifference to failure.
The challenge for all is to lead, manage, guide and coach others through a quality transformation process. This goes much deeper than tinkering with structure and adopting the right rhetoric. It is a sustained effort to embed safety as a defining imperative that permeates both individual behaviour and organizational culture. To improve performance, organizations must overcome varying degrees of systemic, cultural, and individual barriers. Leaders have an obligation to identify and nurture the people with courage and discipline to drive improvement and change.
We talk about lessons from the airline industry. While we recognize that the relationship between people is what makes the absolute comparison between healthcare and aviation inappropriate. There is much to learn from aviation, but compared to healthcare it is elegantly simple. The pilot and co-pilot have checklists that apply in every circumstance. They do not interface with each and every passenger and do not customize their service to address the uniqueness of what each passenger presents. In fact, the pilot is formally protected from contacts and interruptions. Entry is not permitted.
The proportion of their jobs that is technocratic and automated is much higher than in healthcare. Commercial airliners literally fly themselves most of the time. Yes, there are things healthcare can learn from aviation: the absence of hierarchy in making safety decisions, standardization of equipment and supplies, simulation exercises, the importance of data, and the optimal use of technology. Healthcare and aviation when it comes to complexity are fundamentally different in character and complexity; If aviation is checkers, healthcare is multi-dimensional chess. By all means let’s strive for scientific and technocratic excellence, but let’s also pay attention to – and measure – the relational and behavioural sidemaking sure it is supported by a foundation of truth and strong values.
We are interrupted by the familiar voice of the “Ghost of Healthcare Despair” who shouts…..
“When a plane crashes or when something goes wrong they have a “black box” that records everything, and the recorder is analyzed not by the company but by an independent investigator. If you believe that healthcare is more complex and given the amount of harm that takes place in operating theatres why doesn’t healthcare have “black boxes” in the operating theatres. You spend huge amounts of time and efforts on retrospective interviews?”
Many studies have indeed shown that healthcare is often hazardous to patients with unnecessary morbidity and mortality. We need to embrace this reality. What strikes us, is that similar contributing factors – complexity of the work processes, organizational characteristics, professional autonomy – are faced by several industries. And some have found good approaches to mitigate their effect on safety and quality. Aviation’s black box is one of these.
According to William Rutherford, a retired U.S. Air Force flight surgeon, safety improvements developed by the aviation industry in three areas can be used within healthcare. The first concerns an increasing transparency about its errors; aviation encourages staff to report mistakes and designs interventions to reduce risks. The second is within the area of procedural standardization, i.e., curtailing operator autonomy while preserving operator authority. And third, the aviation industry has realized an increased efficiency by extracting value from all parts of the system – human, information, and hardware – with team training programs. Rutherford claims that healthcare could benefit from the three innovations. However, little is known about whether and how they can be diffused to healthcare.
Since the early 1990s, learning from aviation has been a central part of the Rotterdam Eye Hospital’s strategy. Aviation was taken as one of the exemplary industries ‘since this industry has shown the possibility of handling more passengers, improving logistics and safety, and being highly service-orientated’. A comparison of the passenger versus patient handling processes was conducted from 1992 till now, resulting in the decision to adapt a series of innovations from the aviation.
To stimulate the use of safety management principles, in 2008 the Rotterdam Eye Hospital introduced an innovation inspired by aviation’s “black box” which, by recording all flight crew activities, is used to determine the cause(s) of an accident. In adopting the innovation, the hospital decided to have surgical team activities recorded. Aviation trainers were hired who video and audio taped several ophthalmic surgeries, which were later used to give the team feedback on the application of the safety procedures that were taught during multidisciplinary safety (or crew resource management, CRM) training.
The voice and video recordings revealed team specific differences in performing the time-out procedure. It also showed that the teams varied in the use of the safety communication rules agreed upon during the CRM training, and that the absence of team members at the pre-operative briefing resulted in a less structured surgery. The video recordings, for example, showed that after the patient did arrive in the OR, a resident and student received medical-technical explanation about the procedure. There was no talk about specific casts, performing actions or potential problems. The resident was unexpectedly supposed to jump in during the surgery and wasn’t able to do so. In debriefing with an aviation expert, it was made clear that a ‘captain’ needs to have the situational awareness regarding the competences of his colleague performing the operation. To prevent errors like this, he briefs him before what to expect, so the situational awareness of the co-pilot is updated. The ‘co-pilot’ can ask questions or make things clear to the whole team. Since then a briefing and debriefing has been introduced in the surgical program. We were only able to create this feedback as a result of the video-capturing.
This definitely is not a “black box” yet. During recent wrong-side surgeries, there were no tapes available for incident analyses since there is no automated flight data recorder. Huge amounts of time and efforts are currently spent on retrospective interviews, analysis and reporting. Aviation has learned about the causes of incidents from the cockpit voice recordings. Moreover, ‘real’ cockpit data is very strong to incentive medical professionals and create common awareness.
There are other mountains to climb. In aviation, there is strong legislation concerning the production and use of such recordings. After an incident, an independent investigator will use the data, not the public prosecutor or police. This kind of legislation is currently lacking in healthcare. Protecting the confidentiality and security of information collected needs to considered as “protected health information”.
Aviation has not become a safe industry due to well-willing and transparency oriented pilots. Governmental bodies, like national transportation and safety boards played an important role. Sector-wide systems approaches are needed. If black boxes have proven to be invaluable in improving safety in aviation, could not black boxes prove to be invaluable to ensuring safety in medicine?
Next Week’s Guest on the Balcony of Personal Reflection: P. Davies-Scimeca in a conversation titled “Call Lights Are On Care Provider Distraction”.
Click here to see the First Series of Ghost Busting essays.
Click here to see essays from the Second Series: The Ghost of Healthcare Consciousness.
About the Author(s)
Hugh MacLeod is the CEO of Canadian Patient Safety Institute. Dr. Dirk F. de Korne Quality/Safety and Research Fellow, Rotterdam Ophthalmic Institute, Rotterdam Eye Hospital.References
MacLeod H.B. Working Together for Safe Efficient Quality Care, Canadian Journal of Respiratory Theraphy. 2010 Winter Vol 46.4
Rutherford, W.: Aviation safety: a model for health care? Qual Saf Health C 12:162-163, 2003.
de Korne, D.F., van Wijngaarden, J.D.H., Hiddema, U.F. et al.: Diffusing aviation innovations in a hospital in the Netherlands. Jt Comm J Qual Patient Saf 36(8):339-347, 2010.
Comments
Rob Robson wrote:
Posted 2013/03/20 at 11:01 AM EDT
For a fascinating alternative view (from the Netherlands) readers may enjoy the following article:
Certain uncertainties: Modes of patient safety in healthcare, Jerak-Zuiderent, S., Social Studies of Science, 2012, 42: 732
The author coins the very intriguing phrase "Certain unsafety and uncertain safety". The concept of "certain unsafety" refers to the challenges (that can lead to unintentional patient harm) created by over-reliance on rules, regulations and policies when working in complex adaptive systems. The concept of "uncertain safety" refers to the idea that adapting to the uncertainty inherent in a CAS like healthcare can actually create safety. This is indeed what happens on an hourly basis throughout Canadian healthcare when front-line direct care providers adapt to a variety of limits in their under-specified working environments (for instance insufficient time, resources, controls - feedback mechanisms - and pre-conditions) and find ways to provide safe quality care to patients in spite of what the rules and regulations direct them to do.
Personal Subscriber? Sign In
Note: Please enter a display name. Your email address will not be publically displayed