Healthcare Policy

Healthcare Policy 1(4) May 2006 : 12-20.doi:10.12927/hcpol.2006.18240
The Undisciplined Economist

High Reliability versus High Autonomy: Dryden, Murphy and Patient Safety

Robert G. Evans, Karen Cardiff and Sam Sheps


Healthcare is not a high-reliability industry. The adverse event rate is on the order of 10-2; industries such as aviation, nuclear power and railways achieve rates of 10-5 or better. Increasing awareness of this contrast has made "patient safety" a major topic of concern. High reliability in other industries flows from a combination of "engineered safety," tight regulation ("high-level constraints") and the development of a "culture of safety" that recognizes error as a systemic rather than a personal failure. In medicine, achieving such a combination would involve abandoning deeply embedded and centuries-old traditions of individualism, clinical autonomy and personal responsibility. This will not happen. Watch instead for safety concerns to be diverted into activities that do not threaten core values.

[To view the French abstract, please scroll down.]

The First Story: Flight AO 1363

At 12:09 CST on March 10, 1989, Air Ontario #1363 took off from Dryden Municipal Airport. Just over a minute later the aircraft ceased to fly, crashing in a wooded area beyond the end of the runway and catching fire. Twenty-four people, passengers and crew, were killed. The proximate cause of the disaster was clear enough; the aircraft's wings were heavily "contaminated" with black ice and wet snow, and could not provide adequate lift. It failed to gain altitude, and hit the trees.

The accident might have been labelled a simple case of pilot error. Captain George Morwood's fatal command decision was certainly an error. He should not have attempted to take off under conditions so fraught with risk. But he did, and he was neither inexperienced nor incompetent. Nor, as Mr. Justice Moshansky dryly remarked, was there any evidence that he was suicidal. There had to be more to the story.  

The Second Story

There was, much more. The Hon. Virgil Moshansky, a Justice of the Queen's Bench of Alberta - and also a pilot - was appointed to head a Commission of Inquiry "to examine the entire Canadian aviation system for organizational failures, both latent and active, which might have contributed to the Captain's faulty decision" (Moshansky 2005). His final report (Moshansky 1992) is a landmark document in Canadian aviation. All 191 of its recommendations were accepted.

Many other factors lurk behind "pilot error"; failure in complex dynamic systems is typically multi-causal. Morwood had been sent into a trap by an inexperienced and unqualified flight dispatcher. He had landed at Dryden to refuel a heavily loaded flight returning from Thunder Bay to Winnipeg. He kept his engines running; if stopped, they could not be restarted. The flight and all its passengers would be stranded, at great company expense, until other aircraft could arrive.

Air Ontario had, as an economy measure, cancelled plans to provide ground start facilities at Dryden. The on-board Auxiliary Power Unit, which could have restarted the engines, had been out of service for five days. Company policy and profits required keeping aircraft in the air; inessential repairs could wait. The weather in Dryden was foul, cold with freezing wet snow. But company policy - a policy that Moshansky described as "useless" - forbade de-icing while the engines were running. The pilot faced an ugly choice: abort the flight, or take your chances. In retrospect, he made the wrong choice. At the time, it must have seemed the better option.  

The Silence of the Passengers  

Captain Morwood was not available to give evidence. But a plausible interpretation would be that he expected the wings to be blown clean during take-off. The reasons they were not are somewhat technical. But the fact was observable and observed from the passenger compartment, and this raises one of the most dramatic aspects of the crash.

As the doomed aircraft began its final run towards take-off, the flight attendants and several passengers felt distinctly uncomfortable about their situation. In addition to the two flight attendants, there were among the passengers two commercial pilots with their families. No one made any attempt to bring their concerns to the cockpit.

There was also on board an RCMP special constable, who did mention his strong concerns to the senior flight attendant. She told him, incorrectly, that the aircraft had automatic de-icers; when he challenged this statement, she shrugged. The junior flight attendant testified that when she had, on previous flights, brought forward concerns, she had simply been told not to worry, and her concerns were ignored. But what if the two pilot passengers had spoken up? In any case, it didn't happen.

Complex systems always operate with multiple goals that are inherently contradictory and require trade-offs. Justice Moshansky's report, however, provides an appalling catalogue of deliberate management decisions to cut corners on safety. There is an air of Greek tragedy about the interaction of the de-regulatory zeitgeist with the desperate scramble for corporate profitability, moving inexorably towards the destruction of Air Ontario flight 1363.  

The Endangered Species: Patients, not Passengers

Before resolving to travel in future by bus, however, one should recall that air travel was and is one of the safest activities in our society. It is a "minus-six" industry, with an adverse event rate on the order of 10-6, that is, measured out of a million cases. This does not excuse either the managerial cost-cutting or the palsied regulatory environment that sent 24 people to their deaths. But it stands in some contrast to the healthcare sector, where the adverse event rate is on the order of 10-2. For the numerically challenged, the healthcare rate is 10,000 times larger.  

The contrast has been noticed. Practices in "high-reliability" industries may provide models, or at least lessons, for improvement of patient safety in what is clearly a "low-reliability" sector. Indeed, Moshansky (2005) was invited to reprise his summary of the events surrounding the Dryden crash, and his report more generally, at a Canadian Health Care Safety Symposium last year in Calgary.

From the other side of the fence, a writer in the Aviation Safety Letter ("Scrutinizing Aviation Culture" 2006) introduces readers to the term "professional courtesy," borrowed from medicine and law, as an explanation for the "silence of the lambs" in the passenger cabin of flight 1363 - the silence that perhaps cost them their last chance.  

"Hierarchical deference" might be a better term, avoiding the economic overtones of "professional courtesy," but the point is the same. In a strongly hierarchical workplace, offering advice - particularly warnings - to a superior necessarily implies that the superior has failed either to notice or adequately to consider some potentially critical facts. On flight 1363, according to testimony, there were really two crews, not one, and the flight deck crew were not perceived as welcoming advice from the girls who poured the coffee.  

This hierarchical division is a deeply embedded cultural reality in healthcare as well, and forms part of the backdrop to current calls for the creation of an alternative culture of safety. "Safety improvement efforts in health care often run up against traditional aspects of medicine's culture: steep hierarchies, tenuous teamwork, reluctance to acknowledge human fallibility, and a punitive approach to errors" (McCarthy and Blumenthal 2006). By contrast, "work environments committed to improving safety … are informed, just, and flexible; inspire individuals to report errors and near misses; and use safety data to learn and reform" (McCarthy and Blumenthal 2006).

People and Systems: Culture of Safety, Culture of Blame

These alternative cultures conceive of error in two quite different ways - the person approach and the system approach:

The person approach focuses on the unsafe acts - errors and procedural violations - of people … as arising primarily from aberrant mental processes such as forgetfulness, inattention, poor motivation, carelessness, negligence, and recklessness. … [C]ountermeasures … include poster campaigns … , writing another procedure … , disciplinary measures, threat of litigation, retraining, naming, blaming, and shaming. Followers of this approach tend to treat errors as moral issues. (Reason 2000)

The system approach, by contrast, accepts that "to err is human":

Errors are seen as consequences rather than causes, having their origins not so much in the perversity of human nature as in "upstream" systemic factors … . [T]hough we cannot change the human condition, we can change the conditions under which humans work. When an adverse event occurs, the important issue is not who blundered, but how and why the defences failed. (Reason 2000)

This is an important part of the second story.

Where Have We Heard This Before?

Students of the determinants of health and illness may at this point recognize a striking parallel with approaches to health promotion and disease prevention. There is a school of thought that views social differences in health as largely a consequence of individual bad behaviour arising from moral failings, a view epitomized by Satel (1997: 12-14) - "inferior nutrition, obesity, smoking, alcohol and drug abuse, and reckless sexual behaviour." Human error. Modern research and understanding, however, emphasize instead that the social environments in which people live and work create the conditions from which such behaviours emerge.

In short, there was no evidence that Captain Morwood was suicidal. The Dryden tragedy provides a textbook illustration of Reason's distinction. Captain Morwood's decision was the consequence not of his own perversity or "aberrant mental processes" but of " 'upstream' systemic factors" over which he had no control. He was trapped. A verdict of "pilot error" would have drawn a convenient veil over all those systemic failures - until the next tragedy.  

Good Efforts, But …

McCarthy and Blumenthal (2006) offer a collection of case studies of efforts to create a "culture of safety" in particular organizations. These seem to have been relatively successful, with measurable outcomes in terms of reduced adverse events. "Flattening hierarchies" and improving communications are recurrent themes in these examples. But can they be generalized, and will they last? The traditional aspects of medicine's culture that they point to seem far too deeply rooted - not only in history, but in the power relationships and the associated economic structure of North American medicine - to be permanently shifted by well-meaning and enthusiastic, but superficial, efforts to encourage greater teamwork and better communication. Captain Murphy, I think, would have recommended a different approach.

The Real Murphy's Law: Engineering Is Better Than Education

Captain Edward A. Murphy was in charge of collecting physiological data from a series of experiments begun by the US Air Force in 1946 to study the effects on the human body of extreme flying conditions. Directed by Captain (later Lieutenant Colonel) John Stapp, MD, these came to focus on the G-forces generated by rapid deceleration (read: crash). Stapp himself was strapped onto a "rocket sled" (called, alas, the "Gee Whiz") that was propelled rapidly down a track into a set of hydraulic brakes. Electrodes attached to various parts of his body recorded the results.  

The sled runs did not always yield as much data as expected. On one particular occasion, no data at all were recorded. It turned out that the technicians had wired every single electrode incorrectly. This experience and others similar to it gave us Murphy's Law: "If there are two or more ways of doing something, and one of them will lead to catastrophe, then someone will do it" (Matthews 1997).

The subsequent simplification to "If a thing can go wrong, it will" actually loses Murphy's point. He was not making a sad existential statement about the inherent cussedness of the universe, or a sardonic joke. Rather, he was stating a fundamental principle in safety engineering. The corollary to his law is: Engineer the situation so that there is only one way of doing a thing - the right way. Do not rely on the human element to make the correct choice. Humans may choose poorly.

Passing Gas Safely: Technology, Regulation and Culture

But can you do this in such a personalized field as medicine? Well, yes, and American anaesthesiologists have led the way (Gaba 2000). From a combination of inspired professional leadership and fear of malpractice litigation, they have for a number of years taken patient safety very seriously, indeed. One strategy they have adopted follows Murphy: the development of "engineered safety devices." Require gas hose connectors, for example, to be designed so that it is physically impossible to attach the hose to the wrong place. Require manufacturers to design equipment so that the knobs and the on/off switches work the same way on every machine. In general, standardize the work environment to minimize and, where possible, eliminate "recurrent error traps."  

Technological strategies, however, constitute only one component of the anaesthesiologists' overall approach to patient safety. Formulation and adoption of standards and guidelines for practice have also been important, as has the conscious adoption of a system perspective and "human factors engineering" - systematic and critical review of their own tasks and behaviour (Gaba 2000). There really does appear to have been a cultural shift, in this field at least.

The author of the Aviation Safety Letter ("Scrutinizing Aviation Culture" 2006) describes a similar cultural shift: "I'll be the first to admit that it takes a lot of nerve for an off-duty pilot to step out of the passenger mentality and speak out … . Fortunately, … [c]rew members now understand such advice as totally acceptable and expected." (But only from other pilots?)

Reason versus Tradition in Medicine: The Satisfactions of Scapegoating

The prospects for a general shift towards a "culture of safety" in medicine, however, may be a good deal less bright. To quote Reason (2000) again:

The person approach remains the dominant tradition in medicine, as elsewhere. From some perspectives it has much to commend it. Blaming individuals is emotionally more satisfying than targeting institutions. People are viewed as free agents capable of choosing between safe and unsafe modes of behaviour. If something goes wrong, it seems obvious that an individual (or group of individuals) must have been responsible. Seeking as far as possible to uncouple a person's unsafe acts from any institutional responsibility is clearly in the interests of managers. It is also legally more convenient, at least in Britain.

(All very familiar to students of the determinants of health.)

Nevertheless, the person approach has serious shortcomings and is ill suited to the medical domain. Indeed, continued adherence to this approach is likely to thwart the development of safer healthcare institutions. (Reason 2000)

Is High Reliability Worth the Cost? To Whom?

Sheps and Cardiff (2005) characterize healthcare as a "high-reliability-seeking" industry that is clearly not, at present, high reliability. But how serious are the leaders of that industry about seeking high reliability? Would physicians, in particular, be willing to pay the price, in terms of major organizational change? "Becoming ultrasafe may require healthcare to abandon traditions and autonomy that some professionals erroneously believe are necessary to make their work effective, profitable and pleasant" (Amalberti et al. 2005).

Are their beliefs erroneous - in particular, under the headings of profitability and pleasure? As Sheps and Cardiff (2005) note, the professional culture of medicine has deep roots in the mediaeval craft guilds and is remarkably consistent across regions and countries. Medicine is organized and governed the way it is because that is how physicians want it, and they have successfully fought off many efforts by others to change it. Financing is a permanent source of conflict in all countries, precisely because physicians' preferences inevitably collide with broader public objectives of access and cost control. "The 'abandonment of professional autonomy' (and, inter alia, the problems of professional self-regulation and control) is an essential lesson from high-reliability industries … . Related to the reduction or elimination of professional autonomy is the shift from the mindset of the craftsman to that of an 'equivalent actor' " (Sheps and Cardiff 2005: sec. 4.1). (It doesn't matter who the pilot - anaesthesiologist, surgeon? - is; they are interchangeable.) But this is a profession whose members are fiercely individualistic and deeply committed to autonomy.  

A cynical old economist would bet that faced with the requirements of a serious commitment to high reliability, most clinicians will say, "The hell with it. We will not give up our centuries-old (and highly profitable) traditions just to avoid the occasional adverse event. Symbolic gestures, sure, and expressions of great concern - but only so long as we stay, individually, in control." After all, unlike pilots, clinicians do not share the fate of their patients.  

How, then, to deal with the current concerns? A plausible strategy for deflection might be to conflate the concept of patient safety with the very different concept of quality of care. Then find some ignorant economist - there's no shortage - to tell the world that improved quality can be achieved only with more money. (The money = quality equation has a sorry history in health economics, now stretching over more than 30 years.) Et voilà! A potential threat to professional autonomy has been transmuted into yet another reason why healthcare needs more money - lead into gold.

Too cynical? Maybe, but let's wait and see.

Haut niveau de fiabilité ou un haut niveau d'autonomie? Dryden, Murphy et la sécurité des patients


Les soins de santé ne sont pas une industrie qui peut se vanter d'avoir une cote de fiabilité très élevée. Le taux d'occurrence des événements indésirables est de l'ordre de 10-2, comparativement à un taux de 10-5 ou moins dans les industries comme l'aviation, l'énergie nucléaire et les chemins de fer. Une prise de conscience accrue de ce contraste a fait de la sécurité du patient un sujet qui est source de nombreuses préoccupations. Le haut niveau de fiabilité observé dans les autres industries est attribuable à une « sûreté technologique », à des règlements rigoureux (« contraintes élevées ») et à la mise en place d'une « culture de la sécurité » en vertu de laquelle l'erreur constitue une faille systémique plutôt que personnelle. En médecine, la mise en place d'une telle combinaison nécessiterait l'abandon des traditions séculaires profondément ancrées d'individualisme, d'autonomie clinique et de responsabilité personnelle. Cela ne se produira pas. Nous croyons plutôt que les questions de sécurité seront détournées vers des activités qui ne menacent pas les valeurs de base.


Amalberti, R., Y. Auroy, D. Berwick and P. Barach. 2005 (May 3). "Five System Barriers to Achieving Ultrasafe Health Care." Annals of Internal Medicine 142(9): 756-64.

Gaba, D.M. 2000. "Anaesthesiology As a Model for Patient Safety in Health Care." British Medical Journal 320(7237): 785-88.

Matthews, R.A.J. 1997 (April). "The Science of Murphy's Law." Scientific American 276(4): 88-91.

McCarthy, D. and D. Blumenthal. 2006 (March). "Stories from the Sharp End: Case Studies in Safety Improvement." Milbank Quarterly 84(1): 165-200.

Moshansky, V.P. 1992. Commission of Inquiry into the Air Ontario Crash at Dryden, Ontario. Final Report. Ottawa: Minister of Supply and Services Canada.

Moshansky, V.P. 2005 (October 22). Address to the Halifax 5 - Canadian Health Care Safety Symposium, Calgary, AB.

Reason, J. 2000. "Human Error: Models and Management." British Medical Journal 320(7237): 768-70.

Satel, S. 1997 (February 17). "Race for the Cure." New Republic: 12-14.

"Scrutinizing Aviation Culture: Professional Courtesy." 2006 (February 27). Aviation Safety Letter 22: 38z.

Sheps, S. and K. Cardiff. 2005 (December). Governance for Patient Safety: Lessons from Non-Health Risk-Critical High-Reliability Industries. Project no. 6795-15-2003-5760006. Ottawa: Health Canada.


Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed