Healthcare Quarterly

Healthcare Quarterly 17(2) April 2014 : 5-6.doi:10.12927/hcq.2014.23887

It’s Time to Revisit Flexner

Duncan G. Sinclair

The digital revolution is an already-momentous and fast-growing phenomenon with profound implications for health professional education and research. John Stackhouse, former editor-in-chief of The Globe and Mail, referred to it recently as the new industrial revolution (Stackhouse 2014, January 25). It's the near-universal, almost-instant access that everybody in the world has to everybody else and to the total of recorded knowledge about nearly everything. This challenge is more than equal to that addressed over a century ago by Abraham Flexner when health professional education was changed from its pre-20th century apprenticeship model to the current model in which it is based primarily on the need of practitioners for a relatively deep understanding of the science and other knowledge underlying their work.

The Flexnerian transition to a university- and science-based model was made when two things became clear:

  1. Apprenticeships produced practitioners of highly variable competency. Flexner's report led to common standards for the education of doctors, then the only health professionals thought to be affected. Nurses were considered appropriately trained by the apprenticeship model in hospitals, and professionals in other fields of healthcare, if they existed at all, were few and far between.
  2. A major factor behind the Flexnerian revolution was that the sciences basic to the health professions had advanced by the early 20th century to the point where practitioners had to understand their tenets in order to provide patients with the steadily increasing efficacy and quality of care from which we benefit today. Those scientific foundations continue to grow and be incorporated in the educational standards set by those who certify graduates and trainees in the health professions competent to practice.

The so-called digital revolution constitutes a powerful new force at work in the second decade of the 21st century. It is a force just as compelling as those that led the Carnegie Foundation to hire Flexner to take a hard look at what educational institutions were doing to prepare their graduates for practice as they entered the 20th century.

The new industrial (digital) revolution has many identities. In the health sciences, we think of molecular medicine, personalized therapeutics, individualized care, intelligent drug design and so on. But more generally, it means virtually everybody is instantly connected to as much of the world, here, there and far away, as they care to reach. Basically, the digital revolution gives individuals ready access to the universe of existing knowledge, including that of threats to his or her health and well-being, as well as a growing ability to draw on a plethora of continuously recorded, app-derived personal physiological data. As these forces affect the health professions, one can foresee a fundamental shift from population-based care, treatments and research to those based on the fast-increasing knowledge of the genetic, lifestyle and clinical characteristics of individuals, both of patients and particularly of people seeking to avoid becoming patients. Accompanying this shift to individualized healthcare, there will be growing insistence on a return to personalized care, care closely tailored to the needs, wants and cultural preferences of the individual patient and his or her family, a style of caring more comparable to that of yesteryear than of yesterday.

The fast-developing potential of individualized healthcare is well described by Eric Topol in The Creative Destruction of Medicine (2011). A cardiologist at the leading edge of the application of knowledge derived from genomic sequencing and from technology to record physiological functions digitally on your cell phone, Dr. Topol, clearly a lonely pioneer, decries the long resistance of the health professions, of medicine especially, to embrace information management. He points out the remarkable and still-accelerating trend to connectivity made possible by invention of the cell phone in 1973, the personal computer in 1980, the Internet in the '90s and digital device proliferation, cloud computing and social networking in the first decade of this century, during which ever-cheaper and more rapid sequencing of the genome has also been discovered. There both the speed and cost continue to drop at a rate that makes Moore's law seem very conservative.

In 2012, the first ever iPOP, an integrative personal "omics" profile, was done at Stanford. It includes many billions of data points, including some 40,000 biomarkers from analyses of the team leader's deoxyribonucleic acid (DNA), RNA, cell proteins, antibodies, metabolites and molecular signals. Complementing such data, the US Food and Drug Administration (FDA) has already approved a small host of biodetectors, so-called "sniffers," home-use physiological data-recording applications, or apps, for everybody's smart phone.

And how on earth will we cope with such enormous volumes of data? Think of Watson, IBM's plain-language computer that prevailed over two champions on Jeopardy a couple of years ago. According to a recent article in The Economist, "Watson is now smaller, more efficient and 24 times faster than it was when it won Jeopardy. It is being applied to industries such as finance and health both of which rely on expensive human advisors. Well-Point and the Cleveland Clinic are actively collaborating in the development of Watson apps related to healthcare. The computer is some way from making doctors and financial advisors redundant but it is well-placed to provided them [everybody] with expert advice" ("A Cure for the Big Blues" 2014, January 11).

A number of Cs are fundamental to the changing ways:

  • Constant connectivity: With Facebook, Twitter, blogs, databases and the like, nearly everybody is connected to everybody else.
  • Collaboration and crowdsourcing: Smart phones, the Internet and social networks have created a participatory culture. Relevant to the health professions are sites such as PatientsLikeMe, MedHelp and others in which shared experiences are openly available to any and all who want to benefit from them. To quote Topol again, "Our go-to source for health and medical information is moving away from our doctor … increasingly [to] crowd- and friendsourcing."
  • Customized consumption: No longer content to buy an album, we download a particular song we want to hear by a favourite band.
  • Cloud computing: This is a cheap, seemingly unending capacity to store and instantly make available to everybody the world's total supply of data, information, recorded experience, advice and opinion.

These Cs together more than fulfill the concept of "creative destruction" made popular by business guru Joseph Schumpeter. While health professional education doesn't need "destruction," it does need, and quickly, to consider creatively how to do its job more effectively and efficiently in this new digital world.

Another aspect of the new industrial revolution deals with the mind-boggling potential of analyses of genomic sequences to unravel the causes of ill health and to prevent and repair them. Peter Huber, in The Cure in the Code (2013), argues that the legal foundation followed by the FDA in its adherence to the randomized controlled trial (RCT) as the gold standard to be met before new drugs are licensed is blocking the intelligent design of new pharmaceutical agents that are based on our increasing structural knowledge of the genomes of disease-causing agents and the patients they affect. Basically, his thesis is that RCTs represent population-based research, finding out what works with the least deleterious side effects for the average person. Contemporary knowledge of the genome makes it possible to discriminate between those people on the right-hand side of the distribution who respond well to a drug and those on the left who don't. It makes it possible to tailor treatments to individual subpopulations and then down to individuals, given the requisite information derived from analysis of their genome, their lifestyle, their clinical history, "sniffer" records and the health-threatening agent involved, be it a bacterium, virus, prion or whatever.

What should we be doing in health professional education to prepare students for maximally effective work in this new world? That's for the professions and educational institutions to answer. But there is no doubt that, as with information management generally, the health professions have to play catch-up. They are well behind the crest of the very big wave of the digital revolution. That wave is fast carrying forward to a shinier future those enterprises and fields that had the wit and discipline to get on their boards back in the days when the wave was smaller and easier to ride.

It's past time for the health professions to get into the water, pool their courage and resources and engage a modern Abraham Flexner to help them figure out what and how to change to meet the challenges of the digital revolution that is the hallmark of the 21st century.

About the Author(s)

Duncan G. Sinclair, PhD, is a one-time dean of the Faculty of Medicine and vice-principal of health sciences, and a fellow in the School of Policy Studies at Queen's University, in Kingston, Ontario.


"A Cure for the Big Blues." 2014, January 11. The Economist 54–55.

Huber, P.W. 2013. The Cure in the Code. How 20th Century Law Is Undermining 21st Century Medicine. New York: Basic Books.

Stackhouse, J. 2014, January 25. "Davos Diary: A New Angst Settles Over the World's Elites." The Globe and Mail.

Topol, E. 2011. The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Health Care. New York: Basic Books.


Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed