Healthcare Quarterly

Healthcare Quarterly 17(Special Issue) October 2014 : 45-50.10.12927/hcq.2014.23950
Key Levers To Patient Safety

Measurement of Quality and Safety in Healthcare: The Past Decade and the Next

Gary F. Teare

Abstract

The author calls for a critical assessment of the impact of investments made in the measurement of quality and safety, and reflects on whether a reorientation of some of this investment is required to realize the healthcare quality and safety improvement the system seeks. This article also reflects on several Canadian initiatives that have been typical and draws on the experience of health systems that have used measurement to great effect to suggest how investments in healthcare quality and safety measurement should be focused in the future.

There has been an explosion of healthcare performance (quality and safety) measurement activity – in the decade since the Baker et al. study (2004) on patient safety in Canadian hospitals. Around the time of that study, several provinces had launched or were developing, provincial health quality councils or similar functions in government. These entities began to develop and release public reports on aspects of health system performance, work with healthcare organizations and teams on improvement of health services and develop capability for "measurement for improvement." The Canadian Institute for Health Information (CIHI) took on responsibility for propagation of public reporting on Ontario hospital performance, building on methods developed by the University of Toronto for the Ontario Hospital Association Hospital Reports (OHA 2003), eventually expanding the effort nationally. The Canadian Patient Safety Institute (CPSI) was created, and launched SaferHealthcareNow! (CPSI 2004) and its related system of patient safety indicators measurement. Several provincial governments initiated work on metrics-based accountability agreements with regional health authorities or other healthcare agencies. Think tanks (e.g., Fraser Institute; Frontier Institute) and news media organizations (e.g. CBC The Fifth Estate) have gotten involved in producing public reports on quality and safety of healthcare over the years.

This interest has been valuable in bringing attention and expertise to address what was a dearth of performance measurement in healthcare – a condition that has set healthcare in sharp contrast to most other industries. It also had its detrimental effects. The many, uncoordinated measurement and reporting initiatives have at times created a cacophony of measures, measurement approaches and messages that can confuse and distract rather than focus and provide insight helpful to systematic efforts to improve healthcare performance. This state of "indicator chaos" was highlighted, and potential solutions identified, at a May 2011 meeting in Saskatoon of representatives from many of the organizations and academics engaged in healthcare performance measurement (Health Quality Council 2011). A key idea that emerged from the participants of that meeting was that creation of a nation-wide mechanism to enable coordination of and collaboration in measurement work would help to reduce unnecessary duplication of effort.

Attempts to Bring About Improvement by "Top Down" Measurement

The largest investments in performance measurement in Canadian healthcare have been oriented to a "top down" theory of change. This is reflected in numerous efforts and large investments to identify and develop standardized indictors with appropriate adjustments for bias that enable comparisons of performance among jurisdictions or healthcare organizations/facilities. Ontario's Hospital Report (2003) and similar national or provincial initiatives have largely followed this path. The operative theory has been that if we get the measurement right, the facts will speak for themselves and organizations or jurisdictions that are outliers on particular measures will be motivated to make the required changes in behaviour. This in turn will bring about needed improvement in the healthcare processes underlying the results reflected in the measures.

To that end, we have invested heavily in (and spend a lot of time criticizing) the scientific validity of performance indicators, in identifying frameworks and sets of measures that are meaningful and feasible to measure across organizations and jurisdictions and in electronic reporting tools on which to report them. The data used for this measurement are generally taken from existing standardized data sources such as administrative health data, although in some cases, new data collection is developed for the purpose. While the data all come out of the daily activity of healthcare, they are often abstracted from the process and the clinicians generating the data often are not highly aware of the data. In other cases, where the data collection was created specifically for the measurement purpose, clinicians may be hyper-aware of the data, and annoyed by the "add on" activity of collecting it. The quality and safety measures themselves are generally calculated at some distance (in both space and time) from the point-of-care and are generally reported electronically – on a website or online reporting tool – using increasingly sophisticated graphics and methods to facilitate comparisons.

The typical response to this reporting is that an analyst distills the information into a report for the organization's leaders, who focus their attention on those measures where the organization is an outlier or is not performing as well as hoped. This is followed up by a command to "fix the problem."

Unfortunately, successful improvement based on this approach is limited and is often not sustained when the attention of the leader shifts elsewhere or when leadership changes. The measurement is disconnected from the daily processes of care, which is where the improvement needs to take place. The work is usually handed off to be directed by a committee and months may pass.

In successful cases, the key processes to be fixed are identified and improved. However, in getting there, the improvement team finds that the measures from the report that motivated the leader to say "fix it" are usually not timely enough to support process improvement work – the data they are based on are now old. Or the measures only reflect outcomes of care – and the team does not have available information about the performance of the processes that lead to those outcomes, nor about the inputs (e.g., patients and materials) to the processes, which would help them to interpret and contextualize the outcomes. So, successful improvement work requires the development of local, point-of-care measurement to understand and monitor the performance of those processes. Generally, the resources needed to do this improvement work are configured as additive to the care process itself. Enthusiastic clinical and administrative champions go "above and beyond" their daily work to make the improvements happen and that measurement and reporting supports (staff, tools) are put in place. Unfortunately, the success depends on these additional inputs, and when the enthusiasts tire or move on, or when leadership attention shifts to fixing a new problem the efforts cease. The entropy inherent in the system can undo any improvements fairly quickly.

Attempts to Provide Support "From Away" to Local Improvement as Part of Larger Campaigns

Recognizing that outcomes-oriented measurement was insufficient, many organizations have attempted to help local improvement teams by providing training and support. Quality improvement "Breakthrough Collaboratives" (HQC 2008) and national healthcare safety campaigns, such as SHN!, are examples of this kind of initiative. These initiatives have played an important role in spreading a working knowledge of quality improvement and patient safety methodology. They also give point-of-care teams (microsystems) hands-on experience in capturing and using data to understand and improve their care processes.

SHN! engaged healthcare organizations and providers across the country in focusing on improving a few key areas of healthcare known to be associated with higher risk to patients' safety. The point of the initiative was to help hospital healthcare teams to reliably follow practices that were previously demonstrated to be effective in dramatically reducing the frequency of patient harms. From a measurement perspective, SHN! provided support to healthcare teams' evaluation of their process improvement by providing well-defined measures, not only of outcomes but also of the key underlying processes, and by providing electronic tools to facilitate local data capture and basic analysis. Eventually an online tool was developed for data entry and basic reporting.

The improvement science and measurement support and the kinds of measurement done in SHN! provide an important next level of engagement to help local improvement teams meet what are still largely "top down" improvement goals. Having the important processes already identified, having appropriate measures already defined and having some technology in place to facilitate data capture and reporting with potentially much less delay, address some of the key reasons for why the first kind of "top down" measurement often fails to lead to improved quality and safety. Unfortunately, the same key features of these initiatives, which enable them to achieve improvement results relatively quickly, can also be the source of their unsustainability.

In most examples of this kind of initiative – whether SHN!, or any number of other similar programs – the weak link is that the improvement activity and the related measurement is still an "add on" activity for the organization and the clinical teams delivering care. They struggle with the measurement and come to see it as something they are doing "for" the initiative or its sponsoring organization instead of for themselves and their own learning; measurement must be built into workflows so that it becomes a seamless and value-adding part of staff work. Having a separate online form or website for data entry and reporting does not work with clinical workflow. As a result, the job of measurement goes to a special resource (e.g. a research nurse or a study coordinator) – the kind that is the first to be cut under conditions of resource constraint (i.e., when leadership's priorities shift elsewhere, or budgets are cut).

Measurement to Support Bottom Up Improvement in the Context of Top Down Prioritization

To borrow something often said of politics – "all improvement is local." Achievement of improvements in patient healthcare outcomes all begins with improvement of appropriateness of the care and of the processes by which it is delivered. It seems self-evident that engagement of the hearts and minds of local healthcare teams – including the patient, the clinicians, support staff and their immediate supervisor(s) – in the effort to improve care is the way to sustainable, real improvement. This has certainly been the path taken by the healthcare systems most often looked to by others as examplars in achieving improvement success – places such as Virginia Mason Medical Centre (VMMC) in Seattle, Intermountain Healthcare in Utah, Southcentral Foundation in Alaska, or Jonkoping County in Sweden. Each have achieved this in different ways – and there isn't space to discuss all of them. Here we will focus on key lessons from Virginia Mason and Intermountain Healthcare pertaining to the important role measurement plays in improvement work and will touch on how some of these practices are being replicated in Canadian settings.

Visual Management

Visual management is a different form of "measurement and reporting" – a technique that is promoted in the quality and safety improvement practices that were most thoroughly developed for manufacturing at Toyota (popularly called "lean") and adapted to healthcare by Virginia Mason (2014). Developing visual management of a process involves having the team that does the work understand their processes and, wherever possible, create standards for the operation of those processes. Visual management involves creating visual (and sometimes audible) cues to signal to people working that process when a critical step in the process is ready to be taken or when a critical part of the process is not operating within the standard. For example, in a hospital or clinic, this could be a flag system on doorways to signal when the room is ready for a patient or to signal when the patient is ready for a particular provider type or service. Or it could be tracking of patient flow through a clinic, with different-coloured indicators on whiteboard showing whether each care team is on time or if any are running behind, to enable on-the-fly management of the schedule. Visual management is also the motivation for workspace clean-up and organization practice (called "5S") promoted in lean improvement methods.

Daily visual management (DVM) extends this kind of practice to how clinical teams make their work "visible" to each other, their leaders and their patients – through use of key process and outcome metrics that they capture during the course of their work and use to regularly update a "visibility wall" (metrics board) on a daily or weekly basis. The team uses the metrics on the board as a focal point for daily and/or weekly team huddles to plan or evaluate their work and to identify to each other opportunities for improvement or progress on improvements ideas being tested. The content displayed on visibility walls is largely driven by what is considered important by the local (microsystem) team based on their processes and what they are striving to improve. However, they can (and should) be used to help the team see the connections between their local work and organizational/system improvement priorities. Mature visibility walls will contain a balanced set of metrics to help the team reflect on the performance of their team with respect to quality, patient and provider safety, patient and provider experience, cost and the delivery of the services (usually in terms of timeliness and quantity).

DVM is often based on quite low-tech approaches to measurement like tracking of patient flow on a whiteboard with hourly status summarization to spur any actions needed. However, DVM can also involve information that is generated from electronic tools used in managing or delivering healthcare – such as digital whiteboards, bed management software and electronic medical records. The key is that the measurement and reporting is done in real or very close to real time so that the information can be used actively in decision-making. The collection and use of the data are built-in to the daily work routines.

Building DVM into the work routine is not automatic. It does take purposeful work, commitment and a flexible approach to make it best suit the needs of the team and help the team see their connection of their work to larger organizational goals. It ultimately proves its utility to the team by helping them create a less chaotic work environment, helping to tell the story of their continuous improvement progress and helping to make evident the improvements in patient outcomes they are achieving. Hospital units and some primary care clinics in Saskatchewan have begun in the last two years to learn and apply this approach to use of measurement.

Ultimately – the practice of DVM cascades up and a similar approach informs visual management at higher levels of the organization and health system as a whole. In Saskatchewan, the Regional Health Authorities (RHAs) have developed their organizational- and department-level visibility walls for leaders at each level to use to track the work in their area and to inform "good questions" that leaders can ask of those who report to them to ensure that barriers to improvement can be identified and addressed. At the top level of the Saskatchewan healthcare system, the Deputy Minister of Health, Physician Advisors, RHA CEOs and Board Chairs all meet quarterly around a visibility wall to maintain focus on provincial improvement priorities.

Measurement to Assist with the Designing Care

While measuring and monitoring improvement in care processes is vital, and quality improvement methods including those of "lean" are tremendously helpful to improve how care is delivered reliably and safely, it is important to note that much of what is done in healthcare is not based on a solid evidence-base, so standardization of care that should be provided presents a special problem. That is not to say that standardization is anathema to problems of appropriateness in healthcare – but rather it means that a purposeful and careful approach is required to develop and use standards in determining what care to provide for patients. Intermountain Healthcare has developed a very robust method for developing and using measurement.

Called "Shared Baselines" – what Intermountain Healthcare did, was combine the standardization of experimental clinical trial methodology together with quality improvement methods to build standard evidence-informed routines into care while preserving clinicians' autonomy to treat each individual patient in the manner that seems to best suit that patient. The method is supported by a measurement system that is built into the clinical workflow to capture important aspects of patient characteristics (process inputs), key process decision/action points and patient and health system outcomes (clinical, experience and cost). Importantly –the measurement approach enables the capture of clinician-initiated protocol variations and includes a "learning loop" to feed the information on those variations and short- and long-term patient outcomes back to the clinicians on a regular basis (James 2014). The latter feature is key, as it forms the basis of "evidence-generating" healthcare – wherein aspects of healthcare, for which specific clinical trial grade evidence does not exist to guide decisions, can be informed by the documented accumulation of experience over time to improve care decisions. This is an important feature of this measurement approach, as most of healthcare in the real world is not provided to the highly selected patient populations included in clinical trials.

Where Intermountain Healthcare has excelled in its approach is that it prioritized its improvement work to focus first on the "golden few" care processes that comprise the bulk of the care their organization provides, they developed an information system and approach to measurement that embedded measurement into the clinical workflow, they adjusted their management structures to encourage use of the data for improvement and they aligned financial incentives to enable clinicians to provide the right care without suffering a penalty for doing so (James and Savitz 2011). Today, Intermountain Healthcare is widely known for its highly effective use of information technology in healthcare to guide improvement and achievement of better patient outcomes. The information technology is an important ingredient in Intermountain Healthcare's measurement approach, but Brent James is quick to caution against jumping to computer use too quickly – as Intermountain Healthcare wasted many millions of dollars in initial failed attempts at health information system until they aligned their IT strategy with their shared baselines clinical integration approach.

In a nutshell – the approach involves a team of clinicians, supported by measurement and quality improvement experts, visualizing the care process (the patient journey through the process) using process mapping, determining key decision points in the process and agreeing to a standard approach to care at those points and identifying key clinical, patient experience and cost outcomes pertinent to that care process. To round out the measurement needs, the team identifies key patient characteristics and other process inputs that will be important to know to properly interpret process and outcome measures (i.e., for stratification). The team determines the kinds of feedback reports that they will need to monitor the standards and to learn from clinician-initiated variations and identifies the specific data that will need to be collected to produce those reports. The next phase involves identifying the most appropriate places within the workflow to collect specific data elements and to run a trial of collecting those data – using pre-coded forms or checklists on paper – and produce initial copies of the reports. At that stage a final selection of the most valuable reports is made and only the data required to support them are "hard wired" into their electronic medical record and other electronic data collection tools. With regular feedback of the reports to clinicians and scheduled annual minor and triennial major reviews of the shared baseline protocols, the standards are continuously updated to reflect the latest evidence – both from the published scientific literature and from the accumulated observations and interpretation of Intermountain Healthcare's own protocol variations data.

At present no Canadian health organizations or systems have created a system for the design of care and active learning from variation as well-developed as that of Intermountain Healthcare. The Variations and Appropriateness Working Group of the Saskatchewan Surgical Initiative (2012) have followed Intermountain Healthcare's lead in design of a shared baseline protocol in vascular surgery – and developed the measurement system following the method used by Intermountain Healthcare, but the province still lacks the information systems infrastructure to build measurement seamlessly into workflow, and has not addressed the issues of clinical management structure or financial incentives. Alberta's Strategic Clinical Networks seem to have important elements of clinical management structure in place but have yet to reliably deliver the hoped-for improvements in care. The entire package of changes to care design, measurement, management structure and incentives that will work in a Canadian context has yet to be realized. From a measurement perspective, It is important to note that the successful approach used by Intermountain Healthcare required an investment in improvement and measurement expertise that could be embedded with clinical standards development teams for an extended period – initially to help develop, and then to help maintain the shared baselines approach.

The Next Decade in Quality and Safety Measurement

As a country and as provincial/territorial healthcare systems, we will continue to need standardized, comparable metrics that can be used to identify areas where improvement is needed or to document trends in improvement (or not) over time among jurisdictions and organizations. There will continue to be areas of care where the existing evidence base relating specific processes to desired outcomes is quite solid, where there will need to be mechanisms such as large-scale campaigns or collaboratives to facilitate the spread of implementation of these better practices. Each of these approaches has an important role to play and needs further investment to improve their effectiveness.

An emerging area of healthcare quality and safety measurement, where a significant amount of investment and focus needs to be placed going forward, is in helping clinical teams and leaders at all levels learn how to make their work processes visual and to manage them in that transparent way. In short – it will require an openness to changing the healthcare leadership culture to one where transparency and visibility of processes and outcomes – the great, the good, the bad and the ugly – is a fundamental principle. So we will only see visual management increase if leaders at all levels invest – their time and their resources – in developing it.

A key area of healthcare quality and safety measurement that needs investment is in the development of local measurement-savvy quality improvement support personnel. They would work at the local, regional and provincial levels with healthcare providers and patients – to develop the kind of data and information that will be most useful to them in understanding and improving their care processes over time. These resources must have strong numeracy, solid quality improvement science skills, and be highly emotionally intelligent and skillful at working with groups of experts who often hold widely divergent opinions about the work at hand. There are few training programs in health systems or at universities to develop these skills in people. And – people with strong numeracy and analytical skills in Canadian healthcare organizations presently tend to find their time largely occupied responding to "fix it" imperatives from leadership, motivated by top down kinds of measurement and reporting.

The last area requiring significantly new and different development attention is information technology. For too long the focus of electronic medical record development has been to essentially replicate the paper medical record using bytes instead of a pen. Canada needs to develop information technology solutions that are easy to use and apply to data capture within the clinical workflow – and yet conform to compatibility standards to enable data flow in the health system. We need flexible online tools with interfaces that are easy to adapt to different scenarios to capture data on the fly – and that don't require a lot of primary programming by consultants to get data into them or out of them. We need personnel trained to work with these systems embedded along with the improvement support people in the clinical teams to ensure development of IT that truly enhances and fits with care workflow rather than adding extra work.

In conclusion, Canadian healthcare needs balance and parsimony (Meyer 2012) in its approach to large-scale measurement initiatives to ensure that much more time is given and appropriate investments made to develop local and provincial capabilities for visual management and care (re)design.

About the Author(s)

Gary F Teare, PhD, MSc, DVM, is executive director, Measurement and Analysis at the Saskatchewan Health Quality Council in Saskatoon, SK.

Acknowledgment

The author thanks Danton Danielson, Manager of Evaluation, CPSI, for sharing valuable information insights about the Safer Healthcare Now! program and its related Patient Safety Metrics program. All opinions, errors and conclusions in the paper are only those of the author.

References

Baker, G.R., P.G. Norton, V. Flintoft, R. Blais, A. Brown, J. Cox, E. Etchells, W.A. Ghali, P. Hebert, S.R. Majumdar, M. O'Beirne, L. Palacios-Derflingher, R.J. Reid, S. Sheps and R. Tamblyn. 2004. "The Canadian Adverse Events Study: The Incidence of Adverse Events Among Hospital Patients in Canada." Canadian Medical Association Journal 170(11): 1678–86. doi: 10.1503/cmaj.1040498.

Canadian Patient Safety Institute. 2004. Safer Healthcare Now! Retrieved September 16, 2014. <http://www.saferhealthcarenow.ca/EN/Pages/default.aspx>.

Health Quality Council. 2008. The Courage of One, The Power of Many: The Saskatchewan Chronic Disease Management Experience. Retrieved October 3, 2014. <hqc.sk.ca/Portals/0/documents/chronic-disease-management-report.pdf>.

Health Quality Council. 2011. Think Big, Start Small, Act Now: Tackling Indicator Chaos. Retrieved October 3, 2014. <http://hqc.sk.ca/Portals/0/documents/tracking-indicator-choas.pdf>.

James, B.C. 2014. "The Cystic Fibrosis Improvement Story: We Count our Successes in Lives." BMJ Quality and Safety 23: 268–71. doi: 10.1136/bmjqs-2014-002839.

James, B. and L. Savitz. 2011. How Intermountain Trimmed Healthcare Costs through Robust Quality Improvement efforts." Health Affairs 30(6): 1185–91. doi: 10.1377/hlthaff.2011.0358.

Kenney, Charles. Transforming Health Care: Virginia Mason Medical Center's Pursuit of the Perfect Patient Experience. 2011. CRC Press, New York.

Meyer et al. 2012. "More Quality Measures Versus Measuring What Matters: A Call for Balance and Parsimony." BMJ Quality and Safety 21: 964–68. doi:10.1136/bmjqs-2012-001081.

Ontario Hospital Association. 2003. Hospital Reports. Retrieved September 16, 2014. <http://www.oha.com/KnowledgeCentre/Library/HospitalReports/Pages/HospitalReports.aspx>.

Saskatchewan Surgical Initiative. 2012. Sooner, Safer, Smarter. Retrieved September 16, 2014. <http://www.sasksurgery.ca/sksi/surgicalinitiative.html>.

Comments

Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed