Healthcare Quarterly

Healthcare Quarterly 1(4) June 1998 : 5-6.doi:10.12927/hcq..16741
Departments

Quarterly Letters: An Effective Stimulus for Change

A.C.P. Powles

Abstract

I read the article entitled "Benchmarking Compar-isons of the Efficiency and Quality of Care of Canadian Teaching Hospitals" by Helyar et al. (Hospital Quarterly, Spring 1998) with great interest. As Chief of Staff of one of the Canadian teaching hospitals, I am charged by my Board of Trustees with responsibility for the quality of medical care provided to all patients. Like many of my colleagues, I am concerned that the attrition of healthcare resources through fiscal constraint will ultimately lead to deterioration in the quality of care deliverable to our teaching hospital patients, particularly when the alternative community-based facilities and resources are not yet in place.
"Benchmarking" and "best practices" are words in common use at our hospital and their underlying concepts have provided an effective stimulus for change. To ensure that benchmarking continues to be an effective stimulus for change, physicians and administrators would do well to remind themselves of the process of benchmarking, as quoted by Helyar et al.: "In healthcare, the key to benchmarking rests with understanding and improving the underlying processes and practices that drive quality, cost and clinical excellence. Benchmarking helps to determine how other organizations have achieved exemplary performance and suggests a method for adapting benchmark performance to one's own organization." And, later in the discussion: "The goal of the ACTH/HayGroup Benchmarking Comparison of Canadian Teaching Hospitals is to identify high-performing clinical and operational processes that can be emulated to improve the efficiency and quality of hospital services across Canada."

It is not enough to look within one's own institution for efficiencies or changes in process. It is necessary to examine how the high-performing hospitals provide care and what processes are used to achieve the benchmark indicator. The data provided in the article are just that - figures illustrating the variability of performance for each of the indicators. Those seeking to meet the benchmark will need information as to process. How is care provided in Hospital C, with the short ALOS for stroke discharges (Exhibit 4)? Is this the result of availability of a stroke rehabilitation unit or rehabilitation carried out elsewhere in a non-acute setting? Is the short length of stay an artifact caused by the opportunity to transfer the patient? In this day of the "integrated system", it would be appropriate to count all days during which care is delivered to the patient while an in-patient, whatever the institution. The same point can be made with regard to knee replacement (Exhibit 5) which shows two groups of hospitals, one with ALOS of 7-8 days, the other (said to have lower case volumes) with ALOS over 10 days. Those hospitals with high knee replacement volumes might be expected to function in conjunction with a rehabilitation centre, however, whether the days spent in the rehabilitation centre have been counted in the ALOS is unclear.

To avoid any potential for the ALOS numbers to become a shell game, it would be very useful to capture and display all institution-based patient days, whether in the teaching hospital or another institution, for all those CMGs where part of the care required is likely to be provided in non-acute settings. This will enable non-teaching hospital resources and costs to be identified and compared. "Benchmarking," to be a real value as we restructure to provide more care outside the hospital, must be able to take into account both the resources used, both hospital and non-hospital, and the quality of care delivered, both hospital and non-hospital. There is, inevitably, an emphasis on efficiency measures in this article. Inevitable because of the data sources, on the one hand, and because of the difficulty of defining quality of care on the other. Clinical and Operational Efficiencies are not necessarily indicators of the quality of care. Defining quality of care by two indicators, Risk Adjusted Complication Ratio and Risk Adjusted Mortality Index, as done in the article, is a start, but the conclusions are so surprising that the authors have to acknowledge the documentation issues as well as the possible impact of structural changes to the healthcare system on the quality of care provided. The Hospital Accreditation Standards define eight Dimensions of quality: Efficiency, Effectiveness, Accessibility, Acceptability, Appropriateness, Safety/ Risk, Competence, and Continuity. Many of these can be interpreted as relating to the performance of the institution in delivering care, not of the quality of care delivered to the patients.

I applaud the Association of Canadian Teaching Hospitals and the HayGroup for what they have achieved so far, particularly in describing Clinical and Operational Efficiency, but I urge them to go further, to set up a system to define and measure quality of care in terms of deliverables that have been shown to effect health outcomes. In this time of "evidence-based medicine" with the growing popularity of "care paths," we should be able to define quality of care differently, and measure it. This may need to be done by individual CMG. In myocardial infarction, for example, the "door to needle time" is an indication of compliance with an evidence-based standard of care, and for the same condition, the proportion of patients discharged on medications considered appropriate for the reduction of morbidity and mortality post MI. This type of information can be obtained currently, with the expenditure of many human hours (e.g., the MACSTRAK Project), but should in the future be more easily accessible via computerized medical records or, in the shorter term, with linkage of key elements such as the pharmacy database and the diagnostic database information.

About the Author(s)

A.C.P. Powles, M.B., Ch.B.
Chief of Staff, St. Joseph's Hospital
Professor of Medicine
Faculty of Health Science
McMaster University

Comments

Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed