Healthcare Quarterly

Healthcare Quarterly 13(4) September 2010 : 24-26.doi:10.12927/hcq.2013.21994
Accelerating Excellence Report

By the Numbers: Measuring for Quality Care

Jane Coutts

Saskatchewan's Health Quality Council (HQC) was launched in 2003 with a mandate to not only measure and report on healthcare but also work with a range of partners to improve the province's health system. In late 2007, HQC's board decided it was time for Saskatchewan to reinvent its healthcare system, using the highest-performing systems in the world as its model. And in 2008, HQC launched Accelerating Excellence, a multi-level program to rethink, redesign and renew healthcare.

To help maintain momentum and show other provinces whether high-performing healthcare can be achieved in Canada, HQC is documenting its journey toward high-performing healthcare. This fourth article in the series discusses using measurements to improve quality in healthcare.

Faced with the numbers, there was only one choice: action. The numbers in question showed how many of the Saskatoon Health Region's chronic heart failure patients were readmitted to hospital after an initial stay between 2002 and 2004 – and compared them with national and provincial averages. They very clearly showed a problem (Table 1).


Table 1. Readmission rates for chronic heart failure patients
  Readmission
After 30 Days After 90 Days After 1 Year
Saskatoon 12% 25% 41%
National average 8.7% 14.1% 23.6%

 

It wasn't a small difference, bad data or coincidence. It wasn't that there's a particular problem in Saskatchewan: other health regions weren't getting the same kind of results. Simply put, Saskatoon was doing a bad job with heart failure patients. And something had to be done about it.

Patrick Robertson, clinical pharmacy manager for the region, was approached by two colleagues who were worried about the numbers. Together they put together an application for an Innovation Grant from the Health Quality Council (HQC) to tackle the problem.

Their pilot project was based on W. Edwards Deming's continuous quality improvement principle of plan-do-study-act; they planned five PDSA cycles during the pilot. The project itself was simple – they wanted to hire a nurse to do education and follow up with patients.

Linda Sinclair, the nurse who was hired, fairly quickly made a shocking discovery: most patients who had been in the hospital for heart failure had no idea what was wrong with them, or what to do about it. Told they had "fluid on the lung," they made no connection at all with what they were eating and were unaware they needed to radically cut down on salt. No wonder they kept winding up back in hospital.

Fairly simple measures were introduced. An information package, featuring a big refrigerator magnet with the basic rules of self-care, was put together and given to every patient. Ms. Sinclair went around to wards, educating nurses on how to explain the condition to patients, and she herself called all patients shortly after they were discharged, to see what they understood and what she could help with.

This turned out to be a simple but effective strategy. After the pilot project, the 30-day readmission rate had dropped to 11%, the 90-day rate was 14% and return within a year was 21%.

Simple, but effective – just like the message contained in those original numbers: you're not doing this well enough. Fix it.

Unfortunately for Saskatchewan's Accelerating Excellence program, using measurement to improve healthcare quality is rarely that easy, and the farther you get from front-line care – as you try to find measures to guide organizations and regions and provinces – the harder it gets.

However, as the old saying goes, if you can't measure it, you can't manage it. So measuring is essential for quality improvement, according to healthcare consultant and HQC board member Steven Lewis. Fundamentally, he says, "you can't know how well you're doing if you don't measure, and it's hard to establish accountability without measurement and reporting."

But there are several other reasons measurement is a building block of quality improvement.

"People seem to respond to comparative performance data more positively than they do to hectoring," he says, and "the very act of measurement requires you to decide what to measure, which requires you to think more precisely about what you're doing and why."

That's where HQC found itself in late 2006, after a strategic planning meeting with 250 participants from across the province. HQC learned that regions found the jumble of measurement and improvement initiatives they were being asked to participate in was actually getting in the way of their ability to deliver quality care.

Furthermore, the measurement data that were available, people said, were disconnected from what they needed to know to make everyday decisions. They weren't necessarily the type of data people wanted, in the first place, and then there was the issue of time lag – it's hard to base healthcare decisions on information that is often a year or more old.

They called it "indicator chaos" and asked HQC to develop a unified, organized system for measuring and reporting on quality that would make information easier to gather and more relevant for improving quality.

The response was to launch the Quality Insight project in 2007, a single province-wide measurement program. It issued a first, brief report in 2008, but even while it was being written, HQC knew that to be really useful, Quality Insight had to involve its target users in designing indicators. So they created the Quality Insight Working Group (QIWG), made up of representatives from the health regions, the Saskatchewan Cancer Agency and the Ministry of Health.

Participants were chosen to ensure the indicators they developed would be relevant to and trusted by providers, managers and leaders. Their work focused on developing standard indicators to use for reporting at the system level – on individual health, yes, but also on the overall health of the population, how providers are doing and the sustainability of the system.

"We're never going to be reporting on handwashing," says Gary Teare, director of quality measurement and analysis at HQC. Ward-level issues, such as processes to reduce infections or falls, he explains, are best measured and dealt with on the spot (though with broader comparisons, of course). The focus of Quality Insight, although it will help develop those "micro" indicators, is to figure out how to measure the things that reflect the overall quality of the health system and its contribution to the health of the population.

So then the tricky question – what to measure to do that? As a start, all the members of QIWG reviewed everything measured in their own organizations. That exercise revealed just how uncoordinated measurement is, even within an organization; it also showed many organizations don't gather data at all, or don't do it regularly. Turns out the data available from QIWG members mostly involve the results of acute care – what the working group called in its first report "data that describe how busy the system is, not how well it is doing."

The working group was struggling to find its first focus for action when Saskatchewan's Patient First Review of healthcare in the province wound up its work with a recommendation that "the health system take immediate action to improve Saskatchewan patients' surgical experiences." The Saskatchewan Surgical Initiative, launched by the Ministry of Health in response to that recommendation, signalled early on that it must include regular reporting on clear targets.

The job of developing that reporting system so the public and healthcare providers could know if waits were getting shorter, patient and provider experiences were improving and patients were doing better after surgery, was given to QIWG, with HQC supporting the work.

HQC welcomed it for the focus it gave them. "It gives the whole organization a way to prioritize the work it's doing," says Rosemary Gray, the consultant directing development of the online reporting tool for the surgical initiative. "It lets us streamline the projects and brings the organizations together around a common thread." (The next stage of her demanding job will be to take that common thread and work out how to build a web-based reporting system for it on which the same information will be adapted to different audiences, from the curious public to the intent planner who needs to analyze every number available for years back.)

The surgical initiative has five overall objectives: shorter waits for surgery, a better experience for patients and families, safe high-quality care, support for good health (which includes preventing falls and smoking) and stronger patient-centred focus among providers.

At this early stage, QIWG is depending on data that already exists for its reports. Much of the data are the kind of information that was so helpful in fixing Saskatoon's poor record on caring for heart failure – national data on patient care, organized into comparative charts. That's invaluable at one level, as we've seen, but it doesn't help with finding better ways to run the system, or planning services or improving overall patient health.

That means, Dr. Teare says, that the first report on the Surgical Initiative, due later this year, may not have data for even half the indicators they eventually hope to report on at regular intervals. And some of the information for the ones they will "cobble together" will not be as robust as they would like – for instance, they will gather a fair bit of information about people who are referred for surgery, but there is almost as much to be learned from people who see a surgeon but don't have an operation – should they not have been sent for a consultation at all? Should other treatments have been tried first? Did they not want to wait? There will be no way of knowing.

It also means the first report, perhaps the first few, will have a greater emphasis on what goes on in hospitals, but is likely to be much weaker on community care (a problem that should improve as greater attention is paid to systematic data collection in that sector and more doctors start using electronic medical records).

Ultimately, Dr. Teare hopes to have a set of indicators that will give everyone in Saskatchewan "a good line of sight from the work the providers are engaged with day to day and the goals the system is engaged with as a whole." Bonnie Brossart, chief executive officer of HQC, acknowledges that they are starting with a less-than-perfect information system: all the information they would like isn't available, and some data are old by the time they are collected and released; but even this modest start can make a contribution to improving healthcare, she says.

"It's still information; it can inspire a conversation, but in the absence of the data the conversation doesn't happen, or it's bullied into directions that maybe there are no grounds for, but no one has any evidence for dealing with that."

Even imperfect measurement, she says, can be useful, because if people say it's incorrect, they have to be prepared to show how, which may encourage them to collect data themselves, or at least define what they think is relevant and what would be useful for them, asserts Brossart.

"Don't let the perfect be the enemy of the good," she advises. "Yes, it's not timely, yes it's not perfect, but that's not entirely a bad thing because wherever it initiates a conversation with people who want to improve care, it's one more step in the right direction."

About the Author(s)

Jane Coutts is a healthcare writer based in Ottawa, Ontario. This article was commissioned by HQC.

Comments

Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed