The authors review their 30 years' experience in determining the best research applications for routinely collected data from ministries of health, education and social services. They describe the rich research opportunities afforded by 40 years of data on health – i.e., every patient contact with hospitals, physicians, drugs and more – from the problems encountered in convincing an academic journal that meaningful findings could be culled from information collected on paying bills and tracking patients, through studies on education (enrolment, grades, standardized tests for grades 1 to 12), family characteristics (residential moves, marital formation and breakdown, number and timing of births) and social services (welfare recipients, children taken into care, protection services offered children in the family). They also detail how and why the Manitoba Centre for Health Policy was founded, and how it has continued through multiple ministerial, deputy and government changes.

Being offered 4,500 words to reflect upon and review our professional lives is a privilege – and also a challenge. This offer gave us the opportunity to go back through files, datebooks and dusty folders, coming across long-lost memories and events.

Because creating a history of the Manitoba Centre for Health Policy (MCHP) and its work with administrative data is central to this exercise, we'll start there. The historical review identified a whole series of "highlights" – events that were particularly important, memorable and fun to recount. The review also prompted a focus on "lessons learned" – insights that may resonate with others trying to work in similar circumstances. The review also revealed a series of challenges that we faced and mostly survived. Finally, all this reflection underscores what remains to be done – the opportunities for the next generation of those working with administrative data and setting up policy centres.

Our History of Working with Administrative Data

Meeting Paul Henteleff, Assistant Executive Director, Health Services at the Manitoba Health Services Commission, who was responsible for the data section, and becoming aware of the remarkably rich, routinely collected electronic hospital and physician records for the Manitoba population, was unquestionably where everything started. We had recently arrived from the United States, where Noralou had conducted a series of interviews to determine the problems with a fractious, stalled merger of the Northwestern teaching hospitals. The potential for conducting research with anonymous data already collected on all Manitobans who existed in the past and present was most attractive.

We started working with tonsillectomy both because it was a clinical issue of some interest to Paul Henteleff and the Manitoba Health Services Commission at that time, and because the American Academy of Pediatrics guidelines based its assessment of appropriateness on the number of episodes of respiratory illness a child experienced. The number of times a child had been seen by a physician for such problems in the period before surgery could be counted using claims data. The first paper submitted for publication was rejected because reviewers doubted the validity of the diagnoses entered on physicians' claims. Fortunately, the next month the New England Journal of Medicine published a paper assessing the validity and reliability of clinical judgments (Koran 1975). Koran reported both intra-observer agreement (agreement of a physician with himself/herself regarding repeated observations) and inter-observer agreement (agreement of two or more physicians with one another). The same approach could be used for assessing the validity of diagnoses on physician claims. We compared the two diagnoses received by patients who had a respiratory diagnosis and who had a second physician visit with a respiratory diagnosis within one week following their first visit. Agreement rates on these two visits were almost as high as Koran found in his research (Roos, Henteleff et al. 1977). Because some of the mismatches included such complaints as injuries or broken bones (which clearly could have reflected a new accurate diagnosis), reviewers were convinced that the administrative data were a valid research resource – and we had our first two major publications (Roos, Henteleff et al. 1977; Roos, Roos et al. 1977).

In the 1980s we continued with surgical procedures that are easy to study with administrative data, including cholecystectomy, hysterectomy, prostatectomy and hip replacement. There was little disagreement about whether a surgical procedure had occurred, and complications, particularly when they involved revisions (hip replacement) and re-operation (prostatectomy), were interesting and easy to track. The administrative data showed real strengths: revision rates for prostatectomy over an eight-year follow-up period were more than double the highest estimate available from the published literature (Wennberg et al. 1987). Several reasons were found for the discrepancies between our findings and those published in the literature. Patients were sometimes unavailable for follow-up, and in some cases there were shorter periods of follow-up. When patients had complications following a procedure, they often didn't go back to the same surgeon and hence were being missed in the published data, which focused on one group's or one hospital's patients.

A major innovation came when Les decided we could and should use the health system registration data to track whether someone did not have a complication following surgery because he or she was healthy and had no problems, or because the patient had left the province and hence was lost to follow-up. Noralou was miffed at the time because this meant completely reanalyzing a set of hysterectomy outcomes data – and made little difference to the results. But the concept of the research registry and our ability to track an individual's presence (or absence owing to moves or death) over long periods of time greatly expanded the questions that could be addressed.

Each of these clinical issues was approached working with a clinical specialist in the area – often the head of the department. We focused on hip replacement because an orthopaedic surgeon, David Lyttle, was concerned about quality-of-care issues and thought the administrative data could be used to look at re-operation rates. We found that 2.7% of patients required re-operation within two years and 4% of patients were readmitted to hospital with other surgical complications (Roos and Lyttle 1985). Our first non-surgical focus involved mortality rates associated with acute myocardial infarct (AMI) and working with a cardiologist, Andrew Morris, who was interested in standards of care in rural Manitoba (Morris et al. 1983). Concerned that out-of-hospital deaths might not be recorded in the Ministry of Health database with which we were working, we met with the local office of Vital Statistics and started a relationship resulting in the annual transfer of mortality data (including cause of death) from their offices to us at the university. This addition significantly improved the database, providing both a check on the registry and independent information on cause of death.

We also accomplished our first merger of administrative data with survey data – a representative cohort of elderly Manitobans interviewed as part of an initiative by the provincial gerontologist, Betty Havens (Mossey et al. 1981). This merger allowed us to address such important questions as, "Are those patients who are not accessing care, or not visiting physicians, individuals who are very sick and isolated and in need?" As we found using the survey data, the non-users in a universally funded healthcare system were basically healthy individuals (Shapiro and Roos 1985).

In 1984 the University of Manitoba and the Manitoba Health Services Commission exchanged letters, and the university accepted responsibility for housing the anonymized database. Over these early years we had little contact with the owners of the data, the Manitoba Health Services Commission, except with regard to new data requests. We spent little time with anyone in the Ministry of Health. Every three to five years when we wanted to add to or update our data sets, we would invite a prominent researcher (John Bunker came from Stanford University in 1986) to speak to key individuals in the ministry about how important the research was. Such advocacy helped us maintain access to the data. We also did little with the press. After one bad experience with a headline, whenever called by a reporter, Noralou would be as boring and dry and brief as possible.

All the work was supported by external funding agencies on topics that we, as researchers, identified as interesting and doable. In the mid-80s our funding level had grown from $300,000 to $400,000 per year. We focused on publication in high-profile journals.

The Academic–Ministry Interface

Then, in 1988 we were asked to join Fraser Mustard's and Bob Evans's population health group at the Canadian Institute for Advanced Research (CIAR). This major change led to the founding of the Manitoba Centre for Health Policy (and Evaluation) in 1990. The CIAR group was struggling to develop a new approach to understanding the full range of determinants of health. The kinds of data and analysis being done in Manitoba were seen as both innovative and strategic.

The early years of the MCHP meant a steep learning curve. Noralou spent the first six months trying to determine who in the ministry used data for making decisions because she assumed we could provide whatever they needed. This simplistic view was quickly abandoned. She then thought that we would figure out the best set of "indicators" that could be developed from each type of data (hospitals, physicians, nursing homes), and we would then update these indicators annually. This step was important in deciding how to work with the data (and research on such indicators continues with each new data set). But it became clear that, if we were to be useful, we needed to focus on specific projects of interest to the most senior levels of the Ministry of Health. This, in turn, led to periodic meetings with the deputy minister and minister to discuss important issues facing the government, discussions that continue to this day and are central to maintaining the relevance of the MCHP's work.

A major driver of the government's agenda in the 1990s was the fiscal problems it and other provinces were facing. Manitoba was committed to cutting healthcare costs and particularly to closing hospital beds. Bed closures began in 1992; by 1996, 24% of the beds in Winnipeg hospitals had been closed. One of the early projects the MCHP took on was to monitor the effects of bed closures on mortality, readmission rates and access to care.

Our analyses showed that the system responded remarkably well. In the face of shortening lengths of stay and expanding outpatient surgery, we were able to detect no negative impacts on quality of care (Brownell et al. 2001). The minister wanted results as soon as possible, and we agreed to produce the first report on the first year of data available after closures began. (The MCHP typically receives the next year of data in the fall following the March 31 fiscal year-end.) With bed closures such a challenging issue (newspaper reports had predicted an increase of people dying in the streets), we were very concerned about the study's accuracy. We wrote up the first year's report (which found essentially no negative impact from the closures) and delivered it to the government, which planned to announce the results on a given date. One of our most uncomfortable periods was waiting for this announcement, because by then, we had the second year of data. We pulled out all stops to replicate our earlier analyses in case our early work was proven wrong. We knew the second year's results would come in about three weeks before the announcement was due. Fortunately, the early "no negative impact" findings held.

One of our analysts (Ron Wall), on a visit in 1992 to the Manitoba Health Services Commission, noticed the collection of forms and files on cost data routinely reported by Manitoba hospitals. He suggested that we could use the cost data, combined with case-mix data created from the hospital files, to estimate comparative costs per hospital across the system. When then-Minister of Health Donald Orchard was asked at an early meeting if he would be interested in our working on these calculations, he was keen. We emphasized that this was a major job, because we would first have to demonstrate that all the pieces could be validly measured.

One of the first stages of the process compared lengths of stay for several types of patients at Winnipeg hospitals (Roos and Brownell 1994). Before the report was released, Manitoba's Deputy Minister of Health, Frank Maynard, called together the CEOs of the hospitals to whom we presented the results. One hospital was an outlier on several of the conditions, with its patients having the longest stays. When we got to one of the last groups of patients – those with a psychiatric diagnosis – the CEO of this "long-stay" hospital said (before he had seen the results), "Okay, I know we have a problem here; I have met with the head of psychiatry and we are trying to turn this around." In fact, his hospital had one of the shortest lengths of stay for these patients. This interaction was helpful in convincing the group that they really didn't know how their practice patterns compared with those of other hospitals. Our work, they somewhat grudgingly admitted, was potentially useful.

Our subsequent costing report focused attention on the high costs and high proportion of patients in Manitoba treated at teaching hospitals. As a result, teaching hospitals became a particular focus of bed cuts. Because we were not physicians, the MCHP – based in the Faculty of Medicine – was not always appreciated by other faculty members. At this time we were fortunate in having Nick Anthonisen as dean of medicine. While he didn't always agree with our conclusions, he respected our work. As part of the release of our most controversial reports, the dean convened a faculty forum where we were able to present our results and respond to questions and criticisms.

As academics, we could bring the research of others to the attention of the deputy minister of health. During this period of bed closures, one of the vice-presidents of a teaching hospital claimed that every patient at his hospital needed to be there; there was no room for early discharge. We knew that studies of the appropriateness of acute hospital care had been done in the United States and in another Canadian province, using physician-developed measures of acuity derived from medical records. We suggested to the deputy minister that if he were interested in having us assess acuity levels, and was willing to pay extra for the necessary abstractions from hospital records, questions of appropriate hospital use could be answered. Following up on his interest, 51% of the admissions and 67% of the Winnipeg hospital days used by adults with medical conditions were assessed as non-acute (inappropriate) (DeCoster et al. 1997).

While much of our early focus was driven by the government's cost-cutting agenda, and our results showing no negative effects were welcome, sometimes findings ran counter to government plans. The government committed early on to a strategy of redirecting rural patients, who occupied 20% of Winnipeg hospital beds, into less expensive rural hospitals. This decision seemed to make sense, as MCHP had shown how expensive care in the teaching hospitals was. In addition to saving money, such a policy appealed to the Progressive Conservative government's strong rural base. We were asked to determine which rural hospitals needed to be expanded to absorb the patients displaced by Winnipeg hospital closures. We were reluctant to take on this project, as we were fairly sure the answer would not be what was expected. We were right; our report showed essentially no rationale for expanding rural hospitals. However, we kept the minister and his deputy briefed on our early results. To give the minister, Don Orchard, particular credit (the hospital in his constituency would have benefited from this expansion), he understood the results and gave us the opportunity to brief caucus members and explain these unwanted findings.

How we thought about what needed to be done was much influenced by our association with the CIAR Population Health Group. This group changed the dialogue across Canada, focusing on the role played by socio-economic status and education as key determinants of health. Our research – showing the relatively high proportion of non-acute care patients in Winnipeg hospitals, the variations in surgical rates across the province and the inefficiencies in the current system – correspondingly helped to reorient thinking about the healthcare sector. In joining these two strands together, our work on acuity also showed that despite their higher rates of hospital use, hospital stays for patients who lived in the poorest neighbourhoods did not represent "social admissions," as some suggested; patients admitted from the poorest areas had acuity levels at least as high as those of individuals from the wealthier areas.

Key Insights Gained Over the Years

Organize the data infrastructure

Les had the insight to insist on several strategies:

  1. Force the use of one programming language throughout the operation – in our case, SAS. While there were a few problems and cases were made for adding other capabilities, this policy undoubtedly gave us a more efficient and more unified programming structure. New features have allowed SAS to be used in MCHP's Remote Access Sites (secure terminals located elsewhere at the University of Manitoba).
  2. Write generalized, probabilistic record linkage software using SAS. This request provided a flexible capability for putting various files together, allowing needed additions to the program to be made in a timely manner. A more recent version of the software is being used at several sites (including the Manitoba Ministry of Health), both nationally and internationally.
  3. Force a centralized documentation system of definitions and code to describe how things are defined and operationalized. Our concept dictionary and research resources were made Internet-accessible in the 1990s.
  4. Develop a population registry to provide a flexible way of generating population denominators. We could then understand not just who was getting health services, but who was not; not just who was enrolled in grade 12 and wrote exams, but who should have been writing them (i.e., those who had remained in Manitoba since birth and weren't writing because they had dropped out of school or had been held back) (Roos et al. 2010).

Ask for all the data

When we were first working with ministry staff to identify which data would be transferred to the university, staff asked us to specify which fields we wanted. Since we didn't know which fields they had, and record layouts were considered confidential, this was a painful process. Eventually we realized that blank hospital discharge forms and physician claim forms would provide information on the fields potentially available. Our current approach when acquiring new files is to ask for all valid data fields (other than names and street addresses); fields once thought to be irrelevant have often proved essential for a project. These rich data files form the backbone of the repository and can be accessed when the next study is designed.

Organize areas by health or socio-economic status

Typically, reports organize presentation of data on areas by geographic location or alphabetically, to make the specific areas easier to find. We started early on by organizing the areas from best to worst, according to the health status of area residents (using the premature mortality ratio – i.e., rate of deaths occurring before age 75). More recently, when data on outcomes for high-risk children are presented, findings are organized by socio-economic status of the areas. We have found that multiple indicators of health and socio-economic status result in similar area rankings. (Spearman correlation between our routinely used health measure – the premature mortality ratio – and our routinely used socio-economic factor index was 0.91, p<0.0001 across 80 areas.) Such organization ensures a focus on whether access to care reflects the needs of area residents and how strongly socio-economic status is related to health and educational outcomes.

Our most compelling evidence for demonstrating the relationship of socio-economic status to health came after looking at Manton's (1991) US study estimating the life expectancy that could potentially be gained by eliminating all types of cancer – 2.8 years. We were at the time (Roos et al. 2004) producing results showing that the potential life expectancy gains to those in the lowest-income neighbourhoods would be several times as large (as the gains from eliminating cancer) if they could achieve the health status of residents of high-income neighbourhoods – i.e., 11.3 years would be potentially gained by males and 7.7 years by females.

Data (if they are seen and understood) can change the dialogue (a bit)

Originally, the government had pledged that all care displaced from closing hospital beds would be replaced and equivalent care would be found for these patients elsewhere in the system. With our work on acuity levels in different hospitals, unexplainable differences in lengths of stay and the variations in surgical rates found across different areas of the province, the dialogue began to focus on the potential for making the acute care sector more efficient.

However, we have been less successful in attempts to review quality of care. A great deal of outcomes research, often resulting in hospital or physician "report cards," has been published (Robinowitz and Dudley 2006). We did early work evaluating surgical outcomes in Manitoba compared with other jurisdictions (Roos et al. 1990, 1992). Overall findings were highly favourable to Manitoba surgeons and hospitals, but the outcomes of one particular procedure performed in Manitoba hospitals (repair of hip fracture in the years 1979–1992) were worse than those in New England (Roos et al. 1996). Engaging the specialists involved in discussion of these data, or in the routine monitoring of such outcomes, proved impossible. Another project focusing on developing quality-of-care indicators for hospital comparisons proved divisive and raised serious issues with local physicians (Bruce et al. 2006). The most recent projects to succeed in monitoring quality of care have been led by a physician (Katz et al. 2006). Physician leadership may be a prerequisite for influencing the local system on quality concerns.

Interaction between academics and policy makers is a complex process

Policy making is clearly more complex than academics realize. We have been effective when evidence could be used in support of shifting the agenda or implementing policies that someone in the bureaucracy had already been working on. Often the evidence does not come as a surprise, but our making it public makes it harder to ignore; the evidence can then potentially be used to galvanize action.

Those in the Ministry of Health did not always welcome what we considered compelling data on system inefficiencies, information that we somewhat naively thought should have made their job easier. Evidence-based decision-making may require more political will than decision-making without such information. The real world is also complicated. While data may suggest that a given hospital discharges patients rapidly – a practice that would appear to be more efficient – we were also providing data showing that hospitals sometimes admit patients who do not really require acute care.

Often academics will work with information in ways that the data owners could not or would not have considered. To calculate an outcomes score across all students taking language arts exams, we combined the scores from similar tests given in different settings (to students in French immersion, those in French-language schools and those in English-language schools). This had not been done before. We could also use the health data (hospital and registry) to develop birth cohorts, ascertaining the enrolment status of each cohort member not taking a language arts test even though they were still in the province and, if they had stayed in school and continued with their cohort, should have been writing. We understood why the ministry didn't do this routinely; they could see the value in our efforts to develop population-based measures of educational achievement.

Ground rules for academics working with government

On the advice of university lawyers, our first contract stated that we would have the right to publish our findings. We also agreed to take on projects that the government requested but – after an early meeting with the deputy when he made two small requests with which Noralou was uncomfortable – agreed we would not "fire-fight" (do quick studies on government request). During the early years, Noralou would always take Evelyn Shapiro (who established the provincial home care program) or Brian Postl (the head of Community Health Sciences when the MCHP was established) to meetings with the minister and his deputy. Both were highly experienced in political interactions. Noralou wanted someone there who could help sort out what we could or should, as well as what we couldn't or shouldn't, do. She was also fortunate to have a very helpful government-based liaison, Tom McCormack. He advised us on when we needed to get the deputy minister involved in dealing with the bureaucracy, when asking for a "comfort" letter or memorandum of understanding was appropriate, and so forth.

Working across ministries

Working with the CIAR Population Health Group and focusing on the broader determinants of health led us to successfully seek Canadian Foundation for Innovation funding to bring education and social services databases into the repository. Work across the databases of different ministries creates high-payoff research insights. In conjunction with Marni Brownell, we have been able to understand the "overlap" across ministries in the high-risk populations they serve: how many teen mothers have parents receiving income assistance; how many were at one point in "protection" or in care of Family Services (Roos et al. 2010). As we started working with education and Family Services data, we had a breakfast meeting with the deputies from the various ministries before our MCHP Advisory Board meeting both to report back to them on progress and to gain insight into their priorities.

Timelines will likely be long

We completed our first project working with Family Services information in the mid-1990s but didn't do a second, or achieve the regular transfer of Family Services data to the MCHP, for another 10 years.

Challenges will occur

The province implemented an electronic system for tracking all out-of-hospital prescription drug purchases on a patient-specific basis. Noralou was a member of the committee setting this up, and joined others in arguing for not allowing the system to be owned and run by a private company. When the lead member of this system left government to work in the private sector for a company expecting to receive the prescription data, he wrote the deputy minister warning that the MCHP's security was lax and that there were breaches of confidentiality. Subsequently, the ministry and auditors conducted a six-month review of the MCHP's operations; we came out squeaky clean, and prescription drug data were eventually transferred to the MCHP.

When the Canadian Institute for Health Information (CIHI) was established in 1994, the new director met with the deputy minister, suggesting that the MCHP was no longer needed; the ministry should just cancel our contract and depend on CIHI for needed information. Fortunately our deputy minister at the time, John Wade, appreciated the strengths of the MCHP and said "thank you, no."

Working with the press

Most academics avoid the press. However, from the time the MCHP was founded, several of us decided we needed to work with the media to ensure that our research was accurately communicated to the public. We hired a media consultant to coach our authors to ensure that they knew how to interact with media representatives (for example, always have a series of sound bites ready to get across). Because the government seemed to pay attention to us (several of the first reports were released by the deputy or minister with us in attendance at the Legislature), the press paid attention to our reports. And because the press gave extensive coverage to these reports, stakeholders (hospitals, physicians) were also forced to look closely at our analyses and respond to them. The press and stakeholder response in turn influenced how the government saw us. We have subsequently tried, and changed, many approaches – press conference versus none, op-ed opinion pieces when reports were released versus none – but working with the media to communicate key findings has remained a priority.

Locating a centre with external "deliverable" responsibilities

The government was straightforward when setting up the MCHP: the funding came with the stipulation that payment was being made for services to be delivered – hence the concept of the "deliverable" projects agreed upon every year – something quite different from typical "curiosity-driven" investigator-initiated research. While initially we worked only with the Manitoba Ministry of Health, with the expansion of the repository several ministries became involved; currently, five deputy ministers serve on the MCHP Advisory Board. The MCHP is located in the Department of Community Health Sciences within the Faculty of Medicine. This arrangement has worked well when the department head and the dean are "hands off" and appreciate the unconventional academic role of the director. Over the last 20 years, MCHP Advisory Board members have sometimes questioned the reporting lines, suggesting that the director should be independent of an individual faculty and should instead report through the vice-president of research. We never pursued this suggestion. However, it has become clear in recent years that a centre such as ours must have support at the highest levels of the university, both for research that is multidisciplinary (not strictly healthcare-focused, somewhat unusual for a centre sited in a medical faculty) and for the director's role, which involves different demands than academics typically face.

National and international linkage

Linkages with researchers outside Manitoba have been attempted in several ways. We have frequently brought in speakers from elsewhere in Canada and hosted visitors (sometimes for up to a year) from other universities and centres. Arrangements have been made for timely reports to support such national efforts as the Romanow Commission (On the Future of Health Care in Canada). Work using the data repository with Canadian and US investigators outside Manitoba has been particularly successful; the analysis has been done on-site at the MCHP with supervision by a local researcher. Such work has led to some of our most highly cited and policy-relevant studies (Fedson et al. 1993; Forget et al. 2002; Romano et al. 1993).

Studies involving joint or cooperative data analysis across two or more provinces have had mixed success. The hurdles have included inadequate documentation of variables and differing provincial rules regarding data access and handling (Kephart 2002). Relatively simple comparative analyses using Ontario and Manitoba data have been successful (Tu et al. 2001).

Projects with Statistics Canada have suffered from "midstream" policy changes promulgated in response to federal issues. Nonetheless, important papers have resulted from such collaboration, and the potential remains great (Mustard et al. 1999).

Challenges for the Future

Introducing new perspectives

The possibilities for tracking health and other histories of family members across now 40 years of repository data have led to several proposals by various researchers. New data sets, such as those on housing and justice, offer unique opportunities.

Routinizing evaluation and using administrative data to track outcomes

We were big fans of the work of Duncan Neuhauser (1991, 1992), described years ago at Case Western Hospital, where patients entering the hospital were routinely randomized to different wards in order to compare patient outcomes and assess the effect of different staffing levels, treatments, medication approaches and more. We always thought we could and should implement something similar in the Winnipeg hospitals, using the administrative data to track outcomes; however, this has never been achieved.

Informing the public

We were told by deputy ministers that they "got" our messages, but our real challenge was to shape the public view. There isn't sufficient public understanding to permit government to use our findings to set a political agenda. This shortcoming may reflect the lack of relationship between healthcare spending and health, the inefficiencies in the healthcare system or the long-term pay-offs from investing in high-risk children. Academics can say things that those in government cannot. Several of us have recently received research support to try working with the media on exactly these issues.

Another suggestion has been that we need an updated version of a Misery Index (an old idea in economics that combines the inflation rate with the unemployment rate), or, more kindly, a Social Deficit or Social Balance Sheet, to monitor progress. Gauging society's success by growth in GDP alone needs to be challenged in informed ways. Ongoing work with the United Way and the City of Winnipeg has the potential to provide data to support efforts at improving society.

Données administratives au Centre des politiques de santé du Manitoba: réflexions


Les auteurs ont examiné leurs 30 années d'expérience dans le recensement des meilleurs applications de recherche que permet la collecte routinière de données auprès des ministères de la Santé, de l'Éducation et des Services sociaux. Ils décrivent les riches possibilités de recherche que permettent les données de santé recueillies pendant 40 ans – soit chacun des contacts des patients avec l'hôpital, le médecin, les médicaments ou autres – et ce, allant des problèmes qui surgissent au moment de convaincre une revue scientifique que des résultats significatifs peuvent découler de l'information recueillie sur les factures payées et le suivi des patients, en passant par les études sur l'enseignement (inscriptions, classes, tests normalisés pour les classes de niveaux 1 à 12), les études sur les caractéristiques familiales (déménagements, mariages et séparations, taux et fréquence de natalité) et les études sur les services sociaux (bénéficiaires d'aide sociale, enfants pris en charge, services de protection offerts aux enfants dans une famille). Les auteurs expliquent également comment et pour quelles raisons a été fondé le Centre manitobain des politiques en matière de santé, et comment il a continué ses activités en dépit de plusieurs changements de ministères, de délégués et de gouvernements.

About the Author

Noralou P. Roos, PhD, Senior Research Scientist, Manitoba Centre for Health Policy, University of Manitoba, Winnipeg, MB

Leslie L. Roos, PhD, Distinguished Professor, Senior Research Scientist, Manitoba Centre for Health Policy, University of Manitoba, Winnipeg, MB


We look forward to working on some of these issues and contributing what we can to others. It has been fun!


Brownell, M.D., N.P. Roos and L.L. Roos. 2001. "Monitoring Health Reform: A Report Card Approach." Social Science & Medicine 52(5): 657–70.

Bruce, S., H. Prior, A. Katz, M. Taylor, S. Latosinsky, P. Martens, C. DeCoster, M. Brownell, R. Soodeen and C. Steinbach. 2006. Application of Patient Safety Indicators in Manitoba: A First Look. Winnipeg: Manitoba Centre for Health Policy.

DeCoster, C., N. Roos, K.C. Carriere and S. Peterson. 1997. "Inappropriate Hospital Use by Patients Receiving Care for Medical Conditions: Targeting Utilization Review." Canadian Medical Association Journal 157(7): 889–96.

Fedson, D.S., A. Wajda, J.P. Nicol, G.W. Hammond, D.L. Kaiser and L.L. Roos. 1993. "Clinical Effectiveness of Influenza Vaccination in Manitoba in 1982–1983 and 1985–1986." Journal of the American Medical Association 270(16): 1956–61.

Forget, E., R.B. Deber and L.L. Roos. 2002. "Medical Savings Accounts: Will They Reduce Costs?" Canadian Medical Association Journal 167(2): 143–47.

Katz, A., R. Soodeen, C. DeCoster, B. Bogdanovic and D. Chateau. 2006. "Can the Quality of Care in Family Practice Be Measured Using Administrative Data?" Health Services Research 41(6): 2238–54.

Kephart, G. 2002. Barriers to Accessing and Analyzing Health Information in Canada. Ottawa: Canadian Institute for Health Information.

Koran, L.M. 1975. "The Reliability of Clinical Methods, Data and Judgments." New England Journal of Medicine 293(13): 642–46.

Manton, K. 1991. "The Dynamics of Population Aging: Demography and Policy Analysis." Milbank Quarterly 69(2): 309–38.

Morris, A.L., V. Nernberg, N.P. Roos, P.D. Henteleff and L.L. Roos. 1983. "Acute Myocardial Infarction: Urban and Rural Mortality." American Heart Journal 105: 44–53.

Mossey, J.M., B.J. Havens, N.P. Roos and E. Shapiro. 1981. "The Manitoba Longitudinal Study on Aging: Description and Methods." Gerontologist 21: 5518.

Mustard, C.A, S. Derksen, J.-M. Berthelot and M.C. Wolfson. 1999. "Assessing Ecologic Proxies for Household Income: A Comparison of Household and Neighbourhood-Level Income Measures in the Study of Population Health Status." Health & Place 5(2): 157–71.

Neuhauser, D. 1991. "Parallel Providers, Ongoing Randomization and Continuous Improvement." Medical Care 29(7 Suppl.): js5–js8.

Neuhauser, D. 1992. "Progress on Firms Research." International Journal of Technical Assessment in Health Care 8(2): 321–24.

Robinowitz, D.L. and R.A. Dudley 2006. "Public Reporting of Provider Performance. Can Its Impact Be Made Greater?" Annual Review of Public Health 27: 517–36.

Romano, P.S., L.L. Roos and J.G. Jollis. 1993. "Adapting a Clinical Comorbidity Index for Use with ICD-9-CM Administrative Data: Differing Perspectives." Journal of Clinical Epidemiology 46(10): 1075–79.

Roos, N.P. and M.D. Brownell. 1994. "Introducing Data into the Health Policy Process: Developing a Report on the Efficiency of Bed use in Manitoba." Health Care Management Forum 7(2): 46–50.

Roos, L.L., E.S. Fisher, R. Brazauskas, S.M. Sharp and E. Shapiro. 1992. "Health and Surgical Outcomes in Canada and the United States." Health Affairs (Millwood) 11(2): 56–72.

Roos, L.L., E.S. Fisher, S.M. Sharp, J.P. Newhouse, G.M. Anderson and T.A. Bubolz. 1990. "Postsurgical Mortality in Manitoba and New England." Journal of the American Medical Association 263(18): 2453–58.

Roos, L.L., R.K. Walld, P.S. Romano and S. Roberecki. 1996. "Short-Term Mortality After Repair of Hip Fracture. Do Manitoba Elderly Do Worse?" Medical Care 34(4): 310–26.

Roos, N.P., P.D. Henteleff and L.L. Roos Jr. 1977. "A New Audit Procedure Applied to an Old Question: Is the Frequency of T&A Justified?" Medical Care 15(1): 1–18.

Roos, N.P. and D. Lyttle. 1985. "Hip Arthroplasty Surgery in Manitoba: 1973–1978." Clinical Orthopaedics 199: 248–55.

Roos, N.P., L.L. Roos, M. Brownell and E.L. Fuller. 2010. "Enhancing Policy Makers' Understanding of Disparities: Relevant Data from an Information-Rich Environment." Millbank Quarterly 88(3): 382–403.

Roos, N.P., L.L. Roos and P.D. Henteleff. 1977. "Elective Surgical Rates: Do High Rates Mean Lower Surgical Standards?" New England Journal of Medicine 297(7): 360–65.

Roos, N.P., K. Sullivan, R. Walld and L. MacWilliam. 2004. "Potential Savings from Reducing Inequalities in Health." Canadian Journal of Public Health 95(6): 460–64.

Shapiro, E. and N.P. Roos. 1985. "Elderly Nonusers of Health Care Services. Their Characteristics and Their Health Outcomes." Medical Care 23(3): 247–57.

Tu, J.V., P.C. Austin, R. Walld, L. Roos, J. Agras and K.M. McDonald. 2001. "Development and Validation of the Ontario Acute Myocardial Infarction Mortality Prediction Rules." Journal of the American College of Cardiology 37(4): 992–97.

Wennberg, J.E., N. Roos, L. Sola, A. Schori and R. Jaffe. 1987. "Use of Claims Data Systems to Evaluate Health Care Outcomes. Mortality and Re-operation Following Prostatectomy." Journal of the American Medical Association 257(7): 933–36.