Healthcare Quarterly

Healthcare Quarterly 12(Sp) August 2009 : 161-167.doi:10.12927/hcq.2009.20985
Creating Safety Learning Systems

TOCSIN: A Proposed Dashboard of Indicators to Control Healthcare-Associated Infections

Régis Blais, François Champagne and Louise Rousseau

Abstract

Healthcare-associated infections (HAIs) constitute a major safety problem. Healthcare managers need complete and valid information to fight against these infections. The purpose of this study was to develop a dashboard of indicators to help healthcare managers monitor HAIs. A pilot testing approach was used that was composed of the following steps: literature review, consultation with infection control experts and healthcare managers, operationalization of selected indicators, data collection from six Quebec hospital complexes to test the feasibility of the selected indicators and results dissemination. The literature review identified 299 possible indicators. After consulting infection control experts and healthcare managers and having collected data in the hospitals, a proposed dashboard was created that includes 97 indicators divided in three categories (structure, process and outcome) and grouped in 22 themes. The proposed indicators are both scientifically valid and administratively feasible. However, many healthcare facilities need additional financial resources and expertise to measure these indicators and manage the information they will generate.

Healthcare-associated infections (HAIs), previously called nosocomial infections, constitute a serious problem for patient safety. Traditionally, HAIs have been defined as infections that are acquired during hospitalization. Today, however it has been recognized that infections can be contracted in all settings where healthcare is delivered (Archibald and Hierholzer 2004). HAIs have always existed, particularly in the hospital environment, but the highly publicized outbreaks of severe acute respiratory syndrome (SARS) and Clostridium difficile in recent years in Canada and the growth of antibiotic-resistant bacteria have raised concern about the problem and underlined the need for healthcare authorities to react vigorously.

Some studies have shown that HAIs rank second among avoidable adverse events, right after adverse drug events (Kohn et al. 1999; Leape et al. 1991). Infections that occur most frequently are urinary infections, nosocomial pneumonia, surgical site infections and primary nosocomial bacteremia. HAIs affect about 5-10% of hospitalized patients, depending on the level of care and characteristics of the clientele, and this rate appears to be growing (Gourdeau et al. 2005). The death rate attributable to these infections is estimated to be between 1 and 10%, depending on the type of infection; furthermore, these infections extend hospital stay by about four days (Centers for Disease Control 1992; Haley et al. 1985; Jarvis 1996). Studies conducted in the United States have shown that HAIs are very expensive for the healthcare system and that such costs vary depending on the infection site (Nettleman 2003). Based on projections by the Quebec nosocomial infections committee, it has been estimated that HAIs cost about $180 million dollars and require an average of 1,200 beds annually in the province of Quebec alone (Comité sur les infections nosocomiales du Québec 2004).

It has been proposed that if appropriate actions are taken (e.g., the presence of a nurse and a physician with expertise in infection prevention, a structured program for infection prevention including a monitoring component, disclosure to surgeons of rates of surgical site infections), approximately one third of HAIs can be avoided (Haley et al. 1985; Mayhall 2004). Effective clinical and administrative measures exist to fight HAIs, but each healthcare facility should adapt them to its mission, size, clientele and environment (Mayhall 2004; Ranji et al. 2007; World Health Organization [WHO] 2002). To monitor their fight against HAIs, it has been recommended that healthcare facilities maintain a scorecard including appropriate indicators, some of which may be standardized for comparative purposes while others will remain specific to each facility (Comité d'examen sur la prévention et le contrôle des infections nosocomiales 2005).

Various organizations have proposed indicators that should be measured when dealing with HAIs (Canadian Council on Health Services Accreditation [CCHSA] 2007; Comité d'examen sur la prévention et le contrôle des infections nosocomiales 2005; Joint Commission on Accreditation of Healthcare Organizations [JCAHO] 1998; WHO 2002), and many healthcare facilities may have access to some ad hoc or sporadic data on infections. However, these indicators are partial and do not cover the full range of issues that need to be monitored. Furthermore, the proposed indicators have not always been operationally defined, nor have they all been organized into a scorecard or tested in the field. To our knowledge, there is no valid scorecard or tool available for local decision-makers that will allow them to monitor all relevant indicators and effectively fight against HAIs. Some scorecard or monitoring systems have been developed to follow-up on HAIs and related factors. Interesting tools have been developed in the United States (Kahn et al. 1995), France (Institut de Veille Sanitaire 2004) and Great Britain (Comptroller and Auditor General 2000), for example, but they are incomplete and do not suit the Canadian healthcare context or the needs of various facilities. Furthermore, some facilities have developed their own in-house systems, but these are incomplete and not standardized.

The prime responsibility for fighting against HAIs rests with local healthcare administrations. Once decision-makers are provided with full and valid information, they will be in a position to make better decisions based on facts and to adjust their HAI-control activities to their context. To take the most appropriate actions in their respective environments, local decision-makers need specific and up-to-date information concerning the full range of infection issues. At the present time, decision-makers do not have a tool or scorecard that allows them to effectively monitor all relevant indicators in the fight against HAIs. The objective of our project, called TOCSIN, was to fill this gap by developing a dashboard of HAI indicators in response to the needs of local managers working in various types of healthcare facilities (teaching hospitals, regional hospitals, health and social service centres and long-term care centres). TOCSIN is an acronym for the French expression "Tableau organisationnel de contrôle et de suivi des infections nosocomiales," which translates to "organisation chart for controlling and monitoring nosocomial infections." The word tocsinin French also means "alarm bell": it is hoped that the proposed indicators will alert the persons in charge about situations that need action.

Methods

The project was developmental research. It followed a general pilot testing approach by which field workers were regularly consulted on the research products, which were adjusted as required. All standard rules of ethics were respected. The project was approved by the Research Ethics Committee of the Faculty of Medicine of the University of Montreal.

Below is a description of the project's six steps: literature review, consultations with experts, consultations with managers, operationalization of the indicators, data collection and analysis of selected indicators and dissemination of the results to decision-makers.

Literature Review

An exhaustive review of national and international literature allowed us to gain a good understanding of variables that contributed to HAIs and measures needed to fight such infections. Based on the literature review, 299 potential indicators were identified representing the different variables that have to be taken into consideration to control and monitor HAIs. Following Donabedian (1966), indicators were then classified into three main categories: structure, process and outcome.

Consultations with Experts

The list of 299 indicators was submitted to an expert panel composed of four physicians specializing in microbiology and infectious diseases and two nurses working in the field of infection control and prevention. Experts individually assessed each indicator. At the end of this step, the list was reduced to 95 indicators which, according to the majority of the experts consulted, appeared to be the most scientifically valid.

Consultations with Managers

The list of indicators selected in the previous step was then submitted to administrative and clinical managers with different levels of authority (e.g., chief executive officers, directors of professional/medical services, managers in charge of quality and nurses in charge of infection control) and who are responsible for the fight against HAIs in four pilot healthcare facilities in Quebec (one teaching hospital, one hospital affiliated with a university and two health and social services centres, both of which included a hospital and one or more long-term care facilities). The indicators were also submitted to two infection control program managers at the Institut National de Santé Publique du Québec (Quebec Public Health Institute). We met with two to seven persons from each of these organizations in small group meetings. The purpose of the meetings was to verify with workers in the field how useful and feasible it would be to measure the proposed indicators. From their feedback, we were able to adjust and reformulate the indicators.

Operationalization of the Indicators

The step of indicator operationalization consisted of clearly defining how each indicator was to be measured (numerator, denominator, source, etc.), articulating the logic that justifies its use and developing a guide for the users (Champagne et al. 2005). This step was based on evidence from the literature and on consultation with the above-mentioned experts.

Data Collection and Analysis of Selected Indicators

The purpose of data collection and indicator analysis was to verify the feasibility of measuring the selected indicators. Indicator-specific data were collected from six healthcare facilities in Quebec (the four pilot centres mentioned above and two additional health and social services centres, both of which included a hospital and at least one long-term care facility). Two nurses trained in prevention control were hired; they visited the facilities to collect data from key stakeholders who had, or were able to get, the needed information. Data were compiled and analyzed across all participating facilities. These data were used to determine the current capacity of facilities to measure selected indicators and to document the difficulties encountered or anticipated in the use of these indicators. As a result, minor adjustments were made to the list of indicators. This step also made it possible to draw conclusions about how to use the selected indicators.

Dissemination of Results to Decision-Makers

The list of selected indicators and results specific to each facility were presented in person to participating health managers and professionals, with the exception of one facility that was not available for the meeting but that received the documents by e-mail. These meetings provided opportunities for collecting managers' and professionals' reactions to the proposed indicators and preferences for using them.

Results

Proposed Indicators

At the end of the six-step process, 97 indicators were selected and divided into three categories (structure, process and outcome). Structure-related indicators are used to verify whether the organizational structure of the facility, including its financial, human and information resources, is compatible with standards set out by government authorities or other bodies responsible for the control and prevention of HAIs. Process-related indicators are employed to measure infection control and prevention activities currently used within the facility. Outcome-related indicators are used to track HAIs and provide an overall picture of trends that result from infection control and prevention actions. Within each of the three categories, indicators are then grouped by theme (n = 22).

Table 1 provides a list of the selected indicators. The 22 themes are presented in boldface. The operational definition of the indicators and the references that support the indicators are available from the first author (R.B.).


 

Table 1. Indicators selected under TOCSIN*
Structure-Related Indicators
 
Process-Related Indicators
 
Outcome-Related Indicators
 
S1. Facility level project to fight HAIs
S1.1a ICP program approved by BOD or advisory committee to the CEO
S1.1b Annual objectives defined
S1.2a Annual action plan
S1.2b Date of plan adoption
S1.2c Dedicated budget
 
P1. ICP policies, protocols and procedures
P1.1a Management of outbreak: gastroenteritis
P1.1b Management of outbreak: influenza
P1.2 Cleaning and disinfection of endoscopes
P1.3 Cleaning and sterilization of critical reusable equipment
P1.4 Patient information management
P1.5 Patient isolation and cohorting
P1.6 Room disinfection
P1.7 Decontamination of non-critical reusable equipment
P1.8a Diseases transmissible by blood
P1.8b Varicella-zoster virus
P1.8c Tuberculosis
P1.9 MRB screening
P1.10 Verification of immune status of employees when hired
 
R1. Infections related to the use of medical equipment
R1.1 Bacteremia related to central catheters in intensive care
R1.2 Total bacteremia
R1.3a Pneumonia associated with mechanical ventilation
R1.3b Distinction: early versus late pneumonia
R1.4 Nosocomial urinary infections from bladder probes
 
S2. Infection Prevention Committee
S2.1 Infection Prevention Committee
S2.2 Multidisciplinary/composition
S2.3 Number of meetings/year
S2.4 Member participation
S2.5a Mechanism to inform BOD
S2.5b What mechanism
 
P2. Hand hygiene
P2.1 Hand hygiene policy
P2.2 Policy on gloves
P2.3 Compliance with hand hygiene
P2.4 Quantity of antiseptic handwash used
 
R2. Influenza
R2.1 Vaccination of target clientele
R2.2 Vaccination of employees by category
R2.3 Number of outbreaks of three cases or more/seasonal period
 
S3. ICP operational team
S3.1 Physician(s)
S3.2 Pharmacist
S3.3 Head nurse
S3.4 Regular nursing staff
S3.5 Secretary
S3.6 IT staff
S3.7 Other regular staff
S3.8 Access to laboratory data
S3.9 Access to imaging data
S3.10 Access to pharmacy data
S3.11 Local monitoring software
S3.12 Provincial monitoring software
S3.13 Admission, discharge, transfer software
S3.14a Participation in ICP network
S3.14b At what level (regional, other)
 
P3. Disinfection of surface areas and care equipment in patient's room
P3.1 Dedicated disinfection team
P3.2 Number of surface disinfections/24 hours
P3.3 Number of patient bathroom disinfections/24 hours
P3.4 Number of emergency bathroom disinfections/24 hours
P3.5 Cleaning and disinfection techniques written down and distributed
P3.6a Full disinfection audit after discharge or isolation
P3.6b Percentage of successful disinfections
 
R3. HAIs unrelated to medical equipment
R3.1 Nosocomial CDAD
R3.2 Nosocomial MRSA
R3.3 Bacteremia from nosocomial MRSA/nosocomial bacteremia from S. aureus
R3.4 Nosocomial VRE
R3.5 Viral nosocomial gastroenteritis
R3.6 RSV in pediatrics
R3.7a Surgical site infections by type of surgery
R3.7b Distinction: superficial versus deep versus organ infections
 
S4. New product purchasing committee
S4.1a Purchasing committee
S4.1b Minutes of committee meetings
S4.1c ICP nurse on committee
 
P4. Guidelines for prevention of HAIs related to intravascular catheters
P4.1 Policy for installation and maintenance of intravascular catheters
 
R4. Diseases transmissible by blood among staff
R4.1 Exposure to diseases transmissible by blood by category of employment
R4.2 Regular reporting to IPC regarding wounds
 
S5. ICP trainers training plan
S5.1 Training plan for ICP trainers
 
P5. Equipment at bedside dedicated to a patient who has transmissible HAI or dedicated to a cohort
P5.1 Patient-dedicated equipment
P5.2 Cohort-dedicated equipment
 
R5. Prevention of airborne transmissible diseases among staff following exposure
R5.1 Number of tuberculin test conversions over number of post-exposure to tuberculosis follow-ups
R5.2 Number of post-exposure follow-ups with negative serology over number of post-exposure to chickenpox follow-ups
 
S6. Human resources for hygiene and cleanliness in areas of clinical activity
S6.1 Human resources 24/7
S6.2a Addition of staff at outbreak
S6.2b Staff trained in ICP
 
P6. Respiratory etiquette for patients, visitors and companions who show cold or influenza symptoms when they come to emergency or other outpatient departments or who are circulating within the facility
P6.1 Respiratory etiquette
 
R6. Aspergillosis
R6.1 Invasive aspergillosis
 
S7. ICP communication plan
S7.1 Information management system
S7.2 Communication plan
S7.3 Patient information booklet containing information on HAI
S7.4 ICP regular bulletin
 
P7. Antibiotic monitoring in intensive care
P7.1 Antibiotic monitoring
P7.2 Bacterial resistance monitoring
 
 
  P8. Staff training in ICP
P8.1a Training when hired
P8.1b Percentage of employees trained when hired
P8.2 Ongoing training hours
P8.3 Percentage of employees who have received ongoing training
P8.4 Categories of employment that have received ongoing training
 
 
  P9. Monitoring of the use of antibiotics
P9.1 Program to monitor use of antibiotics
P9.2 Pre-operating prophylactic antibiotic therapy at right moment before incision
P9.3 Right choice of prophylactic antibiotic therapy
P9.4 Right duration of prophylactic antibiotic therapy
P9.5 Appropriate prophylactic antibiotic therapy according to three criteria
 
 
BOD = board of directors; CDAD = Clostridium difficile-associated diarrhea; CEO = chief executive officer; ICP = infection control and prevention; HAI = healthcare-associated infection; IPC = infection prevention committee; MRB = multi-resistant bacteria; MRSA = methicillin-resistant Staphylococcus aureus; RSV = respiratory syncytial virus; VRE = vancomycin-resistant enterococci; *Indicators in one column are not directly related to indicators in the other columns.


 

Current Monitoring of HAIs

The consultation with participating facility managers (step three) as well as the data collection from these facilities to verify the feasibility of using the proposed indicators (step five) allowed us to make several observations about the current monitoring of HAIs. First, the means and practices for gathering data on HAIs vary a lot by facility and are quite often minimal. Although managers find them interesting, many of the proposed indicators are not currently measured. In most of the participating facilities, this is due to a perceived lack of financial and human resources. Second, facilities very often do not have any integrated information management system concerning HAIs. Such a system would define the methods to collect, analyze and disseminate information within their facility, as well as define the persons responsible for these various tasks. (Such a mechanism or system, which is the starting point for any process to monitor a phenomenon, corresponds to proposed indicator S7.1.) Consequently, the information gathered is often incomplete or contradictory within the same facility. Third, data collection activities in long-term care facilities are insufficiently developed compared with those of acute care hospitals (as is also true with general measures to fight against HAIs). Fourth, even if managers are interested in using the proposed indicators internally for formative purposes (i.e., for quality improvement), they are afraid of the data being used summatively (for accountability), and they are reluctant to see their facilities compared with other facilities, unless their specific situation is taken into account (e.g., mission and type of clientele treated).

Discussion

The main objective of this study was to develop a dashboard of indicators for HAIs that would correspond to the needs of local healthcare facility managers. The approach used, which was based on an analysis of the scientific literature, consultations with infection control experts and healthcare facility managers and a pilot testing of the indicators in different types of healthcare facilities, lead to the identification of a series of indicators that are better in two ways than indicators that existed previously. First, the TOCSIN indicators are scientifically valid (i.e., based on the best available evidence), and they are more comprehensive than those proposed elsewhere (CCHSA 2007; Comité d'examen sur la prévention et le contrôle des infections nosocomiales 2005; Comptroller and Auditor General 2000; JCAHO 1998; Kahn et al. 1995; Institut de Veille Sanitaire 2004; WHO 2002), which do not cover the full range of structure, process and outcome dimensions of HAIs. Second, the TOCSIN indicators have been field tested and are considered feasible by healthcare managers. However, given their relatively large number, some considerations as to their use are warranted:

  1. A given healthcare facility only needs to measure the indicators relevant to its specific clientele and services, not all the TOCSIN indicators.
  2. Local managers, in consultation with their infection control and prevention team and according to the rate of infection in their setting and the processes they wish to strengthen, should decide which specific indicators to prioritize. For example, if surgical site infections are a priority, the healthcare facility will want to measure the structure indicator S3.10 (access to pharmacy data - to be able to follow the use of antibiotics), the process indicators P9.2-P9.5 (which assess the appropriateness of antibiotic therapy used during surgery) and the outcome indicators R3.7a and R3.7b (which measure the actual rate of surgical site infections).
  3. The frequency of measurement of indicators will vary according to the nature of indicators. Some indicators, especially structure-related ones, could be measured only once per year. Other indicators will be measured on a more regular basis (e.g., every quarter, every month or more frequently), depending on the need.
  4. The focus of measurement could alternate from period to period. For example, catheter-related infections could be measured in the first quarter, surgical site infections in the second, etc. (Pronovost et al. 2006).
  5. If an overall picture is sufficient, data for indicators from the same theme may be combined, thus reducing the amount of information. Of course, looking at single indicators may also be warranted at times. In sum, it is up to managers to select the indicators that best fit their local context and needs.

This study has a number of limitations. First, the TOCSIN indicators apply mainly to short-term and long-term hospitalizations. Since HAIs also occur elsewhere, the proposed indicators would have to be adapted or new indicators developed for other contexts of care (home care, rehabilitation centres, outpatient care, emergency departments, etc.). Second, the TOCSIN indicators were not designed for the calculation of a global score (like France's ICALIN [Indice Composite des Activités de Lutte contre les Infections Nosocomiales; Institut de Veille Sanitaire 2004]), which can be useful for comparative purposes. Such a global score would require that indicators be weighted, which could be a major challenge since healthcare facilities are different (e.g., clientele, services) and may attach a different importance to various indicators. In addition, the possibility of the TOCSIN indicators being used for accountability purposes may reduce the adherence of managers, induce indicator misuse and thus limit indicator usefulness for improvement purposes (Perrin 1998). Third, the indicators proposed here have been based as much as possible on the best available research or, when research is lacking, on the opinion of experts. However, in some cases, there is limited evidence on the relationships between the structure, process and outcome dimensions. For example, we may ask whether the existence of a new product purchasing committee (structure-related indicator) will necessarily lead to a reduction in infections related to the use of medical equipment (outcome-related indicator). In such instances, additional research is needed to test the relationships between structure, process and outcome dimensions; this would in turn reinforce the validity of selected indicators.

Conclusion

HAIs are a major challenge in healthcare systems today. Healthcare managers need complete and valid information to make appropriate decisions to fight these infections. The proposed TOCSIN indicators, which appear to be both scientifically valid and administratively feasible, can be a useful information tool for that purpose. The Quebec Ministry of Health and Social Services has started to implement indicators based on TOCSIN. However, to effectively fight HAIs, many healthcare facilities need additional financial resources and expertise to measure the proposed indicators and manage the information they will generate.

About the Author

Régis Blais, PhD, is a professor in the Department of Health Administration and a researcher in the Groupe de recherche interdisciplinaire en santé (GRIS) of the University of Montreal, Montreal, Quebec. You can contact Dr. Blais at 514-343-5907, by fax at 514-343-2448 or by e-mail at regis.blais@umontreal.ca.

François Champagne, PhD, is a professor in the Department of Health Administration and a researcher in GRIS.

Louise Rousseau, PhD, is a clinical assistant professor in the Department of Health Administration and an associate researcher in GRIS.

Acknowledgment

We want to thank the following persons for their expert contribution in this study: Lucie Beaudreau, RN, Yves Benoît, MSc, Monique Delorme, RN, Marc Dionne, MD, Charles Frenette, MD, Yvan Gendron, MSc, Marie Gourdeau, MD, Andrée Larose, RN, Anne Lemay, PhD, Myrance Maillot, RN, and Claude Tremblay, MD. We also gratefully acknowledge the collaboration of the six hospital complexes that participated in the study and the Institut National de Santé Publique du Québec. This project was funded by a grant from the Canadian Patient Safety Institute and the Quebec Ministry of Health and Social Services.

References

Archibald, L.K. and W.J. Hierholzer Jr. 2004. "Principles of Infectious Diseases Epidemiology." In C.G. Mayhall, ed., Hospital Epidemiology and Infection Control (3rd ed.). Philadelphia: Lippincott, Williams & Wilkins.

Canadian Council on Health Services Accreditation. 2007. Goals of the CCHSA in Terms of Patient Safety and Required Organizational Practices (ROPs); Evaluation of Implementation and Proof of Compliance. Ottawa, ON: Author.

Centers for Disease Control. 1992. "Public Health Focus. Surveillance, Prevention and Control of HAIs." MMWR Morbidity and Mortality Weekly Report 41: 783-87.

Champagne, F., A.L. Guisset, J. Veillard and I. Trabut. 2005. The Performance Assessment Tool for Quality Improvement in Hospitals (PATH Project): A General Description (Report No. R05-06). Montreal, QC: Groupe de recherche interdisciplinaire en santé, Université de Montréal.

Comité d'examen sur la prévention et le contrôle des infections nosocomiales. 2005. D'abord, ne pas nuire ... les infections nosocomiales au Québec, un problème majeur de santé, une priorité. Québec City, QC: Ministère de la Santé et des Services sociaux.

Comité sur les infections nosocomiales du Québec. 2004. Normes en ressources humaines pour la prévention des infections au Québec. Québec City, QC: Author.

Comptroller and Auditor General. 2000. The Management and Control of Hospital Acquired Infection in Acute NHS Trusts in England. London, England: The Stationery Office.

Donabedian, A. 1996. "Evaluating the Quality of Medical Care." Milbank Memorial Fund Quarterly 44: 166-203.

Gourdeau, M., C. Tremblay and C. Frenette. 2005. Impacts des infections nosocomiales et efficacité d'un programme de prévention. Paper presented at the Colloque sur la prévention des infections nosocomiales, Montréal, QC.

Haley, R.W., D.H. Culver, J.W. White, W.M. Morgan, T.G. Emori, V.P. Munn and T.M. Hooton. 1985. "The Efficacy of Infection Surveillance and Control Programs in Prevention HAIs in US Hospitals." American Journal of Epidemiology 121: 182-205.

Institut de Veille Sanitaire. 2004. Recommandations pour la mise en œuvre d'un tableau de bord de la lutte contre les infections nosocomiales de chaque établissement de santé français. Paris, France: Author.

Jarvis, W.R. 1996. "Selected Aspects of the Socioeconomic Impact of HAIs: Morbidity, Mortality, Cost and Prevention." Infection Control and Hospital Epidemiology 17: 552-57.

Joint Commission on Accreditation of Healthcare Organizations. 1998. "Surveillance Prevention and Control of Infection." In, Comprehensive Accreditation Manual for Hospital: The Official Handbook. Chicago, IL: Author.

Kahn, M.G., S.A. Steib, E.L. Spiznagel, D.W. Claiborne and V.J. Fraser. 1995. "Improvement in User Performance following Development and Routine Use of an Expert System." Medinfo 8 Pt. 2: 1064-67.

Kohn, L.T., J.M. Corrigan and M.S. Donaldson, eds. 1999. To Err Is Human: Building a Safer Health System. Washington, DC: Institute of Medicine, National Academy Press.

Leape, L.L., T.A. Brennan and N. Laird. 1991. "The Nature of Adverse Events in Hospitalized Patients. Results of the Harvard Medical Practice Study II." New England Journal of Medicine 324(6): 377-84.

Mayhall, C.G., ed. 2004. Hospital Epidemiology and Infection Control (3rd ed.). Philadelphia: Lippincott, Williams & Wilkins.

Nettleman, M.D. 2003. "Cost and Cost-Benefit of Infection Control." In: R.P. Wenzel. Prevention and Control of HAIs (4th ed.). Philadelphia: Lippincott, Williams & Wilkins.

Perrin, B. 1998. "Effective Use and Misuse of Performance Measurement." American Journal of Evaluation 19: 367-79.

Pronovost, P.J., M.R. Miller and R.M. Wachter. 2006. "Tracking Progress in Patient Safety: An Elusive Target." Journal of the American Medical Association 296(6): 696-99.

Ranji S.R., K. Shetty, K.A. Posley, R. Lewis, V. Sundaram, C.M. Galvin, and L.G. Winston. 2007. "Prevention of Healthcare-Associated Infections. " Vol 6 of: K.G. Shojania, K.M. McDonald, R.M. Wachter and D.K. Owens, eds., Closing the Quality Gap: A Critical Analysis of Quality Improvement Strategies (Technical Review 9; AHRQ Publication No. 04(07)-0051-6). Rockville, MD: Agency for Healthcare Research and Quality.

World Health Organization. 2002. Prevention of Hospital Acquired Infections, A Practical Guide (2nd ed.). Geneva: Author.

Comments

Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed