Reporting, Learning and the Culture of Safety
Systems that provide healthcare workers with the opportunity to report hazards, hazardous situations, errors, close calls and adverse events make it possible for an organization that receives such reports to use these opportunities to learn and/or hold people accountable for their actions. When organizational learning is the primary goal, reporting should be confidential, voluntary and easy to perform and should lead to risk mitigation strategies following appropriate analysis; conversely, when the goal is accountability, reporting is more likely to be made mandatory. Reporting systems do not necessarily equate to safer patient care and have been criticized for capturing too many mundane events but only a small minority of important events. Reporting has been inappropriately equated with patient safety activity and mistakenly used for "measuring" system safety. However, if properly designed and supported, a reporting system can be an important component of an organizational strategy to foster a safety culture.
Healthcare is not as safe as it should or could be: rates of adverse events, defined as situations where patients suffer harm from the healthcare they receive (or not receiving care that would have helped), in acute care have been shown to be high. For example, the Canadian Adverse Events Study found that 7.5% of patients admitted to a Canadian hospital suffered an adverse event (Baker et al. 2004). The National Steering Committee on Patient Safety listed the comprehensive identification and the reporting of hazards as one of "nine key principles for action" that served as a foundation for the committee's recommendations to make Canadian patients safer (National Steering Committee on Patient Safety 2002). Further, the committee recommended the adoption of non-punitive reporting policies within a quality improvement framework. Recently, the National System for Incident Reporting (Canadian Institute for Health Information 2011) was established by the Canadian Institute for Health Information, whose focus at the present time is incidents regarding hospital-based medication and intravenous fluids. The development of reporting systems to enhance patient safety has been proposed as a strategy in other countries; examples include the Australian Incident Monitoring System (Runciman 2002) and the National Reporting and Learning System in England and Wales (Williams and Osborn 2006).
Reporting is described in The Canadian Patient Safety Dictionary as "an activity where information is shared with appropriate responsible individuals or organizations for the purposes of system improvement" (Davies et al. 2003). Reporting is sometimes confused with disclosing, informing and notifying. Disclosing is the imparting of information, by healthcare workers (and, in some situations, healthcare organizations) to patients or their significant others, pertaining to any healthcare event affecting (or liable to affect) patients' interests (adapted from Davies et al. 2003). Informing, in this context, can be described as the sharing of safety-related information by an organization, or an individual healthcare provider, with stakeholders who are not responsible for the care of a particular patient or patient population (Figure 1). Reporting can be internal or external to a healthcare providing organization. Finally, notifying means to give formal notice (Merriam-Webster n.d.); in this context it would be providing notice to individuals in organizational positions of authority about an important safety event. Although voluntary sharing of patient safety incidents with provincial or national non-regulatory organizations is commonly referred to as reporting, using the definitions derived from The Canadian Patient Safety Dictionary, this voluntary sharing would more accurately, for the organization submitting the information, be referred to as informing.
What Are We Trying to Accomplish?
In addition to national strategies for incident reporting, internal systems are often set up locally; health system (e.g., health region, local health integration networks, groups of hospitals) and hospital-based incident reporting systems are ubiquitous. Is there a clear understanding of how reporting into these systems would translate into safer care? One goal of asking healthcare workers to submit reports of adverse events, close calls, hazardous situations or errors is to make possible the opportunity for a system to learn of vulnerabilities or weaknesses in care delivery processes. When reports of adverse events and close calls are submitted, the expectation is that reporting an event will lead to lessons learned; an analysis of the event will shed light on contributing factors, and interventions that address the system deficiencies that contributed to harm will be undertaken to reduce the likelihood of recurrence. Reporting systems can also be created for which the primary goal is accountability. Some organizations strive to hold healthcare providers more accountable for the care they provide by placing reports of close calls and adverse events in human resources files, which can then be used for performance assessment. Using reports to aid in assessing performance (negatively) is a strong motivation for healthcare providers to stop reporting, thus reducing the potential for organizational learning. Reporting systems with a primary purpose of accountability require the reporting activity to be mandatory; those systems with a primary purpose of learning are created so that the activity is voluntary and without repercussions for either not reporting or what is reported (with exceptions for criminal activity or gross negligence).
Reporting systems have been used for purposes other than learning and accountability. Some organizations use their reporting system data to track or measure safety in their system. For example, it is not uncommon for organizations to use reports of falls as their "falls rate" performance measure. But because the reporting is incomplete and not systematic, the events and issues that are captured do not reflect an objective estimate of the rate of adverse events; reporting systems are not helpful for estimating whether a healthcare system is becoming more or less safe.
Many organizations rely mostly on reports of adverse events as their source of information about system weakness and vulnerability. However, several studies have demonstrated that reporting systems capture only a minority of adverse events and close calls. Sari et al. (2007), in a large UK National Health Service hospital, compared the organization's routine incident reporting system with a review of case notes in 1,006 admissions. Of the 324 patient safety incidents identified in 230 admissions, 83% were picked up by case note review only, 7% by the routine reporting system only and 10% by both. The case note review detected all 110 admissions in which a patient suffered harm; however, the reporting system detected only 5% of these. Similarly, Cullen et al. (1995) found that only three of 54 adverse drug events (6%) had a corresponding incident report filed. Reporting has also been criticized for capturing too many "mundane" events (Shojania 2008) and for inappropriately being equated with meaningful patient safety activity (Vincent 2007).
A Role for Reporting: Using Reporting Systems to Transform a Culture
Sustained improvement in the provision of safe patient care requires an organizational commitment to a safety culture; reporting is an important contributor to this (Vincent 2007). Reason (1997) listed reporting and a reporting culture as one of four attributes of a safety culture. The other attributes are learning, flexibility and a just culture. Reason defined an organizational learning culture as "the willingness and the competence to draw the right conclusions from its safety information system, and the will to implement major reforms when their need is indicated" (1997: 196). He also suggested that the best safety information is derived from analysis of reports of incidents and near misses as well as from "proactive checks on the system's vital signs." A reporting culture is "an organizational climate in which people are prepared to report their errors and near-misses" (Reason 1997: 195). It is dependent on how an organization processes information and chooses to handle blame and punishment. Westrum (2004) described three archetypal organizational approaches for processing information: (1) pathological (messengers are shot/failure leads to scapegoating/novelty is crushed); (2) bureaucratic (messengers are neglected/failure leads to justice/novelty is problematic); and (3) generative (messengers are trained/failure leads to inquiry/novelty is implemented). Reason defined a just culture as "an atmosphere of trust in which people are encouraged, even rewarded, for providing essential safety-related information – but in which they are also clear about where the line must be drawn between acceptable and unacceptable behaviour" (1997: 195). Important success factors for a reporting system to be effective – based on our review of the literature and our experience in implementing a new, electronic safety and learning reporting system in the former Calgary Health Region – are listed in Table 1. The contribution of reporting to high-quality, safe patient care and reporting system dependencies are shown in Figure 2.
|Table 1. Important success factors for effective reporting systems|
Experience Implementing a Region-Wide Safety and Learning Reporting System
In the former Calgary Health Region, external reviewers were asked to complete an analysis of the safety of Calgary's healthcare system following the death of two patients in 2004 (Baker et al. 2008). Among the multiple recommendations made for making care safer was a proposal to redesign the region's incident reporting system to ensure that reports were used "as learning opportunities for the entire organization" and not as occasions for performance management. What existed at the time in the region was a paper-based reporting system that required staff to complete a detailed form and submit it to their direct supervisor. If the supervisor believed the report detailed an "incident," he or she would submit the report, via internal mail, to a central safety office; copies were sometimes kept by supervisors and placed in personnel files. The average duration between the time an incident occurred and the report reaching the safety office was 30 days. Basic summary-type statistics were generated, but there was no systematic process for report analysis: managers assumed responsibility for reading reports and "closing the file." The most common actions were to speak with the staff member(s) involved or to forward the report to someone else so that he or she could speak with the staff member(s). Rarely was a report received from a physician.
The decision was made to completely change the reporting system to complement a planned effort to evolve an existing bureaucratic attitude in the region toward a more generative attitude, with the ultimate goal of creating a safety culture. This was signalled by decommissioning the existing paper-based "incident reporting system" (incident was poorly defined and could refer to non–patient safety issues) and replacing it with an electronic safety reporting and learning system. An easily identifiable link to the reporting system was created on the region's internal home page so that it was readily accessible.
The fundamental premise was that an increase in the number of reports would positively reflect a shift in attitude and culture. The tactics used to increase reporting rates included making the act of reporting (1) acknowledged, (2) worthwhile, (3) easy to use and (4) safe (no real or perceived repercussions). Reporters received an automatically generated message acknowledging their effort to report and thanking them for submitting. To signal that reporting was worthwhile, a formal communication strategy was launched that produced patient safety alerts, safer practice notices and patient safety information sheets, the genesis of which were in part from the reports received. The system was designed around simplicity – there were minimal screens for users to complete, with a minimal amount of required information; the focus was a narrative story rather than check boxes requiring users to "classify" the report.
To make the system safe, the practice of using reports for performance management was stopped, a changed endorsed by the region's executives and board. A just and trusting culture policy was also introduced, based on a principle of appropriate accountability (i.e., not blame free), that declared that healthcare workers would not be punished for committing errors regardless of whether a patient was harmed or not, but maintained accountability at an individual level in other circumstances (i.e., rule breaking, intention to harm). Additional safety was designed into the system itself: in the new electronic system, the reporter's name was required in order to submit a report but was kept confidential (supervisors were prevented from knowing the name of the reporter).
Initial pilot studies of the new system showed that old habits were hard to break; to maintain the past practice of performance management, some managers resorted to using patient identifiers, chronological information and staff rotation information to determine who the reporter was and/or which staff members were involved in the report. Therefore, the reporting system was redesigned with two important (and executive-approved) modifications. First, a daylong educational program, supported by tool kits and workbooks, was created and delivered to over 1,500 leaders. The program introduced them to the region's four new safety polices: Reporting, Disclosure, Just and Trusting Culture, and Informing. These workshops provided leaders with a theory-based safety model for the changes being introduced and the rationale behind the new reporting system. Second, all possible identifiers (including those of the patient) were removed from the submitted reports. The premise was that the primary purpose of this reporting system was to serve as a safety culture carrier. Staff could tell their patient safety stories with the expectation that managers, by reviewing the substance of the reports, would target the system rather than individuals for change.
However, one of the unintended consequences was the alienation of many front-line managers who felt a disconnect between their responsibility for the care being provided in their area and their inability to use the new system to manage individual events (as described in reports). Managers were receiving reports signalling that something untoward had happened in their area of responsibility, and they felt disempowered in their attempts to manage the issues. The managers were still adhering to the prevailing culture, with the belief that each report required individual follow-up with a staff member, which was in conflict with a new safety culture that promised confidentiality to the reporter. Some managers created a workaround – the development of a parallel paper-based event management system (recreating the old incident report).
In the final evaluation of the new reporting system, staff overwhelmingly asked that they be given the option of whether or not to leave their name on the electronic report that would be sent to their manager. Although the intent of the region's quality and safety portfolio that led the implementation of the new reporting system was to make it completely voluntary and confidential, it was learned that it would take time to evolve to this ideal. In retrospect, the investment needed in change management and the requirement to also replace the non-safety uses of the legacy incident reporting system were underestimated.
The analysis of safety learning reports was well supported; this was an important factor for success. Centrally, there were two expert coders, and each one of the six clinical portfolios in the region had a patient safety leader who was responsible for (1) reading each report that originated within his or her portfolio; (2) scrubbing all identifiable information; (3) classifying and entering the report into the Safety Learning Database; (4) identifying those reports that required more immediate action and ensuring that timely notification was made available to responsible leaders; and (5) where appropriate, sending reports to members of "reading groups." For example, an interdisciplinary group of medication experts would read reports of adverse drug events and determine, through electronic communication with each other, if there were particular reports that required action or themes emerging from groups of reports. This expert group reviewed multiple reports about the failure to remove timed medication patches from patients. They took a systems approach to addressing this hazard and published a patient safety alert and configured the region's new patient care information system to provide electronic reminders for patch removal.
Notwithstanding some of the challenges with the new Safety Learning and Reporting system, the initial evaluation of its implementation showed positive results: reporting increased by 200% (Figure 3); reporter perception of safety (in reporting) and ease of use increased significantly (Table 2); there was an increased percentage of submitted reports that described close calls and hazards rather than adverse events; and feedback to reporters increased (more than 260 reporters received feedback on substantive and demonstrable changes that had been made to the system as a result of their reports).
|Table 2. Reporter perceptions: safety and ease of use for old and new reporting system|
|Level of Agreement||Pre-implementation (n = 309)||Post-implementation (n = 172)||p Value (Chi-Square)|
|Staff may be blamed unfairly when an incident report/safety learning report is submitted|
|I don't feel confident the incident report/safety learning report is kept confidential|
|The incident form/safety learning report takes too long to fill out|
|The incident report/safety learning report is complex and can be complicated to fill out|
|* For this response, n = 310.|
Before additional evaluations could be carried out, the region merged with other provincial regions in the creation of Alberta Health Services. This changed the direction of this system. However, the implementation of the Safety Learning and Reporting System demonstrated that it was possible to create an approach to reporting that was successful at meeting the requirements set out in Table 1 and thus supported a large healthcare organization's goal to positively influence its safety culture.
About the Author(s)
W. Ward Flemons, MD, FRCPC, is a professor of medicine at the University of Calgary, and a respirologist at Foothills Medical Centre, in Calgary, Alberta. You can contact him at 403-220-8722 or by email at firstname.lastname@example.org.
Glenn McRae, BSc, RN, BN, MBA, is an executive director with Emergency Medical Services in Alberta Health Services. You can contact him at 403-944-1323 or by email at email@example.com.
We would like to thank Dr. J.M. Davies for her helpful comments and review of the manuscript.
Baker, G.R., P.G. Norton, V. Flintoft, R. Blais, A. Brown, J. Cox et al. 2004. "The Canadian Adverse Events Study: The Incidence of Adverse Events among Hospital Patients in Canada." Canadian Medical Association Journal 170: 1678–88.
Baker, G.R, A. MacIntosh-Murray, C. Porcellato, L. Dionne, K. Stelmacovich and K. Born. 2008. High Performing Healthcare Systems: Delivering Quality by Design. Toronto, ON: Longwoods Publishing. Retrieved September 29, 2011. <http://www.longwoods.com/publications/books/571>.
Canadian Institute for Health Information. 2011. National System for Incident Reporting. Ottawa, ON: Author. Retrieved September 29, 2011. <http://www.cihi.ca/CIHI-ext-portal/internet/en/document/types+of+care/pharmaceutical/services_cmirps>.
Cullen, D.J., D.W. Bates and S.D. Small. 1995. "The Incident Reporting System Does Not Detect Adverse Drug Events: A Problem for Quality Improvement." Joint Commission Journal on Quality Improvement 21: 541–48.
Davies, J.M., P. Hébert and C. Hoffman. 2003. The Canadian Patient Safety Dictionary. Ottawa, ON: Royal College of Physicians and Surgeons of Canada. Retrieved September 28, 2011. <http://www.rcpsc.medical.org/publications/PatientSafetyDictionary_e.pdf>.
Merriam-Webster. n.d. "Notify." In, Merriam-Webster Dictionary. Retrieved February 8, 2012. <http://www.merriam-webster.com/dictionary/notify?show=0&t=1328730248>.
Morath, J.M. and J.E. Turnbull. 2005. To Do No Harm. San Francisco, CA: Josey-Bass.
National Steering Committee on Patient Safety. 2002. Building a Safer System: A National Integrated Strategy for Improving Patient Safety in Canada. Ottawa, ON: Author. Retrieved September 28, 2011. <http://rcpsc.medical.org/publications/building_a_safer_system_e.pdf>.
Reason, J. 1997. Managing the Risks of Organizational Accidents. Aldershot, United Kingdom: Ashgate Publishing Company.
Runciman, W. 2002. "Lessons from the Australian Patient Safety Foundation: Setting Up a National Patient Safety Surveillance System—Is This the Right Model?" Quality and Safety in Health Care 11: 246–51.
Sari, A., T.A. Sheldon, A. Cracknell and A. Turnbull. 2007. "Sensitivity of Routine System for Reporting Patient Safety Incidents in an NHS Hospital: Retrospective Patient Case Note Review." BMJ 334: 79. DOI: 10.1136/bmj.39031.507153.AE.
Shojania, K.G. 2008. "The Frustrating Case of Incident-Reporting Systems." Quality and Safety in Health Care 17: 400–2.
Vincent, C. 2007. "Incident Reporting and Patient Safety." BMJ 334(7584): 51.
Westrum, R. 2004. "A Typology of Organisational Cultures." Quality and Safety in Health Care 13: ii22–27.
Williams, S.K. and S.S. Osborn. 2006. "The Safety and Quality of Health Care: Where Are We Now? The Development of the National Reporting and Learning System in England and Wales, 2001–2005." Medical Journal of Australia 184(Suppl.): S65–68.
Be the first to comment on this!
Personal Subscriber? Sign In
Note: Please enter a display name. Your email address will not be publically displayed