Healthcare Quarterly

Healthcare Quarterly 15(Special Issue) April 2012 : 24-29.doi:10.12927/hcq.2012.22845
Human Factors In Patient Safety

From Discovery to Design: The Evolution of Human Factors in Healthcare

Joseph A. Cafazzo and Olivier St-Cyr

In some respects, the use of human factors engineering (HFE) methods in healthcare has been transformative. However, as measurable progress has been limited in patient safety, these challenges continue. Where is HFE headed to address these challenges? We look at the progress and difficulties HFE faces in today's complex healthcare environment.

Human Factors in Safety Research

Applications of HFE in system design can be traced back to the mid-1900s. Post–World War II, many engineers were preoccupied with the safety of transportation systems. As a result, human factors became prominent in domains such as aviation and the military. Applying HFE principles in these domains gave rise to fundamental conceptual frameworks to understand the relationships between humans and complex technology. For example, in the aviation industry, human factors specialists studied a series of elements influencing flight deck design, contributing to a culture shift that promoted the importance of safety in such complex socio-technical systems. From the mid-1970s to mid-1980s, much of the HFE attention focused on process control systems. Industrial events such as the Three Mile Island nuclear incident and the Bhopal industrial disaster reminded us that despite the successful HFE integration in some domains, the process control industry had partially failed to implement HFE methodologies. A series of inquiries followed these incidents, resulting in several regulatory documents outlining how HFE must systematically be incorporated in system design (Meister 1999).

Human Factors and Healthcare

Applications of HFE principles to the domain of healthcare are relatively new. Although some references to human factors in healthcare date back to the 1980s and early 1990s (Leape 2004), it was the work of James Reason in 1995 that contributed to increasing the scope of HFE practices to healthcare systems.

Reason reviewed HFE contributions to the domain of healthcare and introduced the concepts of active and latent failures. The former are errors committed by users of the systems (e.g., a doctor administers the wrong medication). The latter are errors created at the organizational level of the design (e.g., inadequate procedures, incomplete training, poor labelling choices etc.) and may lie dormant for several days, months or even years until triggered by a collection of local factors. Among other things, Reason's work emphasized the importance of considering team and organizational factors in the design of safety critical systems as well as avoiding the culture of "blaming the users" and embracing a culture of understanding the root causes of adverse events (Reason 1995).

One of the early empirical demonstrations of the significant impact of using HFE design techniques on medical systems was shown in a redesign of the user interface of a commercially available patient-controlled analgesia (PCA) pump (Lin et al. 1998). The authors conducted an evaluation comparing their redesign to the original design. The results showed a significant decrease in programming time, a lower mental workload and fewer errors. These findings clearly demonstrated the applicability of HFE techniques in the healthcare domain and highlighted important design considerations for medical equipment.

It is important to understand that the use of HFE frequently extends to non-technological problems (Vicente 1998), such as hand hygiene, where human behaviour imperils safe practice. HFE often shows that the design of environments affects performance, cognition and behaviour: poor lighting can cause a nurse to misread a label, personal conflict can inhibit critical clinical communication, and a poorly located sink can result in low adherence to hand hygiene.

The Institute of Medicine's report To Err Is Human (Kohn et al. 2000) was a further catalyst to the use of HFE in the healthcare domain. It led to a considerable understanding of actions needed to solve safety problems in the healthcare system.

A decade of great expansion in the field of HFE in healthcare followed. As healthcare became a domain of study for human factors practitioners, HFE principles became valuable tools for healthcare professionals. Many regulations, standards and guidelines (e.g., the Food and Drug Administration [FDA] regulations on human factors and medical devices, and the American National Standards Institute [ANSI]/Association for the Advancement of Medical Instrumentation [AAMI] HE75 standard on HFE and the design of medical devices) resulted from this increased attention and are now in place to help the healthcare industry catch up with other domains in which HFE principles and regulations have been common practice for many years. With this in place, what have we learned and what progress has been made in the past 15 years?

Our Understanding of Human Factors Today

HFE is still new to many in healthcare who seek methods for improving safety and efficiency. Certainly in recent years, other methods such as checklists (Gawande 2007) and Lean methods (De Koning et al. 2006) have received more attention.

Even as the discipline of HFE has made significant progress in addressing patient safety problems in healthcare, it often remains misunderstood and unfamiliar. The term human factors is somewhat of a misnomer, sometimes interpreted by the uninitiated as attributing the cause of an adverse event to human factors, that is, someone's actions. However, regardless of the definition of human factors used, the discipline promotes a fundamental rejection of the notion that humans are primarily at fault when making errors in the use of a socio-technical system. The premise of this defence is that these systems are constructs of human invention and, hence, should be designed for humans to use them. Any error that occurs by the user is thus attributable to the design of the system; this recognizes that some aspect of human cognition, performance or behaviour was not fully considered in order to avoid the circumstance that led to the error. (Figure 1)


Click to Enlarge
 

System Design versus Behaviour Change

As healthcare was accepting the sea change in patient safety to a "no blame" and just culture (Frankel et al. 2006), there was also a realization that progress in addressing patient safety issues has been limited (Classen et al. 2011; Landrigan et al. 2010; Leistikow et al. 2011). Through the lens of human factors, this outcome was not surprising. Many prior strategies have presumed behaviour change on the part of the practitioner, but dependencies on perfect human performance to ensure patient safety are both scientifically and practically unreliable. Humans err.

Human factors practitioners tend toward a more systemic approach for mitigating human error; the only truly reliable method for designing and creating socio-technical work environments involves minimizing the possibility for human error as well as the potential impact when error occurs. This means designing systems that elicit desired behaviours and help reduce errors (i.e., promote human behaviour shaping) as opposed to designing systems that force behaviour changes (Vicente 1998). Despite this view, in healthcare we tend to continue to rely on interventions that improve user performance through greater training and/or a behaviour change to reduce and minimize the impact of adverse events.

Dependency on human behaviour change as a means to mitigate use errors is illustrated by the current popularity of the use of checklists (Gawande 2007; Hales and Pronovost 2006). Although checklist use has recently made headlines in its ability to reduce adverse events in settings such as the operating room and intensive care (Haynes et al. 2009; Pronovost 2006), it remains unclear that an intervention so fundamentally reliant on human behaviour will be sustainable in the long term without constant enforcement (Bosk et al. 2009). Are all healthcare organizations able to create a culture for the sustained use of checklists? If this solution applies only to organizations that have the leadership and resources to maintain such a culture, checklists – and other solutions reliant on human behaviour – cannot be considered a systemic solution. Given how rare serious adverse events are to the total volume of healthcare encounters, a solution that applies to only a fraction of organizations cannot address this safety issue fully.

The Hierarchy of Intervention Effectiveness (Figure 2), a risk management theory, rates interventions related to human behaviour toward the bottom of its scale in favour of technological interventions, which are viewed as more reliable (Institute for Safe Medication Practices 1999). This should not suggest that human-based mitigation interventions (e.g., training, policy and checklists) are not without value. Indeed, a scenario where we are totally dependent on automation with little human intervention is not desirable when a centralized locus of control could propagate errors so easily across a system. Humans are still needed to make judgments at the point of care. However, what this reinforces is that no single mitigation strategy will totally eliminate use errors that lead to adverse events. A tapestry of strategies that have some scientific basis for success will likely be more successful.


Click to Enlarge
 

Nonetheless, as healthcare providers, we tend to not create such strategies. We continue to seek silver-bullet solutions such as checklists, bar-coding and crew resource management (CRM), adapted from aviation. In medication administration in particular, the lack of holistic, systemic solutions has created a fragmented system of technologies, policies and training. No single company has a product that ensures a continuity of information and workflow for the administration of medication. Oddly, there is often no single administrative entity within the hospital overseeing this either, with a mixture of medicine, pharmacy and nursing leadership attempting to ensure the safety of medication administration. What is required from organizations is transcendence of organizational boundaries in order to produce a delivery system that ensures an integrated approach for the user, rather than dealing with multiple disparate systems to perform their critical tasks (Cafazzo et al. 2009).

HFE in Evaluation

If we are to note any significant progress in the use of HFE in healthcare, it must be in the evaluation of technology, if not the improvement of its design. HFE methods have been particularly effective in post-market identification of design flaws that have led to use errors (Chagpar et al. 2006; Chan et al. 2010; Jessa et al. 2006; Trbovich et al. 2010a).

In 2011, FDA issued an update to its regulatory approval guidelines (the first update in 10 years) for device manufacturers to support a far more rigorous human factors evaluation process during the pre-market phase (FDA, 2011). In particular, the guidelines are more prescriptive in defining product usability testing for final design validation to ensure a minimum of use errors.

The motivation for such strict new rules may be due to a growing awareness and recognition that human errors are often due to systemic issues. Specifically, recent high-profile disclosures of human error rates in medication administration using infusion pumps (ECRI Institute 2010; Trbovich et al. 2010a) and concerns over similar issues in radiation therapy delivery (Chan et al. 2010) have given regulators pause for how the industry is designing such products. However, some responsibility also lies in how we continue to purchase technology, as manufacturers argue that they are designing for the marketplace. Even today, healthcare organizations continue to demand the most advanced technology with the latest features to adorn their already-crowded displays. Procurement processes evaluate technology using a checklist culture, rating product quality by the number and completeness of features. This can drive device manufacturers to race to the bottom of product quality. Organizations opt for products bloated with features and that are far too complicated, creating an environment for forced, unintended errors.

The FDA's new guidelines will not entirely remedy the quality of healthcare technology. By emphasizing testing in the final stages of validation, they do little to ensure that human factors methods are extended to the entire development process. While it is a step in the right direction, it is also a missed opportunity to include greater rigour of user-centred design in the earlier conceptual phases of product development.

At a local level, we can identify and mitigate risk by including HFE methods in the procurement process for high-risk technologies. Usability testing can be effectively applied to short-listed products to inform the procurement decision-making processes. The return on this modest investment of time and effort is the acquisition of technology that best suits the user needs of the organization (Cassano-Piché et al. 2010).

As well, computer systems that were intended to solve patient safety issues have revealed other potential sources of use errors (Koppel et al. 2005). HFE identified that software systems require even more attention than previously thought if they are to realize their potential (Beuscart-Zéphir et al. 2010). HFE has been used to mitigate interruptions of high-risk tasks (Trbovich et al. 2010b), to help develop team communication (Grogan et al. 2004) and to manage error at the organizational level (Reason 1997).

And this is where we sit. The full utility of human factors continues to be relegated to the evaluation aspects. We have become proficient at using human factors methods in identifying problems, often after it is too late. The questions we are faced with are whether we can find these problems sooner and how we can implement solutions once problems are identified.

Human Factors and the Design Solution

Human factors practitioners are often passionate in their determination to ensure the safety of systems. They can be combative; insisting that design flaws are at the root of adverse events and that not addressing these flaws directly will lead to more mishaps. They believe that local mitigation strategies such as additional training, new policy, reminders and checklists will not address the systemic problem. However, finding design solutions is not necessarily practiced by human factors professionals. Methods for identifying solutions in science and practice are clear and widely acceptable, whereas methods for finding solutions in design are much less so. We could argue that design is more a process of creativity and inspiration (which are ill defined), and this contrasts the precision of science and engineering. Nevertheless, design is where HFE practitioners can make further impact in the area of patient safety. There is fortunately a body of work in HFE that directly addresses design science with rigour, and this is outlined below.

The Future of Design Practice: User-Centred Design and Ecological Interface Design

User-centred design (UCD) is one of the most common methods that HFE practitioners use to create systems that prioritize the need for ease of use. UCD defines an iterative process where concepts and prototypes employ user testing to inform and optimize the design of the system. It does not define how concepts and initial prototypes are achieved. UCD better defines the evaluation methods in the design process than the process of design ideation and implementation. What is missing then is a rigorous framework to inform design, leading to design solutions that adapt technology to people as opposed to requiring people to adapt to technology.

Ecological interface design (EID) is one such framework for designing human-machine interfaces for complex systems (Vicente and Rasmussen 1992). The framework has been applied to a variety of domains, including healthcare (Vicente 2002). EID is based on the insight that major events in safety critical systems occur when users encounter conditions unanticipated by systems designers. Hence, one of the goals of EID is to provide design principles for human-machine interaction that will help users in unanticipated conditions while preserving their ability to exercise normal control (Vicente and Rasmussen 1992).

The EID framework is relatively new to most HFE healthcare practitioners. Given the complexity of today's healthcare systems, it seems appropriate to consider EID as a systematic approach that could improve our understanding of such systems and lead to innovative design solutions. Recent applications for the framework in the domain of healthcare are encouraging. Examples include the use of EID to modernize the user interface of high-risk radiotherapy control systems (Chun et al. in press), as well as for systems for the detection, evaluation and treatment of cardiovascular risk (McEwen and Flach in press). These illustrate the viability of EID in healthcare and offer promising new applications.

With these methods established, HFE practitioners need to recognize that their value in ensuring patient safety will not be fully realized until they consider providing design solutions with scientific rigour, rather than simply evaluating and identifying issues.

Conclusions

Much of what we can learn from HFE is not in the methods themselves but in the cultural lens HFE provides healthcare practitioners. Working nurses should not assume that because they are having difficulty with a task, whether it involves technology or not, that it is necessarily the case that they are the problem. They should not assume that the designers of the system have fully considered the complexity of their working environment, understood the multi-tasking that is often involved, or realized that they are frequently interrupted when performing such critical tasks.

The positive news is that providers are finding their voice with respect to the quality of the systems they use. They are more demanding of the tools employed to care for patients. Why can these systems not be as easy to use as the latest consumer electronics gadget?

HFE rigour has given healthcare better products, helped find the poor ones and provided insight into how our behaviour is impacted by our environment, among other critical safety issues. There are also more hospital practitioners of HFE methods in healthcare settings who can be advocates for providers by evaluating and identifying technologies, workflows and environments. The future lies with practitioners in using more advanced HFE methods to create the path to finding greater systemic solutions to the challenges in patient safety.

About the Author(s)

Joseph A. Cafazzo, PhD, PEng, leads Healthcare Human Factors at the Techna Institute, University Health Network, and is an assistant professor for the Institute of Health Policy, Management and Evaluation and the Institute of Biomaterials and Biomedical Engineering, Faculty of Medicine, University of Toronto, in Toronto, Ontario.

Olivier St-Cyr, PhD, is a member of Healthcare Human Factors at the Techna Institute, University Health Network, and is an adjunct professor in the School of Information Technology, York University, in Toronto, Ontario.

Acknowledgment

We would like to thank members of Healthcare Human Factors at the University Health Network for their assistance in providing input and background for this manuscript: Tara McCurdie, Wayne Ho, Varuna Prakash, Jennifer Jeon, Diane DeSousa, Svetlena Taneva Metzger, Emily Rose, Aarti Mathur and Cassie McDaniel.

References

Beuscart-Zéphir, M.-C., J. Aarts and P. Elkin. 2010. "Human Factors Engineering for Healthcare IT Clinical Applications." International Journal of Medical Informatics 79(4): 223–24.

Bosk, C.L., M. Dixon-Woods, C.A. Goeschel and P.J. Pronovost. 2009. "Reality Check for Checklists." The Lancet 374(9688): 444–45.

Cafazzo, J.A., P.L. Trbovich, A. Cassano-Piche, A. Chagpar, P.G. Rossos, K.J. Vicente et al. 2009. "Human Factors Perspectives on a Systemic Approach to Ensuring a Safer Medication Delivery Process." Healthcare Quarterly 12(Special Issue): 70–74.

Cassano-Piché, A., J.A. Cafazzo, A. Chagpar and A.C. Easty. 2010. "Choosing Safer Medical Technologies: How Human Factors Methods Can Help in the Procurement Process." Biomedical Instrumentation and Technology: Human Factors 44(Suppl. 1): 49–56.

Chagpar, A., J. Cafazzo and T. Easty. 2006. "Lessons Learned from a Comparative High Fidelity Usability Evaluation of Anesthesia Information Systems." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 50(24): 2595–99.

Chan, A.J., M.K. Islam, T. Rosewall, D.A. Jaffray, A.C. Easty and J.A. Cafazzo. 2010. "The Use of Human Factors Methods to Identify and Mitigate Safety Issues in Radiation Therapy." Radiotherapy and Oncology 97(3): 596–600.

Chun, W., J. Jeon, J. Cafazzo and C.M. Burns. In Press. "Work Domain Analysis for Designing a Radiotherapy System Control Interface." Proceedings of the 2012 Symposium on Human Factors and Ergonomics in Health Care.

Classen, D.C., R. Resar, F. Griffin, F. Federico, T. Frankel, N. Kimmel et al. 2011. "'Global Trigger Tool' Shows That Adverse Events in Hospitals May Be Ten Times Greater Than Previously Measured." Health Affairs 30(4): 581–89.

De Koning, H., J.P.S. Verver, J. Van Den Heuvel, S. Bisgaard and R.J.M.M. Does. 2006. "Lean Six Sigma in Healthcare." Journal for Healthcare Quality 28(2): 4–11.

ECRI Institute. 2010. "Top 10 Technology Hazards for 2011." Health Devices November: 404–16.

FDA 2011. Draft Guidance - Applying Human Factors and Usability Engineering to Optimize Medical Device Design. Center for Devices and Radiological Health. Accessed February 14, 2012. <http://www.fda.gov/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocuments/ucm259748.htm>.

Frankel, A.S., M.W. Leonard and C.R. Denham. 2006. "Fair and Just Culture, Team Behavior, and Leadership Engagement: The Tools to Achieve High Reliability." Health Services Research 41(4, Pt. 2): 1690–709.

Gawande, A. 2007. "The Checklist." The New Yorker 83(39): 86–95.

Grogan, E.L., R.A. Stiles, D.J. France, T. Speroff, J.A. Morris, B. Nixon et al. 2004. "The Impact of Aviation-Based Teamwork Training on the Attitudes of Health-Care Professionals." Journal of the American College of Surgeons 199(6): 843–48.

Hales, B.M. and P.J. Pronovost. 2006. "The Checklist – a Tool for Error Management and Performance Improvement." Journal of Critical Care 21(3): 231–35.

Haynes, A.B., T.G. Weiser, W.R. Berry, S.R. Lipsitz, A.-H. S. Breizat, E.P. Dellinger et al. 2009. "A Surgical Safety Checklist to Reduce Morbidity and Mortality in a Global Population." New England Journal of Medicine 360(5): 491–99. DOI: 10.1056/NEJMsa0810119.

Institute for Safe Medication Practices. 1999. Medication Error Prevention "Toolbox." Horsham, PA: Author.

Jessa, M., R. Cooper, J. Cafazzo, R. Wax, A. Chagpar and T. Easty. 2006. "Human Factors Evaluation of Automatic External Defibrillators in a Hospital Setting." Proceedings of the Human Factors and Ergonomics Society Annual Meeting 50(10): 1099–102.

Kohn, L.T., J.M. Corrigan and M.S. Donaldson, eds. 2000. To Err Is Human: Building a Safer Health System. Washington, DC: National Academy Press.

Koppel, R., J.P. Metlay, A. Cohen, B. Abaluck, A.R. Localio, S.E. Kimmel et al. 2005. "Role of Computerized Physician Order Entry Systems in Facilitating Medication Errors." Journal of the American Medical Association 293(10): 1197–203.

Landrigan, C.P., G.J. Parry, C.B. Bones, A.D. Hackbarth, D.A. Goldmann and P.J. Sharek. 2010. "Temporal Trends in Rates of Patient Harm Resulting from Medical Care." New England Journal of Medicine 363(22): 2124–34.

Leape, L.L. 2004. "Human Factors Meets Health Care: The Ultimate Challenge." Ergonomics in Design, 12: 6–12.

Leistikow, I.P., C.J. Kalkman and H. Bruijn. 2011. "Why Patient Safety Is Such a Tough Nut to Crack." BMJ 342: d3447.

Lin, L., R. Isla, K. Doniz, H. Harkness, K.J. Vicente and D.J. Doyle. 1998. "Applying Human Factors to the Design of Medical Equipment: Patient-Controlled Analgesia." Journal of Clinical Monitoring and Computing 14(4): 253–63.

McEwen, T.R. and J.M. Flach. In Press. "Development and Evaluation of an Ecological Display for the Detection, Evaluation, and Treatment of Cardiovascular Risk." Proceedings of the 2012 Symposium on Human Factors and Ergonomics in Health Care.

Meister, D. 1999. The History of Human Factors and Ergonomics. Mahwah, NJ: Lawrence Erlbaum Associates.

Pronovost, P. 2006. "An Intervention to Decrease Catheter-Related Bloodstream Infections in the ICU." New England Journal of Medicine 355: 2725–32.

Reason, J. 1995. "Understanding Adverse Events: Human Factors." Quality in Health Care 4(2): 80–89.

Reason, J.T. 1997. Managing the Risks of Organizational Accidents. Aldershot, United Kingdom: Ashgate.

Trbovich, P., V. Prakash, J. Stewart, K. Trip and P. Savage. 2010a. "Interruptions during the Delivery of High-Risk Medications." Journal of Nursing Administration 40(5): 211–18. DOI: 10.1097/NNA.0b013e3181da4047.

Trbovich, P.L., S. Pinkney, J.A. Cafazzo and A.C. Easty. 2010b. "The Impact of Traditional and Smart Pump Infusion Technology on Nurse Medication Administration Performance in a Simulated Inpatient Unit." Quality and Safety in Health Care 19(5): 430–34.

Vicente, K.J. 1998. "Human Factors and Global Problems: A Systems Approach." Systems Engineering 1(1): 57–69. DOI: 10.1002/(SICI)1520-6858(1998)1:1<57::AID-SYS6>3.0.CO;2-8.

Vicente, K.J. 2002. "Ecological Interface Design: Progress and Challenges." Human Factors 44(1): 62–78.

Vicente, K. J. and J. Rasmussen. 1992. "Ecological Interface Design: Theoretical Foundations." IEEE Transactions on Systems, Man, and Cybernetics 22(4): 589–606.

Comments

Be the first to comment on this!

Related Articles

Note: Please enter a display name. Your email address will not be publically displayed