Healthcare Quarterly

Healthcare Quarterly 12(Sp) August 2009 : 49-54.doi:10.12927/hcq.2009.20966

Ensuring the Safety of Health Information Systems: Using Heuristics for Patient Safety

Christopher J. Carvalho, Elizabeth M. Borycki and Andre Kushniruk


Health information systems (HISs) are typically seen as a mechanism for reducing medical errors. However, there is evidence to suggest that technology can facilitate or induce medical errors. Therefore, it is crucial that we fully test systems prior to their implementation in real-world settings. Presently, evidence-based evaluation heuristics that are specific to HISs do not exist for assessing aspects of interface design that may facilitate errors.

A three-phase study was conducted to determine the utility of evidence-based heuristics in evaluating a human-technology interface (i.e., the Veterans Affairs Computerized Patient Record System [VA CPRS®]). Phase one consisted of a systematic review of the health informatics literature involving technology-facilitated or technology-induced error. Phase two involved reviewing the literature and generating a comprehensive list of 38 heuristics that could be used to evaluate an HIS for technology-induced errors. Lastly, phase three involved conducting a heuristic evaluation of the VA CPRS® system using evidence-based heuristics. Results from this work are discussed.

According to a recent paper by Baker et al. (2004), approximately 185,000 hospital admissions in Canada result from an adverse event. Several studies published over the past two decades have demonstrated the effectiveness of HISs in reducing the number of adverse events and improving healthcare practitioner performance (Ash et al. 2007). Largely based on this literature, healthcare administrators have decided to invest in HISs (e.g., computerized provider order entry [CPOE], pharmacy systems, decision support systems and medication administration systems) for their organizations to reduce the number of adverse events arising from medical errors. However, in the past five years, several researchers have documented the potential of HISs to introduce new types of errors, called technology-induced or -facilitated errors (Koppel et al. 2005; Kushniruk et al. 2005). The number of studies with these findings continues to grow. Therefore, it is crucial that any potential negative effects of HISs must be considered and methods be developed to ensure HISs are safe prior to their implementation.


Each day healthcare organizations move through the process of procuring large-scale HISs with the hope of implementing a solution that provides added safety for patients. As HISs directly influence patient care activities, it is extremely important for organizations to take the safety of these systems into account during the procurement process. At present, there are few methodologies that can be used to evaluate the safety of a HIS. Much of the work in the health informatics literature has focused upon developing or using methods to identify technology-induced errors only after the implementation of an HIS into an organizational setting (e.g., observational research) (e.g. Ash et al. 2007; Campbell et al. 2006; Koppel et al. 2005). During the HIS procurement process, there is a need to use low-cost, targeted approaches that provide immediate feedback to healthcare decision-makers (e.g., chief information officers). Some researchers have attempted to develop methods such as clinical simulation (see Kushniruk et al. 2005, 2006 and "Toward an Integrated Simulation Approach for Predicting and Preventing Technology-Induced Errors in Healthcare: Implications for Healthcare Decision-Makers," page 88) that could be used during the procurement process.

Heuristic evaluation has been identified by a number of human factors specialists (e.g., Nielsen 1993, 2000) as an expedient, general methodology for evaluating the quality of user interfaces. Heuristic evaluation involves one or more analysts systematically stepping through a system or user interface, comparing the system's design against a set of heuristics and noting the conformance to or violation of these heuristics. Heuristics typically represent principles of good design (e.g., they provide a consistent user interface layout, clear navigational links to users etc.).

Unfortunately, there are few publications that provide examples of evidence-based heuristics (i.e., those derived from human factors research in healthcare involving error) that could be used to evaluate the ability of an interface to prevent technology-induced errors. The purpose of this paper is (1) to outline the development of evidence-based guidelines or heuristics that are potentially very useful for organizations that are purchasing HISs and (2) to report on the outcomes of using those heuristics when used to evaluate the Veterans Affairs Computerized Patient Record System (VA CPRS®).

Development of Heuristics for Preventing Technology-Induced Errors

The heuristics were developed in three phases. In phase one, a systematic review of research articles published in MEDLINE and on the Web of Science from 2005 was undertaken to identify published research that outlines any potential harms arising from the use of HISs. The key search terms were informatics medical errors, informatics induced errors, technology-induced errors, technology facilitated errors and CPOE errors. In phase two, a round-table discussion took place between three health informatics experts during which the findings from phase one were reviewed. The round-table discussion led to the generation of a comprehensive list of 38 evaluation heuristics (Figure 1). The 38 evaluation heuristics were then grouped into four areas according to four usability themes: workflow issues, content issues, safeguards and functional issues. Workflow issues refer to those concerns that arise from difficulties experienced when the steps of a process do not run well from beginning to end (Haag et al. 2004) while content issues refer to the quality of the subject matter contained in the HIS. Safeguards include the presence of differing types of passive and active decision supports that prevent a medical error from occurring (e.g., alerts, reminders, laboratory reference range values). Alternatively, functional issues focus on the functions that a HIS can perform (e.g., the ability to scroll through screens) (Preece et al. 1994). A more detailed description of the literature that was reviewed and formed the basis for developing the heuristics is published in another work, for further details see Carvalho et al. (2009).


Figure 1: Comprehensive list of 38 evaluation heuristics

  • System does not change the business process.
  • System does not impose parallel activity or sequential ordering.
  • System allows more than one person to view the records at the same time.
  • System has flexible screen sequences.*
  • System has clear log-on and log-off.
  • Information is consistent in computer and paper record in hybrid electronic-paper environments.
  • System medication information is on the computer and is compatible with the paper record.
  • System accommodates clinician physical activities.
  • User has ability to override the system during an emergency.
  • User requires a minimal number of clicks for entering medication orders.*
  • System lists medications in terms of priority where appropriate (e.g., stat meds should be found at top).
  • System displays medication status.*
  • System allows for customization of medication lists and synonyms to the health care organization.*
  • System clearly displays the date and time the medication was updated.*
  • System medication information should be guidelines-based.
  • System should limit or not use defaults for medications.*
  • System should provide information about the origins of the defaults to users (i.e., vendor suggested or health care organization suggested default).*
  • System rules are up to date (i.e., rules for alerts and reminders, etc.).
  • System provides information about each drug order on the same screen.
  • Heuristic 30
  • System information in the EHR is consistent with information found in other systems.
  • Heuristic 6
  • Heuristic 7
  • Heuristic 31
  • System conducts interaction checking (e.g., drug-drug, drug-diluent).
  • System checks for duplicated medications, IV medications and procedures.*
  • System alerts and reminders should be consistent with current organizational policies and procedures.*
  • System provides appropriate levels of record and record field locking.
  • System ensures allergy and reminder information does not lead to high false positives.
  • System displays normal ranges for medication doses.
  • System displays patient's room such that there could be no error in giving the wrong medication to the wrong patient.*
  • Heuristic 13
  • System allows for linkages between ordering and information technology discontinuation.
  • System allows for linkages between medication ordering, administration and discontinuation procedures.
  • System menus are scrollable and clearly marked.*
  • System signals the user when the first dose is to be given on all non-standard orders, procedures, medication orders, etc.
  • System allows the user to post notes and make annotations regarding patients' special conditions, etc.
  • System limits the free-text that other users may not be able to see.*
EHR= electronic health record
* Indicates heuristic that was used during the usability inspection of the VA CPRSR.


Using Heuristics to Evaluate a HIS

Phase three of this work involved using the generated heuristics to evaluate an HIS. In order for the researchers to evaluate the HIS in real-world conditions (i.e., those typical of analysts evaluating an HIS prior to its procurement), they developed an ecologically valid scenario (Haimson and Elfenbein 1985): an analyst using the heuristics to conduct an evaluation (Nielsen 1993) of a demonstration version of a system. (Demonstration versions of systems are often reviewed by healthcare decision-makers as the installation of a full system during the procurement process would be too difficult to pursue [Preece et al. 2004].) A demonstration version of the VA CPRS® was used for the heuristic evaluation.

The VA CPRS® is the most well-known HIS and is possibly among the most researched HISs in North America. It is associated with significant improvements in the quality of patient care. As well, it is known for its high usage rate among clinicians (Morgan 2005), and it is currently used in many countries around the world (e.g., Finland, Jordan, Germany, India) (Protti and Groen 2008). The analyst compared the VA CPRS® against each of the heuristics in turn and recorded the outcomes of the heuristic evaluation (Figure 2).


Figure 2. Outcomes of heuristic evaluation
1. System displays medication status. The system displays the medication: order date, stop date, expiry date and order status (the system displays all the information that indicates medication status).
2. User makes minimal number of clicks for entering medication orders. Approximately five clicks were needed to order a medication (this included opening a new order form, selecting the medication, the dose, and submitting the order).
3. System limits the free-text that other users may not be able to see. Some sections of the VA CPRS® did allow for free text.
4. System allows for customization of medication lists and synonyms to the healthcare organization. The VA system is a homegrown system. Therefore, it was assumed the synonyms were customized to each hospital.
5. System has flexible screen sequences. The system accommodates differing types of clinician tasks. (The tabs at the bottom of the screen give the clinician user the flexibility to choose a sequence of screens.)
6. System provides information about each medication order on the same screen. All medication information is located on the same screen. (The system provides a small summary field that displays the medication name and dose. The system also has functionality that can open up a more detailed report on the ordered medication and a complete medication history.)
7. System alerts and reminders should be consistent with current organizational policies and procedures. The VA system is a homegrown system. Therefore, it was assumed that all alerts were originally based upon the organization's policies and procedures.
8. System should limit or not use defaults for medications. For some medications, the system displayed several possible doses that could be selected from as well as several possible medication schedules that could be used. If a medication order is submitted with either the medication dose or schedule fields empty, an error message appears that asks the clinician user to enter a dose and a schedule. Defaults were not automatically added to these fields. Therefore, there is little opportunity to enter a wrong dose or schedule without the user selecting it by him or herself.
9. System should provide information about the origins of the defaults to users (i.e., vendor suggested or healthcare organization suggested default). The VA system does not use defaults in the medication ordering component of the system. The system does provide options that can be selected from (i.e., dose or schedule of medical administration). The origin of these options is not displayed. It is assumed the options have organizational origins as it is a homegrown system.
10. System menus are scrollable and clearly marked. Menus that exceed the assigned page limit are scrollable. The scroll bar appears on the right hand side of the screen. The scroll bar is the only indicator that the field is scrollable (Users that are not familiar with computers might experience difficulty using this feature.)
11. System clearly displays date and time the medication was updated. The system displays the medication order date and stop date.
12. System checks for duplicate medications, IV medications and procedures. If a user tries to order a medication, the system checks to see if the same medication has been ordered (i.e., a duplicate medication order). If a duplicate medication order occurs or if the patient is taking the medication an alert will "pop up" indicating there is a duplicate order. The user can then accept the medication order or cancel it.
IV= Intravenous; VA CPRS= Vetrans Affairs Computerized Patient Record System.



After developing evidenced-based heuristics, we determined how well the heuristics highlighted the safety of an HIS by having an analyst use the heuristics to evaluate a demonstration version of the VA CPRS®. The analyst conducted the heuristic evaluation and recorded the findings for each developed heuristic (i.e., whether the system conformed to or violated the heuristic). A subset of only 12 of the 38 evidence-based heuristics could be applied using traditional heuristic evaluation approaches (see Figure 2 for the heuristics and the corresponding responses). Two of the 10 workflow issue heuristics, six of the 14 content issue heuristics, three of the eight safeguard issues and two of the six functional issues could be applied. Of the heuristics that were applied, the VA CPRS® system fulfilled the expectations of each (see Figure 2). The remaining 26 heuristics could not be applied in the context of an analyst sitting at a computer conducting a walkthrough. A simulation environment would need to be set up to test the systems implications upon workflow using real-world tasks (Borycki et al. 2006).

In order for the heuristics to be fully applied to the evaluation of the VA CPRS®, the analyst would need to understand the organization's plans in terms of implementing and customizing the system to the local organizational environment (e.g., the organization's intensive care unit, emergency room, medical units etc.). This would include developing a comprehensive understanding of the following:

  • Existing and planned changes to business or clinical processes and workflows (i.e., how the HIS will be used to automate and re-engineer clinical processes)
  • How the organization intends to implement the system (i.e., implementing the system all at once, pilot testing the system on a unit and implementing it on a unit-by-unit basis or incrementally implementing each component of the HIS over time) as this will impact upon clinical processes and workflows and will determine the system's impact upon users over time
  • The clinical guidelines and standards the organization is using
  • The technical standards involving other information systems that would influence the exchange of data with the system under consideration
  • Organizational plans to implement decision support in the clinical setting and its impact upon future clinical workflow (i.e., types of alerts and reminders)
  • The types of information and terminologies used by systems that would exchange data with the system that is being considered and how they may fit together
  • Organizational policies and procedures and how they could be represented in the system and used to support user information needs within the context of clinical workflow


Throughout the evaluation process, it became apparent that to evaluate and select an HIS, a number of issues needed to be attended to. Evidence-based heuristics alone when applied by an analyst as part of a heuristic evaluation do not provide sufficient information to fully evaluate the safety of an HIS. Based on the research on technology-induced errors, there is a need to evaluate HISs in the context of local organizational business or clinical processes, workflows, clinical guidelines, standards and organizational structures.

During the procurement process, heuristics can be used to conduct a preliminary assessment of the potential of a technology to facilitate medical errors. If a system passes the heuristic evaluation, then the system could be considered for a more dynamic evaluation involving clinical simulation within a local organization context. Researchers have effectively used clinical simulations to test the safety of information systems using real-world scenarios, devices and information systems (see Kushniruk et al. 2005, 2006). Such simulations can be undertaken in an organization and can involve individuals with knowledge of clinical and organizational processes, guidelines, standards, polices and procedures as a second phase to evaluating systems safety (Kushniruk and Borycki 2006).

In the process of system procurement, it is important to fully evaluate HISs using heuristics that are specific to safety. This evaluation should be completed prior to the purchase of an HIS to obtain a full understanding of the functions of the HIS within the organization's context. The ability to compare HISs is important when considering purchasing, implementing and customizing solutions. Standards need to be used to provide metrics that can aid in comparing these systems and thereby support HIS selection. These newly created heuristics can be used as an initial approach to testing and comparing HIS safety (Kushniruk et al. 2009).

Although these heuristics can provide initial insight into the safety of systems, there are limitations associated with their use. Heuristics provide a static view of an HIS. Heuristics do not take into account local organizational guidelines, standards, business and clinical processes, nor do they take into account the organizational context for performing work. Furthermore, dynamic testing may be needed involving the use of clinical simulations in which users perform representative tasks within the context of the organization (Borycki et al. 2009).


Terminology at a Glance

Computerized provider order entry (CPOE): Information system designed to support computerized input of physician orders.

Evidence-based heuristics: Heuristics (i.e., rules of thumb) based on empirical evidence.

Heuristic evaluation: A form of usability engineering involving inspection of a user interface or system by trained analysts (who compare the interface or system against a set of heuristics).

Technology-induced error: Error that inadvertently occurs as a result of using a technology (e.g., medication errors that result from using a system).

Usability engineering: A scientific set of methods that can be applied to systematically improve the usability of information systems.

Veterans Affairs Computerized Patient Record System (VA CPRS®): One of the world's most successful electronic patient record systems designed and deployed by the US Department of Veterans Affairs.



Currently, there is a significant desire among healthcare organizations to introduce HISs to hospitals and clinics. Most of the related research has found that the HISs that are currently being implemented are improving patient care. However, there is some research that shows that technology may be contributing to errors, rather than preventing them.

Many researchers use general heuristics to assess the usability of HISs. By developing heuristics that are based on published research as described in the literature, a new set of evidence-based heuristics have been created that assess HIS safety. The developed evidence-based heuristics proved to be effective in assessing the safety of the VA CPRS®. These heuristics are new and will need to be tested on multiple systems before they can be considered a new standard to provide an initial insight into system safety.

There are a number of other factors that need to be considered when assessing system safety. First, the knowledge level of the users must be considered since experienced and inexperienced users may have differing error rates. Also, user training affects error rates. Organizations must consider the time it will take users to gain sufficient experience with an HIS. Lastly, every HIS has its limitations, and these need to be fully understood before a system is implemented. These limitations can be found with the use of heuristics.

When an HIS is implemented, the organization and its users must fully understand its impact. With the use of evidence-based heuristics, HISs can be tested for their potential to inadvertently facilitate technology-induced errors. In this manner, organizations can select an HIS that will contribute to, not hinder, patient safety.

About the Author(s)

Christopher J. Carvalho, BSc, is working with Courtyard Group in Edmonton, Alberta. You can contact him at

Elizabeth M. Borycki, RN, PhD, is an assistant professor at the School of Health Information Science, University of Victoria, Victoria, British Columbia. She can be reached by e-mail at

Andre Kushniruk, PhD, is a professor at the School of Health Information Science, University of Victoria. He can be reached at


Ash, J.S., D.F. Sittig, R.H., Dykstra, K. Guappone, J.D. Carpenter and V. Seshadri. 2007. "Categorizing the Unintended Sociotechnical, Consequences of Computerized Provider Order Entry. International Journal of Medical Informatics 76(1): 21-27.

Baker, G.R., P.G. Norton, V. Flintoft, R. Blais, A. Brown, J. Cox, E. Etchells, W.A. Ghali, P. Hebert, S.R. Majumdar, M. O'Beirne, L. Palaciao-Derfligher, R.J. Reid, S. Sheps and R. Tamblyn. 2004. "The Canadian Adverse Events Study: The Incidence of Adverse Events among Hospital Patients in Canada." Canadian Medical Association Journal 170(11): 1679−86.

Borycki, E.M., A.W. Kushniruk, S. Kuwata and H. Watanabe, H. 2009. "Simulations to Assess Medication Administration Systems." In B. Staudinger, V. Hoess and H. Ostermann, eds., Nursing and Clinical Informatics. Hershey, PA: Idea Group.

Borycki, E.M., A.W. Kushniruk, S. Kuwata and J. Kannry. 2006. "Use of Simulation Approaches in the Study of Clinician Workflow." AMIA Conference Proceedings 61−65.

Campbell, E.M., D.F. Sittig, J.S. Ash, K.P. Guappone and R.H. Dykstra. 2006. "Types of Unintended Consequences Related to Computerized Provider Order Entry." Journal of the American Medical Association 13: 547−56.

Carvalho, C.J., E.M. Borycki and A.W. Kushniruk. 2009. "Using Heuristic Evaluations to Assess the Safety of Health Information Systems." Studies in Health Technology and Informatics. 143: 297-301.

Haag, S., M. Cummings, D.J. McCubbry, A. Pinsonnealt and R. Donovan. 2004. Management Information Systems for the Information Age. Toronto, ON: McGraw-Hill.

Haimson, B.R. and M.H. Elfenbein. 1985. Experimental Methods in Psychology. Toronto, ON: McGraw-Hill.

Koppel, R., J.P. Metlay, A. Cohen, B. Abaluck, A.R. Localio, S.E. Kimmel and B.L. Strom. 2005. "Role of Computerized Physician Order Entry Systems in Facilitating Medication Errors." Journal of the American Medical Association 293: 1197−203.

Kushniruk, A.W. and E.M. Borycki. 2006. "Low-Cost Usability Engineering: Designing and Customizing Usable Healthcare Information Systems." Healthcare Quarterly 9(4): 98−100.

Kushniruk, A.W., E.M. Borycki, K. Myers and J. Kannry. 2009. "Selecting Electronic Health Record Systems: Development of a Framework for Testing Candidate Systems." In J.G. McDaniel, ed., Advances in Information Technology and Communications in Health (Vol. 143). Fairfax, VA: IOS Press.

Kushniruk, A.W., E.M. Borycki, S. Kuwata and J. Kannry. 2006. "Predicting Changes in Workflow Resulting from Healthcare Information Systems: Ensuring the Safety of Healthcare" Healthcare Quarterly 9: 114−18.

Kushniruk, A.W., M.M. Triola, E.M. Borycki, B. Stein and J.L. Kannry. 2005. "Technology-Induced Error and Usability: The Relationship between Usability Problems and Prescription Error when Using a Handheld Application." International Journal of Medical Informatics 74: 519−26.

Morgan, M.W. 2005. "The VA Advantage: The Gold Standard in Clinical Informatics." Healthcare Papers 5(4): 26−29.

Nielsen, J. 1993. Usability Engineering. Toronto, ON: Academic Press.

Nielsen, J. 2000. Designing Web Usability. Berkeley, CA: New Riders Publishing.

Preece, J., Y. Rogers, H. Sharp, D. Benyon, S. Holland and T. Carey. 1994. Human-Computer Interaction. Don Mills, ON: Addison-Wesley.

Protti, D. and P. Groen. 2008. "Implementation of the Veterans Health Administration VistA® Clinical Information System around the World." Healthcare Quarterly 11(4): 83−89.


Be the first to comment on this!

Note: Please enter a display name. Your email address will not be publically displayed