Background Computerised provider order entry (CPOE) is an important patient safety intervention that has encountered significant barriers to implementation. The usability of a CPOE system plays a significant role in its acceptance. The authors conducted a heuristic evaluation of a CPOE order set system to uncover existing usability issues prior to implementation.
Methods A heuristic evaluation methodology was used to evaluate the usability of a CPOE test order set system. There are 10 heuristic principles, such as error prevention, to help users identify and recover from errors. Evaluators included a staff physician with extensive clinical experience, and three engineers with expertise in heuristic evaluation methodology. The results of the heuristic evaluation were used to create a user centred design prototype.
Results 92 unique heuristic violations were found for the CPOE test order set system, including 35 identified by the clinician and at least one engineer, and 57 of the 92 violations (62%) found only by the clinician. All evaluators identified at least one violation of each of the 10 usability heuristics in their analysis of the CPOE system. A user centred design prototype was created to demonstrate changes that could improve usability.
Interpretation The CPOE test order set system had many usability heuristic violations. Many violations were found by a clinician with knowledge of the heuristic evaluation process. Implementation of the CPOE system was deferred and a new user centred design prototype was developed for future study. The authors recommend conducting heuristic evaluations early in the process of designing, selecting and implementing CPOE systems.
- Computerised provider order entry
- computerised order sets
- usability evaluation
- human factors engineering
- medical order entry systems
- human factors
- information technology
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
- Computerised provider order entry
- computerised order sets
- usability evaluation
- human factors engineering
- medical order entry systems
- human factors
- information technology
Computerised provider order entry (CPOE) systems have the potential to improve quality of care, decrease medication errors and reduce adverse drug events.1–7 Unfortunately, implementations of CPOE systems are often unsuccessful, and only 15% of hospitals in the USA have implemented CPOE for medications.8 Organisational barriers, such as increase in clinician work, unfavourable workflow issues, negative emotions and generation of new kinds of errors, are the focus of most research studies on CPOE implementation.4 8–12 Many studies only note the importance of system usability and its major role in provider acceptance and successful implementation.8 9 13–16 A consistent user-friendly interface enables users to accept CPOE despite significant changes in workflow.5 9 10 However, despite the importance of usability in successful CPOE implementations, there are few studies that directly focus on CPOE system usability17–23 and even fewer that conduct quantitative as well as qualitative evaluations24 25; an extensive literature review of CPOE usability identified important CPOE design features that often result in poorly designed systems.17 Importantly, the few studies of CPOE usability do not include end users (ordering providers) in their usability evaluations.
The user centred design method considers the needs and limitations of end users in each stage of the design process, using methods such as heuristic evaluations, cognitive walkthroughs, field studies, task analyses and usability testing.22 Heuristic evaluation is a widely adopted engineering method, developed by Jakob Nielson, for evaluating the usability of a user interface design.26 27 Heuristic evaluation is often used as a preliminary usability assessment tool because it is rapid and relatively inexpensive.28 Heuristic evaluation involves a systematic examination of user interfaces or software programs by usability experts to evaluate their compliance with a recognised list of usability principles.29 This methodology is extensively used across industries and has been applied to uncover usability issues with medical technology.28 30 Typically, heuristic evaluations are performed by experts in usability, and do not involve representative end users of the system. Such evaluations may overlook usability problems if these experts lack the necessary knowledge of the clinical task.23 The authors addressed this potential limitation by including a representative end user in a heuristic evaluation.
Sunnybrook Health Sciences Centre planned to implement a vendor CPOE system. The authors wanted to evaluate the usability of the vendor's system prior to implementation. The focus of this study was on a key feature of CPOE systems, standardised medication order sets. Specialty-specific order sets conveniently group medication orders for the standardised treatment of specific conditions. Order sets are believed to save time for ordering providers and represent a significant opportunity to decrease variation in care and enhance compliance with treatment guidelines.2 They are able to do so by improving ordering efficiency, completeness and accuracy of orders,1 and reducing the ordering of unnecessary tests.31
The authors specifically involved an end user of the CPOE system in the heuristic evaluation to ensure that a broader range of usability concerns would be identified,30 and to better inform the design of a more user-friendly system prototype for future usability testing on a broader sample of end users.
Design and setting
Four independent evaluators performed Nielsen's method of heuristic evaluation32 on only the Sunnybrook CPOE test order set interface; no heuristic evaluation was performed on the existing paper order sets. The heuristic evaluation involved the assessment of the Sunnybrook CPOE test order set system against 10 usability heuristics to identify usability problems. Table 1 provides a list of the usability heuristics with brief descriptions.
The study was conducted at Sunnybrook Health Sciences Centre in the Information Services Usability Evaluation Laboratory. Sunnybrook is a 1200 bed academic hospital in Toronto, Canada with approximately 10 000 staff, physicians and volunteers. The hospital specialises in caring for Canada's war veterans, high-risk pregnancies, critically ill newborns, adults and the elderly, and treating and preventing cancer, cardiovascular disease, neurological disorders, orthopaedic and arthritic conditions and traumatic injuries. At Sunnybrook, all physician orders are written on paper. However, a limited subset of these orders, such as laboratory and radiology, are subsequently entered into an electronic patient care system (OACIS Clinical Care suite) and conveyed to the appropriate ancillary service. Occasionally, physicians enter these written orders into OACIS, but most written orders are entered into the system by nurses or administrative staff.
The CPOE test system included dummy patients that allowed the evaluators to completely simulate the ordering process.
Sunnybrook's CPOE team created the CPOE test platform (figure 1) using the OACIS Clinical Care Suite (Dinmar 2003, Ottawa, Canada, client server version 7.0). OACIS has been in use at Sunnybrook for over 10 years for viewing selected clinical results and managing a limited number of orders, primarily laboratory tests and diagnostic imaging. The Sunnybrook CPOE technical team configured order sets within the OACIS system (figure 2) based on the content of existing pre-printed paper order sets (figure 3). This OACIS version is an older system that has been superseded by newer versions since this study was conducted. The focus in this study was on order sets used by the general medical service, the intended service for the pilot CPOE implementation.
To submit an order set in the CPOE test system, the user must first login to OACIS then select a patient from the available roster. The CPOE test system screen will then appear (figure 1) containing patient information and the three system panes. Users select orders from the catalogue pane by expanding the folders in the hierarchical tree; the order sets are grouped in folders by service under ‘Order Set’. To complete an order set, the user selects the order set by clicking and dragging its icon into the workspace pane. The orders contained within that order set will then be listed in the workspace (figure 2). When an item in the workspace pane is selected, its details will then be displayed in the detail pane. Each item is then individually selected in the workspace pane to view and complete the details for that order in the detail pane. Any additional orders that are not part of the set will need to be found and ordered from the catalogue pane. Once the necessary information is entered for each item in the workspace pane, the submit button in the toolbar will activate and the user can then press it to submit the order set. Help options are available in the menu bar and in the different tabs in the detail pane (ie, information about warnings, order status, order references, and supporting data).
The user centred design prototype was created in C# based on the results of the usability evaluation and the 10 usability heuristics. The authors ensured that the new design adhered to the heuristic principles, although a heuristic evaluation of the new prototype was not subsequently performed. It is a completely new interface meant to demonstrate the appearance of an order set system that better adheres to usability design principles. An opening screen displays a button for each of the general internal medicine (GIM) order sets. When an order set is selected, the order set screen then appears (eg, when the ‘general internal medicine standard admissions’ button is pressed, the order set screen appears as in figure 4). When the order set is submitted, a pop-up message appears indicating that the order set was successfully submitted.
Four independent heuristic evaluations were completed on the CPOE test order set system. One participant was a staff physician with 22 years of clinical experience (including 7 years' experience with a similar order entry interface for ordering laboratory and radiology investigations) and limited experience with heuristic evaluations. The remaining participants were engineers with human factors training and experience with usability design and heuristic evaluation. One engineer was familiar with the OACIS CPOE test order set system prior to the evaluation.
Participants were shown how to log in to the CPOE test order set system, navigate to the order entry screen, and locate the GIM order sets prior to performing their evaluations. Each participant was given the list of 10 usability heuristics presented in table 1 to use as a guide in their evaluations of the CPOE test system. Participants were asked to explore the various system interfaces and document their observations and comments as they related to these principles. They were instructed to remain in the order entry screen and to only document observations and comments that pertained to the GIM order sets. Participants completed their heuristic evaluations of the CPOE test system independently, and were instructed to continue compiling heuristic violations until they felt that they had explored all order set functionality. No other specific tasks or procedures were used. Each evaluation took approximately 2 h.
From the 10 usability heuristics considered (table 1), the four evaluators found a total of 92 unique usability heuristic violations for the CPOE test order set system (table 2); a summary of the observations is provided in online appendix 1. The following heuristics were found to be violated most often: visibility of system status; consistency and standards; aesthetic and minimalist design; error prevention; and helps users recognise, diagnose and recover from errors. Over half of the violations found, 57 of the 92 violations (62%), were identified only by the staff physician.
A user-friendly CPOE order set interface that better complies with these usability heuristics would include: the preservation of the current provider ordering model; the ability to review all orders in one screen; the use of only checkboxes, drop-down lists and free text fields for user inputs; and a simpler and aesthetically pleasing interface. The user centred design prototype CPOE order set system interface (figure 4) was designed to demonstrate how these changes may be incorporated into future system redesign. Later this new design was evaluated in a separate usability study involving a randomised control trial with representative end users.33
The heuristic violations were addressed in the user centred design prototype (figure 4). First, all selected orders and their details were fully displayed on a single screen for easy review throughout the ordering process. This design solution addresses a problem with the layout of the CPOE test system which made it difficult for users to recognise ordering errors prior to submission; the system displays ordering information in a three-pane layout (figure 1) where the catalogue pane contains a list of all orders within the order set, orders are selected from the catalogue and subsequently shown in the workspace pane, and the detail pane displays the details of a selected order from the workspace pane. To review orders, the user would have to click on each individual order in the workspace pane to see its details in the detail pane before submitting the order set.
Consistency was maintained in how drug information is displayed on the system. This design solution addresses problems with the consistency and standards heuristic where drug information is displayed inconsistently in the CPOE test system during ordering tasks; for example, if a user selects morphine 1 mg orally every 4 h from the catalogue, it appears as ‘Morphine sulphate q4h po 1 mg’ in the workspace which may cause user confusion.
An obvious way was included to undo actions through manipulation of checkboxes, drop-down lists, and free text fields. This design solution addresses a violation of the user control and freedom heuristic where the CPOE test system fails to provide users with an obvious method to undo actions; users of the CPOE test system must learn to right click on an order then select ‘undo’ from the bottom of a long drop-down menu.
Finally, the number of system modes was reduced to increase visibility of the system's status. This design solution addresses a violation of the visibility of system status heuristic when different order entry system modes (eg, search mode and ordering mode) are virtually identical and easily confused in the CPOE test system. Users may be unable to complete an order if the wrong mode is in use. The redesign only uses one system mode that allows users to accomplish all order entry tasks.
These design solutions should alleviate the high cognitive load imposed on the user by the CPOE test system by reducing the amount of information users have to keep in working memory. For example, the CPOE test system currently requires users to remember the following: the current system mode; details of each order in the order set (this information is not available unless each item is individually selected and viewed in the detail pane); and the status of each order in the set (headings for orders in the order set do not indicate if they are complete or require further user input).
The usability of the CPOE test order set system was examined using 10 recognised usability heuristics. The system was found to violate all usability heuristics, with a total of 92 violations. The system did not match the real world; was inconsistent in some areas and did not make use of accepted standards; did not provide the user with a clear indication of its status; did not properly address or handle errors; was inflexible, inefficient and imposed high cognitive loads on the user; and was not simple or aesthetically pleasing. The authors concluded that the CPOE test system was not user friendly and needed to be redesigned prior to implementation.
There are no prior published heuristic evaluations of CPOE order set systems. The method used in this study uniquely included a clinician in the evaluation process who identified a significant number of concerns not identified by the engineers. Usability evaluations can fail to identify violations if the evaluators lack the necessary domain knowledge for user interfaces like CPOE.23 Similarly, CPOE systems that are designed with limited input from end users can fail to perform as intended and may suffer from poor usability. Only the end users, clinicians, can appropriately judge the presentation of content and preferred functionality of a complex system like CPOE. The results show a significant potential for improved CPOE system usability by more heavily involving clinicians in the research and development process for these systems.
The study has several limitations. Only one CPOE system was evaluated that was still under development, so the results cannot be generalised to other CPOE systems. The heuristic evaluation process is minimally standardised, so the ability to replicate this process is unknown. The reviewers worked independently; different results could be achieved if an engineer worked with a clinician during the heuristic evaluation process.
The results had an immediate impact. CPOE implementation has been deferred until the usability concerns are addressed. The results also led to the development of a user centred interface that has been given to the vendor with the study findings. This interface addressed many usability concerns, including lack of system status and feedback, ability to review orders prior to submission, and consistency with familiar user conventions and language. The vendor has incorporated these findings into future designs. The authors suspect that CPOE implementation would have failed considering the amount of usability issues discovered within a reasonably short period of time (ie, 92 unique violations within the first 2 h of use with four users). The authors recommend that any organisation considering CPOE implementation should conduct a heuristic evaluation. The authors also recommend that providers be involved in heuristic evaluations to ensure that necessary domain knowledge is applied. This heuristic evaluation provided Sunnybrook with a quick, inexpensive appraisal of the CPOE test system. However, to identify specific areas of improvement and to obtain quantitative measures, there should also be usability studies with a larger sample of representative end users.33
In addition to increasing provider acceptance and improving quality and safety of patient care, there are also potential financial benefits to improving CPOE usability. Budgets for annual maintenance (training and support) in hospital CPOE implementations have been estimated to be as high as $1.35 million for a 500-bed hospital.11 By implementing systems with better designs and improved usability, there is an enticing possibility to reduce this need for support and training.
In summary, a heuristic evaluation of a CPOE system was conducted prior to implementation. Many usability concerns were found that led to deferral of implementation and creation of a prototype user centred design interface. The authors recommend performing heuristic evaluations with end users early in the process of CPOE selection, design and implementation.
The authors thank the staff at Sunnybrook Health Sciences Centre who supported and participated in this research.
Funding This project was supported by a student grant to Ms Chan from Information Services, Sunnybrook Health Sciences Centre.
Competing interests Dr Etchells is currently the President of the OACIS User Group (OUG). OUG is a voluntary association of healthcare organisations that use the OACIS system. Dr. Etchells receives no payments or inducements from the owners of OACIS. We provided a preliminary copy of our report to the makers of OACIS.
Provenance and peer review Not commissioned; externally peer reviewed.