Back in the mid-90s, doctors who bought EHR systems were trying to solve specific problems.   These early adopters would listen as I listed the many ways that EHR systems would change their lives, acknowledge the issues I raised, then buy an EHR anyway.   More often than not, however, they were happy with how things turned out because the problems they sought to solve were eliminated.

In academic medical centers, informaticists and clinicians who saw the potential of IT to improve care delivery drove adoption.  Here, most systems were homegrown, and by extension, custom designed.   The success of those nascent EHR systems in academic and early-adopter private practices gave the impression that EHR systems were ready for wide adoption–eventually leading policy makers, cheered on by vendors and visionaries, to advocate for what became the federal EHR incentive programs.   The big push started in 2003 and was fully realized in HITECH. 

As is usually the case with complex systems, the downstream effects of widespread EHR adoption were not anticipated. Overly optimistic assumptions about the state of EHR technology and lack of familiarity with the effort and creativity required to design and build EHR systems led to unrealistic MU-related feature creep.  Some assumptions–HL7 V3 RIM was ready for building real-world interoperability solutions or that anyone really knew what semantic interoperability was or looked like in real-life–were obviously wrong.   More recent assumptions (that usability research along with better user-centered design processes will solve EHR workflow and safety issues) will be debunked in good time.  Disney aside, frogs do not turn into princes.

The chart metaphor is not the proper design inspiration for supporting the complete range of clinical processes. Until this is accepted, EHR usability research (with all of its problems) (1, 2) and the UCD processes that make use of that research will not result in more process-friendly systems.  Hopefully, research on the unintended consequences of EHR adoption will provide a much-needed reality check on the true complexities of clinical software (3, 4, 5, 6, 7).   

Whenever I read about safety issues, lower productivity, ransomware or other EHR-related problems, Jurassic Park comes to mind.   Putting dinosaurs and humans together is not a good idea, and neither is giving complex computer systems to organizations with little understanding of information security or process management.  The rush to install systems and get an incentive payment was little different than setting out to see the dinosaurs with a hurricane approaching—that someone would be eaten was only a matter of time.

In 2006, Campbell et al. provided a list of CPOE-related issues that, thanks to wider EHR adoption, are now well known (3).

  • More/new work for clinicians
  • Unfavorable workflow issues
  • Never ending demands for system changes
  • Paper persistence
  • Changes in communication patterns and practices
  • Negative emotions
  • New kinds of errors
  • Changes in the power structure
  • Overdependence on technology

Too many clicks and productivity losses are major complaints.  Workarounds are common.  Customizations are frequent and can be troublesome to maintain (i.e., retaining them through product updates). Clinician burnout is increasing.  Every item in the above list has been documented in the literature.  Given that this list is from 2006, one would think that 11 years later, EHR issues would have decreased—except, they have not.

In New Unintended Adverse Consequences of Electronic Health Records (4), Sittig and colleagues add six additional consequences to ponder. 

  • Complete clinical information unavailable at the point of care
  • Lack of innovations to improve system usability leads to frustrating user experiences
  • Inadvertent disclosure of large amounts of patient-specific information
  • Increased focus on computer-based quality measurement negatively affects clinical workflows and patient-provider interactions
  • Information overload from marginally useful computer-generated data
  • Decline in the development and use of internally-developed EHRs

Complete clinical information unavailable at the point of care
What is one to make of this new collection?   Certainly, during my practice days, I spent many hours tracking down missing consultant reports, so, that is not an EHR-only issue.  The lack of seamless data exchange and interoperability is less an unintended consequence than it is the aftermath of an over-active imagination. As someone who has developed clinical software and tried to make sense of HL7 V3 RIM documentation, I would have happily volunteered back in 2004 that it was not going to be easy to implement.    

Anyone who actually writes code knows that when building something that has never existed, it is best to start with a basic prototype to test one’s assumptions, learn from that, and repeat. The very thing NOT to do is have a committee come up with a complex specification, make it a standard, THEN see if it can be used in real-world settings. The usual engineering questions— How much does it cost to build a version 1?  What skills are needed to build it? How difficult is it to maintain? Is it easy to upgrade or change?—should be answered before blessing a standard. 

Lack of innovations to improve system usability leads to frustrating user experiences
The lack of usability innovations is directly tied to the choice of the paper chart as a design metaphor instead of the processes that comprise clinical work.  Applying usability research to EHR design requires a standard way of thinking about clinical processes and clinical work.  No such standard exists for clinical workflows (see Workflows with Friends).   Lacking standards, usability research is difficult to compare across projects and research groups (3, 4).  If you have seen one UCD process, then you have seen one UCD process.

Inadvertent disclosure of large amounts of patient-specific information
Ransomware and an increasing number of breaches—‘nuff said.   Security remains an afterthought in both software design and organizational priorities.

Increased focus on computer-based quality measurement negatively affects clinical workflows and patient-provider interactions
MU insisted that computers should keep track of quality.  This requires a lot of data collection, much of which, due to insufficient granularity, will prove to be nearly useless.   However, that is not the main problem.   The big problem is that EHR systems were forced into the role of being data collection devices instead of clinicians’ assistants, adding merit to complaints that they turn clinicians into data entry clerks.  There is nothing wrong with data collection. However, MU demanded too many changes to systems, too fast. Data collection requirements were added in the fastest way possible, usability and workflow concerns got lost in the death march.   

Did MU stifle innovation?   I once interviewed for a high-level position at one of the top-five inpatient EHR vendors.  During the interview, I asked the person in charge of R&D how the company came up with new design ideas. The answer—they mostly responded to user requests for features.  This was long before MU…

Information overload from marginally useful computer-generated data
I question this item as a legitimate unintended consequence. Instead, I would say it is exactly meeting expectations. It is a natural consequence of thinking that a data-centric chart is a clinician productivity tool.   It is very much the logical conclusion of thinking that if some information is good, then all possible information is better.

Charts do not present information in the context of the clinician’s workflow or even based on personal preferences.  Process support requires process-centric designs.

Decline in the development and use of internally-developed EHRs
I want to lament this loss more deeply.  However, I have always been dismayed that so little of that pioneering work moved into the software engineering body of knowledge.   Given the long history of EHR development at major medical institutions, one would think that numerous books and articles would have been written about clinical software engineering principles and best practices. 

EHR pioneers built systems that helped their home institutions immensely, and a few innovations made it out of academia – MLM, Blue button, automated guideline languages, but for all the effort and money spent, why are we just now looking at usability research standards? Should these not have arisen at these institutions years ago?    Should not clinical workflow standards and methods have been researched and promulgated at these sites?  Surely, clinical workflow issues are not a 21stcentury phenomenon.

Clinical software design and engineering
Clinical software design and engineering should garner much more attention within the academic informatics community.  It is almost as if clinical software design and development are seen as a “trade” of sorts and not a proper intellectual discipline.   We have many articles describing poor EHR usability. Yet, none takes the step of delving into the internal design (coding and architecture choices) to explain surface usability issues. Describing what is wrong is not enough, alone, to determine how to fix it.  Nope, no more than describing the anatomic and physiologic consequences of cancer tells one what goes wrong at the molecular level or how to cure the disease.   

Curing cancer requires research into genes, drugs, drug delivery systems, etc.   Creating better clinical systems requires research into computable process representations, data transformation algorithms, data structures for clinical concepts, formal workflow analysis methods, and object models, among other things.  This work has been abandoned to EHR vendors.   I am sure that many of these issues were tackled when creating homegrown EHR systems, but where is that knowledge now?   Why was it never put into books? Why so few journal articles?   Only recently has EHR design been of academic interest–one good thing that came out of SHARP.  Even so, internal design and software architecture articles are underrepresented (rare) in informatics journals.

As increasing reports of security breaches, safety concerns, usability complaints, and other unintended consequences demonstrate, current EHR systems need to improve.   Let’s start by agreeing that clinical care software systems are complex, and safe, reliable, usable systems require design expertise and sound engineering practices.  Millions of lives, including mine, are covered by EHR technology, and every patient should have some assurance of the quality of those systems.   

When complex systems affect millions of lives, they rightly become the focus of scientific and engineering investigations.  EHR systems have been around in some form or other for decades.  Why is there no Textbook of EHR Design and Development or a manual of Clinical Software Engineering Principles and Practices?   Where are the journal articles?  We have civil, aerospace, nuclear, electrical, and many other types of engineers.  Still, somehow, it is assumed that a complex clinical system is just a matter of a team of programmers, a database, a GUI, and a good UCD process.   

Clinical software requires much more in the way of formal scientific and engineering support.  Marketplace whims and feature requests will not accomplish this.   Until building clinical care systems is viewed in the same light as building commercial airliners or nuclear reactors, unintended consequences will increase, usability research will be difficult to compare, and ransomware will continue to be profitable.   

  1. Ratwani RM, Hettinger AZ, Fairbanks RJ. Barriers to comparing the usability of electronic health records. J Am Med Inform Assoc. 2016 Aug 29. [E]
  2. Ellsworth MA, Dziadzko M, O’Horo JC, Farrell AM, Zhang J, Herasevich V. An appraisal of published usability evaluations of electronic health records via systematic review. J Am Med Inform Assoc. 2017 Jan;24(1):218-226.
  3. Campbell EM, Sittig DF, Ash JS, Guappone KP, Dykstra RH. Types of unintended consequences related to computerized provider order entry. J Am Med Inform Assoc 2006 Sep-Oct;13(5):547-56.
  4. Sittig DF, Wright A, Ash J, Singh H. New Unintended Adverse Consequences of Electronic Health Records. Yearb Med Inform. 2016 Nov 10;(1):7-12. 
  5. Borycki E, Dexheimer JW, Hullin Lucay Cossio C, Gong Y, Jensen S, Kaipio J, Kennebeck S, Kirkendall E. Methods for Addressing Technology-induced Errors: The Current State. Yearb Med Inform. 2016 Nov 10;(1):30-40.
  6. Cifuentes M, Davis M, Fernald D, Gunn R, Dickinson P, Cohen DJ. Electronic Health Record Challenges, Workarounds, and Solutions Observed in Practices Integrating Behavioral Health and Primary Care. J Am Board Fam Med. 2015 Sep-Oct;28
  7. Zheng K, Abraham J, Novak LL, Reynolds TL, Gettinger A. A Survey of the Literature on Unintended Consequences Associated with Health Information Technology: 2014-2015. Yearb Med Inform. 2016 Nov 10;(1):13-29.
Facebooktwitterpinterestlinkedinmail

{ 4 comments }