Usability Research, User-Centered Design, and EHR Systems—Closing the Loop

by Jerome Carter on June 15, 2015 · 0 comments

Since starting EHR Science, there have been many times when I lamented the lack of research of sufficient quality and breadth to guide clinical software development.   When building software for a specific domain, such as clinical care, research is needed that discusses a range of outcomes: end-user effects, performance, reliability, maintainability and other software quality metrics. For EHR systems, the interplay between software design, usability, safety, extensibility, and implementation ease have garnered the most criticism and attention. For software designers then, how to tie design decisions to specific downstream metrics is a fundamental problem.

The arrival of HITECH brought with it a national conversation on EHR systems. Fortunately, research on the impact of EHR system use on clinicians has been a major component of that discussion.   Documentation of workflow disruptions, workarounds, safety problems, and productivity losses is becoming more abundant. The quality of the studies is also improving.

Research on usability issues is particularly welcome because usability is not an easy concept to transform from user complaints to 0s and 1s.   What one user finds difficult and obtuse, another may have no problem with.

My first encounter with Unix back in the mid-80s was a miserable affair.   The Unix guru who insisted that I would “love” it seemed to think that spending hours learning how to use vi was somehow fun—no, not even close.   Usability is not a one-size-fits-all proposition.   Knowing this, how does one actually make software more usable as is currently being demanded?

User-centered design (UCD) is the prevailing approach to creating usable software.   Ideally, UCD occurs within a structured environment created to assure that a consistent, high-quality feedback loop exists between users and software designers.   When properly executed, UCD (usually) results in a better product. (I say usually, because user feedback cannot overcome inappropriate architectural design choices such as hard-coded workflows.)

Knowing the role that UCD processes play in product development, it makes sense to see how capable vendors are in using UCD as part of their software engineering practices. As is happens, Ratwani and colleagues at the National Center for Human Factors in Healthcare have just recently produced the first research I am aware of on this topic (1). The researchers selected 11 EHR vendors that range from 10 to more than 6000 employees. Structured interviews were conducted to assess software development and UCD processes. Their analyses revealed three categories based on vendor UCD processes and practices.

Well-developed UCD
Vendors have a well-developed UCD process, including the infrastructure and expertise necessary to understand user requirements in context, utilize an iterative design process with early usability in- put, and conduct formative and summative testing on several different aspects of their product. Importantly, these vendors have developed efficient processes that allow them to integrate UCD with the rigorous software development timelines that they operate under.

Basic UCD
Vendors understand the importance of UCD but do not have a complete UCD process fully integrated with their EHR design and development. Some vendors in this group have already identified the specific process barriers they face and are working to overcome these barriers, while other vendors realize they have an incomplete UCD process but are still working to identify the specific barriers that are preventing a more complete UCD practice.

Misconceptions of UCD
Vendors do not have any UCD process in place although they believe they do through mechanisms like “web forums that allow [their] users to post suggested features and vote on these features.” Vendors in this group generally have a misconception about what constitutes UCD. One prominent misconception is the belief that having an infrastructure for responding to users’ feature requests and complaints is what constitutes UCD.

In this small sample of vendors, UCD perceptions and process maturity were nearly equally distributed among the three groups—four were in the well-developed and basic groups and three in the remaining group. Though group membership was to some extent related to revenue (all members of the well-developed group had greater than $100 million in revenue), one vendor with greater than $1 billion in revenue was in the basic category.

Vendors were also asked about the challenges they faced in practicing UCD. Responses were reported using the same vendor categories.

Well-developed UCD
Vendors in this category overwhelmingly described the difficulties of conducting detailed, contextually rich studies of workflow in different sub-specialty clinical disciplines. The investments required to conduct these studies is extensive; consequently, vendors focus on specialties that have the largest market audience.

Basic UCD
Vendors in this category require additional knowledge and resources on how to effectively and efficiently employ the UCD processes that they seek to implement, given their specific development cycles and resource constraints. Knowledge gaps include number of participants for summative testing and appropriate testing procedures (eg, training, experimental setup, etc).

Misconceptions of UCD
Vendors in this category require education on the importance of UCD for the usability and safety of the product being developed. Importantly, the leadership needs to understand the business case for UCD to encourage investments in usability resources. One vendor stated, “Our product is used by thousands of people everyday. So if it was that bad, it would already be out of the market.”

Unsurprisingly, the major challenge identified by the top vendor group was workflow research.

There are no standards for workflow analyses within health care. Further, training materials that emphasize flowcharts and swim lanes are great for discussions between people, but inadequate for capturing the complex combination of control-flow, information use, and resource interactions required for software design.

Software engineers are likely trained in UML, which means they are familiar with use cases, activity diagrams, state diagrams, etc., which are fine for understanding internal software functioning, but are not ideal for capturing complex clinical workflows.   Finally, the seeming absence of workflow engines in EHR systems leads me to believe that few vendors know how to take advantage of the tons of workflow research that has been done in the last 20 years (see Workflow Patterns).

What do we know about the efficacy of vendor-specific UCD processes in terms of addressing usability concerns for EHR systems? Not much, really. As the authors point out, the key to determining the efficacy of a vendor’s UCD process requires a before/after analysis of a product’s usability concerns.

Importantly, the usability of the EHR itself was not examined; rather, our focus was on the integration of usability methods into the development process. An important next step will be to examine the usability of the EHR product to determine whether there is a relationship with the rigor of the UCD process being employed by the vendor.

What are the incremental effects on revenue or end-user satisfaction of moving from the Misconceptions group to the Well-developed group??? No one knows that either. Though I expect some differences would become evident, there is no way to calculate the magnitude or significance of those differences. Since every EHR system is different and was created in a different software development environment, any attempts to draw conclusions may be unavoidably apples-to-oranges.   Perhaps new MU3 certification requirements will shed some light here.

A major concern I have is the degree to which vendor UCD processes are permitted to guide architecture and design decisions. The assumption seems to be that well-developed UCD processes will necessarily lead to better EHR products. But what does “better” mean? If it means better than the previous version of a current product, the answer is undoubtedly, yes (if the vendor cares about its users).   Even so, if a “better” product is one that still has hard-coded workflows that users cannot configure, then the degree of actual improvement is not so certain.   Assisting users in their work will always come down to workflow issues. UCD and the software it produces have to account for workflows both the computable variety and those that exist in the real-world.

The user interface is just as important as robust workflow support. Issues such as color, size, window placements, text entry, alert sounds, and any number of other aspects of the UI should be directly configurable by users. Absent significant end-user control, the best UCD process will necessarily target the “average” user, which is what exactly what got us where we are today.

UCD is a key aspect of creating usable software, but alone cannot trump architecture and design limitations.   If vendors are having difficulty mapping workflows AND their EHR systems have static UIs and are missing the adaptability offered by workflow technology, how much improvement can we truly expect?

  1. Ratwani RM, Fairbanks RJ, Hettinger AZ, Benda NC. Electronic health record usability: analysis of the user-centered design processes of eleven electronic health record vendors. J Am Med Inform Assoc. 2015 Jun 6.
Facebooktwitterpinterestlinkedinmail

Leave a Comment

{ 0 comments… add one now }

Previous post:

Next post: