What Do You Mean by EHR “Design”– Architectural, Detailed, User Interface, or Process?

by Jerome Carter on August 31, 2015 · 3 comments

As I have written more about EHR design, I have noticed the term has different meanings among various groups focused on HIT. There are so many different groups focused on HIT — each with its literature and jargon — that at times it is difficult to communicate. The rising interest in user-centered design has made matters worse. There seem to be four major groups that are focused on EHR issues: human factors professionals, clinicians, informaticists, and software engineers.   Within the first three groups, design often refers to the user interface or user experience while in the fourth group it has two additional dimensions — the overall organization (architecture) and internal makeup (detailed design) of a software system.   These discrepancies in usage can lead to minor misunderstandings or to talking past one another, which is bad if the common goal is better clinical software.

Since we are talking about software, the Guide to the Software Engineering Body of Knowledge (SWEBOK) is a good place to consult for definitions.   Here are the definitions offered for architectural and detailed design.

Software architectural design (sometimes called high-level design): develops top-level structure and organization of the software and identifies the various components.

Software detailed design: specifies each component in sufficient detail to facilitate its construction.

Architecture addresses important aspects of a system, such as security, connectivity, components, and performance. These are high-level issues that must be considered when creating any software system.   For example, an app designed for a single user on a smartphone will have different security and performance requirements than a multi-user, cloud-based system. Distributed systems may have software components that reside on different servers (e.g., a database server) while an iPad app is self-contained.   Architectural design considers what the software must do, who will use it and under what circumstances, and then determines if the resulting software should be stand-alone, multi-user, component-based, the hardware required to deliver desired performance, etc.   Selection of an appropriate architecture is done by applying accepted architectural principles. Here are a few principles (with SWEBOK definitions) that I have discussed in prior posts. 

 Coupling and Cohesion. Coupling is defined as “a measure of the interdependence among modules in a computer program,” whereas cohesion is defined as “a measure of the strength of association of the elements within a module”.

Decomposition and modularization. Decomposing and modularizing means that large
software is divided into a number of smaller named components having well-defined interfaces that describe component interactions. Usually the goal is to place different functionalities and responsibilities in different components.

Separation of concerns. A concern is an “area of interest with respect to a software design.” A design concern is an area of design that is relevant to one or more of its stakeholders. Each architecture view frames one or more concerns. Separating concerns by views allows interested stakeholders to focus on a few things at a time and offers a means of managing complexity.

Detailed design addresses software design at the level of programming code. Classes, design patterns, functions, methods, state diagrams, and data access are examples of detailed design concerns.

User interface design and user experience are a third design area.   SWEBOK offers the following general UI design principles.

Learnability. The software should be easy to learn so that the user can rapidly start working with the software.

User familiarity. The interface should use terms and concepts drawn from the experiences of the people who will use the software.

Consistency. The interface should be consistent so that comparable operations are activated in the same way.

Minimal surprise. The behavior of software should not surprise users.

Recoverability. The interface should provide mechanisms allowing users to recover from errors.

User guidance. The interface should give meaningful feedback when errors occur and provide context-related help to users.

User diversity. The interface should provide appropriate interaction mechanisms for diverse types of users and for users with different capabilities (blind, poor eyesight, deaf, colorblind, etc.).

Each of the three design aspects plays a role in every software system, and building better clinical systems requires attention to all three.  The interplay among these aspects of software design must be taken into account when discussing usability and user-center design.   Fixing a problem that annoys users may require revisiting each design aspect, which explains why seemingly simple problems may be hard solve.

Consider a common problem such as slow response time. When a screen takes too long to load or update, it can be a real pain for users. Now, what is the cause of the slow update/load? Is it inadequate servers, poorly-written programming code or a low-cohesion software component/module that is performing too many functions, or all three?   Often, there is no easy way to tell.    The ability of usability studies and UCD to improve software is determined by the extent to which a problem cuts across all design aspects.    Usability issues that involve all design aspects will be more costly and time-consuming to resolve than those limited to the UI.   This fact must be kept in mind when applying UCD and usability studies to legacy systems, and expectations must be tempered accordingly.

We are in the earliest stages of determining how current software design principles, precepts, and methods apply to clinical care systems.   The paper chart is the main stumbling block preventing a critical (re)assessment of clinical software design principles.   A static information archive has been used as the basis for EHR systems, leaving clinical processes out in the cold. As we are learning, processes are just as important as data. Unfortunately, legacy clinical software has made workflows implicit, and current UCD processes and usability research are mainly focused on legacy EHR systems.  Legacy systems are poor at process support because they are designed to offer information, not support processes. Fixing them will require revisiting all three design aspects along with the recognition of a fourth aspect – explicit process representation. Next-generation clinical software must allow explicit process representation and execution. Likewise, clinical software development practices must be able to unambiguously represent clinical processes from requirements to deployment.

Clearly, the HIT community has embraced the idea that processes are important. But, in doing so, seems to be doubling down on trying to shoehorn process support into legacy systems whose designs are based on paper charts. Bad usability and poor interfaces are frequently cited as the reasons EHR systems are so disruptive to clinicians’ workflows. However, usability and interface issues are often symptoms of deeper problems. Design teams must address each of the four aspects of clinical software, and these teams must have a shared conceptualization of what the term “design” encompasses.   Explicit representation of clinical processes and workflow technology must become part of the design discussion when addressing usability concerns, care coordination features, and CDS needs. Otherwise, we will continue to bump into the “workflow” elephant in the room. Design has four aspects—all must addressed…

Facebooktwitterpinterestlinkedinmail

Leave a Comment

{ 2 comments… read them below or add one }

@BobbyGvegas September 1, 2015 at 1:17 PM

apropos of your “UI design principles”:

“…The AI community began by trying to model isolated human intelligence while the emerging community of human-computer interaction designers followed in Engelbart’s augmentation tradition. He had begun by designing a computer system that enhanced the capabilities of small groups of people who collaborated. Now Gruber had firmly aligned himself with the IA community. At the Stanford Knowledge Systems Laboratory, he had interviewed avionics designers and took their insights to heart. There had been an entire era of industrial design during which designers assumed that people would adapt to the machine. Designers originally believed that the machine was the center of the universe and the people who used the machines were peripheral actors. Aircraft designers had learned the hard way that until they considered the human-machine interaction as a single system, they built control systems that led to aircraft crashes. It simply wasn’t possible to account for all accidents by blaming pilot error. Aircraft cockpit design changed, however, when designers realized that the pilot was part of the system. Variables like attention span and cognitive load, which had been pioneered and popularized by psychologists, became an integral part first in avionics and, more recently, computer system design…”

Markoff, John (2015-08-25). Machines of Loving Grace: The Quest for Common Ground Between Humans and Robots (Kindle Locations 4720-4730). HarperCollins. Kindle Edition.

Reply

Jerome Carter September 1, 2015 at 4:59 PM

Thanks for this. Design has to take a more central role in HIT. One problem I have been wrestling with is determining the most efficient way to adapt/translate human factors research to clinical software development. Still searching…

Reply

{ 1 trackback }

Previous post:

Next post: