EHR adoption is a relative success story. I say relative because more clinicians use EHR systems than ever before, but not everyone is happy to be doing so. The effects of EHR adoption on patients have often been discussed, but mostly from the viewpoint of improved outcomes. The safety of EHR systems for patients is a relative unknown. Thankfully, credible research is beginning to appear that offers insights that can be used to both assess current systems and design safer offerings.
In An Analysis of Electronic Health Record Patient Safety Concerns, Meeks, Smith, et al. describe findings that shed light on the potential sources of EHR safety issues. The big deal about this paper, and the main reason I chose to discuss it, is that its findings are based on real-world data, not a survey or the opinions of experts.
The authors conducted a retrospective analysis of safety reports from VA healthcare facilities. The reports analyzed were the findings of investigations carried out under the auspices of the VA’s Informatics Patient Safety Office as described by the authors:
In conjunction with other patient safety initiatives such as sentinel event monitoring, root cause analysis, and proactive risk assessment, the VA created an Informatics Patient Safety (IPS) Office in 2005 to establish a mechanism for non- punitive, voluntary reporting of EHR-related safety concerns.
The IPS reporting system, which includes only health IT-related reports, is the foundation for a rigorous approach that includes not only incident investigation and analysis, but also feedback to reporters and developers of solutions to mitigate future risks to patients.
In all, 100 cases from August 2009 until May 2013 were analyzed and categorized by the type of safety issue raised. Four categories were identified:
|Unmet display needs
|Mismatch between information needs and content display|
|Concerns due to upgrades, modification, or configuration|
|Concerns due to failure of interface between EHR systems or components|
|Hidden dependencies in distributed system
|One component of the EHR is unexpectedly or unknowingly affected by the state or condition of another component.|
Unmet display needs, which were the most frequently cited concern, are described by the authors as follows:
This category represented a pattern of incidents in which human–EHR interaction processes did not adequately support the tasks of the end users. These incidents reflected a poor fit between information needs and the task at hand, the nature of the content being presented (e.g., patient- specific information requiring action, such as drug-allergy warnings or information required for successful order entry), and the way the information was displayed. As a result of this poor fit, the displayed information available to the end user failed to reduce uncertainty or led to increased potential for patient harm.
I would like to focus on unmet needs for the remainder of the post because this category illustrates what I consider to be a significant oversight in how user interactions with EHR systems are discussed. (The other categories are software engineering/quality assurance matters.)
Tying work to people and clinical care systems
The authors present their findings within an eight-dimensional socio-technical framework shown below.
|Hardware and software: the computing infrastructure used to power, support, and operate clinical applications and devices
Clinical content: the text, numeric data, and images that constitute the ‘language’ of clinical applications
Human-computer interface: all aspects of technology that users can see, touch, or hear as they interact with it
People: everyone who interacts in some way with technology, including developers, users, IT personnel, and informaticians
Workflow and communication: processes to ensure that patient care is carried out effectively
Internal organizational features: policies, procedures, work environment, and culture
External rules and regulations: federal or state rules that facilitate or constrain preceding dimensions
System measurement and monitoring: processes to evaluate both intended and unintended consequences of health IT implementation and use
A socio-technical framework is a good tool for discussing concerns among stakeholders, and perhaps, that is its main goal. However, from a software design standpoint, this framework is much too high-level in how issues are discussed to be easily tied back to the underlying design faults in software systems. At some point, user problems must be translated into code, and the larger the gap between the problem description and the code causing the problem, the greater the likelihood for a misunderstanding of what the user is either describing or requires. Yes, I know that software developers are supposed to be able to make the required translation between user requirements and code, but often that is easier said than done. Ergo, current systems…
For example, when I was working on the UAB EHR project, it took quite a while to explain concepts related to drugs to the engineers. My engineers were very capable, but of course, knew nothing about drug-drug interactions or drug classes, or the difference between a side effect, allergy or adverse reaction. Since our drug interaction code was written in-house, these concepts as well as those of “medication list” and “prescription” had to be grasped by them at a level sufficient to write algorithms to manage them. My point is this: Giving a software engineer requirements written at too high a level or pointing out an irritating EHR issue is in no way sufficient to assure that they understand the problem at the level required to write code for it. (Yelling doesn’t increase understanding either.) It is the rare software developer who has practiced medicine, nursing, or any clinical field, so getting better software means finding a better way of communicating requirements. A socio-technical framework is a good starting point for discussion and data collection, but something else is required to solve the problems uncovered—especially those concerned with user interactions.
User interactions with software systems are workflows. I doubt anyone would disagree with this. However, what seems to be lacking in the general perception of workflow is that any workflow consists of not only a series of tasks, but also the information created and consumed by those tasks as well as the people and software systems that execute them. Simply put, information needs are workflow needs, and the two cannot be separated.
The interplay between processes and information is not a novel idea on my part; it is well documented within the domain of workflow patterns, which have been around since the early 2000s. Workflow patterns are based on graph theory and can be encoded in programming languages (see Workflow Patterns, Part I: A Pattern-based View of Clinical Workflows). In fact, colored Petri Nets are based on the functional programming language ML.
Using workflow patterns, one can precisely capture the task to be completed, who will do it, and what information will be required. It is time to move away from workflows as simply something one captures in a swim lane or flow chart and accept that they are mathematical entities that can be captured and rendered unambiguously using workflow patterns. Workflow patterns can be rendered visually using tools such as YAWL, therefore they can be used for interactions with clinicians as well as developers. In fact, the first five socio-technical dimensions listed above could be captured using workflow patterns.
As the authors state, their findings are based on actual cases that occurred within the VA, providing an objective and much-needed, if limited, view of safety issues within EHR systems. Addressing these problems requires not only changes to the underlying software, but also a better way of analyzing and capturing user information needs in the context of the task at-hand and in light of the resources required. Certainly, this is a challenging undertaking and, fortunately, we already have the tools for it. Socio-technical frameworks are great for getting a high-level view of what should and should not happen with clinical care systems. However, workflow patterns are better for capturing and analyzing the details.