Workarounds seem to be an unavoidable fact of using current EHR systems. The volume of research in this area is growing, and hopefully, it will soon lead to better clinical care systems. A few weeks back, I came across an article by Carrington, et al. that discussed the creation of a quantitative instrument to measure how the unintended consequences (UCs) of EHR use affected nurses (1).
The authors set out with two research questions.
Research Question 1: From a pre-existing qualitative data set, what are the UCs encountered by nurses while using the EHR?
Research Question 2: Using these data, what process can be used to transfer qualitative data to scale development for psychometric testing?
The original data used for this paper were taken from qualitative research in an acute care setting that looked at how nurses communicated changes in patient status or clinical events. They looked at barriers and workarounds that occur during nurse-nurse communication. Typical barriers included issues such as poor network connections, computer-lock-ups, and data entry time. Use of documentation shortcuts and finding a working computer were typical workarounds. From the barriers and workarounds identified, questions were developed that could be used as a means of assessing EHR-related disruptions.
Here is an example of the questions developed for the Hardware Issues barrier.
|Category||Subcategory||Thematic Unit||Instrument Item|
|Hardware Issues||Network connection||Computer access is erratic||In your organization, how often do you see maintenance and update-type work being done?|
|Computer issues||Computer lock up||When I enter data into the EHR, I have to leave my station before I am finished.|
|Power loss||No backup during power loss||When I try to document or retrieveinformation, the computers don’t work because they are not turned on or they lose power.|
An instrument like this would allow assessment of EHR issues to be done more quickly and uniformly in similar settings, which would be a boon to developers and researchers alike.
Reading this paper brought to mind one by Sittig and Singh that suggested a socio-technical model for studying HIT and its consequences (2). An eight-dimensional framework is proposed for evaluative purposes. The dimensions are listed as:
|Hardware and Software Computing Infrastructure||This dimension of the model focuses solely on the hardware and software required to run the applications. The most visible part of this dimension is the computer, including the monitor, printer, and other data display devices along with the keyboard, mouse, and other data entry devices used to access clinical applications and medical or imaging devices.|
|Clinical Content||This dimension includes everything on the data-information- knowledge continuum that is stored in the system (i.e., structured and unstructured textual or numeric data and images that are either captured directly from imaging devices or scanned from paper-based sources).|
|Human Computer Interface||An interface enables unrelated entities to interact with the system and includes aspects of the system that users can see, touch, or hear. The hardware and software “operationalize” the user interface; provided these are functioning as designed, any problems with using the system are likely due to human-computer interaction (HCI) issues.|
|People||This dimension represents the humans (e.g., software developers, system configuration and training personnel, clinicians, and patients) involved in all aspects of the design, development, implementation, and use of HIT.|
|Workflow and Communication||This is the first portion of the model that acknowledges that people often need to work cohesively with others in the health care system to accomplish patient care. This collaboration requires significant two- way communication. The workflow dimension accounts for the steps needed to ensure that each patient receives the care they need at the time they need it.|
|Internal Organizational Policies, Procedures, and Culture||The organization’s internal structures, policies, and procedures affect every other dimension in our model.|
|System Measurement and Monitoring.||This dimension has largely been unaccounted for in previous models. We posit that the effects of HIT must be measured and monitored on a regular basis.|
|External Rules, Regulations, and Pressures||This dimension accounts for the external forces that facilitate or place constraints on the design, development, implementation, use, and evaluation of HIT in the clinical setting.|
No matter how one assesses HIT use and interactions, a common thread usually emerges and Sittig and Singh capture the essence of that thread quite well. The extra steps taken by Carrington and Gephart to create an objective instrument (Carrington-Gephart Unintended Consequences of Electronic Health Records Questionnaire) point the way to making qualitative research easier to apply in practical situations. One possible such use that jumped out at me was EHR selection.
EHR selection questions show up in my mailbox all the time (I don’t mind). My approach to system selection has always been:
- Decide what problems are to be solved.
- Do a gap analysis to find out what is needed to solve the problems.
- Analyze processes to see how they can be improved.
- Decide on process improvements.
- If software is part of the solution (software is not the answer to every problem).
- Decide which features are needed.
- Write test scripts for all processes analyzed.
- Score software products based on 1) number desired features present, and 2) how well those features fit proposed workflows. (Products with workflow engines should shine at this step.)
This approach to selecting products has served me well. However, after reading Carrington’s paper, I now see the possibility for improving the selection process. The selection process outlined above is mostly an optimistic, feature-based approach. Needs are used to determine required features and test scripts are used to see how well those features perform. While optimism is good, acknowledging common issues related to EHR use would be equally helpful.
Purchasers who have never had an EHR have no way of knowing the types of issues actually using an EHR might cause. Without an idea of how an EHR can change work habits, new buyers lack the experience to ask important questions. Having a standard set of questions that accounts for known barriers and workarounds might both raise awareness and reduce buyer’s remorse. Of course, anyone can ask these types of questions now. The problem is that unless an organization has been down the road before, it is not likely to have a good question set. A standard instrument or even a question pool would make this wisdom more accessible for all.
Melding Sittig and Singh’s dimensions with Carrington and Gephart’s methods offers the possibility of transforming the wealth of research on workflow disruptions and workarounds into selection tools for new software or diagnostic tools for in-place systems! In a way, it is a type of sensitivity/specificity screening. Making sure a product has all required features and that those features complement the assumed/expected process steps is the sensitivity screen. Using workflow disruption and workaround screening questions constitutes the specificity component.
A sensitivity/specificity approach to software selection might result in more satisfied customers. Providing EHR buyers a selection tool based on the latest research from the frontlines could be a huge help to software purchasers—especially those in small practices. It’s something to think about…
- Carrington JM, Gephart SM, Verran JA, Finley BA. Development of an Instrument to Measure the Unintended Consequences of EHRs. West J Nurs Res. 2015 Mar 22. [E]
- Sittig DF, Singh H. A new sociotechnical model for studying health information technology in complex adaptive healthcare systems. Qual Saf Health Care. 2010 Oct;19 Suppl 3:i68-74.