This might seem like a silly question with an obvious answer, but is it really? The solution to any problem grows out of the environment in which it appears and from the mindset in which it was conceived. In 1970, the answer to this question would have been a mainframe system. By 1981, after the Apple II and a few other microcomputers had been around for a few years, the answer for most people at that time would still have been mainframes (or maybe minicomputers as well) because microcomputers were still considered to be toys. When IBM released the IBM PC AT in 1982, microcomputers began to be taken seriously as computers—that is, computers that could be used for real business applications. The arrival of reliable local area networking technology cemented the status of PCs as real business computers.
Initially, local area networks (LANs) were used to share printers, disk storage, and applications. However, as servers became more powerful and disk storage more dense and affordable, database management systems and sophisticated client/server software appeared.
Client/server computing lowered the cost of owning essential business applications such as accounting systems that once would have resided on a minicomputer or a mainframe. The tools for building client/server applications were reasonably priced compared to the minicomputer market. As might be expected, the arrival of a relatively easy-to-set-up network operating system in the form of Windows NT (1993) put pressure on the minicomputer market. Health technology companies were quick to jump on the Windows NT bandwagon, so that by 2000 most EHR systems for offices were Windows NT-based.
Of course, by 2000 the Internet had come along and changed the computing paradigm once again by making client/server computing global.
Each new computing platform gained traction because it offered functionality, flexibility, and affordability in a format that had not previously existed. Even so, the new platforms were often greeted with disdain because they were not “real computers.”
Looking back, one sees that solutions to information management problems, and even what was considered to be an information management problem, were both biased by the conceptualization of what a computer was and what a computer should be able to do.
Besides business imperatives, such as cost, flexibility, and functionality, individualization and adaptability to personal tasks seem to have been major drivers for acceptance of new computing platforms. Stated another way, the first tasks assigned to computers were those that made the most sense at a business level– accounting, inventory, etc. However, with lower prices and more flexible systems, individual productivity became more important—which is where we are today. The end of this little history lesson brings me to this subject of this post: Are smartphones and tablets computers?
It is important to ask this question now because smartphones and tablets have evolved to the point where their capabilities in relation to other computing platforms today is very similar to the relationship microcomputers (i.e., PCs) had to mainframes and minis in 1982. Knowing that the affordability and flexibility of client/server systems made it feasible for office-based clinicians to have their own EHR systems, one has to ask what will be the ultimate effects of smartphones and tablets if they follow a similar development trajectory.
When ARM-based tablets and smartphones appeared, neither had serious computing power. In fact, the iPad– the best selling tablet–is classified as a media tablet by Gartner and other market analysis firms. The definition considers them to be more entertainment devices than real computers. This definition, now four years old, is being seriously challenged.
Last year, something wonderful happened, to borrow a line from Arthur C. Clarke. Apple introduced a 64-bit chip in the iPhone 5s. The A7 chip has demonstrated significant computing power compared to its 32-bit predecessors. In the tablet arena, the latest iPad Air 2 has a second-generation A8X 64-bit chip, and the A8X’s performance on standard benchmarks has demonstrated computing power that rivals that of a Macintosh laptop from a few years ago. And a few years ago, laptops were considered real computers.
Personal computers had to mature for a few years before they could tackle real problems, and smartphones and tablets are following the same pattern. Among other things, mobile computers add new user interface options and portability to the computing mix in ways that no other computing platform can match. Solutions to clinical information management problems must now embrace mobile computing capabilities: touch-based interfaces, multi-media data management, communications functionality, and location/ movement awareness. Without question, the iPad Air 2 is a real computer, and it and other tablets with similar specs can be used to solve real problems.
Most current EHR software was designed well before tablets and smartphones existed, and many were born before the Internet really caught on. These EHR systems were designed back when LANs were state-of-the-art computing platforms, the cloud did not exist, Wi-Fi was painfully slow, and pointing was done with a mouse. The computing platform and development tools dictated how developers approached clinical information management problems.
The most recent rethinking of software designs came in the form of web-based systems. While browser-based applications solve some problems, they create others. For example, web applications are platform-independent, making it easier to deliver software solutions without worrying about much more than connection speeds (or quirky browsers—you know who you are). However, there is a trade-off when using browser-based applications. The Web is state-less, so applications have to create a way to remember users between interactions. In addition, one has no idea what will be available on any computer that accesses the application. For me, Web apps are easier to distribute but harder to create. Building web apps has proven to be far more involved than any client/server app I ever built. One thing I am growing to like about mobile systems is that creating software for them is similar to creating for a desktop. When designing an iPad app, one knows exactly what an iPad can do, and access to every feature is available.
Looking at clinical care and its computing needs, I see requirements that are distinct when compared to standard business computing. Clinical data are varied and numerous. Clinical work consists of interacting with patients to obtain information, consulting information sources (e.g., chart, guidelines, articles, other clinicians), making decisions, recording information, and moving on. Support for clinical work requires large, searchable data stores, fast networks, sophisticated communications functionality, and portable computers capable of displaying text, pictures, sound and video. Tablets and smartphones are the first computers to meet all of these requirements.
Writing for mobile means stepping back from web and client/server applications and being willing to see a problem purely from the standpoint of mobile computing; that is, adopting a “mobile first” attitude.
Mobile first requires a willingness to rethink past approaches. At the top of the list is use of cloud capabilities. Like mobile computers, the cloud is a new way of doing things. Building mobile applications that link to cloud storage and use APIs to interact with other applications is a new way of delivering functionality. There is no reason to have local terminology services if they can be obtained via a cloud application. The same is true of workflow engines or another service that supports clinical work. Mobile first also means not taking a client/server app and putting a mobile face on it. That will not work any better than putting a browser interface on a standard desktop app. It might work to some extent, but the original design limitations will show through.
Until the 64-bit chips arrived, the amount of computing power in mobile systems made them useful only for limited applications. However, Apple has shown in its A8X chip that tablets and smartphones are rapidly gaining sufficient computing power and communications capability to make serious clinical applications possible in a way that has never existed–and this is only the chip’s second generation! The fourth generation will appear in 24 months, if Apple sticks to form. What will those systems be capable of doing?
How many EHR vendors will bite the bullet and start serious mobile-first projects? Few, I imagine, because if the past is prologue, most will cling to the prevailing wisdom that mobile devices are not real computers. And we know how that story ends…