Only three electronic health-record systems got perfect scores for usability—two made by Allscripts and one by McKesson Corp.—out of 20 products rated by the American Medical Association and a group of researchers.
The AMA teamed up with MedStar Health's National Center for Human Factors in Healthcare to rate 20 commonly used EHRs by their adherence to user-centered design best practices.
The average score on the 15-point scale was 11.5 and the median was 12.5. The worst-performing product, from eClinical Works, got a 5.
Vendors were supposed to incorporate user-centered design principles in developing systems that would be eligible for the federal EHR incentive payment system.
This year researchers at Washington-based MedStar reported that many EHR vendors failed to meet the program's user-centered design requirements while others ignored optional user-centered design best practices.
“Physician experiences documented by the AMA demonstrate that most EHR systems fail to support effective and efficient clinical work, and continued issues with usability are a key factor driving low satisfaction with many EHR products,” AMA President Dr. Steven Stack said in a news release about the ratings, which are available online free of charge. “Our goal is to shine light on the low bar of the certification process and how EHRs are designed and user-tested in order to drive improvements that respond to the urgent physician need for better designed EHR systems.”
To qualify their EHR systems for use by providers in the federal incentive payment program, vendors were asked to report to the Office of the National Coordinator for Health Information Technology at HHS how they met their user-centered design requirements. Those requirements included multiple users actually testing the systems for usability and scoring the results.
These reports are made available to the public via an ONC website, but they are not uniform in design and the placement of key information. Further, their data was not scored against recognized best practices in user-centered design principles, nor was it compiled in one easily accessible format for comparison with usability reports for other vendors' systems.
In their initial study, the MedStar researchers looked at the ONC filings on 50 EHRs. They found that nine in the sample (18%) had no public report on usability. Of the 41 vendors with reports on file, about a third failed to state the type of user-centered design process they used, a requirement for certification.
The ONC also has endorsed a National Institute of Standards and Technology recommendation that vendors use at least 15 participants when performing user-centered design testing. But the ONC didn't make that threshold mandatory, and 63% of the vendors in the MedStar sample had fewer than 15 participants in end-user testing.
The new rating scheme ranks each vendor's EHR on a 15-point scale by comparing what the EHR vendors reported to the ONC against best-practices criteria developed by participants from the AMA and the MedStar center.
“Every single vendor should be at 15 points,” said Raj Ratwani, scientific director of the MedStar's Human Factors Center and a principal developer of the ratings system.
For the initial 20 vendors, Ratwani said, he and fellow researcher Dr. A Zach Hettinger, the center's medical director, “went through every single use case these vendors used and how they approached user-centered design.” Dr. Michael Hodgkins, vice president and chief medical information officer for the AMA, and Matt Reid, the association's senior health IT consultant, also worked on the ratings framework.
The ONC did not respond to requests for comment on the new rating system. More than 400 EHR systems have been tested and certified for use in the federal incentive payment program, so Ratwani acknowledged the ratings are a work in progress.
“If we can spark a little bit of competition with this framework, then we're achieving our goal of improving these products,” he said.
The Electronic Health Records Association, a trade group representing 31 EHR vendors, panned the ratings as mostly useless to buyers.
“We note the acknowledgement in the AMA's press release that the framework is not intended to evaluate the perceived usability of an EHR, only the documentation submitted for certification,” the association said in an e-mail. “Given that the documentation presented for certification is a small subset of the usability evaluations and user-centered design techniques that a vendor may practice, this framework may not represent a comprehensive view of the user-centered design work done in the context of any given product.”