The pressure on hospitals and health plans to demonstrate quality is building to a crescendo, complicated by the fact that governments, payers and patients are all singing different tunes.
That's a problem when it comes to selecting a sophisticated system to meet differing definitions of cost-effectiveness and quality. There's so much variation in the data measured and the complexity involved-let alone the cost-that it can bewilder executives trying to find the package that satisfies its demands.
"The provider community has felt a confusion and schizophrenia in the marketplace regarding quality-measurement systems," said Joseph M. DeLuca, general practice director of JDA, a San Francisco-based healthcare information-technology consulting firm.
On one hand, clinical users are looking at treatment variations and straining to obtain the fullest possible value of such systems for enhancing care. At the same time, Mr. DeLuca said, employers and regulators are interested in measures mainly as a way to check up on providers and ultimately base reimbursement decisions on care delivery and outcomes.
That can make for quite a spread in the capabilities desired in a quality-measurement system, and JDA has spent about six months researching and cataloging the data marketplace to identify the range of vendors and permit product comparisons.
Its study on quality-measurement systems costs $1,200 and includes a "buyers guide" that compares the features and costs of 20 products, grouping them according to category of quality measurement: functional status, quality indicators and severity-of-illness indexing. Another section details each product's intended emphasis, operation, cost breakdown, reporting capabilities and the data yielded.
Quality data.Sizing up the specialty vendors is only half of today's equation, though. The study also sought to determine the capacity of "core" information systems to ferret out the data elements required for quality-measurement reports and eliminate the retrospective search for data.
The report goes into some technical detail on the capability of a dozen vendors to collect data elements on such topics as preventive and prenatal care, health conditions and risks, health status and personal characteristics.
Integrating quality-measurement data collection into the daily workings of the larger information system is important because it makes extra work unnecessary, said Mr. DeLuca. Without that link, clerks and nurses must fan out and flip through charts and printouts, pulling data bit by bit.
Besides the time and cost involved, the manual process has two other big disadvantages, he said. First, it risks missing or misinterpreting a lot of data, and that can skew results. Second, it limits the usefulness of the data because of the time lag between clinical events and their reporting.
Capturing information close to the actual event "helps take retroactive judgment out of the system" and helps improve care. Instead of being "massaged and analyzed" on a monthly or quarterly basis, data can be plowed back to reassess effectiveness of a course of care or a clinical policy, sometimes while a patient is in mid-treatment, said Mr. DeLuca.
Some problems.That said, the report concludes that information system vendors have a long way to go in bringing about seamless support. Although they provide technical flexibility to add elements that are collected from data bases, few offer quality indicators or outcomes measurement in their standard software, the report said.
But part of the reason is vendors haven't been given clear requirements for the ideal set of data elements, because no consensus has emerged. And vendors aren't likely to commit resources to a project without such a consensus, Mr. DeLuca said. "Providers of health information systems have asked, `Can we get to a common denominator of the top two or three systems, at least, so we only have to do this once?"'
That goal may have to evolve, he said, because right now the conflicting interests of clinicians and consumers and business groups can't be met with a single approach.
Some systems may use billing and diagnostic codes that are good enough for payers, but clinicians may want information collected at the point of care. Practitioners may be interested in the processes of care that improve the outcome for the greatest number of patients, while business coalitions are interested mainly in assessing how necessary and appropriate a procedure is and how soon it returns an employee to a productive state.
States involved.Some governments already have narrowed
the choice for providers. Colorado, Iowa, Pennsylvania, Florida, Indiana and Utah all stipulate a system for comparative reporting of healthcare performance. Hospitals that want to head off further mandates will have to develop credible measurement capabilities, "and they've got to get there faster than the government would force them to be," said Mr. DeLuca.
The irony is governments could saddle hospitals with a system that generates mounds of monitoring information but little data that clinicians and managers can use to reshape healthcare and bring about built-in cost savings.
The temporary solution, Mr. DeLuca said, may be multiple measurement systems working at the same time: one for government and employers, and one that clinicians believe in.