In 1988, the Joint Commission on Accreditation of Healthcare Organizations sponsored a national conference on clinical indicator development. In his address, JCAHO President Dennis O'Leary, M.D., took a poke at organizations that were rushing into the outcomes measurement race, telling the crowd that it takes years to develop a good indicator and those who thought otherwise were fooling themselves.
But it may be the JCAHO that's been doing the fooling while attempting to position itself as the designated quality monitor under healthcare reform.
A two-month investigation by MODERN HEALTHCARE has uncovered evidence that the JCAHO's own indicator program-despite nearly eight years of research and development and millions of dollars spent-has serious problems.
The investigation revealed that:
Many test-site hospitals found the system ineffective and costly.
Many businesses have found that the data generated by the system don't meet their needs.
Many potential philanthropic donors refused to fund the project.
Mandatory use of the system by accredited hospitals may be illegal.
Still, the JCAHO is moving ahead with the project upon which it has pinned its reputation and financial survival. In fact, budget projections reveal that fees generated from hospitals' use of the system could produce a windfall for the organization.
All this has taken a toll on the Oakbrook Terrace, Ill.-based accrediting agency, which is spending more and more of its time and money trying to convince hospitals, businesses and policymakers that the organization is doing the right thing. The toll includes the turnover of key staffers, some of whom say the JCAHO is leveraging hospitals' dependence on the organization to sell them an unproven product.
The JCAHO accredits about 5,300 hospitals, or more than 80% of hospitals nationwide, using its quality assurance and improvement standards. Accreditation isn't required under law, but it's sought after for several reasons:
Some payers won't contract with unaccredited hospitals.
Accreditation may be a requirement before hospitals can sell tax-exempt revenue bonds.
Hospitals use accreditation to attract physicians and executives.
Accredited hospitals automatically qualify for Medicare-the source of 40% of hospitals' patient revenues.
Indicator history. Clinical indicatorsare measures of outcomes or processes that are believed to be linked to quality of care. The most obvious example of a clinical indicator is mortality rate. This September will mark the eighth year of research and development of the JCAHO's indicators.
The indicators are designed as tools that accredited hospitals can use internally to improve the quality of care they provide. Accredited hospitals would feed their information into the JCAHO's indicator data base and receive back their indicator results and comparable data from other hospitals. Use of the system would become mandatory for accredited hospitals.
To date, the JCAHO is voluntarily collecting, testing or developing at least 79 clinical indicators in 10 areas (See chart, p. 38). The JCAHO has set a maximum of 30 indicators to be used by hospitals when the system is required.
Hospital participation. The clinical,financial and political success of the JCAHO's indicator monitoring system depends on a high level of hospital participation. High participation makes indicator data more statistically valid; it makes it easier to attract other hospitals; and it makes it profitable.
That's why the JCAHO has launched an ambitious marketing effort to solicit hospitals to voluntarily participate in31
its indicator project in 1994 and 1995. But MODERN HEALTHCARE has uncovered information through interviews and internal documents that tells a different story about the indicator project.
In 1987, the JCAHO released a list of 17 hospitals that agreed to serve as "alpha" test sites for its anesthesia (now called perioperative) and obstetrics indicators. By 1993, 12 of the hospitals dropped out of the program, according to a phone survey of the hospitals by MODERN HEALTHCARE. Only four hospitals remain in the program today. One hospital declined comment.
Another 17 hospitals signed up as alpha sites beginning in 1990 to test the cardiovascular, trauma and oncology-care indicators. By 1993, 10 of those hospitals had dropped out.
In total, 22 of the original 34 alpha site hospitals-nearly 65%-bailed out.
Quality assurance managers at the drop-out hospitals told MODERN HEALTHCARE that their institutions left the project because it was too complicated, labor intensive and costly to justify what little benefit they received.
"We were using 0.5 to 0.75 of an FTE (full-time equivalent) just to do the JCAHO stuff," said Brian Schreck, vice president for quality management and planning at St. Luke's Regional Medical Center in Boise, Idaho.
The 267-bed hospital pulled out of the project last fall after four years as an alpha site. The hospital's former president, E.E. Gilbertson, sat on the JCAHO's board in 1986 when it approved the project as part of the JCAHO's Agenda for Change program.
"We were a little naive going in," Mr. Schreck said. "We thought it might be useful having input into how the indicators would be developed." But, according to Mr. Schreck, the process proved time-consuming and expensive, and it duplicated many of the hospital's own data-collection efforts.
Jodi Mansfield, vice president for operations at Shands Hospital at the University of Florida in Gainesville, said, "We were telling them (the JCAHO) that the indicators were not workable, but they weren't listening."
The 536-bed hospital volunteered as an alpha site for three sets of indicators: oncology, medication use and infection control. It also served briefly as a "beta" site for the oncology indicators.
The hospital ran into a number of problems with the indicators. Data were extremely difficult to collect and they had to be collected separately because the JCAHO's specifications differed significantly from other external data requirements, such as those of the tumor registry operated by the American Cancer Society. But perhaps most important, the hospital's physicians didn't feel that the data helped them improve patient care, Ms. Mansfield said.
Ms. Mansfield said she and representatives from other hospitals with similar problems met with the JCAHO last spring. But JCAHO executives didn't take their concerns seriously, she said, and the hospital subsequently pulled out of the program in June.
The beta experience. The plight ofthe alpha hospitals carried over into the project's beta, or second, phase.
Of the original 451 beta sites, 102, or nearly 23%, dropped out of the project from 1990 to 1993, according to JCAHO records. And, of the 349 hospitals that completed the beta testing phase last year, 157, or nearly half, decided against participating in the voluntary phase of the indicator project that began in January.
The JCAHO also surveyed the beta hospitals that didn't sign up for voluntary testing. Among the reasons cited were that the staff resource requirements were too great, the beta data weren't as useful as they anticipated and the costs were too high.
But hospitals' reactions weren't limited to complaints about administrative and cost hassles. JCAHO documents show that only 34% of the beta hospitals said the perioperative indicators provided clues to improving care. Some 58% said the same thing about the obstetrics indicator data.
"Hospitals more advanced in quality improvement that had sophisticated performance monitoring systems in place identified the IMS information (as) inferior to existing internal informational systems," the American Hospital Association said in an internal Oct. 12, 1993, memorandum, obtained by MODERN HEALTHCARE.
The JCAHO admits that it has no empirical evidence to show a link between use of the indicators and improvements in patient care. In a June 10, 1993, report the JCAHO said its only proof is anecdotal.
"Anecdotal information will continue to be gathered using an ethnographic approach to data collection, as well as informal feedback from users of the indicators," the report said. "Over time, it will become feasible to statistically measure changes in rates following actions taken to improve care."
Recognizing the public relations need to release data linking its indicators with improvements in patient care, the agency began publishing upbeat stories in its bimonthly newsletter, Perspectives, about how hospitals used the indicators to improve care.
However, the names of the hospitals are confidential, so no independent verification is possible. The first account, published last fall, reported that only 24% of 326 surveyed beta sites identified opportunities to improve care using the perioperative and obstetrics indicators.
JCAHO reaction. The JCAHO says many of the test hospitals that didn't like the indicator system didn't realize what they were getting into.
"A lot of hospitals...signed up thinking that it was kind of coming in on the ground floor of the IMSystem, and found that there was not a ground floor yet. They were going to participate in building the ground floor," Dr. O'Leary told MODERN HEALTHCARE. "For those who had that level of expectation, this was hard work. It wasn't well-oiled."
The IMSystem is the formal name of the program being sold to hospitals.
Also, the JCAHO says the alpha and beta phases were designed to test the feasibility of collecting indicator data and sending them to the JCAHO. They weren't using the system as it's designed today. It would be unfair to judge the system by the experiences of the early hospitals, Dr. O'Leary said.
Last June, the JCAHO prepared a one-page handout explaining how "beta testing differs from an operational indicator monitoring system."
Either way, the test-site hospitals had more than ample opportunity to provide the JCAHO with feedback, Dr. O'Leary said. The hospitals repeatedly were surveyed about their views of the indicators, he said.
"The one thing the beta sites complain about the most is the number of surveys that we take," he said.
Still, as of March 4, only 131 hospitals across the country have signed up for voluntary testing this year, but Dr. O'Leary attributed that to a time delay faced by hospitals in installing the necessary software to collect the data.
Dr. O'Leary said about 2,300 hospitals have expressed interest in the voluntary phase of the program, including 21 hospital systems. He said he expected a "logarithmic jump" in hospital participation by mid-year.
In its marketing packet, the JCAHO says the median capital costs incurred by beta hospitals was $2,122, and their median annual cost of collecting and transmitting data was $4,845. That doesn't include the costs of other related products, which include educational guides and instructional videos.
States say no. Hospital opposition to the project peaked last year when some institutions enlisted the help of their state hospital associations and the AHA.
An AHA survey of 21 state hospital associations conducted last year found that 17 "strongly agreed" that the JCAHO should not require mandatory participation in the indicator program.
Many of the hospital associations that opposed mandatory use of the system have members that are facing stringent state data reporting requirements. They want the JCAHO to accept indicator data that their hospitals already are collecting. Other associations have their own outcome indicator projects under way, and mandatory participation in the JCAHO's program may reduce participation in their programs.
"If the system is good, hospitals will use it. If it's bad, why make them use it?" asked Vahe Kazandjian, vice president for research at the Maryland Hospital Association, which has had its own clinical indicator project going since 1985. More than 750 hospitals in 48 states are using MHA's indicators.
Dr. O'Leary called much of the opposition "phony," suggesting that the groups were acting out of self interest. He said the JCAHO would welcome the incorporation of better indicators into its system from other sources.
Still, the associations, through the AHA, were successful in pressuring the JCAHO to drop its plan to make use of the system mandatory in 1996.
"Our concerns were heard, and they backed off," said Thomas Granatir, director of quality and evaluation at the AHA's Hospital Research and Educational Trust. Mr. Granatir coordinated the state association efforts.
Last May, the JCAHO's 28 commissioners took the position that use of the indicators will become mandatory once the system is part of the accreditation process. They said that won't happen until the value of the system is proven. The tentative plan is to make the system mandatory in 1997.
Dr. O'Leary said the 1996 date was never firm. "Using the terms `mandatory' and `1996' was to get people's attention that this was coming, and I believe I succeeded," he said.
Dance with us. In September, the
JCAHO sent letters to the core dozen or so associations that voiced their concerns and asked them to become "quality partners" in the IMSystem.
As a quality partner, associations would have several options for collaborating with the JCAHO. Under one option, the associations would collect the indicator data from hospitals and forward the information to the JCAHO. Under another, associations would receive state-level data reports.
If an association becomes a quality partner, their members would receive a 10% discount on JCAHO consulting services and a 20% discount on educational products. (The educational discount has since been lowered to 10%.)
In exchange, quality partners would conduct direct-mail campaigns to their members soliciting participation in the project and publish articles in their newsletters about the project.