Rhode Island's health department this week is set to begin reporting how well the state's 10 acute-care hospitals provide recommended care for three medical conditions--the clearest signal yet that a new push for clinical accountability in healthcare is gaining momentum.
Under a mandatory reporting program, which satisfies requirements of a 1998 state law, Rhode Island consumers will be able to peruse scores reflecting how often each hospital has adhered to guidelines for treating incidents of heart attack, heart failure and pneumonia.
Clinical guidelines for the treatment of those medical conditions are shaping up as the initial focus of a first wave of federal, state and private efforts to reach a consensus on clinical practices as markers for quality care.
The three conditions are among the most common reasons why people end up in a hospital, and quality-improvement experts have seized on the treatment guidelines as quantifiable factors that can be monitored to evaluate the quality of healthcare delivery.
From `cookbook' to credible
Once derided by physicians as "cookbook medicine," clinical guidelines are gaining scientific credibility as researchers draw ever-tighter links to medical outcomes. The reluctance of doctors to part with longstanding latitude, plus the difficulty of organizing compliance with some guidelines, have posed significant challenges to hospitals in implementing the actions even as the evidence of benefit gets stronger.
The evidence level rose a notch last month when researchers at Duke University Medical Center, Durham, N.C., presented findings of a groundbreaking study that directly tied heart-attack guidelines to a patient's odds of dying.
The study of 257,000 patients documented 40% lower death rates as the reward for rigorously following a prescribed set of actions and interventions in the minutes and hours after a heart-attack victim enters a hospital. Those actions can be as simple as giving patients a dose of aspirin within 24 hours of a heart attack, or as challenging as administering an electrocardiogram to a suspected heart-attack victim within 10 minutes of arrival in the emergency department.
For quality-improvement forces engaged in a battle for the priorities of payers and healthcare executives, the findings demonstrate that clinical guidelines for heart care pay off.
"It's an important lesson on the whole notion of spending money on performance measures and reaping the value on outcomes," said Jerod Loeb, vice president of research and performance measurement at the Joint Commission on Accreditation of Healthcare Organizations. The study "highlights the importance of these (heart-attack) metrics specifically and the importance of quality measures in general," Loeb said.
Mounting evidence validates federally sponsored efforts to instill adherence to clinical guidelines in hospitals--and to link performance to reimbursement, said Kenneth Kizer, M.D., president and chief executive officer of the Washington-based National Quality Forum, a private agency charged with improving the nation's healthcare quality. "That's absolutely where this train is going," Kizer said.
The engine of the train is fueled by funding from the federal government, which contributed about $800,000 of the $1 million cost of developing and testing the hospital quality-measurement project in Rhode Island as well as about $1 million for the NQF to develop a national consensus on an initial set of standardized hospital-performance measures.
As the Rhode Island program debuts this week, other initiatives on accountability for clinical performance also are picking up speed.
A few days after hospital scores appear on a Web site in Rhode Island, the Centers for Medicare and Medicaid Services is expected to unveil its own pilot project for disclosing measures of hospital performance to the public, Modern Healthcare has learned. Medicare quality-improvement organizations in Arizona, Maryland and New York will conduct the tests.
In addition, the NQF's work on a set of hospital-performance measures, funded in part by a joint grant from HHS and the Agency for Healthcare Research and Quality, could be forwarded to the CMS by the end of January, Kizer said.
Also in January, the JCAHO will begin reviewing hospitals' clinical track records on treatment of heart attacks, heart failures, pneumonia and pregnancy as part of its accreditation process. To be accredited, hospitals will have to adhere to guidelines covering care of at least two of those four conditions. Hospitals were required to start reporting data as of July 1 to information systems vendors authorized to participate in the JCAHO's Oryx program of performance measurement.
CMS Administrator Thomas Scully recently disclosed a new federal effort to link hospital reimbursements to quality of care (Sept. 16, p. 9).
Neither Scully nor other CMS officials would comment on the details of the new pilot program or plans for the incoming hospital measures. The NQF board, which represents employers, insurers and providers, is scheduled to vote on a set of up to 48 measures by Jan. 29, Kizer said.
A CMS spokesman told Modern Healthcare that "it would be premature for us to talk at all about anything that's going on with hospitals." Up to now the agency has concentrated on publishing quality information about nursing homes (April 22, p. 12).
But Kizer said that once the NQF's measures are delivered to CMS officials, "it's certainly our understanding and expectation that they will be using them" to evaluate hospitals.
Other payer heavyweights, such as the employer-backed Leapfrog Group, the Washington Business Group on Health and General Motors Corp. have indicated they will use the measures as a basis for selecting healthcare providers according to their clinical performance, Kizer said. "This is the future. This is where things are going," he said.
A sneak preview of what might be in store for the healthcare industry begins playing on Dec. 10 in Rhode Island, when hospital-specific performances on clinical guidelines become public.
The project implements the third phase of quality measurement required by a 1998 law called the Health Care Quality Reporting Act. The legislation is also known as the Fogarty Law for the former state senator--Lt. Gov. Charles Fogarty--who introduced it.
Previous projects involved reporting patient-satisfaction data for all the state's hospitals and clinical measures of quality for nursing homes.
The task of developing the hospital measures benefited from similar efforts by the JCAHO and the CMS to test approaches for collecting data and settling on measures that were relatively easy to report, said John Courtney, manager of performance measurement at Qualidigm, the contractor that created the hospital program for the state health department. The Middletown, Conn.-based firm is the QIO for Connecticut, but its CEO, Marcia Petrillo, also serves as executive director of Rhode Island Quality Partners, a separate QIO for Rhode Island.
In partnership with the Hospital Association of Rhode Island, Qualidigm signed up to participate in the JCAHO's pilot test of standardized core measures slated for inclusion in the Oryx program. Petrillo also pitched the project to the CMS as a way for the agency to kick-start its strategy for reporting on hospital performance.
The resulting federal funding paid about 80% of the costs for the JCAHO pilot in Rhode Island and for adapting it to the state's reporting requirement, Courtney said. In return, the state program pledged to "pass on lessons learned to the CMS," he said.
The QIO created a composite score of overall performance on the three medical conditions based on a total of 10 guidelines conforming to the JCAHO requirements. The first release of scores for individual hospitals will cover care provided from May 2001 to December 2001 and will be posted on a Web site. Printed versions of the results also will be available.
Proxies for outcomes
The Duke study of heart-attack guidelines also used a composite score to group hospitals on their adherence to more than a dozen clinical guidelines. About half the measures included in the Duke study are common to the three principal quality initiatives in the healthcare industry: the core set of JCAHO Oryx measures, the NQF's proposal and a voluntary clinical guideline compliance effort run by Medicare's QIOs since 1999.
The news about heart-attack care lends strong support to efforts by the JCAHO, the NQF and others to evaluate and accredit hospitals based partly on how well they follow such established guidelines, said the JCAHO's Loeb. The Duke findings prove what medical experts have deduced all along, that "these measures are good proxies for outcomes," he said.
In an analysis of heart-attack care provided by 1,247 hospitals from June 2000 to June 2002, the Duke study calculated a death rate of 8.3% for facilities adhering the closest to care guidelines, compared with 15.3% for hospitals that lagged the most in providing a composite of recommended care (See chart, p. 16).
The results, presented last month at the 75th annual scientific session of the American Heart Association, were among the first to demonstrate that following guidelines established by the heart association and the American College of Cardiology can improve outcomes for heart-attack patients, said Eric Peterson, M.D., one of the Duke cardiologists conducting the study.
A spokesman for the medical center said the authors likely will seek to have the results published in a peer-reviewed medical journal, but nothing has been submitted yet. Among other considerations, new data for inclusion in the study continue to pour in from a national database, and a decision will have to be made on when to cut off the flow of new cases. The additional data will enhance further the credibility of the findings, the spokesman said.
Obstacles to acceptance
Until the results can be validated by peer review, it will be difficult for the healthcare industry to comment on the details of the findings, said Donald Nielsen, M.D., senior vice president of quality leadership at the American Hospital Association. The AHA is "very supportive of the use of evidence-based medicine" when it's been validated by the medical community, he said.
Despite clinical studies proclaiming the effectiveness of the guidelines, at least one in four heart-attack patients failed to receive 12 of the 15 interventions monitored in the Duke study.
For example, the clinical effectiveness of heart drugs such as beta-blockers have been demonstrated well enough to warrant their use in all but a few instances in which other patient problems might trigger complications. But researchers found that only 74% of patients in the study got beta-blockers in the first day of treatment, and there were wide swings in compliance between leading and lagging hospitals (See chart, p. 7).
"Even for well-accepted treatments, such as giving beta-blockers within the first 24 hours of a heart attack, patients treated at lagging hospitals have only a 50-50 chance of getting the drugs," Peterson said. "In contrast, at leading U.S. centers, nearly 82% of patients were given beta-blockers. This degree of variation in care seems unacceptable."
The problem is that many hospitals haven't set up a process that hits all the clinical bases without depending on explicit orders from physicians during a crisis, said David Schulke, executive vice president of the American Health Quality Association, the Washington-based umbrella organization for Medicare QIOs.
Physicians are aware that the guidelines are good medicine, but "they don't have the time and are not in a position to organize the workplace," Schulke said. "If the system is not set up to make this easy, some people will not get these interventions."
But setting up such a system isn't easy, said Donald Berwick, M.D., president and CEO of the Institute for Healthcare Improvement, a Boston-based QIO. "There has to be will," he said, but providers also have to attach that to a good model for organization and the skills to make the changes necessary in care processes.
"Protocols are a method. The objective is reliability, so you can make sure you do the same thing the right way every time," Berwick said.
Hospitals still have to overcome reluctance from doctors, whose medical training espouses that they are ultimately responsible for the well-being of a patient and that they "don't stop thinking" about every possible angle to treating the patient successfully, he said. "That creates a somewhat allergic reaction to standardization."
Gearing up for guidelines
Hospitals will have to find a way to counteract the reaction, because guidelines are inching their way into the to-do lists of hospitals. "As of this point, essentially all Joint Commission-accredited hospitals in the country are required to collect this data," Courtney said.
Studies that make the connection between interventions and outcomes "ratchet up the level of motivation" for hospitals to devote the necessary resources to "simple, easily executed systems for reliably distributing these interventions," Schulke said. "People need to learn to create the systems that enforce their intentions."
A strong correlation with patient benefit is ammunition for advocates of clinical-process improvement at the hospital level, such as Eric Harvey, a clinical pharmacist in Seattle. Harvey contacted Modern Healthcare for more information after the study was reported in the Nov. 18 edition of the Daily Dose electronic newsletter. Harvey worked at Virginia Mason Medical Center in Seattle for 10 years and helps write guidelines for Clineanswers, a Woodland Hills, Calif.-based vendor of clinical performance systems.
Harvey said he wants to get his hands on the study to help gain physician support for better adherence to heart-attack guidelines. "When the physician is confronted with a patient, he doesn't have time to collect information or review the guidelines at that moment," he said.
A simple administration of a drug such as aspirin requires little procedural planning, but the more planning involved in adhering to a guideline, the lower the compliance among U.S. hospitals. For example, hospitals in the Duke study gave aspirin to heart-attack patients in a timely manner 86% of the time. But timely electrocardiograms were performed only 37% of the time and administration of clot-busting drugs to restore coronary blood flow was accomplished within the critical first 30 minutes only 36% of the time.
It's these kinds of gaps that will show up on performance reports such as the imminent national performance measures, Kizer said. "Not everyone out there is doing this. The results are going to show differences."
What do you think?
Write us with your comments. Via e-mail, it's [email protected]; by fax, dial 312-280-3183.