The CMS is in only its first year of rewarding hospitals financially under Medicare for reporting clinical information. And legislation to set up a national system of databases on patient safety was signed into law just last month.
But one of the early champions of a philosophy of gathering and sharing sensitive information about how healthcare providers perform -- the Maryland Hospital Association's Quality Indicator Project -- recently concluded its 20th anniversary celebration.
"The real-world example that started this train down the track is the Maryland Quality Indicator Project," says Jerod Loeb, executive vice president for research at the Joint Commission on Accreditation for Healthcare Organizations. "It was conceived at a time when there wasn't much standardization in this realm."
In 1984, the leaders of seven Maryland Hospital Association member hospitals met and agreed to share data on seven metrics. The QI Project, or QIP, began collecting, slicing and dicing their data the following year and since then, it has adapted and thrived, discarding metrics that didn't work and adding a host of new ones to meet participants' needs.
Using a grant from the Robert Wood Johnson Foundation, the MHA began in 1987 to test its model of indicators and assessment tools for use by hospitals outside the state. Today, the QIP is a registered trademark with more than 225 metrics and more than 1,000 U.S. healthcare organizations using its data-mining services, as well as another 300 healthcare institutions in nine countries abroad.
The project has a full-time staff of 36 and posted operating revenue of $8.1 million for its fiscal year ended June 30, 2004, says Nell Wood, director of marketing and communication for the Maryland QIP.
Planning for the QIP came against the backdrop of -- and widespread dissatisfaction with -- a controversial first effort by the Health Care Financing Administration, predecessor agency to the CMS, to compile hospital mortality data, Wood says.
HCFA, which initially intended to gather information only for internal use by Medicare peer review organizations, was forced to release the data to the news media in 1986 under the federal Freedom of Information Act. (Compilation by the government of hospital mortality rates continues. HHS' Agency for Healthcare Research and Quality publishes inpatient mortality rates for 16 medical procedures and conditions based on claims data.)
The QIP also was initially set up to distribute data only for internal use by participants, Wood says.
In the beginning, it was tough enough getting hospitals to share data with one another, but "it was never intended to be public information," says Richard Davidson, who, before he became president of the AHA in 1991, served 26 years with the Maryland Hospital Association, including 22 years as president. Davidson worked on the QIP at its inception.
"It was intended to be self-help," Davidson says. "You couldn't participate unless you had a plan to do continuous education and improve."
After a brief stint a few years ago with publicly reporting aggregated national data on its process measures, the QIP returned to its roots and no longer publicly discloses data under its own name, Wood says. But the QIP does help clients supply data to numerous state and national accountability programs, such as the Hospital Quality Alliance program and the JCAHO's Oryx core measures initiative.
Of the 50 or so organizations certified by the JCAHO to provide data for the 3,800 hospitals under Oryx, the QIP is the largest, with 460 facilities using its services, Loeb says.
"I think size in one way makes them unique," Loeb says. "It was a project conceived by and run out of the hospital association, and that makes them unique. And the vision, that makes them unique."
Davidson credits the late Baltimore lawyer and civic leader Eugene Feinblatt, who died in 1998, with being the push behind the data exchange program. At the time, Feinblatt was a board member and past chairman of the board of the Maryland Hospital Association and past chairman of Sinai Hospital in Baltimore.
Feinblatt's philosophy was that "governance has the ultimate responsibility for everything that goes on in the institution," Davidson says. "He felt that as a trustee, he never really had the tools to answer the questions: Do we do what we do well? Are we doing the right thing?"
"One hospital can't do this," Davidson recalls Feinblatt telling him, "so why don't you look into how to do it?"
Spencer Foreman, then chief executive officer of Sinai Hospital and now president of Montefiore Medical Center in New York, headed the MHA committee that launched the project, Davidson says. Many of the initial measures were homegrown, but based on existing research.
"A bunch of clinical people sat around the table and forged the development of this," Davidson says. "There wasn't a cookie-cutter model."
"All the stakeholders around quality were in the room thinking about how to get to the bottom of (Feinblatt's) question," Davidson says. "It caught fire. The Joint Commission was just beginning to develop their Oryx program and they had more difficulty. They had to be perfect. We were this little group, and we didn't. It was a high-quality program because it was little, and we would maintain the quality. Initially, the growth of the program was purposefully suppressed."
Also, from the beginning, the vision was for the QIP to become what it is now, a means to measuring quality across the full continuum of care, such as outpatient, inpatient, psychiatric and long-term care, Davidson says.
"That was a nice vision," he says. "Easy to say, but not easy to get to. ... They've been making progress ever since."
Maggie Eller, director of performance improvement at Calvert Memorial Hospital in Prince Frederick, Md., says her hospital has long recognized the value of the data project. Prince Frederick, on Maryland's West Shore, was once a rural community but has become a suburb of Washington, D.C., which is 40 miles to the northwest. Calvert Memorial has 123 staffed beds and has used the QIP extensively since 1987.
"It is essentially our quality outcomes database," Eller says. "That's how we will be operating probably for the rest of my career. We have a single database and a single vendor that provides all of our needs, and that includes (JCAHO) core measures. They are incredibly responsive to their customers' needs."
Susan Dohony, vice president for performance improvement at Calvert Memorial, says that, "Over time, we have participated with almost every quality measure that they have." One of the measures used the longest, about 15 years, has been throughput in the emergency department, Dohony says, and there the QIP's flexibility has been a benefit.
"The ED is usually the patient's first experience with the hospital, and people don't like to wait," Dohony says. "We want to compare ourselves with emergency rooms that see 30,000 patients a year with a Level II trauma center. That allows us to compare apples to apples. We had a significant number of patients staying for over six hours (above the norm), but that was OK because a number of them were psychiatric patients who tend to stay longer. So (the QIP) gives you the ability to drill down."
Adam Beck, director of performance improvement at Franklin Square Hospital Center in Baltimore, one of the seven founding QIP hospitals, says the breadth of the QIP metrics is an important benefit.
"One of the main things we use it for is to get at benchmark information for various indicators that are of specific interest to us," Beck says. "These are benchmarks that are not very easily obtainable from any other source ... your hospital's specific mortality rate for a specific DRG, using all-payer data, not just MedPAR (Medicare) data, from a sample across the nation, not just the state of Maryland."
Apples to apples
User support and data integrity are also key features, Beck says. "They have a very specific guide, or manual, that goes along with every measure. What's really great, they use conformance surveys just to make sure everyone is measuring things across the country, so you feel pretty confident everyone is measuring apples to apples.
"It's nice that it has that structure behind it," Beck says. "They've been great for our hospital."
Loeb says the future for the QIP and data organizations like it remains bright, even as healthcare adopts more clinical information technology. Half of the hospitals that the JCAHO accredits have 100 beds or fewer, so while a few large, tech-savvy hospitals, such as Evanston (Ill.) Northwestern or LDS Hospital in Salt Lake City, may be able to handle sophisticated data-mining in-house, most will continue to need services like those the QIP offers.
"Only when data collection becomes a byproduct of healthcare delivery are you going to see hospitals able to do this on their own," Loeb says. "It's going to take a long time before the value equation enables them to do it."