Methodist Medical Center of Illinois, Peoria, registered about five adverse events per 10,000 doses of medication before it mobilized all its doctors, nurses and information technology personnel to get that rate down. With a new chief executive officer at the helm in 1999, the 284-bed community hospital took seriously a challenge that had just been handed down by the Institute of Medicine in a shocking report on the prevalence of medical errors in U.S. healthcare titled To Err is Human.
After a year of piloting a new computerized system using bar codes at the bedside to check for accuracy in administering medication, the system went facilitywide in 2001. In one year, the rate of adverse events involving medications fell to 2.3 per 10,000 doses.
Methodist's results met the central goal articulated for the healthcare industry in that 1999 IOM report: a 50% reduction in errors in five years.
But as the IOM's sobering call to action marks its fifth anniversary this month, achievements such as that medication-error reduction in Peoria are regarded by many as the exception rather than the rule.
"There's no evidence we've come anywhere near that (50% improvement) at this stage. We've only begun," says Donald Berwick, president and CEO of the Institute for Healthcare Improvement, or IHI, and a member of the committee that penned the report.
Pegging the number of deaths from medical errors in U.S. hospitals at 44,000 to 98,000 per year, a committee of experts buttonholed the healthcare community with a four-pronged plan of action to systematically design safety into the process of care. "We believe that with adequate leadership, attention and resources, improvements can be made," said William Richardson, speaking then as chairman of a 19-member committee of leaders in healthcare delivery, patient-safety research and consumer advocacy that authored the report.
The IOM committee called for an evaluation after five years to assess progress in making the nation's healthcare facilities safer. Several committee members were among those asked by Modern Healthcare how near U.S. healthcare organizations have come to the level of patient-safety improvement envisioned in 1999.
"Nowhere close," says Brent James, IOM committee member and executive director of the Institute for Health Care Delivery Research at Intermountain Health Care, Salt Lake City. "You can see it clearly in a few institutions, but it's not widespread. ... I thought we'd be a lot further down the road."
Berwick says he has seen "many more examples of organizations concerned about patient safety and doing something about it," but the overall response has fallen below expectations. "It's a public health emergency that remains unrecognized as one."
A rosier assessment comes from Carolyn Clancy, director of the Agency for Healthcare Research and Quality, or AHRQ, a grant-making arm of HHS intent on gathering scientific evidence to inform patient safety improvement. Clancy says there is "enormous interest among the professions" and within healthcare organizations. "Most in the healthcare system are fully engaged now," she says.
The actual state of the industry is probably anyone's guess, says Arthur Levin, director of the Center for Medical Consumers, New York, and an IOM committee member. "I don't think we know (whether medical errors have declined). We don't have a system of counting in place.
"I don't doubt that the IOM report has stirred the pot a lot and has made it a front-page concern," he says, though after five years, "The urgency isn't there. It's been driven back to the safety, quality, research gurus."
Levin says there are "beehives of activity in areas around the country," but in terms of assessing general progress, "We really don't have a clue. That's unfortunate."
Obstacles to candor
Experts and observers on the politics, psychology and science behind patient-safety issues say the healthcare industry's progress will continue to be slow and halting until the candid cooperation necessary for building a safe environment becomes less hazardous to clinicians and more valued by healthcare leadership.
Any such campaign needs information to go on, and the unavoidable first step in shaking information loose to prevent harmful mistakes tomorrow is for people to report mistakes today. Threats of repercussions from bosses or licensing agencies, not to mention malpractice actions in a litigious society, are just a sampling of the disincentives to volunteer the raw material of safety activities, experts say.
Recognizing this obstacle, the IOM report called for healthcare organizations to develop a "culture of safety," designed to detect and minimize hazards and the likelihood of error without attaching blame to individuals. But it also called for a nationwide mandatory public-reporting system, protecting the confidentiality of certain error information on medical mistakes with no serious consequences, but making public the errors causing harm.
That's a mixed message in the campaign to coax out crucial information on the greatest dangers to patients, according to advocates of confidentiality. For a culture of safety to take hold, James says, "You have to make reporting safe."
Proponents of mandatory reporting say such a system would start quantifying the extent of the error problem, which is as much a mystery today as it was five years ago. The IOM committee, for example, had to take numbers from past academic studies in medical journals that used sampling methods, extrapolating the findings into a national error tally.
"They weren't telling the industry anything that wasn't known (in research circles)," Berwick says. "But the industry was stunned by what was reported."
Once the debate died down about whether the IOM overstated or understated the number of deaths from medical errors, the momentum for addressing whatever the magnitude of errors was got bogged down in "diversionary arguments" about mandatory vs. voluntary reporting, says Levin, who favors the dual system the IOM laid out.
The mandatory option has since "dropped off the table," he says, lamenting that the lack of accountability for mistakes that kill or injure is not healthy for consumers. "You make systems this important mandatory. You need a stick. There's no evidence that voluntary systems work," Levin says.
That situation may change soon. Of the $165 million parceled out by the AHRQ in research grants in 2001 through 2003, $69.6 million went to 16 demonstration projects exploring different approaches to medical-error reporting systems. Healthcare must find a middle ground that both recognizes the public's right to know and makes the reporting climate safe for clinicians to report adverse events, Clancy says.
In search of adverse events
Besides trying to determine the impact of various reporting approaches on the inclination of healthcare professionals to come forward, the AHRQ is seeking evidence that unfettered reporting actually leads to significant reductions in medical mistakes-a connection Clancy says is "plausible but not established" by scholarly findings. An ongoing study of intensive-care units in Michigan is among the research that promises to shed light on the relationship between reporting and safety improvement, she adds.
But in places such as west central Wisconsin and Salt Lake City, anecdotal results from initiatives at hospitals large and small have produced strong evidence of the connection.
At Luther Midelfort-Mayo Health System, a healthcare network of one hospital integrated with a physician group in Eau Claire, Wis., an emphasis on nonpunitive reporting resulted in a twelvefold increase in the number of errors recorded. Instead of dwelling on who was responsible, managers devised interventions in the process of care based on the types of errors brought to light (Eye on Info, a supplement to Modern Healthcare, May 28, 2001, p. 18).
To bolster that trove of information, Midelfort did its own detective work by examining patient charts by hand. What it found was a ratio of 233 actual or potential errors for every 100 charts reviewed, or two to three errors per patient. More than half were traced to incomplete communication of information on patient medications during admission, transfers from one department to another, or during discharge.
Several protocols were set up to fix that problem, and the result was an 83% reduction in potential adverse drug events during a six-month pilot on one hospital unit in 1998-a year before the IOM report landed. The protocols, with slight modifications where appropriate, were expanded to other areas of the hospital.
In 2005, the Joint Commission on Accreditation of Healthcare Organizations will require such "reconciliation" of drug information on hospitalized patients as one of its national patient-safety goals.
The no-penalty reporting and chart reviews also turned up a confusing lack of standards for administering insulin and the blood thinners heparin and warfarin. The insulin problem was leading to dangerous episodes of low blood sugar in diabetic patients, and a problem regulating blood thinners posed the threats of strokes or excessive bleeding. A standardization push led to a 50% decrease in complications in both areas.
At Intermountain flagship LDS Hospital in Salt Lake City,, a campaign more than a decade ago to turn evidence of adverse events into preventive solutions produced first a strikingly large body of evidence and then a level of improvement just as striking.
LDS long had been averaging about six serious reported adverse events a year, a reflection of mass apprehension that James attributed to a fear of retribution when such episodes were admitted and written down. But information professionals and key clinicians came up with a way to capture data signaling typical responses triggered by an adverse event, such as ordering drugs to counteract a narcotics overdose or reduce the severity of an allergic reaction.
By keeping track of such "triggers" and examining the circumstances under which they came into play, the hospital researchers were able to isolate 581 moderate and severe adverse events in 1990, James says. With the hundreds of events as ammunition for change, concerted efforts to change faulty processes of care cut that incidence to 437 by 1994 and 271 by 1997-a reduction of more than 50% compared with the baseline year.
Besides achieving the IOM goal for error reduction years before it was set, the high-tech hunt for hospital quality problems documented the stark difference between voluntary reporting systems and proactive efforts to document harm in the hospital. "That's the truth of the matter right there," James says of the LDS totals-incidents measured in hundreds rather than handfuls.
LDS turned that grasp of the possibilities for harm into a series of victories in the fight to ward off injury or complications, focusing first on the most severe. In the high-risk shock trauma respiratory ICU, for example, 40 severe adverse drug events, or ADEs, were recorded in 1990 using the more sophisticated methods of tracking them. By 1998 the ICU had just one serious ADE all year.
One program within that overall improvement effort had a particularly quick payback. A push to anticipate serious ADEs from allergic reactions succeeded in reducing the number of those incidents to eight in 1991 from 56 the year before.
The nonpunitive approach
Somewhere in the course of these and other patient-safety campaigns in healthcare organizations, talk began to turn away from error-reporting, and even error reduction. Arguing that there's a key distinction, some leaders have tossed the notion of error aside and replaced it with a focus on reporting adverse events during hospital care-whether preventable or unavoidable-and mobilizing to address the reasons they occurred.
"The (healthcare) system needs to be viewed with fresh eyes," says David Pryor, senior vice president of clinical excellence at Ascension Health, St. Louis, a 67-hospital Catholic system more than one year into a blameless approach to patient safety.
Striving to eliminate adverse outcomes instead of errors "creates the intellectual freedom to ask how to prevent a bad outcome," he says. When the focus is on whether a mistake was made, clinicians and administrators lose sight of whether something can be prevented and how, Pryor says.
Especially in the current legal environment, error-reporting makes hospital workers or physicians concerned about exposing themselves, and managers "spend a lot of time worrying, 'Was there an error?' rather than 'Was there an adverse event?' " he says.
There's no more outspoken proponent of this distinction than James Bagian, director of the VA National Center for Patient Safety, established in 1999 by the Veterans Affairs Department to lead the federal department's healthcare safety efforts. Bagian started work as the center's first director nine months ahead of the IOM report.
"Never ask who's responsible. That should never be part of the response," says Bagian, a former NASA shuttle astronaut who brought tenets of safe practices with him from the aerospace industry. "And you should wash the person's mouth out with soap who asks the question." In a VA handbook that provides guidance on minimizing adverse outcomes of care, "The word error doesn't appear, not even once."
That's a contrast to the theme played out in To Err is Human, he says. Although the IOM succeeded in calling attention to the need for preventing harm in the hospital, Bagian says he was dismayed by the report's spotlight on errors committed by medical personnel. "It came out and I said, 'Man, this is the wrong message.' "
VA's model for safety improvement
The VA's well-recognized achievements in quality improvement during the past decade often are tied to a clinical information system it developed called Vista that's drawing renewed attention as a model for electronic health records. But the patient-safety effort formulated under Bagian's watch also has drawn its share of praise and recognition.
The IHI's Berwick puts the VA on a very short list of large healthcare organizations he says have made big strides in safety improvement. Using the VA's approach and advice, the American Hospital Association in 2003 developed a "tool kit" for the nation's hospitals aimed at identifying aspects of care at high risk for causing patient harm. The VA says its safety-training program has devotees overseas as well: Australia and Denmark are using it already, and nations such as Canada, Japan, Sweden and the United Kingdom are trying to begin using it.
Central to the approach is the starting point that errors cannot be eliminated-that on any given day and under the right set of circumstances, anyone can make a mistake, Bagian says. But constant feedback on faults in care processes can draw attention to dangerous circumstances, giving managers the power to anticipate problems or catch them early and prevent errors from leading to catastrophes, he says.
In all this, reporting is "fuel for the engine" but not a means of keeping score on safety, Bagian says. "To use (incident reporting) as a metric for how safe you are, it can't be done."
Incident reporting in the VA scheme fuels a steady stream of introspective investigations called root-cause analyses. The term was familiarized by the JCAHO as a required response to so-called sentinel events in hospitals: serious adverse events resulting in major injury or death.
Unlike the JCAHO, however, the VA relies just as much on close calls as it does on the serious stuff to improve on care delivery. "In our lives we learn from close calls every day," Bagian says. "But in organizations we don't have any way to do this."
Close calls, in which nothing untoward happened but clearly could have, occur up to 300 times for every adverse event, he says. Yet they accounted for less than five-hundredths of a percent of total events reported to VA officials before the rollout of the new patient-safety approach in all VA hospitals in August 2000.
Within 10 months, reporting of adverse events rose thirtyfold, and close calls came to account for 50% of the incidents to which a root-cause analysis was applied.
"People think, 'No blood, no harm,' " Bagian says of the attitude toward a close call. "Because that's how people think, nobody is afraid to report it."
In addition to the internal framework, a second confidential reporting program open to all VA employees was instituted in 2001 as a "safety valve" to help ensure that vulnerabilities not picked up by the first method could be brought to the health system's attention through external sources.
But reporting was only the first half of the strategy. Something constructive also had to be done about what was reported. Of the root cause analyses performed in VA hospitals before the program started, half drew a conclusion that nothing could be done about whatever problem was unearthed. Now the focus is on taking corrective action instead of throwing up hands in defeat.
Bagian gives the example of a conclusion that some patient falls are unavoidable. "Injuries from falls are preventable," he counters. "If you can't think of any prevention, you're not thinking hard enough."
Today less than 1% of such analyses do not have at least one corrective action, and the average is three or four, he says.
It's that attitude that helps "high-reliability organizations," such as in the aviation and nuclear power industries, to focus on preventing harm to pilots or populations even though the activities-flying and creating atomic energy-are inherently risky. The VA partnered with NASA to develop the supplemental reporting program.
Bagian, a physician, engineer and 17-year NASA veteran, has seen what happens when reporting systems fail. He helped investigate the explosion of the shuttle Challenger in 1986, and while with the VA he managed to also help with the probe of the shuttle Columbia's breakup on re-entry in February 2003.
He had flown on Columbia in June 1991, the second of his two trips into space. The first flight was aboard the orbiter Discovery in March 1989.
A call for leadership
Above all, the aviation culture was acutely aware of the inevitability of error and that no one was immune from committing one. "The old hands around there say, 'There are those that have, and there are those that will,' " Bagian says.
To carry the time-tested lessons to healthcare, leaders have to acknowledge the reality of error and work to plow such feedback into fail-safe procedures for patient safety, he says. "Necessary but not sufficient is leadership at the top level. ... The boss has to say it's important. How do they show it's important? They talk about it."
That means putting the safety issue first, before budget and finance issues and getting reports on improvement efforts. When it comes to a patient-safety push, executives "can't make it work by themselves. But they can extinguish it," Bagian says.
However, when healthcare executives are ducking for cover from medical malpractice threats and public backlash about caregivers doing something wrong to a patient, the courage to come forward with straight talk about medical mistakes can be in short supply, experts acknowledge.
"We continue to have the problem of honesty-truth-telling-in patient safety," Berwick says. Truth can be an adverse event itself for healthcare leaders: "In declaring they need to get safer, it's an implied admission that they're not safe enough," he says.
A first step for leaders is to start putting the infrastructure of patient safety in place, James says. While progress on safety has been limited during the past five years, preparations for a safety program have made inroads, with safety officers being "trained at a good clip" and now "moderately common" in hospitals, he says.
Safety officers have reached the executive suite in a handful of hospital systems. Just last week, Duke University Health System, Durham, N.C., named Karen Frush as its first chief patient-safety officer. Frush will be responsible for developing a comprehensive safety program and measuring the results.
Setting up a program by itself accomplishes nothing, James emphasizes. "There's a difference between hiring a patient-safety officer and doing something about patient safety," he says.
In Peoria, Methodist President and CEO Michael Bryant heralded the importance of a computer-assisted patient-safety initiative by forming an executive information-technology steering committee and chairing it himself. The panel included the entire senior executive team and a few members of the hospital board as well as representatives from hospital line management and physicians, says Tom Rippeto, the hospital's chief information officer.
The facility's chief nursing officer, Debbie Simon, recommended "marrying" the IT and clinical staffs to ensure buy-in and participation during the design and rollout of a bar-code-reading medication administration system. The resulting creation of a Center for Innovation and Clinical Advancement set the stage for nurse involvement in setting up a medication system from McKesson Corp. and supported other computer initiatives as integral parts of a strategic plan for clinical safety, Rippeto says.
The safety campaign, launched in 2000, included a "no-penalty reporting system" encouraged by incident-reporting forms, which called for not only reports of problems but also observations and recommendations independent of an adverse event, Bryant says. "It took about a year for people to understand that we were indeed serious" about the blameless approach, but now the culture has taken hold, he says.
The rate of reports is now in decline after proliferating for several years, which Bryant says is evidence that "we're now beginning to get the real reward for looking at all these (safety) issues before." Changes made in response to reported incidents are "becoming mainstreamed" in care processes and contributing to a lower level of vulnerability to mistakes, he says.
Bryant's role as CEO is to talk up the safety angle in all meetings with departments, ask constantly what they're doing about patient safety and keep it a top priority for all personnel. "I'm the chief advocate, chief cheerleader," he says. That includes personalizing the abstract notion of keeping patients safe, "to remind them that this is your mother, your father, your child, your grandmother.
What do you think?
Write us with your comments. Via e-mail, it's [email protected]; by fax, 312-280-3183; or through the mail, Modern Healthcare, Letters to the Editor, 360 N. Michigan Ave., Chicago, Ill. 60601. To publish letters, we need your name, title, affiliation, location and phone number.