Claims that health information technology will help the hospital industry cut costs are unsupported by facts, at least based on how computers have been used thus far, according to research to be published today.
The report “Hospital Computing and the Costs and Quality of Care: A National Study,” was based on annual surveys of the level of IT implementations at more than 4,000 hospitals, Medicare cost reports and cost/quality databases developed by the Dartmouth Atlas project. The report, led by David Himmelstein, a physician and associate professor at Harvard Medical School, was published in the American Journal of Medicine.
During a period of rising IT usage, and while “hospitals' administrative costs increased slightly but steadily, from 24.4% in 2003 to 24.9% in 2007,” there was “no association between administrative costs and any quality measure,” the report said. Further, “we found no evidence that computerization has lowered costs or streamlined administration.”
In fact, “hospitals that increased their computerization more rapidly had larger increases in administrative costs,” the report said.
“More encouragingly,” however, the researchers found that “greater use of information technology was associated with a consistent though small increase in quality scores.”
The authors concluded that, “As currently implemented, hospital computing might modestly improve process measures of quality but does not reduce administrative or overall costs.”
“Whatever the explanation,” they wrote, “as currently implemented, health information technology has a modest impact on process measures of quality, but no impact on administrative efficiency or overall costs. Predictions of cost-savings and efficiency improvements from the widespread adoption of computers are premature at best.”