Advances in medical technology and pharmaceuticals in the past 25 years have proceeded at such a clip that it would take a bionic eye to focus on just the highlights. But don't worry, that's probably coming soon, too.
In the U.S. alone, consumption of medical and diagnostic devices tipped the scales at $71.4 billion last year, according to AdvaMed, a Washington-based trade group. From magnetic resonance imaging to positron emission tomography to linear accelerators, and from clot-busters to protease inhibitors to drug-coated stents, physicians and hospitals have taken a quantum leap in clinical capabilities. Genetic engineering and the mapping of the human genome carry the promise of even greater advances in prevention and treatment of diseases.
The explosion in options has been accompanied by a commensurate jump in costs. In fact, hospitals have to go to the bond markets for the financing to buy many of the big-ticket items.
Meanwhile, as increasingly sophisticated technologies slash the time needed to diagnose and treat problems, there seems to be a corresponding decline in hospital inpatient days. Yet study after study says drugs and medical technology are to blame for most of healthcare's high costs.
So the $71.4 billion question remains: Are these technological advances really cost-effective, and is society willing to continue to pay for them if they aren't?