Even the beleaguered Veterans Affairs Department is getting into the act. It has launched a collaboration with Facebook that involves tapping into veterans' social media and mobile phone accounts (it's opt-in) to provide the VA with data to help them identify those at risk for suicide.
The era of big data has its perils, of course. The CMS, which now warehouses claims data for about 100 million Americans through Medicare, Medicaid and the Children's Health Insurance Program, is using big data analytics as the linchpin of its very active healthcare fraud investigations campaign. The latest report from HHS' Office of the Inspector General used big data to identify questionable patterns in clinical laboratory claims involving 20% of the $8.2 billion it paid out for the service.
The McKinsey Global Institute recently estimated big data could save the healthcare sector $300 billion a year, with about two-thirds of those savings coming in the form of lower healthcare expenditures. From a provider perspective, big data's analytical sword clearly swings both ways.
None of this would have been possible without the more than $22 billion the federal government has poured into hospitals and physician practices since 2009 to acquire electronic health-record systems. But as that gusher of funding slows, let's not forget the promises made in exchange for the money. Achieving the next phase of EHR meaningful use requires making the data generated by the new systems transferrable to other organizations. The buzzword is interoperability.
Alas, it isn't in any of the major players' self-interest to make their data interoperable. “Competing healthcare organizations that treat overlapping patient populations in a community may be reluctant to share relevant data, typically because each organization fears that others could use its data for competitive advantage,” an article in this month's Health Affairs noted.
It's not in the interest of many of the vendors who develop and install the systems either. The large players such as Epic, because of the depth and breadth of their marketplace penetration, can offer de facto interoperability—at least for those within their networks. They are well-positioned to freeze out the smaller players who depend on their proprietary software to maintain their foothold in the market.
That's why the federal government must hold fast to the interoperability requirements included in the current rules. Pressures to relax the meaningful-use standards will no doubt mount this year as deadlines approach.
The Federal Trade Commission should use its antitrust review authority to maintain an open architecture for healthcare information. The Food and Drug Administration needs to develop better standards for the numerous devices entering the market that generate data to ensure their accuracy.
The government also needs to take steps to assure the public that the era of big data will not violate the sanctity of personal health information. As the Health Affairs overview noted, “to harness big data's potential, consent may need to evolve from strict regulations for every potential use of data to a balance between personal control and informed sharing in the service of public health.”
Why that won't translate into the end of privacy needs to be made clear to the general public. Otherwise, providers and payers who want to harvest the potential in big data will become suspect in the eyes of Main Street America.
Follow Merrill Goozner on Twitter: @MHgoozner