Healthcare leaders need to expand their vision of what it means to pursue population health management to raise quality, lower costs, and improve the overall health of their patients.
Since its beginnings in the mid-1990s, the industry defined population health management by looking inward. Systems and physician practices focused on improving the health of their patients. Payers used the term to describe efforts to improve the health of the people directly in their charge.
Enthusiasm for this style of population health management emerged alongside the rapid expansion of health information technology. In fact, the first mention of population health management in the medical literature came in a 1996 article in a now defunct medical informatics journal. It defined the concept as the "treatment of chronic diseases and avoidance of acute disorders for targeted populations."
The tools for managing these targeted populations included "outcomes measurement and management, wellness/preventive programs, care management programs, and cost management." Clinicians could only deploy these approaches if they had access to computerized health records, gathered over their patients' lifetimes.
Progress toward creating those records has been slow largely because of conflicts of interest among the parties that need to cooperate in aggregating patient data. Health IT vendors, seeking market dominance, established proprietary systems without common standards. Providers didn't care to share their data with competing institutions, fearing interoperability would empower patients to leave for another provider.
Government, which had funded diffusion of health IT, failed to insist on the easy exchange of data across institutions and platforms. The result was a proliferation of information silos, which in turn led to the rise of a population health management consulting industry to aggregate patient data and train providers in how to use their proprietary cloud-based platforms to target interventions for lowering costs and improving lives.
Modern Healthcare over the years has published numerous articles on these techniques. Chronic ER visitors have been identified. Case managers have been deployed. Medication adherence has been elevated. Post-discharge follow-up has been enhanced. Many of these programs showed positive results. Yet the evidence these strategies have lowered costs is scant.
Through it all, the orientation remained inward: on current patients, plan members and employees. Little thought was given to the societal context that generated the 5% of patients who accounted for half of all healthcare spending. Addressing the social conditions that put people on the path to chronic illness seemed to lay beyond the healthcare system's control.
What are they? Unemployment and underemployment. Stagnant wages. Rising inequality in wealth and income. Inadequate housing. Expanding waistlines. Lagging insurance coverage. Social isolation.
All have contributed to what some are now calling the diseases of despair. For the first time in our history, we're seeing a decline in longevity among less educated groups. From substance abuse to gun violence to unaddressed behavioral health issues, America on some days seems in the midst of a society-wide nervous breakdown.
It would be unfair to say systems, physicians, insurers and communities are ignoring the social determinants of health. Some insurers and systems are building housing for the homeless. Hospitals are opening food pantries. Many institutions are creating community-based programs to combat asthma, diabetes and mental illness. Virtually every provider is engaged in local coalitions to address opioids and gun violence, which have risen to the top of the public policy agenda.
But a healthcare system, which is the largest employer in most communities, can do a lot more. Its approach to community benefit should move well beyond merely tacking a few community grants onto the usual offering of charitable care.