Northwell Health is trying to tackle disparities in maternal health with the help of an artificial intelligence chatbot.
The Northwell Health Pregnancy Chats tool, developed in collaboration with Conversa Health, guides patients through their prenatal and postpartum journeys while assessing social barriers and mental health issues.
The tool is part of an initiative within Northwell’s Center for Maternal Health that aims to reduce the maternal mortality rate, particularly among Black women. A major barrier is addressing gaps in behavioral health, education and community resources, said Dr. Zenobia Brown, senior vice president of population health and associate chief medical officer at the New Hyde Park, New York-based health system.
Employing a "high-tech, high-touch" approach, the chatbot helps Northwell providers manage high-risk pregnant patients by deploying personalized education and patient assessments. The tool offers patients information relevant to each stage of pregnancy, such as blood pressure monitoring, prenatal tests, birth plans and lactation support, and regularly screens them for social and mental health needs.
The chatbot is integrated with Northwell’s care management team and can direct patients to relevant resources and alert providers if interventions are needed. When a patient indicates to the chatbot that they are having medical complications, the tool triggers a call from a Northwell representative or instructs the patient to visit the emergency department.
“I could have someone calling moms three times a week and ask them questions about how they're doing. But it allowed us to kind of deploy a lot more touches using the technology than we could do with people,” Brown said.
Since its launch earlier this year, the A.I. chatbot has shown promising preliminary results, according to the health system. An internal survey revealed that 96% of users expressed satisfaction with their experience. In addition, the chatbot effectively identified patients experiencing complications and guided them toward appropriate care, Brown said.
For example, the chatbot identified a woman suffering from postpartum depression, even though she had not disclosed her symptoms during a previous mental health screening with her doctor. The patient confided in the chatbot to having suicidal thoughts, leading to a response from the care team with psychiatric and mental health support.
The utilization of A.I.-powered chatbots in healthcare has been shown to enhance interactions, offering more detailed and empathetic conversations compared to traditional doctor-patient interactions, according to a study University of California, San Diego researchers published in in JAMA Internal Medicine in April.
“These chatbots are never tired,” said John Ayers, vice chief of innovation in the U.C. San Diego School of Medicine division of infectious disease and global public health, who co-authored the study. The findings suggest A.I. chatbots have the potential to increase patient satisfaction while alleviating administrative burdens on clinicians.
“We are using these like really fancy, cool tools to get back to the stuff we know absolutely works in healthcare, which is listening to your patient, letting them ask you lots of questions and getting them engaged with their care,” Brown said.
The approach also could increase how much money doctors can make from insurers by responding to more patient emails, Ayers said. However, to fully realize the technology’s potential, tools must be tailored to meet individual patient needs. For example, many chatbots on the market are designed to ease worker burnout and facilitate patient management. For patients, such tools can be analogous to phone trees, he said. A chatbot should be linked to a real person if a patient requires more complicated assistance, he said.
Bioethicists caution against regarding A.I.-powered chatbots as a definitive solution for patient engagement and have called for stronger oversight.
“Regulation has to come in some form,” said Bryn Williams-Jones, a bioethicist at the University of Montreal. “What form it will take is unclear because that thing that you're trying to regulate is evolving incredibly rapidly.”
To responsibly deploy the technology now, healthcare providers should clearly understand the methodology behind the software, fact-check its work and create accountability mechanisms to respond when something goes wrong, Williams-Jones said. These tools should be designed in line with standards of care and seek to avoid overutilization, he said.