“We feel that workplace violence, particularly in healthcare, was an epidemic before the pandemic. But then with the pandemic, it just got even more pronounced,” said Steve Miff, CEO of Parkland Center for Clinical Innovation, the health system's research institute. “When you’re at the hospital, it’s one of most vulnerable times in your life. So, you can understand why it’s a setting that’s probably more primed for irrational behavior.”
A team from the research institute developed an predictive AI tool within its electronic health record to generate a risk assessment score that informs clinicians which patients are more likely to be violent.
The development of the AI tool comes as violence against doctors and nurses is on the rise. More than 80% of nurses said they experienced some form of workplace violence in 2022 and 2023, according to a February survey by National Nurses United. In a January survey by the American College of Emergency Physicians, 71% of emergency physicians said violence in the emergency department was worse in 2023 than in 2022.
The American Hospital Association has endorsed a bill that would make it a federal crime to attack healthcare workers in the process of doing their jobs.
Parkland has about 400 incidents per year that can include verbal threats, hair pulling, biting or hitting. Often, they are underreported by clinicians, Miff said.
“Just hearing the frontline staff stories is just heartbreaking because they're passionate about helping people and then they themselves become a victim,” Miff said.
The developers trained the predictive AI model using five years of data on violent incidents from multiple sources including the EHR, human resources systems and the Occupational Safety and Health Administration. The AI tool also accounts for the patient’s social drivers of health such as geographic information and other potential contributing factors, said Alex Treacher, senior data and applied scientist at Parkland Center for Clinical Innovation.
The risk assessment model, which is embedded into the EHR and sent as an alert to clinicians, tries to account for the varying factors when predicting a violent patient encounter. For instance, Miff cited the example of a patient with a nicotine habit who spends a few hours in the hospital where smoking is not allowed.
"That was one of the top 10 predictive factors," Miff said.
Physicians were integral in helping the innovation team develop the model, Treacher said. If clinicians receives an alert on a potentially violent patient, they use the Brøset violence checklist, a structured risk assessment tool, that looks at the patient’s mood and language. With that information in hand, clinicians are able to take steps to minimize harm and de-escalate a potential situation.
The AI tool was tested from October 2022 to August 2023. For every 1,000 patient-clinician interactions at Parkland, 7.1 violent events were correctly predicted, compared with 2.3 violent events missed. The AI tool generated 167 false alerts compared with 823 correct predictions of a non-violent patient.
The AI tool will continue to be tweaked as it adopted across Parkland Health, Miff said. He said the tool has led to a higher comfort level among clinicians.
“We constantly hear how it feels like they have somebody watching their backs,” Miff said. “They tell us, ‘There’s an algorithm that we don’t know [about] but it’s running in the background and is watching our back.’”