Language norms used by providers in patient medical records have perpetuated racial and economic bias, identifying a need for clinicians to be more intentional at exploring and articulating the root of a person's health problems.
Electronic health records carry negative patient descriptors based on race, insurance provider and marital status, according to a report published this week in Health Affairs. That's raising concerns about bias and stigma revealed through electronic health records and its potential to exacerbate healthcare disparities during a pivotal time in healthcare.
The study found that some patient populations had more than double the odds of having at least one negative descriptor within their clinical notes. Black patients' EHRs were 2.54 times more likely to contain stigmatizing language when compared to white patients; people with Medicaid were 2.66 times more likely than those with commercial insurance; and unmarried patients had 2.12 times the odds compared to married people, according to the study findings.
Researchers conducted a literary review of EHRs and consulted the Health Equity Commission of the Society of General Internal Medicine to identify and define common negative descriptors, said report co-author Michael Sun.
They then used machine learning to parse through more than 40,000 clinical records from 33,000 patient encounters from the University of Chicago Academic Medical Center between January 2019 and October 2020. Of the 18,500 patients included in the study, approximately 61% were Black, 30% were white, 6% were Hispanic or Latino and 3.5% were categorized as "other." In total, 8.2% of the patients had one or more negative descriptors in their medical history.
Fifteen common patient descriptors were used to pinpoint stigmatizing language, including non-adherent, aggressive, agitated, angry, challenging, combative, non-compliant, confront, non-cooperative, defensive, exaggerate, hysterical, unpleasant, refuse and resist. Sun said the commonality of these terms across EHRs reflects a cultural norm of not properly articulating a patient's barriers to health.
"There is a pattern of words that we're using that are shortcuts and we are doing a disservice to our patients by not affording them the full context, their full story," he said.
For example, a physician can call a patient "noncompliant" when the patient can really lack health literacy and misunderstood what they were supposed to be doing. Understanding that difference can help put a patient back on a treatment plan that works for them and improves outcomes.
The next step in research will be to explore the link of negative comments within a patient's electronic health record to clinical outcomes, Sun said. The report does not directly correlate bad medical outcomes as a consequence of implicit biases, but notes other research, including a study that found doctors with high measures of implicit biases were more verbally dominant with Black patients and a report that indicates bias in healthcare is associated with lower levels of patient adherence.
The report also explains how electronic health records can perpetuate bias and stigma amongst clinicians.The authors cited a 2018 study that found medical providers were more likely to have a negative perception of a patient's pain when presented a chart with notes that contain stigmatizing language, like "frequent flier."
"It would not be hard to imagine the different types of interactions they might be having," he said. "This will certainly be a follow up area of study for us, but we anticipate that these descriptors are having some effect as far as the doctor-patient relationship, and also the many healthcare provider-to-patient relationships that will happen during a patient's hospital stay."
Researchers found the use of stigmatizing language lessened in 2020. Sun said that after the COVID-19 pandemic began, and amid a national reckoning with the murder of George Floyd, clinicians were less likely to use a negative descriptor in an EHR. He said the findings illustrate clinicians' ability to check their biases and hesitate because of using negative descriptors in their charts, especially when describing a patient of color or marginalized identity. It could also reflect a growing interest amongst providers to address cultural incompetencies within their operations.
"It surprised us at first because we thought that the pandemic as a whole, as a stressful environment, would cause people to use more cognitive shortcuts or stereotypes, relying on bias or using bias a little bit more. I think it's really encouraging to find that it actually decreased during the pandemic," he said. "I hope that people think about this as an opportunity to tell a patient's full story, and provide them more compassionate and empathetic care. It is certainly within our grasp, it just takes a little bit more intention."