Translation and interpreting solutions rise to the top when evaluating industries where artificial intelligence (AI) holds great promise. Indeed, a plethora of AI-driven services already are being relied on routinely for transcription applications, going from speech to text and summarizing the essence of a conversation, a cognitive function AI does particularly well in most cases.
However, while clinicians have used these types of linguistics capabilities for some time to quickly transcribe dictated notes, applying similar AI technology has yet to become standard practice when assisting patients with limited English proficiency (LEP).
While it is common knowledge that providers are bound by law to provide translation and interpretation services to patients in their language of choice, what does this truly look like today? Live interpreters can typically be found in hospitals, particularly the emergency room (ER), when a patient has an immediate issue. They may also be available to help with communications upon admission and discharge, either live or via phone or telehealth. However, beyond these specific touchpoints, most LEP patients are still, quite literally, on their own, whether trying to decipher a hospital menu to order a meal, or when simply asking for help. AI can and should be considered to fill obvious gaps, especially in non-emergent medical situations.
Of course, there are many arguments on why healthcare organizations are leery of AI-assisted translation and interpretation, not the least of which involves errors and bias, often due to limited historical data. However, as the industry matures and AI datasets grow and become more accurate, these issues are expected to auto self-correct.
Here are the top three reasons why healthcare leaders should evaluate and adopt AI-assisted language solutions to better serve their non-English-speaking patient population:
- Cost: While it is true that AI requires greater computing power, we all know that technology ultimately makes just about anything cheaper and more cost-effective over the long term. By investing in AI-assisted translation and interpretation providers can reach more people when and where they need it in their health journey, ultimately streamlining communications, and avoiding preventable costs and overutilization of healthcare resources down the line.
- Efficiency: While there is no substitute for live interpreters in emotionally charged scenarios such as delivering a complex diagnosis, today’s healthcare environment lacks adequate live resources to cover all routine patient communication needs. Leveraging AI-assisted translation and interpretation is an efficient way to support patients across all touchpoints.
- Quality and engagement: Use of AI-assisted translation and interpreting services can help to improve quality and patient satisfaction survey scores. Patients and clinicians alike appreciate better communication, which builds trust and loyalty.
Along with helping to improve direct patient communications, AI is proving to be a useful tool to assist live interpreters in ways to hone their skills and presentation. For instance, as a listening ear on live interpreting sessions, AI can tip interpreters when they are speaking too quickly, or if they need to improve their tone to convey greater empathy. AI can monitor the accuracy of a live translation, ensuring that the patient receives and understands the information the clinician intends to convey.
Just as important, AI can be used to gauge the patient’s overall understanding, both from their responses and body language. While an interpreting session can be 100% accurate and match the tone of the clinician, it is entirely possible that patients may not fully understand the information, often for cultural reasons. AI can be used to assess the patient’s tone, words and facial expression, based on cultural nuances, to improve their health literacy.
These AI-assisted enhancements help to level up the quality of the interpreting experience and vastly improve patient compliance and outcomes.
For healthcare administrators, AI can provide invaluable insights by analyzing local and regional data on non-English-speaking populations to better understand their overall needs and challenges, including social determinants of health (SDoH). Unfortunately, a patient who does not speak or comprehend English faces numerous health disparities, such as not knowing where to go for medical treatment or being readmitted to a hospital after release because of a lack of understanding of aftercare plans.
Analyzing an organization’s proprietary interpreting and translation data can reveal valuable insights like:
- Language mix and usage patterns, including why certain populations seek out specific specialties
- Varying lengths of conversations based on language type
- How the LEP population compares to the general English-speaking population
In the aggregate, this intelligence can support administrators as they better and more efficiently manage linguistically challenged segments of the population.
At GLOBO, we ultimately see language solutions as a communication challenge as opposed to simply translating words accurately. We believe better communication is empathic and even transcends language and cultural diversity. It is integral to every patient’s health and well-being and is essential to strategically support the roughly 10 to 15% of the population – more than 25 million Americans who speak a language other than English at home. Considering that more than 350+ languages other than English are spoken in the U.S. today, we believe that leveraging technology, including AI, isn’t just a nice “to do,” rather it is a “must-do” to improve access to care and better engage with all ethnicities.