President Joe Biden signed a sweeping executive order and invoked the Defense Production Act on Monday to establish the first set of standards on the use of artificial intelligence in healthcare and other industries.
As the hype, promise and usage of AI has grown in healthcare, health system leaders and developers have sought more concrete guardrails on its usage, particularly for clinical purposes. Biden signed the order at an afternoon AI-focused event at the White House.
Read More: Healthcare AI regulation coming into focus
“We need to govern this technology," Biden said. He characterized the order as "the most significant action any government anywhere in the world has ever taken on AI safety, security and trust."
Here is what to know about the order and how it will affect healthcare.
What are the implications for healthcare?
The order calls on the Health and Human Services Department to establish an AI Task Force within 90 days that will, within a year, develop policies and frameworks on responsible deployment and use of AI and AI-enabled technologies in the health and human services sector. The task force will aim to create guidance on monitoring the safety and quality of AI-enabled technology, as well as how to incorporate equity when deploying the models.
HHS, along with the Veterans Affairs and Defense departments, are asked to establish a framework to help identify and capture clinical errors resulting from AI deployed in healthcare settings. The agencies also will identify specifications to create a central tracking repository for associated incidents that cause harm.
By invoking the Defense Production Act, Biden will require companies developing or intending to develop potential dual-use foundation models to provide the federal government with information related to training and development of models for cybersecurity purposes.
The order also attempts to increase research of AI in what the White House calls “vital” areas like healthcare. It calls on the Commerce Department to develop guidance for watermarking to identify AI-created content and will create a pilot through the National AI Research Resource that attempts to expand grants for research on AI usage in healthcare.
Why is Biden is focusing on AI?
Since the public release of OpenAI’s large language model ChatGPT, the hype concerning generative AI has taken hold in healthcare, among other industries. Health systems, digital health companies, insurers and other stakeholders are looking to capitalize on the technology.
EHR vendors such as Epic and Oracle Health, big tech companies such as Google, Amazon and Microsoft and health systems such as Salt Lake City-based Intermountain Healthcare and Rochester, Minnesota-based Mayo Clinic are making investments into AI-related products. Health tech companies like GE Healthcare are going to be reliant on AI for revenue growth, according to an analysis from Moody’s Investors Service.
Health system executives see AI potentially replacing third-party vendors operating in areas like medical transcription and patient communications. For health systems dealing with tight financial margins and a challenging staffing environment, the potential exists for the technology to be a cost saver.
Is there any regulation of AI now?
The executive order comes one year after the Biden administration released the AI Bill of Rights to help healthcare and other sectors navigate the potential perils of the technology.
Earlier this month, the Food and Drug Administration announced the creation of a Digital Health Advisory Committee to help explore scientific and technical issues related to digital health technologies that include AI. The FDA is charged with approving AI- and machine learning-enabled medical devices. The Office of the National Coordinator for Health IT released a proposed rule in April that would create technical transparency and risk-management requirements for a wide range of healthcare software systems including generative AI.
There is also interest in Congress to pass oversight legislation. Sen. Mark Warner (D-Va.) has called for AI regulation. The Senate Health, Education, Labor, and Pensions Committee sought public comment on how HIPAA’s framework could potentially be altered to better encompass AI data.
Separately, providers are partnering with large technology companies to explore and deploy early AI use cases. In August, Durham, North Carolina-based Duke Health partnered with Microsoft to study the reliability and safety of generative AI in healthcare. Google is a part of the Coalition for Health AI, a community of academic health systems, developers and other organizations seeking to harmonize standards and reporting for health AI.
What's the industry reaction to Biden's AI executive order?
Most industry insiders said it will take time to understand the true scope of the order.
Individual federal agencies are responsible for enacting and following through on many of the provisions in the executive order.
“A lot is left to the HHS AI Taskforce,” said Suchi Saria, an assistant professor at Johns Hopkins University and CEO of Bayesian Health, a healthcare AI platform. “The devil is going to be in the million details, as AI is a fast-evolving field.”
Saria said she hopes the HHS task force will tap into expertise from AI researchers and entrepreneurs. If the correct balance is struck, oversight could accelerate health AI adoption, she said.
Dr. Justin Norden, a partner at venture capital firm GSR Ventures, said companies that have taken the proper steps to test and check their algorithms should be able to make tweaks to comply with any regulation.
“The companies who are really putting thought and effort into testing, validation and deployment of AI solutions will be helped,” he said. “Companies who have not properly tested their solutions may have to go back to the drawing board.”
Norden said regulation might create fewer companies able to offer solutions in the marketplace, which in turn could make it easier for health systems to identify third-party vendors.
Jeremy Sherer, partner and co-chair of law firm Hooper, Lundy & Bookman’s digital health practice, was skeptical the executive order would slow development and adoption of technology that companies are attempting to bring into healthcare.
“Startups are ambitious, goal-oriented and eager to get their solutions to market,” Sherer said. “That’s part of the DNA in healthcare technology.”
Many digital health companies welcomed the guardrails.
“For us, this is very, very exciting,” said Dr. Michal Tzuchman Katz, CEO and co-founder at Kahun, a company that uses AI for its clinical reasoning engine. “It’s about time that we start to regulate, legislate and build legislation around how to ensure safety, security and trust in these systems. These are new models, and we know that they’re very powerful.”