As AI reshapes health care, calls grow for regulation
N.C. leaders aim to pioneer policies regulating AI in health care to address potential pitfalls
The following article appeared in the Jan. 10, 2025, edition of The Charlotte Ledger, an e-newsletter with smart and original local news for Charlotte. We offer free and paid subscription plans. More info here.
HOW ARTIFICIAL INTELLIGENCE IS TRANSFORMING HEALTH CARE:
An NC Health News/Charlotte Ledger series
A three-part NC Health News/Charlotte Ledger series this week explores how artificial intelligence is shaping health care in North Carolina.
MONDAY: Doctors are turning to “virtual scribes” to take notes, raising privacy concerns
WEDNESDAY: How North Carolina health care providers are harnessing AI
TODAY: How state regulators are approaching the use of AI in health care
—
North Carolina health care systems have been among the first to deploy artificial intelligence tools that leaders say hold great promise. But some health care advocates — and lawmakers — are wary.
North Carolina’s General Assembly could debate the use of artificial intelligence in health care this year. A leading Republican senator says he plans to introduce legislation on that topic.
By Emily Vespa
N.C. Health News
As North Carolina’s health care systems increasingly expand their use of artificial intelligence, some state leaders say they want to pioneer policy to regulate the rapidly evolving technology.
The efforts would be some of the first steps toward placing guardrails on AI in health care at the state level.
AI, which involves software that draws on large amounts of information to accomplish tasks once thought to require human intelligence, has proliferated in health care. The advent of “generative” AI in recent years means that today, machines not only analyze large amounts of data to make predictions or recommendations, they can also output new, original content.
North Carolina health care systems have been some of the first in the country to develop and deploy generative AI technologies, such as documentation tools that record patient visits and generate notes.
But these technologies come with plenty of challenges, said Christina Silcox, research director for digital health at the Duke-Margolis Institute for Health Policy. AI algorithms can be trained on data that contains subtle biases, which ends up meaning they can underperform for patients of color, research has shown. Harnessing vast amounts of patient data can prompt privacy issues. Another danger is that health care providers could over-rely on AI tools, bypassing their years of accumulated knowledge.
Sen. Jim Burgin, a Republican from Harnett County who has co-chaired the Senate’s health care committee, said he plans to introduce legislation this year that would address some of the concerns about AI’s potential pitfalls. Some North Carolina health care leaders also hope to fashion an AI policy that could serve as a model for other states.
“The question that I keep asking is, ‘AI is making all these decisions for us, but if it makes the wrong decision, where’s the liability?’” Burgin said. “Who’s responsible?”
On Wednesday, Senate leader Phil Berger said that regulating AI in health care “may very well be something that would enjoy support within the [Senate] and from a policy standpoint makes sense” — though he cautioned that legislators should “tread very carefully,” according to the N.C. Tribune.
State AI regulation is scant
Though federal agencies have issued rules about AI in health care, Congress has stalled on creating legislation, meaning there’s limited federal oversight. Some states have stepped up to fill the void.
In Utah, for example, lawmakers in April enacted a law that requires state-licensed professionals, including most health care workers, to notify customers when they interact with generative AI. Other states have considered legislation that prohibits the use of discriminatory AI algorithms or allows patients to opt out of AI use altogether, according to the National Conference of State Legislatures (NCSL).
In the past two years, North Carolina lawmakers haven’t made a similar foray into regulating AI in health care, NCSL databases of AI legislation show.
Meanwhile, North Carolina’s health care industry has amped up its use of AI. Across the state, providers have applied a host of tools to do things like help predict health events, analyze scans, handle administrative tasks and communicate with patients.
In a position statement, the North Carolina Medical Board said physicians are responsible for decisions they make at the recommendation of AI algorithms. The statement also says if physicians use AI tools to transcribe clinical notes, the onus is on them to review the notes and ensure that they are accurate.
The state medical board hasn’t yet opened any disciplinary cases that involve the use of AI tools, and it doesn’t have immediate plans to develop guidance documents or policies about the use of AI, according to Jean Fisher Brinkley, the board’s communications director.
The board is staying informed about AI but “wishes to avoid taking precipitous action that could chill innovation in the use of AI in medicine, which the Board believes holds a lot of promise,” she said in an email.
Leaders eye ‘gold-standard’ state legislation
Burgin said he’s working on legislation that would address accountability when AI is used in clinical decision-making. A UNC Health executive recently indicated that the health care industry is also interested in more AI regulation.
“In an ideal world, there would be pre-emptive federal legislation that would apply across all states around the use of responsible AI,” UNC Health Chief Medical Informatics Officer David McSwain said at a November UNC panel on AI in health care.
That seems unlikely to happen in the near future, which means health systems will have to navigate a patchwork of state laws, he added. Scattered state laws would be complicated, he said, and could require companies to design AI platforms differently in every state.
McSwain said North Carolina health care leaders hope to bring together an array of stakeholders, including health systems and societies, with the goal of creating a “gold-standard state legislation” that could be replicated in other states.
A critical aspect of any future legislation would be defining terms like “AI,” which tends to be tricky because it’s used to describe a host of technologies that are used in many different ways, he said.
“I think it’s a bad idea,” McSwain said of state-level AI legislation. “But the reality of it is, that is what’s going to happen. What we want to do is establish state-level legislation that minimizes the burden on health systems, minimizes the burden on providers, minimizes the potential negative health equity impacts.”
NC Health News/Charlotte Ledger reporter Michelle Crouch contributed to this article.
This article is part of a partnership between The Charlotte Ledger and North Carolina Health News to produce original health care reporting focused on the Charlotte area.
You can support this effort with a tax-free donation.
Need to sign up for this e-newsletter? We offer a free version, as well as paid memberships for full access to all 4 of our local newsletters:
The Charlotte Ledger is a locally owned media company that delivers smart and essential news. We strive for fairness and accuracy and will correct all known errors. The content reflects the independent editorial judgment of The Charlotte Ledger. Any advertising, paid marketing or sponsored content will be clearly labeled.
◼️ About The Ledger • Our Team • Website
◼️ Newsletters • Podcast • Newcomer Guide • A Better You email series
◼️ Subscribe • Sponsor • Events Board • Merch Store • Manage Your Account
◼️ Follow us on Facebook, Instagram, X/Twitter, LinkedIn, Substack Notes