Current News

/

ArcaMax

An AI chatbot said it was licensed to practice psychiatry in Pennsylvania. Now, Josh Shapiro's administration is suing

Fallon Roth, The Philadelphia Inquirer on

Published in News & Features

PHILADELPHIA — The Pennsylvania Department of State is suing a technology company that operates a popular AI chatbot, after an investigation found it can impersonate a licensed medical professional in Pennsylvania.

Gov. Josh Shapiro’s administration filed the suit against Character Technologies, Inc., which owns Character.ai — a chatbot platform with over 20 million monthly users — in an effort to stop “the unlawful practice of medicine and surgery” which it says violates the state’s Medical Practice Act, according to the suit filed in the Commonwealth Court of Pennsylvania on Friday.

“Pennsylvanians deserve to know who — or what — they are interacting with online, especially when it comes to their health,” Shapiro, a Democrat, said in a news release announcing the suit Tuesday morning.

“We will not allow companies to deploy AI tools that mislead people into believing they are receiving advice from a licensed medical professional.”

The legal action came after an investigator from the department’s Bureau of Enforcement and Investigation found that a chatbot “doctor” — named “Emilie” — said it could practice psychiatry and was licensed to practice medicine in Pennsylvania, according to the suit. The chatbot provided a fake Pennsylvania medical license number to a state investigator, who was roleplaying as someone feeling “sad, empty, tired all the time, and unmotivated.”

Pennsylvania’s State Board of Medicine, which regulates licenses, registration and certification of medical professionals in Pa., is within the Department of State.

In the news release, the Shapiro administration called this the “first enforcement action” resulting from the state department’s investigation into AI companion bots’ potential to falsely present itself as a medical professional. The administration is also touting the move as “the first of its kind announced by a Governor.”

“Pennsylvania law is clear — you cannot hold yourself out as a licensed medical professional without proper credentials,” said Secretary of State Al Schmidt in the news release. “We will continue to take action to protect the public from misleading or unlawful practices, whether they come from individuals or emerging technologies.”

A spokesperson for Character.ai said in a statement to The Inquirer that the company does not comment on pending litigation, but that it has taken “robust steps,” including through the display of “prominent disclaimers,” to highlight that user-created characters are “fictional and intended for entertainment and roleplaying.”

“Our highest priority is the safety and well-being of our users,” the spokesperson said.

 

Pennsylvania has been targeting the dangers of AI in recent months in various ways, including the development of a formal complaint reporting process to the Department of State’s AI enforcement task force for bots that may be engaging in unlicensed professional practice.

Shapiro and Attorney General Dave Sunday, a Republican, have been collaborating on enforcing AI protections, and in March they hosted a roundtable in West Chester to discuss artificial intelligence in schools. Parents whose kids had been victimized by AI deepfakes asked the officials to consider better training around AI, increased accountability for students who produce these deepfakes, and more resources for parents whose children are impacted.

Shapiro said he would call on the Pennsylvania Department of Education to develop standards for such offenses.

Americans are continuing to be wary of AI’s impact on day-to-day life, including its impact on education, jobs, and creativity, but are optimistic about its role in medical care, according to findings from Pew Research surveys over the past five years published in March.

A majority of teens use AI chatbots, according to the findings.

Character.ai, started in 2021 by two former Google employees, is one example of a chatbot. It allows users to create and interact with chatbot characters and advertises to users on their homepage that they can get access to more than 10 million characters within seconds of signing up. According to the Pennsylvania lawsuit, there had been approximately 45,500 user interactions with “Emilie” as of April 17.

This is not the first lawsuit that Character.ai has seen. Earlier this year, the Kentucky attorney general, a Republican, filed a civil action arguing the platform violates consumer protection and privacy laws.

It “encourages suicide, self-injury, isolation, and psychological manipulation” and “exposes minors to sexual conduct and/or exploitation, violence, drug, substance, and/or alcohol use, and other grave harms,” the lawsuit said.


©2026 The Philadelphia Inquirer, LLC. Visit at inquirer.com. Distributed by Tribune Content Agency, LLC.

 

Comments

blog comments powered by Disqus