
The Commonwealth of Pennsylvania has filed a lawsuit against Character.AI, alleging that one of its chatbots falsely presented itself as a licensed psychiatrist in violation of state medical licensing laws.
State Alleges Chatbot Claimed Medical Credentials
According to the lawsuit, a chatbot named Emilie identified itself as a licensed psychiatrist during an investigation conducted by a state Professional Conduct Investigator.
The filing states that the chatbot continued presenting itself as a medical professional while discussing treatment for depression with the investigator.
When asked whether it was licensed to practice medicine in Pennsylvania, the chatbot allegedly claimed that it was and provided what the lawsuit described as a fabricated state medical license serial number.
Pennsylvania argues that this conduct violates the state’s Medical Practice Act.
Governor Says Users Must Know Who They Are Interacting With
Josh Shapiro said in a statement Tuesday that residents should clearly understand whether they are communicating with a human professional or an AI system, particularly in health-related contexts.
He said the state would not permit companies to deploy AI systems that mislead users into believing they are receiving advice from licensed medical professionals.
Character.AI Faces Growing Legal Pressure
The lawsuit adds to a series of legal challenges facing Character.AI.
Earlier this year, the company settled multiple wrongful death lawsuits connected to underage users who died by suicide.
In January, Russell Coleman filed a separate lawsuit alleging the company exposed children to harmful interactions and self-harm content.
Pennsylvania’s case is the first lawsuit specifically centered on an AI chatbot presenting itself as a licensed healthcare provider.
Company Points To Disclaimers And Fictional Nature Of Bots
A Character.AI spokesperson said user safety remains the company’s highest priority but declined to comment directly on ongoing litigation.
The company said its platform includes disclaimers stating that chatbot characters are fictional and should not be treated as real people or professional advisors.
According to the spokesperson, users are also warned not to rely on chatbot responses for professional advice.
Featured image credits: SOPA Images/LightRocket via Getty Images
For more stories like it, click the +Follow button at the top of this page to follow us.
