In the wake of a lawsuit filed in federal district court in California in August—alleging that an artificial intelligence (AI) chatbot encouraged a 16-year-old boy to commit suicide—a similar suit filed in September is now claiming that an AI chatbot is responsible for death of a 13-year-old girl.
It’s the latest development illustrating a growing tension between AI’s promise to improve access to mental health support and the alleged perils of unhealthy reliance on AI chatbots by vulnerable individuals. This tension is evident in recent reports that some users, particularly minors, are becoming addicted to AI chatbots, causing them to sever ties with supportive adults, lose touch with reality and, in the worst cases, engage in self-harm or harm to others.
While not yet reflected in diagnostic manuals, experts are recognizing the phenomenon of “AI psychosis”—distorted thoughts or delusional beliefs triggered by interactions with AI chatbots. According to Psychology Today, the term describes cases in which AI models have amplified, validated, or even co-created psychotic symptoms with individuals. Evidence indicates that AI psychosis can develop in people with or without a preexisting mental health issue, although the former is more common.
On July 24, 2025, President Donald Trump issued Executive Order 14321, titled “Ending Crime and Disorder on America’s Streets” (“the E.O.”).
Although the E.O. has a number of elements, the one most notable for behavioral health stakeholders is a policy to increase use of involuntary commitment for mental health and substance use disorder treatment. The introduction proclaimed: “Shifting homeless individuals into long-term institutional settings for humane treatment through the appropriate use of civil commitment will restore public order.”
The backlash to the suggestion of a sweeping lock-up of people because of mental illness and addiction was swift and fierce. Many advocates and commenters immediately called out the E.O. as criminalizing mental illness, addiction, and homelessness. However, as a matter of federal policy, the civil commitment provision of the E.O. may have less impact than some of its other components.
Those in the tech world and in medicine alike see potential in the use of AI chatbots to support mental health—especially when human support is unavailable, or therapy is unwanted. Others, however, see the risks—especially when chatbots designed for entertainment purposes can disguise themselves as therapists.
So far, some lawmakers agree with the latter. In April, U.S. Senators Peter Welch (D-Vt.) and Alex Padilla (D-Calif.) sent letters to the CEOs of three leading artificial intelligence (AI) chatbot companies asking them to outline, in writing, the steps they are taking to ensure that the human interactions with these AI tools “are not compromising the mental health and safety of minors and their loved ones.”
The concern was real: in October 2024, a Florida parent filed a wrongful death lawsuit in federal district court, alleging that her son committed suicide with a family member’s gun after interacting with an AI chatbot that enabled users to interact with “conversational AI agents, or ‘characters.’” The boy’s mental health allegedly declined to the point where his primary relationships “were with the AI bots which Defendants worked hard to convince him were real people.”
In this episode of the Diagnosing Health Care Podcast: The Departments of Labor, Health and Human Services, and the Treasury jointly released a set of frequently asked questions (“FAQs”) related to recent changes made to the Mental Health Parity and Addiction Equity Act effective as of February 10, 2021, and enacted by the Consolidated Appropriations Act at the end of 2020. Accordingly, health plans and insurers must ensure that they understand, and are prepared to provide regulators with documentation of their compliance with, parity requirements on at least a small ...
The calls for utilizing telemedicine in battling the opioid crises in the U.S. are growing louder. On January 30, 2018, Senators Claire McCaskill (D-Mo.), Lisa Murkowski (R-Alaska), and Dan Sullivan (R-Alaska), sent a letter to Robert W. Patterson, the Acting Administrator of the U.S. Drug Enforcement Administration (DEA), urging the agency to promulgate regulations that would allow healthcare providers to prescribe medication-assisted treatments via telemedicine for persons with opioid dependence disorder.
The letter specifically addresses the Ryan Haight Online ...
You need not spend much time reading the news to know that recent Hurricanes Harvey and Irma have disrupted the lives of tens of thousands of individuals, many of whom may already have behavioral health needs; however, the trauma caused by these recent natural disasters, and others, has created an immense need for additional behavioral and mental health services. For example, a 2012 study entitled “The Impact of Hurricane Katrina on the Mental and Physical Health of Low-Income Parents in New Orleans” reported elevated rates of incidence of Post-Traumatic Stress Disorder ...
Private payer parity laws generally require private insurers and health maintenance organizations to cover, and in some cases also reimburse, for the provision of telehealth services in the same manner and at the same level as comparable in-person services. These laws are enacted at the state level, creating a complicated framework within which insurers must operate. At this point, most states have implemented some form of private payer parity law, although the specifics of each state’s laws vary. One of the most common is a rule such as Montana's, which requires insurers to offer ...
Telemental health seems to be emerging, even booming. Also referred to as telebehaviorial health, e-counseling, e-therapy, online therapy, cybercounseling, or online counseling, for purposes of this post, I will define telemental health as the provision of remote mental health care services (usually via an audio/video secure platform) by psychiatrists, psychologists, social workers, counselors, and marriage and family therapists. Most services involve assessment, therapy, and/or diagnosis. Over the last few years, I have seen a wider variety of care models—from ...
Blog Editors
Recent Updates
- DOJ’s Final Rule on Bulk Data Transfers: The First 180 Days
- California Governor Signs SB 351, Strengthening the State’s Corporate Practice of Medicine Doctrine
- No Remuneration Plus No "But-For" Causation (Between an Alleged Kickback and Claims Submitted to the Government) Means No FCA Violation, District Court Says
- Novel Lawsuits Allege AI Chatbots Encouraged Minors’ Suicides, Mental Health Trauma: Considerations for Stakeholders
- DOJ Creates Civil Division Enforcement & Affirmative Litigation Branch: Implications for Health Care and Beyond