In the wake of a lawsuit filed in federal district court in California in August—alleging that an artificial intelligence (AI) chatbot encouraged a 16-year-old boy to commit suicide—a similar suit filed in September is now claiming that an AI chatbot is responsible for death of a 13-year-old girl.
It’s the latest development illustrating a growing tension between AI’s promise to improve access to mental health support and the alleged perils of unhealthy reliance on AI chatbots by vulnerable individuals. This tension is evident in recent reports that some users, particularly minors, are becoming addicted to AI chatbots, causing them to sever ties with supportive adults, lose touch with reality and, in the worst cases, engage in self-harm or harm to others.
While not yet reflected in diagnostic manuals, experts are recognizing the phenomenon of “AI psychosis”—distorted thoughts or delusional beliefs triggered by interactions with AI chatbots. According to Psychology Today, the term describes cases in which AI models have amplified, validated, or even co-created psychotic symptoms with individuals. Evidence indicates that AI psychosis can develop in people with or without a preexisting mental health issue, although the former is more common.
For decades, FDA’s Center for Devices and Radiological Health (CDRH) has been recognizing standards that can be referenced in premarket medical device submissions. Congress broadly directed federal agencies to begin relying on standards in 1996, through the National Technology Transfer and Advancement Act, but the informal practice dates back to the 1970s. Congress specifically directed FDA to begin using standards for medical device submissions through the Food and Drug Administration Modernization Act of 1997 (FDAMA).
Being a curious person, I wanted to see what FDA has done with that authority by looking at the CDRH database for Recognized Consensus Standards: Medical Devices. My main takeaway is that CDRH is not yet investing enough time and energy in recognizing standards that support digital health and AI.
Findings
I downloaded the data set on September 20, 2024, and looked when standards were recognized by FDA and to which therapeutic or functional areas they related.
Many physicians rely on publicly available reports to assess the safety of the devices they use on patients, but in some cases, these reports aren’t painting the full picture. A recent Kaiser Health News (“KHN”) article raises serious questions about FDA’s practice of allowing a significant number of medical device injury and malfunction reports to stay out of the public eye.
Under FDA’s Medical Device Reporting (“MDR”) regulation (21 CFR part 803), device manufacturers, importers, and device user facilities (which include hospitals, ambulatory surgery ...
At the end of July, FDA released a tangible plan for promoting innovation in the development of digital health products. In this Digital Health Innovation Action Plan, FDA acknowledges that digital health technologies are critically important in advancing health care, and that traditional FDA pathways to market are not well suited for all of these technologies. Over the last few years, FDA has taken a deregulatory approach with respect to low risk digital health products and has issued guidance regarding its enforcement discretion approach to wellness products, medical device ...
FDA published the long awaited draft guidance on wellness products last Friday. The guidance is a positive step forward for industry in that it proposes that certain general wellness products will not be subject to FDA regulation.
The draft guidance clarifies that FDA does not intend to enforce its regulations against products that are "low risk" and are intended to:
- Maintain or encourage health without reference to a disease or condition (e.g. weight, fitness, stress) or
- Help users live well with or reduce risks of chronic conditions, where it is well accepted that a healthy ...
Blog Editors
Recent Updates
- DOJ’s Final Rule on Bulk Data Transfers: The First 180 Days
- California Governor Signs SB 351, Strengthening the State’s Corporate Practice of Medicine Doctrine
- No Remuneration Plus No "But-For" Causation (Between an Alleged Kickback and Claims Submitted to the Government) Means No FCA Violation, District Court Says
- Novel Lawsuits Allege AI Chatbots Encouraged Minors’ Suicides, Mental Health Trauma: Considerations for Stakeholders
- DOJ Creates Civil Division Enforcement & Affirmative Litigation Branch: Implications for Health Care and Beyond