Imagine going online to chat with someone and finding an account with a profile photo, a description of where the person lives, and a job title . . . indicating she is a therapist. You begin chatting and discuss the highs and lows of your day among other intimate details about your life because the conversation flows easily. Only the “person” with whom you are chatting is not a person at all; it is a “companion AI.”
Recent statistics indicate a dramatic rise in adoption of these companion AI chatbots, with 88% year-over-year growth, over $120 million in annual revenue, and 337 active apps (including 128 launched in 2025 alone). Further statistics about pervasive adoption among youth indicate three of every four teens have used companion AI at least once, and two out of four use companion AI routinely. In response to these trends and the potential negative impacts on mental health in particular, state legislatures are quickly stepping in to require transparency, safety and accountability to manage risks associated with this new technology, particularly as it pertains to children.
Blog Editors
Recent Updates
- Health Care Without the Hospital: ChatGPT Health and Claude Go Direct to Consumers
- The HTI-5 Proposed Rules: ASTP/ONC’s Cleanup and the Hard Work that Lies Ahead
- Just Released: Telemental Health Laws – Download Our Complimentary Survey and App
- OIG Limits Sign-On Bonuses to In-Home Family Caregivers
- Governing Health AI Development and Adoption: Insights from HHS’s Recently Announced Strategy to Promote AI in Healthcare