Imagine going online to chat with someone and finding an account with a profile photo, a description of where the person lives, and a job title . . . indicating she is a therapist. You begin chatting and discuss the highs and lows of your day among other intimate details about your life because the conversation flows easily. Only the “person” with whom you are chatting is not a person at all; it is a “companion AI.”
Recent statistics indicate a dramatic rise in adoption of these companion AI chatbots, with 88% year-over-year growth, over $120 million in annual revenue, and 337 active apps (including 128 launched in 2025 alone). Further statistics about pervasive adoption among youth indicate three of every four teens have used companion AI at least once, and two out of four use companion AI routinely. In response to these trends and the potential negative impacts on mental health in particular, state legislatures are quickly stepping in to require transparency, safety and accountability to manage risks associated with this new technology, particularly as it pertains to children.
Blog Editors
Recent Updates
- HHS OIG Issues Favorable Advisory Opinion Regarding Surgical Supply Discounts to Ambulatory Surgery Centers in Exchange for Software Purchases
- Health Care Workplace Violence Legislation Heats Up in 2026
- DOGE's Attempt to Crowdsource Medicaid Fraud Scrutiny: Is This the Future of Healthcare Fraud Investigations?
- Feds vs. the States: Dr. Mehmet Oz Announces an Investigation Into New York’s Medicaid Program
- U.S. Supreme Court to Weigh Induced Infringement Case Regarding ‘Generic Version of Vascepa®’