Well before the latest government shutdown, the U.S. Department of Justice’s National Security Division (DOJ NSD) issued a final rule at 28 CFR Part 202 (“2025 Final Rule” or “Rule”) to help prevent “countries of concern” or “covered persons” from accessing U.S. government-related data and Americans’ bulk sensitive personal data. The 2025 Final Rule took effect in April—and after a 90-day safe harbor period, the DOJ began enforcement on July 8.
Six months after implementation—with the U.S. Senate now passing the BIOSECURE Act restricting certain biotech business with China—compliance remains the key for affected stakeholders, including those exchanging personal health data. As we reported in July, the 2025 Final Rule implemented the prior administration’s Executive Order 14117 of February 28, 2024, by prohibiting and restricting “bulk” data transactions with countries that could threaten U.S. national security through the use of Americans’ sensitive personal data.
While the 2025 Final Rule remains largely untested, federal agencies and stakeholders alike have taken action to test the bounds of the Rule and, in some instances, expand applicability beyond 28 CFR Part 202. Below is a brief refresher of the key elements of the Rule and some recent developments.
In the wake of a lawsuit filed in federal district court in California in August—alleging that an artificial intelligence (AI) chatbot encouraged a 16-year-old boy to commit suicide—a similar suit filed in September is now claiming that an AI chatbot is responsible for death of a 13-year-old girl.
It’s the latest development illustrating a growing tension between AI’s promise to improve access to mental health support and the alleged perils of unhealthy reliance on AI chatbots by vulnerable individuals. This tension is evident in recent reports that some users, particularly minors, are becoming addicted to AI chatbots, causing them to sever ties with supportive adults, lose touch with reality and, in the worst cases, engage in self-harm or harm to others.
While not yet reflected in diagnostic manuals, experts are recognizing the phenomenon of “AI psychosis”—distorted thoughts or delusional beliefs triggered by interactions with AI chatbots. According to Psychology Today, the term describes cases in which AI models have amplified, validated, or even co-created psychotic symptoms with individuals. Evidence indicates that AI psychosis can develop in people with or without a preexisting mental health issue, although the former is more common.
A recent enforcement action by the Federal Trade Commission (“FTC”) against 1Health.io—which sells “DNA Health Test Kits” to consumers for health and ancestry insights—serves as a reminder that the FTC is increasingly exercising its consumer protection authority in the context of privacy and data protection. This is especially true where the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”) does not reach. The FTC’s settlement with 1Health.io highlights a wide-range of privacy and security issues companies should consider relating to best practices for updating privacy policies, data retention policies, configuration of cloud storage and vendor management, especially when handling sensitive genetic data.
On December 1, 2022, the Office for Civil Rights (OCR) at the U.S. Department of Health and Human Services (HHS) published a bulletin warning that commonly used website technologies, including cookies, pixels, and session replay, may result in the impermissible disclosure of Protected Health Information (“PHI”) to third parties in violation of the Health Insurance Portability and Accountability Act of 1996 (“HIPAA”). The bulletin advises that “[r]egulated entities are not permitted to use tracking technologies in a manner that would result in impermissible disclosures of Protected Health Information (“PHI”) to tracking technology vendors or any other violations of the HIPAA Rules.” The bulletin is issued amidst a wider national and international privacy landscape that is increasingly focused on regulating the collection and use of personal information through web-based technologies and software that may not be readily apparent to the user.
As employers continue their efforts to safely bring employees back to the workplace, many have moved beyond initial pre-entry wellness checks or questionnaires and are considering technology solutions that monitor social distancing and conduct contact tracing in real-time. Along with introducing these enhanced capabilities, the question of the privacy and security of employee personally identifiable information (“PII”) and protected health information (“PHI”) continues to loom.
In order to isolate and contain the spread of COVID-19, one critical component of an ...
On October 26, 2018, the Federal Trade Commission (FTC) announced that it will hold four days of hearings between December of 2018 and February of 2019 to examine the FTC’s authority to deter unfair and deceptive conduct in data security and privacy matters.[1] The two days of December hearings will focus on data security, while the two days of February hearings will focus on consumer privacy. This announcement comes as part of the agencies Hearings on Competition and Consumer Protection in the 21st Century, an initiative that has already scheduled hearings on closely related ...
Recent comments by the Federal Trade Commission (FTC) Commissioner Rohit Chopra should have companies on notice for increased enforcement actions across the board. During the “Privacy. Security. Risk.” Conference in Texas last week, Chopra made comments regarding his views on increasing enforcement, including the imposition of greater civil monetary penalties. “I’ve already raised concerns about settlements we do with no monetary penalties. I want to see monetary consequences for egregious breaking of the law” said Chopra as reported by the IAPP during a live ...
One thing's certain – the vast and growing supply of data contained in electronic medical records systems will play a significant role in improving the speed and efficiency of research into new treatments in the years to come. The challenge will be striking an appropriate balance between the unquestionable promise of this data to enable research – research that will enhance available treatments and save lives – with the rights of individual patients in the privacy of their health information. Attempts to strike that balance are at the heart of current legislative, regulatory ...
Tuesday, March 24, 2015 at 12:00 p.m. – 1:00 p.m. EDT
The past year has demonstrated that no organization is immune to security incidents that could affect its employees, customers, and reputation. Understanding the complex legal framework governing data privacy and developing a plan to mitigate risk can be the difference between an incident and a disaster.
Join Epstein Becker Green's Privacy & Security Practice for a comprehensive overview of data breach priorities impacting organizations that deal in electronic data. Presenters will identify strategies to prepare for and ...
Blog Editors
Recent Updates
- DOJ’s Final Rule on Bulk Data Transfers: The First 180 Days
- California Governor Signs SB 351, Strengthening the State’s Corporate Practice of Medicine Doctrine
- No Remuneration Plus No "But-For" Causation (Between an Alleged Kickback and Claims Submitted to the Government) Means No FCA Violation, District Court Says
- Novel Lawsuits Allege AI Chatbots Encouraged Minors’ Suicides, Mental Health Trauma: Considerations for Stakeholders
- DOJ Creates Civil Division Enforcement & Affirmative Litigation Branch: Implications for Health Care and Beyond