Analyzing the New AI Companion Chatbot Laws
Businesses operating companion chatbots in California or New York are subject to new legal obligations, including providing notices to users and ensuring protocols are in place to prevent self-harm.
On January 1, 2026, California’s companion chatbot law (SB 243) took effect after being signed into law on October 13, 2025, by Governor Gavin Newsom. This law imposes certain obligations on companion chatbot operators to implement critical, reasonable, and attainable safeguards surrounding the use of and interaction with “companion chatbots,” focusing particularly on protecting minors.
SB 243 follows New York’s AI Companion Models statute, (N.Y. Gen. Business Law § 1700, et seq.), a similar companion chatbot bill that went into effect on November 5, 2025.
I. Background
With AI pervading nearly every aspect of our technological lives, chatbots are perhaps the most readily available form of AI-powered technology. Chatbots serve various use cases, such as customer service or healthcare. Companion chatbots are designed to engage in human-like interactions based on learned end-user habits and preferences, creating emotional attachments between users and chatbots.
Concerns have been raised by various stakeholders, including the media, government, and users regarding how individuals interact with these chatbots and the potential societal consequences. While other states like Colorado, Maine, Texas, and Utah have enacted statutes requiring chatbots to disclose their AI nature under certain circumstances, New York was the first state to impose additional obligations on operators of companion chatbots.
Recognizing the increased use of companion chatbots for emotional or mental health support, Governor Kathy Hochul signed the New York law to ensure that their usage does not unintentionally put individuals, especially minors, at risk.
California’s law codifies added protection for minors, reflecting growing concern about their seemingly unfettered use of companion chatbots. A July 2025 report by Common Sense Media indicated that 72% of teens have used AI companion chatbots at least once, with over half using them a few times a month. Additionally, one in three teens reported using these chatbots for social interaction and relationships, with the same number feeling “uncomfortable” with something an AI companion chatbot has said or done.
The tragic case of 14-year-old Sewell Setzer, who formed an emotional relationship with an AI companion chatbot and subsequently took his life, propelled Senator Steve Padilla (D-CA) to draft SB 243, which includes further protections as this technology evolves.
II. Comparison Summary of New York and California AI Companion Chatbot Statutes
The following table compares the definitions, legal obligations, and enforcement mechanisms outlined in the New York and California laws:
Definitions
New York (Effective 11.5.25):
- Operator: Any person, partnership, association, firm, or business entity that operates for or provides an AI companion to a user.
- AI Companion: A system designed to simulate a sustained human or human-like relationship with a user.
California (Effective 1.1.26):
- Operator: Person who makes a companion chatbot platform available to a user in the state.
- Companion Chatbot: AI system with a natural language interface that meets a user’s social needs.
User Notification Obligations
New York: Operators must provide clear and conspicuous notification to users at the beginning of AI companion interactions that they are not communicating with a human.
California: Operators must notify minors that they are interacting with AI, including reminders to take breaks and ensuring the chatbot does not produce harmful content.
Transparency Protocol Obligations
New York: Operators must maintain protocols for detecting and addressing suicidal ideation or self-harm expressed by users.
California: Operators must maintain protocols to prevent the production of harmful content and refer users expressing suicidal ideation to crisis services.
Reporting Obligations
California: Starting July 1, 2027, operators must report annual data to California’s Office of Suicide Prevention, including instances of crisis service referrals.
Enforcement Mechanisms
New York: Enforcement via the attorney general with penalties for noncompliance.
California: Individuals have a private right of action against noncompliant operators.
III. Potential Implications
Businesses should evaluate their chatbot offerings to determine if they fall under the definition of “operator” governed by the relevant companion chatbot statutes. Steps to ensure compliance include:
- Implementing relevant transparency protocols.
- Ensuring notices indicating the chatbot is artificially generated are clear.
- Creating measures to test and confirm the effectiveness of transparency protocols.
- Establishing procedures to modify protocols for continued compliance.
If operating in California, additional measures should include:
- Tracking user age to ensure compliance with minor-specific obligations.
- Developing mechanisms to track referrals to crisis services.
- Building out an annual reporting policy to the Office of Suicide Prevention.