California Companion Chatbot Law Now in Effect
Key Takeaways
- A first-of-its-kind statute went into effect in California at the start of the year, imposing operational and reporting requirements related to companion chatbots.
- The law applies only to “companion chatbots” and excludes many customer service and business operations tools, video game chatbots, and voice-activated assistants.
- Operators of covered companion chatbots must implement user disclosures, suicide and self-harm safety protocols, and, when they know a user is a minor, additional safeguards.
- Beginning in 2027, operators of covered companion chatbots must submit annual reports describing crisis referrals and suicide- and self-harm-related safety protocols.
Introduction
Over recent years, states have increasingly experimented with regulating how chatbots and other AI-driven conversational tools are used in consumer-facing contexts. Early efforts focused largely on transparency, requiring businesses to disclose when users were interacting with automated rather than human agents.
California Companion Chatbot Law
The California Companion Chatbot Law, or California Senate Bill 243, went into effect on January 1, 2026. This law reflects a new phase of regulation: in addition to disclosure requirements, it imposes safety, governance, and reporting obligations. This new direction responds to concerns about how certain chatbots influence user behavior, emotional well-being, and decision-making over time, especially regarding minors.
What Types of Chatbots Are Covered?
The Companion Chatbot Law does not apply to all chatbots or conversational AI tools. Instead, it regulates a narrower category referred to as “companion chatbots.” These chatbots respond to users with adaptive, human-like responses and are designed to engage users to meet social or emotional needs. Chatbots that do not meet these criteria—either because they do not maintain a relationship across multiple interactions or are not capable of eliciting emotional or social engagement—fall outside the law’s definition and are not subject to its requirements.
The law explicitly excludes:
- Customer service chatbots: Used solely for customer service, business operations, or technical assistance.
- Video game chatbots: Those operating within video games, provided their responses are limited to the game itself and do not discuss mental health or self-harm.
- Voice-activated assistants: Devices that do not maintain a relationship across interactions or generate outputs likely to elicit emotional responses.
Core Operational Requirements
The Companion Chatbot Law imposes operational obligations on operators of covered companion chatbots. Some apply to all users, while others apply only when the operator knows the user is a minor.
- Required disclosure: Operators must clearly notify users that they are interacting with an AI system if the user could be misled into thinking they are interacting with a human.
- Required safety protocols: Operators must prevent the chatbot from producing content related to suicidal ideation, suicide, or self-harm, and must refer users to crisis service providers.
- Minor suitability disclosure: Operators must disclose that the companion chatbot may not be suitable for some minors.
- Additional requirements for minors: When a user is identified as a minor, operators must provide notifications reminding them of the AI’s nature and encourage breaks every three hours.
Reporting and Transparency Obligations
Beginning July 1, 2027, operators must annually report to California’s Office of Suicide Prevention regarding:
- The number of crisis service referrals made in the previous year.
- Protocols to detect and respond to instances of suicidal ideation by users.
- Protocols to prohibit responses related to suicidal ideation or actions.
This information will be published by the Office of Suicide Prevention on its website.
Enforcement and Liability Considerations
The Companion Chatbot Law creates a private right of action, allowing users to bring civil actions against operators who violate the law’s requirements, seeking injunctive relief, actual damages, and attorney’s fees.
Looking Ahead
Companies deploying chatbots should assess their functionality to determine if the Companion Chatbot Law applies and implement necessary operational changes. As a first-of-its-kind statute, it signals likely continued experimentation by states exploring AI regulation.
Organizations should also prepare for potential federal regulations that may preempt state laws following renewed federal interest in AI governance. Companies must monitor both state and federal developments to stay compliant with evolving regulations.