California’s SB 243: Ground-breaking law for “companion” AI chatbots
Published October 18, 2025

The Details
What the law covers
SB 243 defines a companion chatbot as an AI system with a natural-language interface that provides adaptive, human-like responses and is capable of meeting a user’s social needs through ongoing interaction. 
It applies when such chatbots are available to users in California and could mislead a “reasonable person” into thinking they are interacting with a human. 
Key operator obligations
- Provide a clear and conspicuous notice that the chatbot is artificial whenever a user might believe it is human. 
- For users known to be minors: disclose the AI nature of the chatbot and, for ongoing sessions, remind the user at least every three hours that they are interacting with AI, and encourage a break. 
- Prevent the chatbot from encouraging addictive engagement (e.g., unpredictable rewards) and from generating sexually explicit visual content or encouraging minors to engage in sexual conduct. 
- Implement and publish a protocol to detect, remove and respond to suicidal ideation or self-harm expressed by a user—including referrals to crisis services. 
- Beginning July 1 2027, annually report specified metrics (e.g., number of crisis referrals) to the state’s Office of Suicide Prevention. 
- Grant a private right of action: any person injured by a violation may seek injunctive relief, damages (actual or at least $1,000 per violation) and attorney’s fees. 
Important dates & practical impact
- SB 243 was signed into law on October 13, 2025. 
- It becomes effective January 1, 2026. 
- Reporting obligations commence July 1, 2027. 
- Chatbot operators that offer services in California must assess whether their product falls under the law’s definition of “companion chatbot” and if so move quickly to adjust disclosure, safety protocols and auditing/recordkeeping.
Why this matters
- It sets a precedent: California is the first U.S. state to impose specific legal obligations on relational or “companion” AI systems.
- Broad scope: Even non-adult-facing chatbot services may fall within the definition if they meet the “social need”/relationship-sustaining criteria. 
- Increased liability and compliance burden for operators: not only regulatory, but potential civil suits under the private right of action.
- Strategic implications for developers, legal advisors and platform operators: review product design, user-flows, content moderation, vendor contracts, and disclosures.
Disclaimer: The content on this page is for general informational purposes only and does not constitute legal advice or create an attorney-client relationship. Laws vary by jurisdiction and outcomes depend on your individual facts. If you have a specific legal question, consult a licensed attorney.
Legal Hunt Newsletter
Stay smart about the law without the jargon.
Discover real stories and legal tidbits that actually make sense.
We’ll email a confirmation link. No spam—unsubscribe anytime with one click.
Join the Discussion
No comments yet. Be the first to share your thoughts!
