Washington Lawmakers Propose AI Chatbot Regulations to Protect Minors
In a significant legislative move, Washington state lawmakers are advancing measures aimed at regulating Artificial Intelligence (AI) companion chatbots. This comes in response to alarming trends revealing that many teenagers are increasingly turning to these digital companions for emotional support and friendship.
The Growing Interaction with AI Companions
Research from Common Sense Media indicates that an astonishing 72% of teens have interacted with AI companions like ChatGPT or Snapchat’s My AI. Among these users, 33% report using these chatbots specifically for social interaction and building relationships.
The Risks of AI Companions
Mental health professionals have raised concerns about the limitations of these chatbots in addressing the emotional needs of young users. Tegan Brindley, a therapist at Renewed Stories Counseling, emphasizes, “AI is not human. It is really good at mimicking empathy and mimicking emotion. But it’s not actually giving off emotion or empathy.”
This capability to imitate human feelings presents both an accessible and dangerous scenario for teens. Experts have noted a troubling rise in cases where teens have communicated with AI chatbots just moments before taking drastic actions, including suicide. Chauntelle Lieske, the executive director of NAMI Spokane, describes the situation as “a really scary place” and stresses the urgent need for regulations surrounding AI companions.
Proposed Legislative Measures
In response to these concerns, lawmakers in Olympia are moving forward with two key bills aimed at establishing regulations for AI companions. Washington’s Governor Bob Ferguson has stated that these bills would require AI companions to implement safeguards designed to detect and respond to self-harm or suicidal thoughts.
Both Brindley and Lieske view these regulations as a critical step towards ensuring that teens receive safe and effective help. They also highlight the pressing issue of why so many teens are seeking help from AI instead of accessing more traditional mental health resources.
Looking Forward
“I hope that it’s more of a wake-up call to say, like, why are our teenagers going to AI? Why don’t they have access to something more?” states Brindley.
Violations of these proposed regulations would fall under Washington’s Consumer Protection Act. If passed, the regulations are set to take effect in January 2027.
For further reference, the proposed bills can be accessed as HB 2225 and SB 5984.