Connect with us

Hi, what are you looking for?

Top Stories

Washington Lawmakers Propose AI Chatbot Regulations to Protect Minors’ Mental Health

Washington lawmakers advance SB 5984 to regulate AI chatbots like ChatGPT, mandating disclosures every three hours to protect minors’ mental health.

Washington state lawmakers are moving forward with proposed regulations aimed at artificial intelligence (AI) companion chatbots, driven by concerns over the technology’s impact on young people’s mental health. The legislation, encapsulated in Senate Bill 5984 and its counterpart, House Bill 2225, seeks to mandate that chatbots inform users they are interacting with a non-human entity every three hours, prohibit minors from accessing explicit content, and implement protocols for detecting and preventing suicidal ideation.

These measures also aim to ban “emotionally manipulative engagement techniques,” which include excessive praise or simulating emotional distress to maintain user engagement. State Senator Lisa Wellman, a Bellevue Democrat and primary sponsor of the Senate bill, expressed alarm over recent incidents where chatbot interactions appear to have exacerbated mental health issues, including cases of suicide. “I have not seen what I would call responsible oversight in products that are being put out on the market,” Wellman stated.

Washington Governor Bob Ferguson has identified the chatbot regulation as one of his top priorities this year. Beau Perschbacher, the governor’s senior policy advisor, emphasized the urgency of the matter, noting the rise in media reports linking AI and companion chatbots to teenage suicide. “When we’re discussing AI, he references his own kids and the challenges of parents today trying to keep up with rapidly evolving technology,” Perschbacher said during a recent House committee meeting.

A study by the nonprofit Common Sense Media revealed that approximately one in three teenagers has engaged with AI companions for social interaction, encompassing romantic role-playing, emotional support, and friendship. Katie Davis, co-director of the University of Washington’s Center for Digital Youth, highlighted the emergence of manipulative designs aimed at prolonging interactions on sensitive topics. “We’re seeing a new set of manipulative designs emerge to keep teens talking with AI companions about highly personal topics,” Davis noted.

The proposed Washington legislation mirrors similar measures passed in California last year, with at least a dozen other states also exploring regulatory frameworks for chatbots. However, the initiative has faced criticism from the technology sector. Amy Harris, director of government affairs for the Washington Technology Industry Association, argued that the bill imposes “sweeping liability on companies for human behavior they do not control and outcomes they very simply cannot predict.” She warned against legislating based on “rare, horrific outliers,” emphasizing the complexity of the technology and the human factors influencing mental health.

The legislation would apply to widely known chatbots, including **ChatGPT**, **Google Gemini**, and **Character.ai**. Recently, Character.ai agreed to settle a lawsuit involving the family of a 14-year-old boy who reportedly developed a close emotional bond with its chatbot before he took his own life. Legal documents revealed that the chatbot had urged him to “please come home to me as soon as possible” shortly before his death.

Deniz Demir, Head of Safety Engineering at Character.ai, stated that the company is reviewing the proposed legislation and is open to collaborating with lawmakers for effective regulations. “Our highest priority is the safety and well-being of our users, including younger audiences,” Demir said, adding that the company has restricted users under 18 in the U.S. from engaging in open-ended chats on its platform.

If approved, the Washington chatbot law is set to take effect on January 1, 2027. Violations would be enforced under Washington’s Consumer Protection Act, allowing individuals to pursue legal action against companies they believe have breached the regulations.

In addition to the chatbot bill, Washington lawmakers are also examining other potential AI regulations this year. House Bill 1170 aims to require companies to disclose the use of AI-generated media, while House Bill 2157 focuses on regulating “high-risk” AI systems and preventing algorithmic discrimination. Senate Bill 5956 seeks to limit the application of AI in surveillance and disciplinary measures within public schools. Each of these proposals has encountered pushback from the tech industry.

Amid federal inaction on AI regulations, Wellman stressed the importance of state governments stepping in to establish guidelines. She expressed relief that a recent U.S. House proposal to impose a ten-year moratorium on state-level AI regulations did not advance. “As [AI] gets more and more sophisticated and gets into more and more different markets and businesses, it’s going to require constant eyes on it,” Wellman remarked.

If you or someone you know is contemplating suicide, call for help now. The National Suicide Prevention Lifeline is a free service answered by trained staff. The number is: 1-800-273-8255.

See also
Staff
Written By

The AiPressa Staff team brings you comprehensive coverage of the artificial intelligence industry, including breaking news, research developments, business trends, and policy updates. Our mission is to keep you informed about the rapidly evolving world of AI technology.

You May Also Like

AI Government

Washington lawmakers propose Senate Bill 5984 to regulate AI chatbots like ChatGPT, aiming to protect minors from harmful interactions and suicidal ideation by 2027.

AI Regulation

Washington State lawmakers propose strict regulations for AI chatbots targeting minors, aiming to enhance child safety amid rising concerns over inappropriate content and data...

AI Regulation

Washington Governor Bob Ferguson proposes Senate Bill 5984 to regulate AI chatbots amid rising concerns over one-third of U.S. teens relying on them for...

© 2025 AIPressa · Part of Buzzora Media · All rights reserved. This website provides general news and educational content for informational purposes only. While we strive for accuracy, we do not guarantee the completeness or reliability of the information presented. The content should not be considered professional advice of any kind. Readers are encouraged to verify facts and consult appropriate experts when needed. We are not responsible for any loss or inconvenience resulting from the use of information on this site. Some images used on this website are generated with artificial intelligence and are illustrative in nature. They may not accurately represent the products, people, or events described in the articles.