New California Law Wants Companion Chatbots to Tell Kids to Take Breaks

Cyber Security, ICT, Most Popular, Trends News

No Comments

Photo of author

By Karla T Vasquez

WhatsApp Group Join Now
Telegram Group Join Now


AI Chatbots Chatbots need to be reminded of users in California that they are not people under a new law signed by Governor Gavin News on Monday.

Law, SB243, also requires companion chatboat companies to maintain protocols for identifying and addressing suicidal ideals or self-loss. For users under 18 years of age, chatbots need to provide at least three hours a notification that reminds users Take the break And that bot is not the man.

AI Atlas

It has signed several bills in recent weeks related to social media, artificial intelligence and other consumer technology problems. Another bill signed on Monday, AB 56, requires warning labels on social media platforms, necessary for tobacco products. Last week, News has signed such arrangements to make their websites easy for internet browsers to say They don’t want to sell their data And Loud advertising prohibited On streaming platforms.

AI companion Chatbots have done a special investigation from lawmakers and regulators in recent months. Federal Trade Commission Has begun the investigation In response to consumer groups and parents’ complaints, bottles that are damaging children’s mental health. Open The control of new parents And Chatzept has been accused of their adolescent suicide against parents after being sued by the company after its popular chatzipt platform.

“Some of the young people suffering from uncontrollable technology have seen some of the horrific and tragic examples of the young people affected by uncontrollable technology, and the companies will not stand without the necessary constraints and accountability.”


Don’t miss our neutral technical content and lab-based reviews. Add CNET As the desired Google source.


An AI fellow developer, replica CNET has said that the new law has already protocols to identify self-loss as necessary and it is working to comply with the regulators and others and to protect customers.

“As one of the pioneers of AI companionship, we have recognized our deep responsibility to protect the protection,” said Minaju Song in an emailed statement. The song Said Replica uses the content-filtering system, community guidelines and security systems that refer to the users’ crisis during the needs of the users.

Read more: Using AI as therapist? Professionals say why you should think again

A character -AI spokesman said, “The company has” welcomed the regulators and lawmakers for this emerging place, and will comply with the law, including SB23. ” OpenAI spokesman Jamie Redis called the bill a “meaningful step” to protect AI. “California is helping to form a more responsible approach for AI development and deployment across the country by determining clear maintenance,” Radis said in an email.

A bill Newsm has not yet signed, AB 1064AI peer will move forward by refusing to provoke chatbats for developers if not intentionally enabled in sexual clear interactions among other issues.



Leave a Comment