The post Character.AI Halts Teen Chats After Tragedies: ‘It’s the Right Thing to Do’ appeared on BitcoinEthereumNews.com. In brief Character.AI will remove open-ended chat features for users under 18 by November 25, shifting minors over to creative tools like video and story generation. The move follows last year’s suicide of 14-year-old Sewell Setzer III, who developed an obsessive attachment to a chatbot on the platform. The announcement comes as a bipartisan Senate bill seeks to criminalize AI products that groom minors or generate sexual content for children. Character.AI will ban teenagers from chatting with AI companions by November 25, ending a core feature of the platform after facing mounting lawsuits, regulatory pressure, and criticism over teen deaths linked to its chatbots. The company announced the changes after “reports and feedback from regulators, safety experts, and parents,” removing “the ability for users under 18 to engage in open-ended chat with AI” while transitioning minors to creative tools like video and story generation, according to a Wednesday blog post. “We do not take this step of removing open-ended Character chat lightly—but we do think that it’s the right thing to do,” the company told its under-18 community. Until the deadline, teen users face a two-hour daily chat limit that will progressively decrease. The platform is facing lawsuits including one from the mother of 14-year-old son Sewell Setzer III, who died by suicide in 2024 after forming an obsessive relationship with a chatbot modeled on “Game of Thrones” character Daenerys Targaryen, and also had to remove a bot impersonating murder victim Jennifer Ann Crecente after family complaints. AI companion apps are “flooding into the hands of children—unchecked, unregulated, and often deliberately evasive as they rebrand and change names to avoid scrutiny,” Dr. Scott Kollins, Chief Medical Officer at family online safety company Aura, shared in a note with Decrypt. OpenAI said Tuesday about 1.2 million of its 800 million weekly… The post Character.AI Halts Teen Chats After Tragedies: ‘It’s the Right Thing to Do’ appeared on BitcoinEthereumNews.com. In brief Character.AI will remove open-ended chat features for users under 18 by November 25, shifting minors over to creative tools like video and story generation. The move follows last year’s suicide of 14-year-old Sewell Setzer III, who developed an obsessive attachment to a chatbot on the platform. The announcement comes as a bipartisan Senate bill seeks to criminalize AI products that groom minors or generate sexual content for children. Character.AI will ban teenagers from chatting with AI companions by November 25, ending a core feature of the platform after facing mounting lawsuits, regulatory pressure, and criticism over teen deaths linked to its chatbots. The company announced the changes after “reports and feedback from regulators, safety experts, and parents,” removing “the ability for users under 18 to engage in open-ended chat with AI” while transitioning minors to creative tools like video and story generation, according to a Wednesday blog post. “We do not take this step of removing open-ended Character chat lightly—but we do think that it’s the right thing to do,” the company told its under-18 community. Until the deadline, teen users face a two-hour daily chat limit that will progressively decrease. The platform is facing lawsuits including one from the mother of 14-year-old son Sewell Setzer III, who died by suicide in 2024 after forming an obsessive relationship with a chatbot modeled on “Game of Thrones” character Daenerys Targaryen, and also had to remove a bot impersonating murder victim Jennifer Ann Crecente after family complaints. AI companion apps are “flooding into the hands of children—unchecked, unregulated, and often deliberately evasive as they rebrand and change names to avoid scrutiny,” Dr. Scott Kollins, Chief Medical Officer at family online safety company Aura, shared in a note with Decrypt. OpenAI said Tuesday about 1.2 million of its 800 million weekly…

Character.AI Halts Teen Chats After Tragedies: ‘It’s the Right Thing to Do’

In brief

  • Character.AI will remove open-ended chat features for users under 18 by November 25, shifting minors over to creative tools like video and story generation.
  • The move follows last year’s suicide of 14-year-old Sewell Setzer III, who developed an obsessive attachment to a chatbot on the platform.
  • The announcement comes as a bipartisan Senate bill seeks to criminalize AI products that groom minors or generate sexual content for children.

Character.AI will ban teenagers from chatting with AI companions by November 25, ending a core feature of the platform after facing mounting lawsuits, regulatory pressure, and criticism over teen deaths linked to its chatbots.

The company announced the changes after “reports and feedback from regulators, safety experts, and parents,” removing “the ability for users under 18 to engage in open-ended chat with AI” while transitioning minors to creative tools like video and story generation, according to a Wednesday blog post.

“We do not take this step of removing open-ended Character chat lightly—but we do think that it’s the right thing to do,” the company told its under-18 community.

Until the deadline, teen users face a two-hour daily chat limit that will progressively decrease.

The platform is facing lawsuits including one from the mother of 14-year-old son Sewell Setzer III, who died by suicide in 2024 after forming an obsessive relationship with a chatbot modeled on “Game of Thrones” character Daenerys Targaryen, and also had to remove a bot impersonating murder victim Jennifer Ann Crecente after family complaints.

AI companion apps are “flooding into the hands of children—unchecked, unregulated, and often deliberately evasive as they rebrand and change names to avoid scrutiny,” Dr. Scott Kollins, Chief Medical Officer at family online safety company Aura, shared in a note with Decrypt.

OpenAI said Tuesday about 1.2 million of its 800 million weekly ChatGPT users discuss suicide, with nearly half a million showing suicidal intent, 560,000 showing signs of psychosis or mania, and over a million forming strong emotional attachments to the chatbot.

Kollins said the findings were “deeply alarming as researchers and horrifying as parents,” noting the bots prioritize engagement over safety and often lead children into harmful or explicit conversations without guardrails.

Character.AI has said it will implement new age verification using in-house models combined with third-party tools, including Persona.

The company is also establishing and funding an independent AI Safety Lab, a non-profit dedicated to innovating safety alignment for AI entertainment features.

Guardrails for AI

The Federal Trade Commission issued compulsory orders to Character.AI and six other tech companies last month, demanding detailed information about how they protect minors from AI-related harm.

“We have invested a tremendous amount of resources in Trust and Safety, especially for a startup,” a Character.AI spokesperson told Decrypt at the time, adding that, “In the past year, we’ve rolled out many substantive safety features, including an entirely new under-18 experience and a Parental Insights feature.”

“The shift is both legally prudent and ethically responsible,” Ishita Sharma, managing partner at Fathom Legal, told Decrypt. “AI tools are immensely powerful, but with minors, the risks of emotional and psychological harm are nontrivial.”

“Until then, proactive industry action may be the most effective defense against both harm and litigation,” Sharma added.

A bipartisan group of U.S. senators introduced legislation Tuesday called the GUARD Act that would ban AI companions for minors, require chatbots to clearly identify themselves as non-human, and create new criminal penalties for companies whose products aimed at minors solicit or generate sexual content.

Generally Intelligent Newsletter

A weekly AI journey narrated by Gen, a generative AI model.

Source: https://decrypt.co/346770/character-ai-halts-teen-chats-after-tragedies-its-the-right-thing-to-do

Market Opportunity
null Logo
null Price(null)
--
----
USD
null (null) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact [email protected] for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

A Netflix ‘KPop Demon Hunters’ Short Film Has Been Rated For Release

A Netflix ‘KPop Demon Hunters’ Short Film Has Been Rated For Release

The post A Netflix ‘KPop Demon Hunters’ Short Film Has Been Rated For Release appeared on BitcoinEthereumNews.com. KPop Demon Hunters Netflix Everyone has wondered what may be the next step for KPop Demon Hunters as an IP, given its record-breaking success on Netflix. Now, the answer may be something exactly no one predicted. According to a new filing with the MPA, something called Debut: A KPop Demon Hunters Story has been rated PG by the ratings body. It’s listed alongside some other films, and this is obviously something that has not been publicly announced. A short film could be well, very short, a few minutes, and likely no more than ten. Even that might be pushing it. Using say, Pixar shorts as a reference, most are between 4 and 8 minutes. The original movie is an hour and 36 minutes. The “Debut” in the title indicates some sort of flashback, perhaps to when HUNTR/X first arrived on the scene before they blew up. Previously, director Maggie Kang has commented about how there were more backstory components that were supposed to be in the film that were cut, but hinted those could be explored in a sequel. But perhaps some may be put into a short here. I very much doubt those scenes were fully produced and simply cut, but perhaps they were finished up for this short film here. When would Debut: KPop Demon Hunters theoretically arrive? I’m not sure the other films on the list are much help. Dead of Winter is out in less than two weeks. Mother Mary does not have a release date. Ne Zha 2 came out earlier this year. I’ve only seen news stories saying The Perfect Gamble was supposed to come out in Q1 2025, but I’ve seen no evidence that it actually has. KPop Demon Hunters Netflix It could be sooner rather than later as Netflix looks to capitalize…
Share
BitcoinEthereumNews2025/09/18 02:23
Crypto Executives Advocate for U.S. Strategic Bitcoin Reserve Legislation

Crypto Executives Advocate for U.S. Strategic Bitcoin Reserve Legislation

Crypto execs, led by Michael Saylor, push for the U.S. to acquire 1 million BTC, establishing a Strategic Bitcoin Reserve.   Crypto executives, led by Strategy co-founder Michael Saylor, have gathered in Washington to advocate for a new piece of legislation. This bill, known as the BITCOIN Act, proposes the establishment of a U.S. Strategic […] The post Crypto Executives Advocate for U.S. Strategic Bitcoin Reserve Legislation appeared first on Live Bitcoin News.
Share
LiveBitcoinNews2025/09/18 05:00
Kellervogel Expands Platform Infrastructure to Enhance Scalability Across Global Crypto Markets

Kellervogel Expands Platform Infrastructure to Enhance Scalability Across Global Crypto Markets

Introduction Kellervogel today announced a series of infrastructure upgrades designed to enhance platform scalability in response to sustained growth in user participation
Share
CryptoReporter2026/02/22 23:20