Openai CEO Sam Altman announced on Tuesday a series of new user policies, including the promise to significantly change how Chatgpt interacts with users under 18 years of age.
“We give priority to security before privacy and the freedom of adolescents,” the position said. “This is a new and powerful technology and we believe that minors need significant protection.”
Changes for underage users are specifically dealing with conversations that include sexual issues or self -injury. According to the new policy, Chatgpt will be trained to no longer participate in a “flirtary discussion” with underage users and additional protective messages will be placed around suicide discussions. If a minor user uses chatgpt to imagine suicidal scenarios, the service will try to communicate with his parents or, in particularly serious cases, local police.
Unfortunately, these scenarios are not hypothetical. Openai is currently facing an illegal death lawsuit by Adam Raine’s parents, who died of suicide after months of interaction with Chatgpt. Character.Ai, another consumer Chatbot, faces similar treatment. While the risks are particularly emergency for underage users examining self -injury, the broader phenomenon of self -esteem -powered Chatbot has attracted widespread concern, especially as consumers Chatbots have become more continuous and detailed.
Along with content -based restrictions, parents who record a minor user account will have the power to define “blackout hours” in which the chatgpt is not available, a feature that was not available in the past.
New Chatgpt policies come the same day with Hearing a Senate Judicial Committee titled “Looking at AI Chatbots Damage”, Senator Josh Hawley (R-Mo) announced August. Adam Raine’s father is scheduled to speak during the hearing, including visitors.
Hearing will also focus on his findings Investigating Reuters These policy documents apparently encourage sexual conversations with underage users. Meta informed Chatbot’s policies after the report.
TechCrunch event
Francisco
|
27-29 October 2025
The separation of underage users will be an important technical challenge and Openai describes in detail its approach In a separate blog post. The service “is built towards a long -term system to understand if someone is over or under 18”, but in many ambiguous cases, the system will be pre -selected to the more restrictive rules. For interested parents, the most reliable way to ensure that a minor user is recognized is to link the teenager’s account to an existing parent account. This also allows the system to immediately warn parents when a teenager is believed to be in agony.
But in the same position, Altman emphasized Openai’s continued commitment to users’ privacy and giving adult users widespread freedom in how they choose to interact with chatgpt. “We realize that these principles are in conflict,” the position concludes, “and not everyone will agree with how we solve this conflict.”
If you or someone you know needs help, call 1-800-273-8255 for the National Life of Suicide Prevention. You can also send text home to 741-741 for free, 24-hour support from Crisis text line; or text or call 988. In addition to the US, visit the International Suicide Prevention Association for a resource database.
