ChatGPT creating age-verification system aimed at identifying users under age of 18 following tragic death of teenager

The company will roll out an age-prediction system to limit underage access.
Text: Óscar Ontañón Docal
Published 2025-09-22

"Minors need significant protection." This is what Sam Altman said just a few days ago in a blog post. Now, we learn that OpenAI is preparing new safeguards for ChatGPT after a lawsuit linked the chatbot to the death of a teenager. According to the information provided, the company will introduce technology designed to estimate a user's age and restrict conversations if there is any doubt, defaulting to an under-18 experience. This will block sexual content, prevent flirtatious exchanges, and stop discussions of self-harm. In certain regions, users may also need to verify their identity with official documents. OpenAI said the move prioritizes safety for minors, even at the expense of adult privacy, while adults will continue to access more flexible interactions. What do you think about this? Of course, if you want to learn more details, you can do so through the following link. Go!

Sam Altman is an American entrepreneur, investor, and chief executive officer of OpenAI. New York, US - 04 September 2025

Back