"Minors need significant protection." This is what Sam Altman said just a few days ago in a blog post. Now, we learn that OpenAI is preparing new safeguards for ChatGPT after a lawsuit linked the chatbot to the death of a teenager. According to the information provided, the company will introduce technology designed to estimate a user's age and restrict conversations if there is any doubt, defaulting to an under-18 experience. This will block sexual content, prevent flirtatious exchanges, and stop discussions of self-harm. In certain regions, users may also need to verify their identity with official documents. OpenAI said the move prioritizes safety for minors, even at the expense of adult privacy, while adults will continue to access more flexible interactions. What do you think about this? Of course, if you want to learn more details, you can do so through the following link. Go!