YouTube announced on Tuesday that it is starting to develop age-esteem technology in the US to identify adolescent users in order to provide a more appropriate experience. The company says it will use a variety of signals to determine the possible age of users, regardless of what the user entered as their birthday when they signed on behalf.
When YouTube identifies a user as a teenager, this IntrenRuns through new protections and experiences, which include deactivation of personalized advertising, safeguards that limit the repeated viewing of certain types of content and activation Tools Like screen time and sleep reminders, among others.
These protections already exist on YouTube, but have only been applied to those who have been verified as teenagers, not those who may have withheld their true age. For example, in 2023, YouTube began to limit the repeated video viewing that could cause body image problems or those with social aggression. The company has also developed digital prosperity tools since 2018.
If the new system incorrectly detects a user up to 18 years of age when it is not, YouTube says the user will be able to verify their age with credit card, government identity or selfie. Only users who have been directly verified through this method or age that are over 18 years old will be able to see the content limited to the platform.
Technology powered by mechanical learning will start circulating in a small set of US users in the coming weeks and then monitored before unfolding wider, the company says.
Plans to introduce the technology of age conclusions were announced in February As part of YouTube’s 2025 road map. Plans are also the last step in trying to make YouTube safer for younger users, following it 2015 Start of YouTube Kids app and the 2024 Rollout of Supervised Accounts. Characteristics also arrive as social media widespreadly come under increased government examination in the United States, where platform manufacturers, including Apple and Google, have placed their interest groups against those of large technologies companies Like Meta for who is responsible for verifying age and child safety.
Meanwhile, a handful of American states have taken things in their hands, over twelve States have passed or propose laws to regulate the use of social media by minors. Many of them require age or parental consent, including those in Louisiana, Arkansas, Florida, Georgia, Utah, Texas, Maryland, Tennessee and Connecticut, among others. (However, some laws, such as those of Utah and Arkansas, are hampered by the difference right now and are not enforceable, while others are still pending the application.)
TechCrunch event
Francisco
|
27-29 October 2025
The United Kingdom also began imposing age verification controls this week after 2023 transmitting the online security law.
YouTube does not share details of the signals it uses to conclude a user’s age, but notes that it will consider certain data such as YouTube activity and the longevity of a user’s account to decide whether the user is under 18 years of age.
The new system will only apply to signed users, as signed users do not already have access to limited age content and will be available on platforms, including web, mobile and connected TV.
