Platforms must verify users' age starting this Tuesday; see how it will work
Starting Tuesday, social media and online services must implement age verification for users to better protect children and adolescents.
Beginning this Tuesday, social media platforms and other online services that may contain inappropriate content for children and adolescents are required to verify the age of their users. This new regulation stems from the Digital Statute of the Child and Adolescent (ECA Digital), which was enacted in September 2025. Unlike simple age verification methods like 'Yes, I am over 18', the law demands more robust mechanisms to ensure adequate experiences for younger audiences.
The ECA Digital mandates that platforms designed for use by children and adolescents or likely accessed by them must adopt mechanisms to provide age-appropriate experiences. This includes requirements for app stores and operating systems to implement measures that accurately assess users' age or age group and relay this information to platforms like social media, which must then tailor the user experience accordingly.
This legislation, also referred to as the Felca Law due to its association with a prominent influencer's video on adultification, represents a significant step toward enhancing online safety for minors. The implications of such regulations are far-reaching, as they may reshape how companies manage user data and address the challenges of ensuring children's safety online. However, the effectiveness of these measures will depend on their implementation and the technologies developed to meet these new requirements.