Feb 24 โ€ข 13:08 UTC ๐Ÿ‡ง๐Ÿ‡ท Brazil G1 (PT)

1 in every 5 young people saw unwanted nudity on Instagram, says Meta in lawsuit

A recent Meta report reveals that one in five young Instagram users has encountered unwanted nudity on the platform, as discussed in a legal case in California.

Meta, the parent company of Instagram, disclosed in a legal document that one in five young users on the platform has experienced exposure to unwanted nudity. This information surfaced during a process in California, highlighting ongoing concerns about the safety and privacy of adolescents on social media. The Chief of Instagram, Adam Mosseri, indicated that most explicit images are sent through private messages, reinforcing the challenge of monitoring such content amid privacy considerations.

The report also revealed alarming statistics about the mental health impact on users, with about 8% of respondents aged 13 to 15 reporting exposure to self-harm or suicidal threats on the platform. This has raised significant concerns among parents and child safety advocates regarding Instagram's responsibility in protecting its younger audience from harmful content, especially given the potential influence of social media on adolescent behavior and well-being.

In response to these issues, by the end of 2025, Meta has committed to removing any images or videos containing nudity or explicit sexual activity involving teenage users, except for medical or educational content. The companyโ€™s actions are part of an effort to enhance user safety, but stakeholders will be watching closely to see if these measures will be effective in safeguarding young users from inappropriate content.

๐Ÿ“ก Similar Coverage