Feb 27 • 19:17 UTC 🇬🇧 UK Guardian

Instagram to alert parents if teens repeatedly search self-harm terms

Instagram will begin notifying parents when their teenagers repeatedly search for terms related to self-harm and suicide as part of a new initiative amid ongoing legal scrutiny of the platform's impact on youth.

Instagram has announced that it will implement a new feature that alerts parents if their teenagers repeatedly search for terms associated with self-harm and suicide. This decision is part of a broader effort by the platform's parent company, Meta, which is facing ongoing legal challenges regarding its alleged negligence in protecting minors from harmful content. The announcement coincides with two major trials: one in Los Angeles that investigates claims about the addictive nature of Meta's platforms and another in New Mexico focused on the company's failure to prevent sexual exploitation of children.

The alert system is designed for parents enrolled in Instagram's parental supervision program, which underscores the platform's commitment to fostering safer online environments for younger users. While Instagram has previously attempted to address these issues by blocking certain content from appearing in the search results of teenage accounts and directing users to appropriate helplines, this new initiative marks a more direct approach to parental involvement in monitoring risky online behavior.

The move to notify parents comes in response to mounting pressure from families, school districts, and government organizations, all of which have initiated lawsuits against Meta and other social media companies. These lawsuits contend that the platforms are intentionally designed to be addictive and have failed to adequately protect children from harmful influences, such as content that may exacerbate mental health issues, including depression and eating disorders. The outcome of these trials may influence future regulations surrounding child safety on social media.

📡 Similar Coverage