Feb 27 β€’ 08:37 UTC πŸ‡°πŸ‡· Korea Hankyoreh (KR)

Instagram to Immediately Notify Parents if Teens Search for 'Suicide and Self-Harm'

Instagram is implementing a service to alert parents if teens repeatedly search for suicide or self-harm-related terms.

Instagram, operated by Meta, has announced a new feature that will notify parents if their teenage children repeatedly search for terms related to suicide or self-harm. This service is part of a 'Youth Management Program' that requires parental consent for participation. Notifications can be sent via email, text message, or WhatsApp based on the preferences registered by parents. The alert system includes not only keywords related to suicide and self-harm but also phrases that suggest a youth's intent to self-harm.

The feature is set to be rolled out next week in the United States, the United Kingdom, Australia, and Canada, with plans to extend its availability to South Korea and other regions by the end of the year. According to Meta, the initiative aims to allow parents to intervene if their child's search activity indicates a need for assistance. This proactive measure provides parents with critical information to support their children in times of crisis.

Meta emphasizes its commitment to protecting teenagers from harmful content and has implemented strict policies against promoting or glamorizing suicide or self-harm. They have indicated that while sharing personal experiences related to such issues is permitted, they are working to ensure that this information is not visible to minors if shared by other users. Additionally, a similar alert feature is in development for interactions involving AI discussions about suicide and self-harm, highlighting Meta’s ongoing efforts to enhance teenager safety on social media platforms.

πŸ“‘ Similar Coverage