Meta activates parents' alert feature when searching for self-harm content
Meta introduces a new proactive safety feature that alerts parents if their teens repeatedly search for terms related to self-harm or suicide.
In a significant move to enhance digital safety for teenagers, Meta, the parent company of Facebook and Instagram, has announced the launch of a new proactive security feature. This feature sends immediate alerts to parents if their children frequently search for sensitive keywords associated with self-harm or suicide. This initiative reflects an effort to address growing concerns over the mental health of young users on social media platforms.
According to reports from the Associated Press, the new feature will be integrated into the parental control tools available on Instagram. Once the system detects repeated searches for specific sensitive keywords within a short period, parents will receive notifications via email, text messages, or WhatsApp, allowing them to be aware of their childβs possible struggles and providing an opportunity for intervention.
Moreover, this feature will not only notify parents but will also offer guidance compiled by mental health experts on how to initiate a calm and supportive conversation with their teenager while respecting their privacy. This feature is set to roll out next week in the United States, the United Kingdom, Canada, and other regions, indicating Meta's commitment to addressing the mental health crisis among adolescents and fostering a more supportive online environment.