Social Networks Bombard Children with Toxic Content. Regulation is Increasing, but the Algorithms Keep Running
The article discusses the alarming rise in children reaching out to a crisis hotline due to mental health issues linked to toxic content on social media platforms, primarily TikTok and Instagram.
In the past year, over 1,600 children and adolescents have contacted a crisis hotline in Czechia, citing issues related to social media. Among these cases is a child who faced psychological troubles due to suicidal challenges associated with social networks. Specific challenges such as 'Blue Whale' and 'Red Dolphin' lead participants to self-harm or harm others, with platforms like TikTok and Instagram serving as the primary venues for such harmful interactions. The outlet's spokesperson disclosed that the hotline received a call from a child overwhelmed by fear stemming from these challenges.
The article highlights the troubling phenomenon where children, such as a 14-year-old named Helena, encounter destructive themes prominently featured in social media videos. Helena describes how her engagement with these videos resulted in a downward spiral, raising concerns over the potential psychological impact on younger audiences. This situation points to the broader issue of how algorithms on these platforms perpetuate exposure to toxic content, often amplifying harmful messages that can influence vulnerable youths.
With increasing regulation aimed at addressing these crises, the article questions whether these measures will be effective against the relentless nature of social media algorithms. The continuous operation of these algorithms not only raises ethical concerns but also underscores the urgent need for more robust regulatory frameworks to safeguard the mental well-being of children against the dangers posed by unchecked social media content.