Recent findings by Ofcom have disclosed a concerning trend of children in the UK being exposed to violent online content as early as primary school, highlighting the inevitability and unavoidability of such experiences among the youth. The study involved interviews with 247 children who reported encounters with violent gaming, verbal discrimination, and physical altercations across social media, video-sharing platforms, and messaging apps. This exposure often occurs before children reach the platforms’ minimum age requirements.

A significant concern raised is the normalization among children of sharing videos depicting schoolyard and street fights, with motivations ranging from gaining online popularity to avoiding ostracization. Teenage boys are particularly noted for their active seeking and sharing of violent content to fit in socially. Moreover, the pressure to find this type of content amusing to avoid peer exclusion was mentioned by children aged 10 to 14.

The research also shed light on children’s encounters with harmful content related to suicide, self-harm, and eating disorders on social media. Such exposure contributes to a normalization and desensitization toward these serious issues. Cyberbullying was identified as a pervasive problem, affecting children’s emotional well-being and mental health, facilitated by direct messaging and commenting functions on online platforms.

In reaction to these findings, Ofcom has underscored the necessity for tech companies to shoulder responsibility and enhance online safety measures for children, particularly in anticipation of new online safety laws. The regulator announced plans to consult on industry expectations to ensure an age-appropriate online experience for children later this spring. This move echoes the broader call for action to protect young users from harm, amid criticisms from organizations like the NSPCC towards tech platforms for failing in their duty of care. The research precedes the implementation of the Online Safety Act, focusing on Ofcom’s role in enforcing stricter regulations on platforms that fall short in protecting children from harmful online content.