Meta, the parent company of major social media platforms such as WhatsApp, Facebook, and Instagram, has recently lowered the minimum age required to use WhatsApp from 16 to 13 in the UK and EU. This decision triggered criticism from child safety groups and politicians, drawing accusations of prioritizing profit over children’s safety. Campaign group Smartphone Free Childhood and its co-founder Daisy Greenwell expressed concerns over this change, pointing at potential risks such as exposure to harmful content and cyberbullying. Conservative MP Vicky Ford criticized the move for lacking parental consultation, labeling it as irresponsible.

In response to concerns about online child safety, Prime Minister Rishi Sunak emphasized the importance of protecting young internet users, mentioning legislative measures including the Online Safety Act intended to hold platforms accountable. Defending its decision, WhatsApp stated that the new age limit aligns with international standards and that they have implemented protective measures against threats like sextortion and image abuse.

Additionally, Meta announced the introduction of new safety measures on Instagram to combat issues like sextortion and intimate image abuse. Measures included a Nudity Protection filter which automatically blurs explicit images in direct messages for users under 18, providing them options to block senders or report inappropriate chats. These features aim to enhance user privacy and safety, particularly for younger audiences, and demonstrate Meta’s ongoing commitment to secure digital environments. This step also reflects broader efforts within the tech industry to develop solutions against evolving online threats and to foster safer online spaces for all users.