Meta, the parent company of several major social media platforms including WhatsApp, Facebook, and Instagram, has recently adjusted the minimum age requirement for using WhatsApp in Europe from 16 to 13 years. This decision aligns WhatsApp with other prominent platforms like Facebook, Instagram, and Snapchat, which also have a minimum age limit of 13.

The decision has led to significant concern and criticism from various campaigners and expert groups. Smartphone Free Childhood, a campaign group, has vocally opposed the change, suggesting that it undermines efforts to protect young users online. There has been specific emphasis on the potential for increased exposure to harmful content and the issues surrounding cyberbullying. Critics have accused Meta of prioritizing profit over the safety and well-being of children.

In response, WhatsApp has defended the modification by stating that it brings the platform into alignment with global standards. They also mentioned the implementation of new safety features, including the Nudity Protection filter on Instagram, designed to protect users under 18 by blurring nude images, enabling blocking of senders, and facilitating the reporting of inappropriate content.

Given the growing concerns, there are calls for stronger regulations concerning social media access by minors. The UK’s Online Safety Act is part of these regulatory efforts, aiming to shield children from harmful online materials. Mark Bunting, Ofcom’s director of online safety strategy, indicated that social media companies could face fines if they fail to comply with upcoming regulations.

This policy change by Meta has reignited debates on how best to balance accessibility of social media for youth with the need to ensure a safe online environment. The discussions are ongoing, with various stakeholders including regulatory bodies and public advocates pushing for more stringent measures to protect younger users in the digital landscape.