Meta has recently announced a change in the minimum age requirement for WhatsApp users in the U.K. and EU, lowering it from 16 to 13. This decision, which came into effect on April 12, has sparked significant backlash from various campaign groups.
One such group, Smartphone Free Childhood, strongly criticized the move, stating that it contradicts the increasing demand for major tech companies to prioritize child safety. The group expressed concerns that allowing individuals as young as 13 to use the platform sends a misleading message about its safety for children, despite the differing views of teachers, parents, and experts.
Meta defended its decision by aligning the age requirement with that of most countries and emphasized the implementation of protective measures for children. The company highlighted the introduction of new tools, such as Nudity Protection, aimed at safeguarding users against sextortion and intimate image abuse.
Specifically, Meta will soon introduce a feature in Instagram direct messages that automatically blurs images containing nudity and prompts users to reconsider sending such content. This feature will be enabled by default for users under 18, similar to Apple's Communication Safety system, which warns children about receiving or sending explicit content.
Apple's system notifies children about potentially inappropriate content and assures them that they can choose not to view it. The image analysis is performed on the device itself, enhancing privacy and security measures.
WhatsApp's latest update includes various other aspects beyond age restrictions and content protection. For a detailed analysis of the broader implications, readers can refer to additional resources for further insights.