Meta has taken aggressive action to comply with Australia’s world-first legislation banning children under 16 from social media platforms. By mid-December 2025, just one day after the ban became law, the company had removed 544,052 accounts from Instagram, Facebook, and Threads. Affected users were given a 14-day notice period to download their data before their profiles were permanently disabled. While Meta has followed the legal requirements, the company maintains that the current approach is flawed and may drive younger users toward less regulated corners of the internet where safety standards are lower.
Table of Contents
Meta and the debate over age verification
Meta is publicly criticizing the structure of the ban, suggesting that it places an undue burden on individual social media companies rather than addressing the root of the problem. The tech giant is advocating for a shift in responsibility, arguing that age verification and parental approval should occur at the operating system or app store level. Under this proposal, Apple and Google would be responsible for verifying a user’s age when they first set up a device or attempt to download an app. Meta believes this would create a more consistent and secure environment for minors across all digital services, not just social media.
Meta and the OpenAge Initiative
Meta has launched the OpenAge Initiative, a non-profit organization dedicated to standardizing age verification processes across the tech industry. This move is part of the company’s broader effort to “engage constructively” with the Australian government while pushing for alternative safety frameworks. By promoting a unified standard, Meta hopes to prevent a fragmented landscape where different apps use varying levels of scrutiny to check user ages. The company continues to provide access to Facebook Messenger, which is exempt from the Australian ban, as a way for families to maintain contact despite the restrictions on broader social networking.

