Meta, TikTok, and Snap commit to complying with new law prohibiting under-16s from opening accounts, despite concerns it could push younger users into ‘darker corners’ of the internet
Australia’s pioneering move to enforce a minimum age of 16 for social media accounts has secured a commitment to compliance from the world’s biggest platforms, including Meta, TikTok, and Snap. However, the tech giants have cautioned that enforcement will be fraught with “enormous challenges.”
The incoming law, which comes into effect on December 10, has forced the hand of platforms that initially sought exemptions, establishing a global test case for regulating how children access and use dominant social networks.
Giving evidence at a recent Senate hearing, Ella Woods-Joyce, TikTok’s Australia policy lead, confirmed the company’s position, stating, “TikTok will comply with the law and meet our legislative obligations.” Yet, she echoed a familiar industry argument: “experts believe a ban will push younger people into darker corners of the Internet where protections don’t exist.”
Similarly, Meta, the parent company of Instagram and Facebook, has pledged its best efforts to remove all Australian users under 16 from its platforms. Meta policy director Mia Garlick acknowledged the scale of the task, calling it an “enormous challenge,” with the central goal being “compliance with the law.”
As reported by InnovationAus, Garlick detailed that Meta plans to use a “waterfall approach” to age checks, implementing tiered verification based on risk factors, though the “precise machinations” are still being developed.
‘Dark Corners’ Argument Rebutted
The platforms’ pushback has involved familiar tactics, including Snapchat and YouTube agitating for exemptions by claiming their services are not primarily “social media,” and TikTok citing the difficulty of complying with such “novel legislation” on the current timeline—despite the law having been approved nearly a year ago.
The industry’s core argument—that regulation will merely drive users to the dark web or unregulated spaces—is a familiar line borrowed from the adult content industry’s defence against age verification for pornography.
However, critics argue this position fundamentally overstates the problem, especially concerning social media. Unlike niche, illicit content sites, social media platforms thrive on a “critical mass” of users; creating a viable, alternative ‘Evilgram’ for teens is not a simple task, as demonstrated by the failures of networks like Mastodon, Vine, and Yik Yak.
Furthermore, the argument is seen as moot given the nature of the problem the law seeks to solve. The push for age checks is rooted in the widely accepted view that the existing, widely used social platforms—Instagram, X, and others—are themselves the sources of harm, from destroying social lives to hosting concerning content.
The Australian eSafety Commissioner, Julie Inman-Grant, has consistently rejected the platforms’ pleas for carve-outs, including YouTube’s call to reinstate an exemption for video-sharing sites. She has warned all major players that they are almost certain to be covered by the law and is expected to proceed with enforcement and fines for non-compliance after the December 10 deadline.
The world is now watching to see whether Australia can successfully pull off this world-first attempt to significantly regulate social media access for minors.

