Sunday, May 10, 2026
Google search engine
HomeLifestyle7NEWS Spotlight: Big tech’s big fail on social media ban

7NEWS Spotlight: Big tech’s big fail on social media ban

Australia’s under-16 social media ban was meant to shut kids out. Five months on, most are still getting in.

The law came into force on December 10, requiring platforms to enforce strict age checks or face fines of up to $50 million for failing to take “reasonable steps.”

Now, the first major test of the ban suggests little has changed.

Know the news with the 7NEWS app: Download today Arrow

Commissioned by 7NEWS Spotlight, a YouGov survey of 1500 Australians aged 13 to 15 — the largest since the law began — shows 85 per cent are still using social media daily.

More than half — 52 per cent — say it’s still easy to access platforms, with most simply lying about their age.

For 51 per cent, there’s been no change. Another 22 per cent say their use has increased.

“I think it’s not surprising that young people would want to be on social media,” Paul Smith from YouGov says. “But it’s clearly that the social media companies have not done anywhere near enough to get young people off social media.”

The top platforms remain unchanged: YouTube, TikTok, Instagram, Snapchat and Facebook.

The teen social media ban applies to platforms such as Facebook, Instagram, TikTok and Snapchat. (AAP PHOTOS)
The teen social media ban applies to platforms such as Facebook, Instagram, TikTok and Snapchat. (AAP PHOTOS) Credit: AAP

But beneath the headline numbers, there are signs of minor progress.

Online bullying is down 9 per cent, while exposure to inappropriate and violent content has dropped 18 per cent.

Away from screens, habits are shifting too. Thirty per cent of teens say they’re spending more time on sports and activities, and 27 per cent report better sleep.

“Now, these are modest improvements,” Smith says, “but it shows that the ban has had a real impact in improving in just six months, the lives of our young people.”

“It’s made good progress but there’s a long way to go.”

Significantly, though, parents appear to be stepping in where platforms haven’t.

Two thirds are now monitoring their children’s social media use, and 87 per cent of teens have discussed the ban at home.

“Parents have felt empowered by the ban,” Smith said. “And those 67 per cent of Australian parents who are monitoring their children are clearly making a difference in driving that improvement in the life experience of ‘all right, you’ve been on that long enough, now it’s time to go out and do something else’.”

Mother’s mission

For Emma Mason, the issue is deeply personal. Her daughter Tilly died by suicide after relentless bullying. Now, she’s campaigning for stronger protections.

Emma Mason got a standing ovation after telling world leaders about her daughter's online bullying (Lukas Coch/AAP PHOTOS)
Emma Mason got a standing ovation after telling world leaders about her daughter’s online bullying (Lukas Coch/AAP PHOTOS) Credit: AAP

She wants tech giants to do more to remove under-16s but says: “Success is that parents and teachers and schools are talking about this. Children are talking about this. And the children that are 10 and downwards will end up with a life in Australia where this is not the norm”.

She also believes enforcement is lagging but is confident it is coming.

“The government wasn’t going to wait to try and get the technology right because technology is constantly changing and it’s like trying to put a fence around a cyclone, trying to get everything right in time to get this law to work perfectly,” she said.

Emma Mason says she is "just a mum from Bathurst who is trying to change the world". (Steve Markham/AAP PHOTOS)
Emma Mason says she is “just a mum from Bathurst who is trying to change the world”. (Steve Markham/AAP PHOTOS) Credit: AAP

“So I think what it says is there’s significant work to be done, but the work needs to be done by the social media companies who are continuing to allow this to happen.”

Internal documents exposed

Internal documents revealed in US lawsuits are now adding fuel to that argument.

In a Los Angeles courtroom, a case brought by a young woman known only as “Kayley” alleged platforms were built to hook users, fuelling addiction and a mental health crisis.

She claimed the algorithms didn’t just keep her scrolling — they hooked her, causing anxiety, depression and even suicidal thoughts.

She took on the tech giants: Instagram, owned by Meta, and YouTube, owned by Google.

After a six-week trial and nine days of deliberations, a California jury found Meta Platforms and Google liable.

During the discovery process in that and other lawsuits, internal documents revealed long-running concerns inside tech companies.

In a 2017 email, one employee wrote: “oh good, we’re going after <13 year olds now?”

A colleague replied: “zuck (Meta CEO Mark Zuckerberg) has been talking about that for a while,” prompting the response: “yeah it was gross the last time he mentioned it”.

Another internal document states: “Instagram is an inevitable and unavoidable component of teens lives. Teens can’t switch off from Instagram even if they want to”.

“It’s the disconnect between these documents that show the truth of what’s going on in these companies,” Mason said. “And then the face of Antigone Davis, head of safety for Meta, who stands in front of our government and says, ‘I don’t think social media has done harm to our children’.

Antigone Davis, Meta’s Head of Global Safety
Antigone Davis, Meta’s Head of Global Safety Credit: AP

“I mean, it riles me so much because I just think, how can you as a company with that face say those things in circumstances where you well know the truth of the way you are functioning as a company?

“I think it’s such a disconnect. It’s a slap in the face to parents.”

In a 2020 exchange, one employee wrote: “oh my gosh y’all IG (Instagram) is a drug”.

A colleague responded: “Lol, I mean, all social media. We’re basically pushers.”

The verdict so far: some gains, but a long way to go, and a growing question over whether the platforms — not just the laws — will be forced to change.

Full responses from tech companies

Snap Inc

“The safety and wellbeing of young people has always been a priority at Snapchat. That’s why we continue to work with them and leading experts to build tools and features that support better, healthier experiences.

“Snapchat continues to implement reasonable measures consistent with the social media minimum age law, and we support its goal of improving online safety for young Australians. Age assurance remains a complex, industry-wide challenge, and we are actively improving our approach as we learn more. Since the legislation’s introduction, we have said there is a more effective way to deliver these protections, such as app store-level age assurance.”

Meta on Australia’s social media law

“We’ve fundamentally redesigned the experience for teens on our platforms. This includes Teen Accounts, which automatically places young people into the most protective experience, with restricted messaging, sensitive content filtering and overnight notification limits. These protections are on by default and cannot be turned off by teens.

“In Australia, where under-16s are banned from social media, these protections apply to 16- and 17-year-olds, ensuring that young people permitted on our platforms still have built-in safeguards.

“These default protections work alongside Family Centre, which gives parents additional tools to supervise their teen’s experience, including time limits, content restrictions and messaging oversight.

“While we have existing measures to find and remove accounts belonging to people we believe are underage, we are also building and deploying additional AI-powered systems that detect and remove underage accounts at the under-13 boundary globally, and at under 16 in Australia.

“We continue to believe the most effective approach to age assurance is age verification at the app store level giving parents a single, consistent place to manage their children’s access to all apps and services, not just social media platforms.”

Meta on the California verdict

“We respectfully disagree with the verdict and will appeal. Teen mental health is profoundly complex and cannot be linked to a single app. We will continue to defend ourselves vigorously as every case is different, and we remain confident in our record of protecting teens online. For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most. We use these insights to make meaningful changes like introducing Teen Accounts with built-in protections and providing parents with tools to manage their teens’ experiences. We’re proud of the progress we’ve made, and we’re always working to do better.”

TikTok

“The safety of our community is the highest priority for TikTok. In Australia, TikTok is a 16+ platform, and we continue to proactively detect and remove suspected underage users. If anyone sees an account they believe shouldn’t be on TikTok, we encourage them to report it in-app.”

Disclaimer: An author of this piece and producer of the 7NEWS Spotlight documentary worked at Meta for 18 months.

If you need help in a crisis, call Lifeline on 131114. For further information about depression contact beyondblue on 1300224636 or talk to your GP, local health professional or someone you trust.

Source

RELATED ARTICLES
Google search engine

Most Popular

Recent Comments