UK under-16 social media ban: Lords vote and what next

Published on
January 26, 2026
Share this post

A UK ban on social media for under-16s has moved from “talking point” to live legislative territory.

On 21 January 2026, the House of Lords backed an amendment that would push the government towards banning under-16s from social media platforms, as part of the government’s schools bill. The government has indicated it will try to overturn the amendment in the Commons while running a consultation on potential changes of its own.

That sets up a politically awkward moment. Momentum at Westminster has grown in part due to Australia’s recent move to restrict under-16 access to major platforms, and because cross-party groups of MPs have been publicly urging action.

But here’s the catch: the internet is extremely good at routing around simplistic solutions.

As Pepper has highlighted in recent media commentary, this debate needs more nuance than a binary “ban or no ban” framing. The central question should not be “does it sound tough?” It should be “does it make children safer in practice?”

What the Lords actually backed

Peers supported an amendment that would require the government to decide which platforms should be unavailable to under-16s and to drive “highly effective” age checks for enforcement, with a year to implement the approach.

Supporters argue the harms are urgent, linking teen social media use to mental health issues, classroom disruption, and exposure to harmful content.

Critics, including children’s charities and some peers, warn that a blanket ban risks unintended consequences, including pushing young people towards less regulated platforms and removing access to some of the positive aspects of social media.

What happens next in Parliament

Because this change was made in the Lords, it still needs to survive the Commons. The government has said it will not accept the amendment at this stage and is pursuing a consultation route, which makes the next Commons stages potentially contentious, especially with a number of MPs publicly sympathetic to a ban.

The enforcement problem: bans don’t delete risk, they move it

The hardest part of any under-16 ban is not announcing it. It’s enforcing it without creating bigger problems.

Age verification in the real world can be inconsistent and easier to bypass than policymakers expect. From selfie-based checks to simple workarounds using filters or apps, enforcement often isn’t watertight.

Pepper’s co-founder Beckii has raised a key concern: bans don’t remove risk, they often relocate it. If children are pushed off regulated, mainstream platforms, are we unintentionally driving them towards more dangerous, unregulated corners of the internet that sit outside UK law and are far harder to police? Those spaces often contain more extreme content and fewer safeguards.

That’s not a theoretical worry. It’s a known pattern in online safety: when one route is blocked without addressing the underlying behaviour and incentives, displacement is a real possibility.

Lessons from Australia: useful signal, not a plug-and-play template

Australia is being used as proof that a ban is “doable”, but the implementation details matter.

In practice, enforcement relies on platforms taking meaningful steps to prevent under-16 account access and on age assurance mechanisms that are both effective and proportionate. The experience so far underlines the same central challenge: if age checks are weak, the policy becomes symbolic; if they’re intrusive, privacy risks rise and public trust falls.

So yes, the UK should learn from Australia — but “learn from” is not the same as “copy-paste”.

The missing piece in the UK debate: children are not only consumers of content

Most proposals focus on children as users of social media. That’s only half the story.

Children are also participants in content, sometimes in monetised brand campaigns, family vlogging, and influencer marketing. And here the gap is stark: in the UK, children have structured protections in traditional industries like TV, film, theatre, and modelling. But when children are featured in social media content, those protections are far thinner and often voluntary.

That regulatory mismatch is becoming more urgent every year.

Where Pepper is focusing: the Responsible Kidfluence Code

This is why we created the Responsible Kidfluence Code, a voluntary framework designed to raise standards for brands, marketers, and parents involved in commercial content featuring children.

The Code prioritises children’s wellbeing, safety, privacy, and financial fairness. It’s about raising standards now, while pushing for better regulation long term.

Pepper also supports evidence-led policymaking through engagement with the APPG for Children’s Online Safety, helping keep the focus on practical safeguarding, not just political signalling.

What would a “make kids safer in practice” approach look like?

A ban might end up being part of the answer. But on its own, it is not a safety strategy. It’s a gate.

A practical approach blends enforcement with design and accountability, for example:

  • Privacy-preserving age assurance that doesn’t become a de facto identity dragnet
  • Stronger enforcement of existing child safety rules, with meaningful consequences when platforms fail
  • Feature-level protections targeting compulsive mechanics (infinite scroll, autoplay loops, aggressive recommendations)
  • Specific rules for commercial content involving children, so child safeguarding doesn’t depend on whether a set looks like a TV studio or a kitchen vlog

Why this matters to marketers and children’s brands

If the UK moves toward an under-16 social media ban (or even tighter age-gating and features restrictions), it won’t just change how young people use platforms. It will change how brands can legally, safely, and reputably show up in culture.

For marketers, this lands in three big places:

1) Audience, targeting, and measurement get harder (fast)
Expect tighter age assurance, reduced access to interest-based targeting for young users, and more “unknown age” audiences as platforms become cautious. That means less precision, patchier reporting, and more reliance on contextual placement, first-party data, and brand-led content strategies that don’t depend on targeting minors.

2) Creator partnerships will need stronger child-safety standards
Even if children can’t hold accounts, children can still appear in content (family creators, talent-led campaigns, brand shoots, UGC-style ads). That raises the bar on safeguarding, privacy, consent, and fair treatment. Brands will need clearer checks around when a child is featured, how their data is handled, what’s disclosed, and what protections are in place on set and in post-production.

3) The risk shifts from “compliance” to “credibility”
Public scrutiny is rising. A campaign can be technically within platform rules and still look irresponsible to parents, regulators, and the press. The reputational cost of getting this wrong is now comparable to a major product recall: rapid escalation, intense emotion, and long memory.

What good looks like right now
This is exactly why frameworks like the Responsible Kidfluence Code matter: they give brands and agencies a practical standard for safer, fairer child-involved content while regulation catches up. It helps marketers move from “Are we allowed?” to “Is this responsible, defensible, and in the child’s best interests?”

Final takeaway

The Lords vote shows the political appetite for an under-16 social media ban is real, but the hard part is not announcing a ban. It is making children safer in practice.

If enforcement is weak, the policy becomes symbolic and easy to bypass. If it is heavy-handed, it risks privacy harm and displacement to less regulated spaces. Either way, brands and marketers should prepare now. Expect tighter age-gating, messier targeting and measurement, and much higher expectations around child safety and fairness in any content featuring children.

The smartest move is not waiting for the law to force better behaviour. It is raising standards immediately so that child wellbeing, privacy, and responsible commercial practice are built in, not bolted on.

Thank you! We appreciate your message!
Oops! Please try again later.

Explore Our Latest Insights

Stay updated with our latest articles and resources.

Partner
with
Pepper

Ready to elevate your marketing strategy?
Let’s add some spice to your next campaign 🌶️