facebook-pixel

UK Online Safety Act: What It Means for Your Privacy in 2026

L
Lunyb Security Team
··10 min read

The UK Online Safety Act is the most sweeping piece of internet regulation Britain has ever passed. Designed to protect children and tackle illegal content, it also reshapes how platforms handle your messages, your identity and your data. For everyday users, the consequences for privacy are significant — and often misunderstood.

This guide breaks down, in plain English, what the Online Safety Act actually requires, how Ofcom is enforcing it, and what it means for your privacy when you browse, message and share links online in 2026.

What is the UK Online Safety Act?

The Online Safety Act 2023 is a UK law that places legal duties on online services — including social media platforms, search engines, messaging apps and file-sharing sites — to protect users from illegal and harmful content. Ofcom is the regulator responsible for enforcement, with the power to issue fines of up to £18 million or 10% of global annual turnover, whichever is greater.

The Act came into force in stages, with the most consequential duties — illegal content, child safety and age assurance — becoming enforceable through 2025 and into 2026. It applies to any service with a meaningful UK user base, regardless of where the company is based.

The three main categories of regulated services

  1. User-to-user services — social networks, forums, messaging apps, link-sharing platforms.
  2. Search services — general-purpose search engines.
  3. Services likely to be accessed by children — anything from gaming platforms to adult sites, which face additional age-verification duties.

The Core Duties That Affect Your Privacy

Most public debate has focused on harmful content, but the Act's day-to-day impact on privacy comes from four specific duties imposed on platforms.

1. Illegal content duties

Platforms must proactively identify and remove illegal content — including terrorism material, child sexual abuse imagery, fraud, and content encouraging self-harm. In practice, this means more automated scanning of what you post, share or link to.

2. Age assurance and age verification

Services likely to be accessed by children, and all pornography providers, must use "highly effective" age checks. This can include facial age estimation, photo ID uploads, credit card checks or third-party digital ID providers. The result: more services are now asking British users to prove who they are before granting access.

3. Risk assessments and transparency reporting

Larger "Category 1" platforms must publish annual risk assessments and transparency reports. While positive for accountability, these require platforms to retain more data about user behaviour to demonstrate compliance.

4. The encryption-scanning power (Section 121)

The most controversial provision allows Ofcom to require services to use "accredited technology" to scan private messages for child sexual abuse material — even in end-to-end encrypted services. The government has said it will not use this power until it is "technically feasible" to do so without breaking encryption, but the legal power remains on the books.

How the Act Changes Your Online Experience

If you live in the UK, you've almost certainly noticed changes already. Here's a snapshot of what's different in 2026 compared with pre-Act browsing.

Activity Before the Act Under the Online Safety Act
Visiting an adult site Self-declared age check Mandatory ID, facial scan or card verification
Using Reddit, X, Discord Open access Age estimation for certain content categories
Private messaging Encrypted, untouched by platforms Potential client-side scanning under Section 121
Posting links Light moderation Proactive scanning for illegal-link signals
Anonymous accounts Widely permitted Category 1 platforms must offer user verification options

The Privacy Trade-Offs Explained

The Online Safety Act is built on a tension: protecting users from harm often requires collecting more information about them. Below are the main trade-offs every UK internet user should understand.

Age verification means more identity data in circulation

To watch an over-18 video or access certain forums, you may now hand over a passport scan, driving licence, or biometric facial estimation. Even when platforms use third-party verifiers, you are creating new copies of sensitive identity data — and new targets for hackers.

Reputable providers use "double-blind" architectures, where the verifier doesn't know which site you're visiting and the site doesn't see your ID. But not every provider meets that standard.

Encryption is legally vulnerable, even if technically intact

End-to-end encryption (used by WhatsApp, Signal, iMessage and others) is not banned. But Section 121 creates a legal mechanism that could, in future, require client-side scanning — software on your device that checks messages before they're encrypted. Privacy advocates argue this fundamentally breaks the security model of E2EE.

More data retention, longer logs

To comply with risk assessments and respond to Ofcom requests, platforms are retaining more behavioural data — what you click, what you share, who you talk to. This data may be anonymised, but anonymisation has well-documented limits.

Chilling effects on speech

Privacy isn't just about data — it's about the freedom to think and speak without surveillance. Several smaller forums (including some hobbyist and LGBTQ+ communities) have shut down or geo-blocked UK users rather than face compliance costs. Wikipedia has publicly questioned whether it can comply without breaching its own principles.

What the Act Does NOT Do

Misinformation about the Act spreads quickly, so it's worth being precise about what's not in the law.

  • It does not ban encryption. WhatsApp and Signal remain legal and operational in the UK.
  • It does not require a national ID for general browsing. Age checks apply to specific content categories, not the open web.
  • It does not criminalise individual users for most categories of "harmful but legal" content — the duties fall on platforms.
  • It does not give the government direct access to your messages. Any scanning powers flow through Ofcom and accredited technology, not GCHQ or the Home Office directly.

How to Protect Your Privacy Under the Act

You can't opt out of a UK law, but you can make sensible choices that minimise how much of your identity and behaviour ends up in third-party databases.

1. Use age-verification providers with strong privacy practices

When an age check is unavoidable, prefer providers that offer:

  • On-device facial age estimation (data never leaves your phone).
  • Zero-knowledge tokens — you receive a "yes, over 18" credential rather than sharing your ID with every site.
  • Clear data-deletion timelines (ideally minutes, not months).

2. Be deliberate about which platforms hold your real identity

Category 1 platforms must offer user verification, but verification should be opt-in. Don't verify accounts you don't need to.

3. Use privacy-respecting tools for links and sharing

When you share content, the metadata around your links — who clicked, from where, on what device — is often more revealing than the link itself. Privacy-focused URL shorteners like Lunyb minimise tracking and avoid building advertising profiles on your audience. We've covered the options in our 2026 buyer's guide to URL shorteners and an honest review of Lunyb.

4. Keep your messaging encrypted — and your apps updated

Signal, WhatsApp and iMessage remain end-to-end encrypted in the UK. Keep them updated; if any provider is ever compelled to introduce client-side scanning, they will likely publish notices or — as Signal has stated — withdraw from the UK market entirely rather than comply.

5. Understand your data rights

The Online Safety Act sits alongside the UK GDPR. You retain the right to access, correct and delete personal data held by platforms, and to complain to the Information Commissioner's Office (ICO) if you believe a platform is over-collecting.

Pros and Cons of the Online Safety Act From a Privacy Lens

Pros

  • Forces platforms to take child safety seriously, with real financial penalties.
  • Mandates transparency reports, giving researchers and the public more visibility.
  • Provides a legal route to remove non-consensual intimate imagery and cyberflashing.
  • Establishes a clear regulator (Ofcom) with technical expertise.

Cons

  • Section 121 creates a legal pathway to undermine end-to-end encryption.
  • Mandatory age checks expand the surface area for identity-data breaches.
  • Compliance costs push smaller, community-run platforms out of the UK market.
  • Automated content scanning is prone to over-removal of legitimate speech.
  • "Likely to be accessed by children" is broadly defined and affects general-purpose services.

How the UK Approach Compares Internationally

The UK is not alone — but its approach is distinctive.

Jurisdiction Key law Encryption stance Age verification
UK Online Safety Act 2023 Legal power to mandate scanning (dormant) Mandatory for adult content and high-risk services
EU Digital Services Act Strongly protected; CSAM scanning proposal stalled Voluntary, with EU digital identity wallet emerging
US Section 230 + state laws (e.g. Texas, Utah) Constitutionally protected speech context State-by-state for adult content
Australia Online Safety Act 2021 Encryption broadly preserved Trialling age checks for social media under-16s

The Road Ahead: What to Watch in 2026 and Beyond

Several developments will shape how the Act actually affects your privacy over the next two years:

  1. Ofcom's Codes of Practice — these define what "reasonable" compliance looks like and are still being finalised across content categories.
  2. Legal challenges — civil liberties groups including Open Rights Group and the Electronic Frontier Foundation continue to challenge specific provisions.
  3. Technological feasibility of "safe" scanning — the encryption-scanning power activates only if Ofcom certifies a technology as effective. No such technology currently exists in the view of most cryptographers.
  4. Interaction with the EU AI Act and UK AI regulation — automated moderation tools will increasingly fall under overlapping rulebooks.
  5. Possible amendments — the Act has already been amended once; future governments may narrow or widen its scope.

Frequently Asked Questions

Does the UK Online Safety Act break encryption on WhatsApp or Signal?

Not currently. The Act gives Ofcom legal power to require scanning technology, but the government has stated it will not exercise that power until a method exists that doesn't undermine encryption. Signal has said it would leave the UK rather than implement client-side scanning, and WhatsApp has made similar statements.

Do I have to upload my passport to use the internet in the UK?

No. Age verification applies to specific services — primarily pornography, gambling and some social platforms when accessing adult content. General browsing, shopping and most everyday services do not require ID. Where checks are required, alternatives such as facial age estimation and credit card verification are usually offered.

Can the UK government read my private messages under the Act?

No, not directly. The Act doesn't give intelligence agencies new powers — those sit under the Investigatory Powers Act. The Online Safety Act regulates platforms, requiring them to address illegal content. Any scanning would be performed by the platform under Ofcom oversight, not by the government reading your messages.

What happens if a website ignores the Online Safety Act?

Ofcom can fine non-compliant services up to £18 million or 10% of global annual turnover. In serious cases, it can apply for court orders requiring UK internet service providers, payment processors and ad networks to block or de-monetise the service. Senior managers can also face criminal liability for certain breaches relating to child safety.

How can I share links privately without feeding analytics platforms?

Use a privacy-respecting URL shortener that doesn't build advertising profiles, doesn't sell click data, and offers clear data-retention policies. Tools like Lunyb are designed with this in mind. For a broader comparison, see our best URL shorteners guide and our review of Rebrandly in 2026.

Final Thoughts

The UK Online Safety Act is neither the dystopia some headlines suggest nor the harmless child-protection measure ministers describe. It is a serious shift in the regulatory architecture of the internet — one that genuinely tackles some harms while expanding the surface area for identity data, automated surveillance and centralised content control.

For the average British user, the practical advice is straightforward: be selective about which services hold your identity, prefer privacy-respecting tools where you have a choice, keep your encrypted messengers updated, and exercise your data rights under UK GDPR. Privacy in 2026 is no longer a default — it's a practice.

Protect your links with Lunyb

Create secure, trackable short links and QR codes in seconds.

Get Started Free

Related Articles