Age Verification, Privacy, and the Online Safety Act — Are We Ready?

Jul 17, 2025

As the Online Safety Act rolls into enforcement, it brings with it a sharp new focus on age verification. But can the UK’s digital infrastructure handle it — without sacrificing privacy?

On July 25, the UK takes its next big step in online child safety enforcement. Under Part 5 of the Online Safety Act (OSA), providers of pornographic content will be legally required to implement “highly effective” age assurance methods — either age verification, age estimation, or both. It’s a long-awaited regulatory push to prevent under-18s from accessing explicit material online.

But as Ofcom’s consultation unfolds, it’s clear: this isn’t just a compliance box to tick. It’s a litmus test for how we balance safety with digital rights — privacy, identity, and data protection.

What the OSA Actually Demands

This isn’t soft-touch regulation. Under Section 81, providers must ensure that children are “not normally able to encounter” pornographic content. The age checks used must be “highly effective at correctly determining whether or not a user is a child.”

This is about more than putting up age-gates or asking for a date of birth. It demands sophisticated systems that actually work — with consequences for failure.

But here’s the challenge: most traditional verification tools fall short on either effectiveness or privacy.

The Age Assurance Toolbox — And Why It’s Flawed

Let’s take a quick look at the options on the table — and why none are quite the silver bullet yet:

  • Facial Age Estimation
    Fast and frictionless, but struggles with edge cases (15 vs 18?). Raises questions around the storage of facial data.

  • Open Banking & Credit Checks
    Offers real ID links, but sharing financial data for content access feels excessive and risks excluding unbanked users.

  • Photo ID Matching with Liveness
    Effective — if privacy-preserving. However, this depends on the secure handling of ID documents, which many providers aren't equipped for.

  • Mobile Network Checks
    Better than nothing, but only works if the account matches the actual user.

  • Email, Cookies, or Device Metadata
    Easy to spoof. Often meaningless. Not fit for purpose.

The Case for Continuous Biometric Assurance

At YEO Messaging, we believe the future lies not in static, one-off checks, but in continuous verification.

With our patented Continuous Facial Recognition (CFR), access is bound to a real, present, verified user, not once at login, but throughout the entire session.

Unlike one-off verifications, CFR ensures minors can’t gain access via shared devices or borrowed logins. It eliminates the all-too-common workaround of an older sibling or friend “verifying” and then handing over the screen.

And because YEO’s facial data is processed on-device, not stored in the cloud, we meet the OSA’s call for privacy by design.

Privacy-Respecting, Child-Protecting

Here’s what any compliant and ethical age assurance strategy must offer:

✅ Real-time, verified identity
✅ Frictionless user experience
✅ No invasive data trails
✅ Built-in transparency and accountability
✅ Localised control over data — not global data harvesting

OSA compliance shouldn’t come at the cost of trust. At YEO, we’re proving it doesn’t have to.

What Comes Next?

With the 25th July 2025 deadline fast approaching for regulated providers, this is a moment of reckoning — and an opportunity.

If the UK gets this right, it could set a global benchmark for safe, private, and effective age assurance.

But it’ll take more than ticking boxes. It’ll require platforms to rethink how identity and access control works — and embrace a new standard.

YEO is ready.

Are you?

As the Online Safety Act rolls into enforcement, it brings with it a sharp new focus on age verification. But can the UK’s digital infrastructure handle it — without sacrificing privacy?

On July 25, the UK takes its next big step in online child safety enforcement. Under Part 5 of the Online Safety Act (OSA), providers of pornographic content will be legally required to implement “highly effective” age assurance methods — either age verification, age estimation, or both. It’s a long-awaited regulatory push to prevent under-18s from accessing explicit material online.

But as Ofcom’s consultation unfolds, it’s clear: this isn’t just a compliance box to tick. It’s a litmus test for how we balance safety with digital rights — privacy, identity, and data protection.

What the OSA Actually Demands

This isn’t soft-touch regulation. Under Section 81, providers must ensure that children are “not normally able to encounter” pornographic content. The age checks used must be “highly effective at correctly determining whether or not a user is a child.”

This is about more than putting up age-gates or asking for a date of birth. It demands sophisticated systems that actually work — with consequences for failure.

But here’s the challenge: most traditional verification tools fall short on either effectiveness or privacy.

The Age Assurance Toolbox — And Why It’s Flawed

Let’s take a quick look at the options on the table — and why none are quite the silver bullet yet:

  • Facial Age Estimation
    Fast and frictionless, but struggles with edge cases (15 vs 18?). Raises questions around the storage of facial data.

  • Open Banking & Credit Checks
    Offers real ID links, but sharing financial data for content access feels excessive and risks excluding unbanked users.

  • Photo ID Matching with Liveness
    Effective — if privacy-preserving. However, this depends on the secure handling of ID documents, which many providers aren't equipped for.

  • Mobile Network Checks
    Better than nothing, but only works if the account matches the actual user.

  • Email, Cookies, or Device Metadata
    Easy to spoof. Often meaningless. Not fit for purpose.

The Case for Continuous Biometric Assurance

At YEO Messaging, we believe the future lies not in static, one-off checks, but in continuous verification.

With our patented Continuous Facial Recognition (CFR), access is bound to a real, present, verified user, not once at login, but throughout the entire session.

Unlike one-off verifications, CFR ensures minors can’t gain access via shared devices or borrowed logins. It eliminates the all-too-common workaround of an older sibling or friend “verifying” and then handing over the screen.

And because YEO’s facial data is processed on-device, not stored in the cloud, we meet the OSA’s call for privacy by design.

Privacy-Respecting, Child-Protecting

Here’s what any compliant and ethical age assurance strategy must offer:

✅ Real-time, verified identity
✅ Frictionless user experience
✅ No invasive data trails
✅ Built-in transparency and accountability
✅ Localised control over data — not global data harvesting

OSA compliance shouldn’t come at the cost of trust. At YEO, we’re proving it doesn’t have to.

What Comes Next?

With the 25th July 2025 deadline fast approaching for regulated providers, this is a moment of reckoning — and an opportunity.

If the UK gets this right, it could set a global benchmark for safe, private, and effective age assurance.

But it’ll take more than ticking boxes. It’ll require platforms to rethink how identity and access control works — and embrace a new standard.

YEO is ready.

Are you?

Sign up to
our newsletter

Get our insights, news and press - directly to your inbox.

Sign up to
our newsletter

Get our insights, news and press - directly to your inbox.

Sign up to
our newsletter

Get our insights, news and press - directly to your inbox.