Author: Sarah-Jane Cecen
Key contacts: Declan Goodwin, Rachelle Sellek
Did you know that almost a quarter of children aged 5 to 7 in the UK already own a smartphone? By the age of 11 this figure rises above 90%. Add in iPads, gaming consoles, and social media, and children are deeply embedded in the digital world.
The Online Safety Act 2023 aims to respond to this reality. It imposes legal duties on platforms, app developers, and online services to ensure that children are not exposed to harmful or inappropriate content. This is not a superficial compliance exercise. It is a strategic, legal, and reputational issue for every business operating in the digital space.
The consequences for failure are huge. You can face fines of up to £18 million or 10% of global turnover (whichever is greater), service suspension, and even criminal liability for senior management.
Why effective age assurance is essential for Online Safety Act compliance
Children move quickly, and they are often more tech-savvy than the systems designed to protect them. Ofcom has been clear that “highly effective” age assurance is now mandatory. A simple pop-up asking users to confirm they are over 18 no longer meets regulatory expectations.
Which online services are in scope under the Online Safety Act?
The Online Safety Act doesn’t just target the big-name platforms. It applies to any service that allows user-generated content or online interaction. This includes:
- Social media platforms (large and small).
- Video-sharing and streaming services.
- Gaming platforms and chat functions inside games.
- Search engines.
- Messaging and community apps where users can interact or share content.
Out of scope areone-to-one emails, SMS, internal business tools, and calls — but if your service allows users to post, share, or interact in any way, The OSA will apply.
For businesses outside the UK, there’s no hiding behind geography. The Act has extraterritorial reach. If your service is accessible in the UK or targets UK users, you are in scope, whether your HQ is in London or Los Angeles.
Why businesses cannot afford a tick-box approach to OSA compliance
Tick-box thinking is gone. Here’s why:
- Legal risk: Fines of up to £18 million or 10% of global turnover. Token attempts at compliance won’t save you. Ofcom is already investigating over 30 adult sites for inadequate age verification.
- Operational challenges: Moderation, reporting, and robust age assurance require tech, policy, and governance. No shortcuts.
- Commercial consequences: Third-party developers and content providers must share responsibility. Contracts and agreements need updating to reflect this.
- Reputation: Parents, regulators, and advertisers reward trust. Fail here, and social media backlash alone could cause lasting commercial damage.
Why basic age gates are no longer enough
We all know the scene: a pop-up appears — “Are you over 18?” — a child taps “Yes”, and they’re in. But Ofcom has made it crystal clear: these ‘measures’ are not sufficient. The law now requires ‘highly effective’ age assurance.
The first major deadlines have already passed:
- 16 March 2025: in-scope services were required to have completed their illegal content risk assessments and begun implementing safety measures.
- 16 April 2025: services likely to be accessed by children had to complete a children’s access assessment, and where under-18s are likely users, a full children’s risk assessment.
What businesses must do to comply with the Online Safety Act
The OSA sets out a roadmap. Every provider within the scope, whether a platform, gaming service, or search engine, must:
- Assess risk: Document where illegal or harmful content could appear, who could access it, and what systems you have in place.
- Put in proportionate safety measures: That means content moderation, reporting systems, complaints mechanisms, and design features that work.
- Clarify terms of service: Users (and regulators) must be able to see how your platform protects them, how illegal content is handled, and how complaints are escalated.
- Keep records: Ofcom expects evidence – your assessments, your decisions, your mitigation steps.
- Embed governance at board level: This is not an IT department problem. It’s a strategic, boardroom issue.
- For large or multi-risk services, the bar is even higher: extra duties like automated hash-matching for child sexual abuse material and more frequent audits apply.
Beyond compliance: Thinking outside the tick box
Parents, regulators, advertisers, and investors are watching closely. A superficial pop-up or disclaimer won’t protect children, and it won’t protect you from the impacts of the OSA.
By contrast, those who get ahead of the curve will gain a competitive edge. The businesses that go beyond tick boxes will be seen as leaders in online safety. If you get ahead and you will have the opportunity to be trusted by users, regulators, and the market alike.
How we help businesses meet Online Safety Act requirements
At Acuity Law, we help businesses turn compliance into strategy:
- Audit & Risk Reviews: Practical assessments to identify exposure and solutions.
- Policy & Contract Updates: Embedding compliance into your agreements with third-party developers and content providers.
- Age Assurance Guidance: Clear advice on what qualifies as ‘highly effective’ in practice.
- Governance Frameworks: Putting compliance on the board agenda, with evidence regulators will accept.
With our extensive expertise and experience, we can work with businesses not just to meet the OSA’s requirements but to use compliance as a tool to build trust, resilience, and competitive advantage.
Act now to stay ahead of OSA requirements
The regulatory clock is ticking. Speak to our Commercial & Technology team to ensure your platform, processes, and governance are fully aligned with the Online Safety Act. We’ll help you close compliance gaps before they become business risks.






