What does Ofcom’s new safety code mean for tech firms? Tech companies could face fines or UK-wide bans if they fail to protect young users. Written by Alice Martin Published on 28 April 2025 Our experts We are a team of writers, experimenters and researchers providing you with the best advice with zero bias or partiality. Written and reviewed by: Alice Martin Direct to your inbox Sign up to the Startups Weekly Newsletter Stay informed on the top business stories with Startups.co.uk’s weekly email newsletter SUBSCRIBE Ofcom has announced 40 new measures for tech companies under the Online Safety Act, which will come into force on July 25. The move marks the next step in regulating young people’s internet use in the UK.These follow earlier rules that were introduced in March, such as assigning someone to be responsible for online safety within businesses and granting users the possibility to report inappropriate material.The latest measures are designed to protect under-18s online, as part of a wider global push to hold tech platforms more accountable for user safety. Below, we’ll explain what’s in the code, and how to know if your business needs to follow it.What are the new Ofcom rules?Ofcom’s new Codes of Practice include 40 new rules that tech companies must follow to keep young people safe online. The rules were designed in consultation with 27,000 children and 13,000 parents.Taking a ‘safety-first’ approach, here are some of the rules Ofcom has outlined for tech firms to follow:Filtered content — tech companies must configure their algorithms to filter out harmful content from children’s feedsAge checks — risky online platforms must implement strict age checks to determine which users are under the age of 18. This means that children can be protected from inappropriate content while still providing access for adultsEasier reporting — online platforms must make it straightforward for young people to flag content and make complaints while providers should respond in a timely and appropriate mannerThese are just three of the 40 total measures in the new Codes of Practice, announced April 24. Owen Bennett, Head of International Safety at Ofcom, wrote on LinkedIn: “Our new measures – which are informed by the views and experiences of close to 30,000 children – spell out how companies can meet those responsibilities. If companies fail to comply with their new duties, Ofcom has the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being available in the UK.”How will platforms be affected?Tech firms will need to change the way they operate in line with the new rules from July 25. Companies that offer online communities, chat features, ecommerce functionality, and user-generated content need to evaluate their safety measures.Already, by July 24, they must record an assessment of the risks their services pose to children. Then, they need to apply the rules set out in Ofcom’s Codes of Practice to mitigate the identified risks. Ofcom can request a copy of the assessment at any time, so it’s important to comply. Compliance with the rules will look different for many companies, depending on the sort of online services provided. An online community app would need to implement age checks at the sign-up stage, for example, and make group chats optional to join. Alternatively, a fitness platform may need to more rigorously monitor the chat function during live streamed workouts for harmful content. In line with the Codes of Practice, tech platforms may need to undergo a process of redesign to add tools or features necessary for compliance. This may include features such as content reporting, filtering systems, and a safer approach to instant messaging.Stricter rules around online content may feel like more red tape for smaller firms, as it’s likely they’ll have to invest in redesigning their services to align with the new measures. That said, it’s important to comply with the measures. Otherwise, Ofcom may impose fines or block access to your site or app in the UK.What do online small businesses need to know?Beyond compliance, the new rules represent an important step toward creating a safer online environment for young people.As it becomes increasingly normal for children to inhabit digital spaces, platforms must ensure that content is age-appropriate, uphold clear community guidelines, and offer accessible reporting tools to protect young users.For small online businesses, this is also a valuable opportunity to build trust with younger audiences and their parents. Taking a transparent, safety-first approach can not only strengthen your reputation and customer loyalty, but will also help you stay on the right side of Ofcom’s regulations. Share this post facebook twitter linkedin Tags News and Features Written by: Alice Martin