Please wait...
Please wait...

Ofcom warns online firms face UK ban if they fail to comply with new protections for kids

Ofcom warned it will act if online providers do not address risks to children

Adobe Stock

New measures have been outlined by UK media regulator Ofcom to better protect children from harm when they are online, with tech firms that fail to comply set to have their services banned in the UK.

Providers of online services that are likely to be accessed by UK children, including social media and gaming platforms, are now legally required to protect children from content that is harmful to them.

Ofcom has said the newly finalised measures, part of the recently introduced Online Safety Act, will ensure children in the UK will have safer online lives and promised tough consequences for tech firms that don’t clean up their platforms in the coming months.

Providers of online services that may be used by children now have to complete and record children’s risk assessments by July 24.

Subject to the codes completing the parliamentary process in time, from July 25, online services will need to take the safety measures set out or use other effective measures to protect child users from harmful content.

Ofcom said it is ready to take enforcement action if providers do not act promptly to address the risks to children.

The Protection of Children Codes & Guidance published today build on the rules that Ofcom has already put in place to protect all users, including children, from illegal harms, such as protecting kids from being groomed and sexually exploited.

In May 2024, Ofcom published proposals about the steps providers should take to address content harmful to children. Since then, it has been consulting with companies, children’s safety campaigners and other organisations, as well as children and their guardians.

If companies fail to comply with their new duties, the watchdog has the power to impose fines and – in very serious cases – apply for a court order to prevent the site or app from being made available in the UK.

Meanwhile, Ofcom is also consulting on proposals to expand the application of its powers to block and mute user accounts and disable comments to a wider range of services.

This is because Ofcom now considers it would be proportionate for these measures to apply to certain smaller services likely to be accessed by children.

Terry Green, social media partner at law firm Katten Muchin Rosenman UK LLP (Katten), said: “The 40 practical measures outlined by the regulator include much needed steps to prevent children from encountering the most harmful context relating to the likes of self-harm, eating disorders, porn and suicide, while also protecting them from misogynistic, violent, hateful or abusive material, online bullying and dangerous challenges.

“The codes now demand a ‘safety-first’ approach when tech companies design and operate their services in the UK. These include producing safer feeds, conducting effective age checks, taking fast action to tackle harmful content, giving children more choice and support in their online experience, making it easier to report and complain, and ensuring that strong governance is respected.”

Please wait...