Fleshbot Loading...
Loading...

Ofcom Publishes Online Safety Codes: Tech Platforms Given Three Months to Comply

LEGAL NEWS STRAIGHT

The U.K.’s communications regulator Ofcom has officially released the first set of online safety codes of practice, marking a significant step toward implementing the country’s Online Safety Act.

Social media platforms, adult websites, and other tech companies now face a compliance deadline of March 16, 2025, to address and mitigate illegal harms on their platforms or risk severe penalties, including fines of up to £18 million or 10% of global revenue.

Ofcom Publishes Online Safety Codes: Tech Platforms Given Three Months to Comply

The Online Safety Act, which passed in October 2023, places stringent new requirements on platforms to reduce harmful and illegal content, such as child sexual abuse materials (CSAM), terrorism-related material, and online fraud. This week’s publication by Ofcom serves as the first formal guidance for how platforms must comply with the Act.

Dame Melanie Dawes, Ofcom’s Chief Executive, stressed the significance of this shift in regulatory oversight: “For too long, sites and apps have been unregulated, unaccountable, and unwilling to prioritize people’s safety over profits. That changes from today,” Dawes said. “The safety spotlight is now firmly on tech firms, and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance.”

The newly published codes focus on illegal harm risk assessments, content moderation, and safety measures. Ofcom emphasized that platforms must take proactive steps to ensure user safety while improving tools to combat illegal activity. Platforms in scope include major players such as Meta, YouTube, OnlyFans, X (formerly Twitter), and Google.

By March 16, 2025, all relevant platforms must complete risk assessments to understand the impact of illegal content on users. From March 17, 2025, they must begin implementing measures to address these risks. Failure to comply could result in fines or, in severe cases, website blocking in the U.K.

Key measures outlined by Ofcom include:

  • Senior Accountability for Safety: Platforms must assign a senior executive responsible for ensuring compliance and safety standards are met.
  • Improved Moderation and Reporting: Tech firms must enhance content moderation practices, train staff to handle illegal content, and make reporting and complaints mechanisms easier to access and use. Robust performance targets will be required to ensure the quick removal of illegal material.
  • Child Safety Protections: Platforms where users can interact must ensure children’s profiles, locations, and connections are hidden by default. Non-connected accounts should not have direct messaging access to minors. Additionally, children must be provided with clear information about online risks when sharing personal data.

Ofcom has also indicated further requirements will roll out in Spring 2025, covering measures such as blocking accounts sharing CSAM, using AI to tackle illegal harms, and preventing the sharing of non-consensual intimate images through hash-matching and URL detection.

The penalties for failing to comply are significant. Ofcom can impose fines of up to £18 million or 10% of global annual turnover—whichever is greater. For platforms that persistently violate safety standards, Ofcom retains the power to block access to their sites within the U.K.

This regulatory pressure places tech giants in a difficult position, particularly with debates around privacy and encryption. Critics argue the Act could weaken end-to-end encryption protections, while free speech advocates, such as X’s Elon Musk, have previously voiced resistance to similar regulatory measures.

Tim Henning, Director of the Association of Sites Advocating Child Protection (ASACP), urged tech companies to embrace the guidelines, noting the U.K.’s approach strikes a balance between safety and access.

“Unlike some ‘child protection’ proposals in the U.S. and elsewhere that are thinly veiled attempts at the outright prohibition of pornography, the U.K. is leaving the door open for responsible platforms and providers to serve the needs of consenting adults,” Henning stated. “We hope that implementing an economically feasible and successful approach to online child protection by the U.K. will inspire other jurisdictions to find more balanced ways of meeting this universal need.”

While the Online Safety Act is widely praised for addressing the urgent need to combat online harms, critics have expressed concerns about its implementation challenges. Balancing privacy protections with content moderation, especially in encrypted spaces, remains a contentious issue.

Additionally, there are questions about the effectiveness of enforcement, particularly for smaller platforms that lack the resources of tech giants. Digital rights organizations argue that sweeping regulations risk unintended consequences, such as over-censorship and disproportionate burdens on niche platforms.

Despite these concerns, Ofcom maintains that its codes will improve user safety, especially for children, while holding tech companies to account. “This is just the start,” Dawes noted, promising further requirements to strengthen protections in 2025.

The compliance timeline is clear: platforms have three months to assess the risks of illegal content and ensure compliance systems are ready. Ofcom has signaled it will closely monitor progress, with enforcement action likely for non-compliance after the March deadline.

For tech companies, this marks the beginning of a new era of accountability under the U.K.’s robust online safety framework—one that could influence similar legislation worldwide. As the legislation takes hold, all eyes will be on whether platforms rise to meet the challenge or face the consequences of non-compliance.