Social media is given a “last chance” to address illegal posts
Online platforms must begin assessing whether their services are exposing users to illegal material by March 16, 2025 or face financial penalties as the Online Safety Act (OSA) takes effect.
Ofcom, the regulator that enforces internet safety law in the UK, published its final code of practice for how companies deal with illegal online content on Monday.
Platforms have three months to conduct risk assessments to identify potential harms to their services or they could be fined up to 10% of their global turnover.
Ofcom chief Dame Melanie Dawes told BBC News this was the industry’s “last chance” to make changes.
“If they don’t start seriously changing the way they run their services, I think these demands for things like Ban on children on social media “You’ll get stronger,” she said.
“I am now asking the industry to act, and if they do not, they will hear from us with enforcement action from March.”
Under Ofcom’s rules, platforms will need to determine if, where and how illegal content may appear on their services, and the ways in which they will prevent it from reaching users.
According to the OSA, this includes content related to child sexual abuse material (CSAM), controlling or coercive behaviour, severe sexual violence, and promoting or facilitating suicide and self-harm.
But critics say the law fails to address a wide range of harms to children.
The Molly Rose Foundation – set up in memory of teenager Molly Russell, who took her own life in 2017 after being exposed to images of self-harm on social media – said OSA had “deep structural issues”.
Andy Burrows, its chief executive, said the organization was “Surprised and disappointed” by the lack of specific, targeted measures for platforms on dealing with suicide and self-harm material in Ofcom’s guidance.
“Strong regulation remains the best way to tackle illegal content, but it is simply unacceptable for the regulator to take a piecemeal approach to direct threats to life,” he said.
Children’s charity NSPCC has also expressed its concerns.
“We are very concerned that some of the largest services will not be asked to remove the most egregious forms of illegal content, including child sexual abuse material,” said the organization’s acting president, Maria Neophyto.
“Today’s proposals will at best close the gridlock, and at worst will create a loophole that means services can avoid addressing abuse in private messages without fear of enforcement.”
The OSA became law in October 2023, after years of debate among politicians over its details and scope, and campaigns by people concerned about the impact of social media on young people.
Ofcom has begun consulting on its illegal content codes In November of that yearIt says it has now “strengthened” its guidance for technology companies in several areas.
Ofcom says its rules include greater clarity around requirements for removing intimate image abuse content, and more guidance on how to identify and remove material related to women being forced into sex work.
It also includes child safety features such as ensuring that social media platforms stop suggesting people authenticate children’s accounts and warnings about the risks of sharing personal information.
Some platforms must also use a technology called hash matching to detect child sexual abuse material (CSAM) – a requirement that now applies to smaller file hosting and storage sites.
Hash matching is where media is given a unique digital signature that can be verified against hashes belonging to known content – in this case, known CSAM databases.
Many major tech companies have already taken safety measures for teenage users Controls to give parents more oversight of their social media activity In an effort to address risks to teens and proactive regulations.
For example, on Facebook, Instagram, and Snapchat, users under the age of 18 cannot be discovered in search or messaged by accounts they do not follow.
And in October, Instagram too Started blocking some screenshots in direct messages To try to combat sextortion attempts – which experts have warned are on the rise, and often target young people.
Technology Minister Peter Kyle said Ofcom’s publication of its blogs was an “important step” towards the government’s goal of making the internet safer for people in the UK.
“These laws represent a fundamental reset of society’s expectations of technology companies,” he said.
“I expect them to deliver and I will be watching closely to make sure they do.”
Concerns have been raised throughout the OSA’s journey about its rules applying to a wide range of diverse online services – with campaigners also frequently warning of the privacy implications of platform age verification requirements.
Parents of children have previously died after being exposed to illegal or harmful content Ofcom criticized for moving at ‘turtle pace’.
The illegal content codes issued by the regulator will still need Parliament’s approval before they come into full force on March 17.
But platforms are now being told that, assuming the laws will have no problem passing through Parliament, companies must take measures to prevent users from accessing banned material by that date.