Revitalised Online Safety Act should help fact checkers turn the tide on misinformation

25 October 2024 | Team Full Fact

One year on from the Online Safety Act becoming law, it’s clear that the vast majority of misinformation tackled by fact checkers remains out of its reach. But it could have been so much better.

The previous Government’s 2019 Online Harms White Paper, which had treated online disinformation and misinformation as a type of harm, had proposed that “companies will need to take proportionate and proactive measures to help users understand the nature and reliability of the information they are receiving, to minimise the spread of misleading and harmful disinformation and to increase the accessibility of trustworthy and varied news content.” All this was to have been supported by a code of practice. 

However, by the eventual time of Royal Assent in October 2023, these promising proposals had disappeared, and we were left with a focus only on people who post knowingly false information intended to cause “physical or psychological harm” to “a likely audience”, rather than dealing with the wider harms to society caused by misinformation. 

The false communications offence is a case in point. We have raised concerns that this particular offence may work in specific harassment cases, but is too vague to be applied at internet scale. This concern appears to be supported by a lack of prosecutions for this offence since it came into law in January - we are aware of just one successful prosecution of a man caught red handed live-streaming about a non-existent riot, and another high-profile arrest that led to no charge. The challenge is that, in our experience, misinformation is often deliberately designed to be not false but to create a false impression in the minds of the audience. Indeed, if this offence is given greater prominence by being designated as a “priority offence”, then there is a risk that internet companies become over-zealous in their takedown of posts suspected of being false, thereby impinging on freedom of expression. This would be a step too far. We prefer a content-neutral approach, where companies inform users and help them make up their own minds, to interventions that restrict what people can see and share. 

Meanwhile the only other mentions of misinformation in the Act, other than the false communications offence, relate to a new Ofcom committee on disinformation and misinformation (which is still yet to be established, though expected by the end of 2024), and enhanced media literacy duties for the regulator.

Therefore a lot of the misinformation Full Fact sees is out of scope. We need to move towards a regime whereby internet companies have a legal duty to tackle the full range of misleading and harmful information spreading on their platforms.

Honesty in public debate matters

You can help us take action – and get our regular free email

Green shoots of reform?

Yesterday saw the first green shoots of reform, with clause 123 of the Government’s newly published Data (Use and Access) Bill setting out proposals for “researchers” to be given access to internet companies’ data, by means of a new clause 154A in the Online Safety Act. This could be good news for the fight against misinformation, if the kind of work that we do will be in scope of such access. But more clarity is urgently needed as to whether this provision will actually enable fact checkers to do their job more effectively, or whether it will be yet another missed opportunity. Beyond access to data however, further reform is needed.

A crucial example of what’s missing is any specific measures to tackle harmful health misinformation - such as vaccine conspiracy theories and fake cancer cures - which the Covid-19 pandemic demonstrated can cause clear and serious harm. We want to see a strengthening of the Act to mitigate the reach and impact of health misinformation, and to ensure all platforms set out clear policies to address it in their terms of service.

The August unrest across England following the murders in Southport also made clear that the Act’s lack of provision for information incidents is a clear risk to public safety. Ofcom - together with fact checkers, news organisations, service providers, law enforcement, and community groups - need clearer roles and systems to manage situations where online misinformation can spill rapidly into the real world with dangerous consequences.

Full Fact stands ready to work with the government and parliamentarians to strengthen the Online Safety Act and ensure that the UK is closer to a better, safer, and more resilient information environment by this time next year.


Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.