The Online Safety Act and Misinformation: What you need to know
What is the Online Safety Act?
The Online Safety Act 2023 (OSA) brought in new rules for internet companies to make sure their users are protected from harm that can take place on their platforms, including tackling and removing illegal material online, and better protecting children. The OSA made Ofcom the UK regulator for online safety.
The Act creates a new duty of care for online platforms, requiring them to take action against illegal or “legal but harmful" content from their users. The former must be taken down; the latter must be able to be filtered out by users. Platforms failing in this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher.
Ofcom is empowered to block access to particular websites that fall foul of regulations, however the Act obliges large social media platforms not to remove, and to preserve access to, journalistic or "democratically important" content. This means that content posted on social media from news organisations or political parties & politicians cannot be removed - including any user comments beneath such posts.
The OSA also created a new “false communications offence”, for sending a message (this includes posting on social media) conveying information that is known to be false, and if there is intent to cause “non-trivial psychological or physical harm” to a likely audience. Again, there are journalistic and “democratically important” exclusions from this offence. When this offence was originally proposed, we had significant concerns about it - from a freedom of speech and a burden of proof perspective.
Over 2024 and beyond Ofcom is consulting on, and developing the regulation. After the regulation has been implemented Ofcom will continue to oversee these changes and, when relevant, sanction companies that do not follow the new rules.
How did Full Fact campaign to change the Online Safety Bill?
Across 2022 and 2023 Full Fact worked with MPs and Peers from across the political spectrum to improve the legislation. We helped to table amendments that would better protect us all from harmful health misinformation, to improve freedom of speech, and to end internet companies' ability to make unaccountable decisions for UK internet users from offices on the other side of the world. Some of these made it into law; some didn’t.
Full Fact was instrumental in making sure that Ofcom’s media literacy duties were updated to include social media and search platforms. This means that Ofcom is required to help the public establish the reliability, accuracy and authenticity of information online, and to understand how to better protect themselves from misinformation.
Full Fact also campaigned to include protections from health misinformation in the Online Safety Bill. We did originally secure a commitment from the then Government to address online health misinformation, but they reneged on this promise. Despite campaigning by Full Fact, leading health charities, and members of the House of Lords, the Government didn’t listen. This left the public with no protections in the new regulations.
Does the Online Safety Act do enough to protect us from harmful misinformation?
The OSA should have been a pivotal moment in the way we tackle the harms caused by misinformation. However, the final Act falls short of the former Government’s original aim of making the UK “the safest place to be online.”
There is currently no credible plan to tackle the harms from online misinformation in the OSA and this continues to leave the public vulnerable and exposed to online harms. The only references to misinformation in the Act are about setting up the committee to advise Ofcom, and changes to Ofcom’s media literacy policy.
The Act does not address health misinformation, which the Covid-19 pandemic demonstrated could be potentially harmful. It also does not set out any new provisions to tackle election disinformation (unless it is a foreign interference offence), nor misinformation that happens during ‘information incidents’ when information spreads quickly online, such as during terror attacks or during the August 2024 riots following the Southport murders. The OSA also does not extend to most harms from generative AI misinformation.
Finally, and most importantly, the Act does not ensure that researchers and fact checkers have timely access to data from online platforms and search engines about misinformation and disinformation circulating on their platforms. Fact checkers can do a much better job when they have better access to data about what the most harmful content is, who is seeing it, and how it is spreading.
Fact checkers need increased access to the data not currently shared with us to help us decide what is the most important thing to check each day. Platforms have dashboards of this kind of data, which show which trends, topics and narratives are forming. We can have more impact if we are better directed to the most important claims.
However, right now the platforms are moving in the opposite direction and shutting down services designed to help fact checkers.
If the Government is serious about tackling misinformation, it must step in to require internet companies to provide this kind of access.
Does the Online Safety Act protect freedom of expression online?
The OSA’s approach to protecting freedom of expression online means that internet companies will have to decide what content is not allowed on their platforms, display this in their terms of service, and apply a consistent approach in how they manage content that is in breach of their terms of service.
However, there is a lack of regulatory oversight for what companies include in their terms of service and how they will address it. This will neither prevent misinformation from spreading, nor will it protect freedom of expression.
Will Full Fact continue to campaign to tackle misinformation?
Full Fact has long campaigned for regulation that tackles misinformation and protects freedom of expression online and will continue to do so: it is in our core mission to find solutions to the spread of misinformation, and we proactively raise the matter with decision makers at every opportunity, as well as running public campaigns when appropriate.
Post royal assent, our first priority has been working with Ofcom as it consults on and develops the regulation, including the Advisory Committee on Disinformation and Misinformation, and on the new media literacy measures. We will monitor internet companies' efforts to comply with the rules and work to make sure Ofcom is effectively running the new regime.