How does the Full Fact team choose what to fact check?

3 April 2025 | Nasim Asl

As fact checkers, one of the questions we’re most often asked is: “How do you choose what to fact check?” 

It’s a fair question. There are dozens of interviews with politicians every day, from the morning broadcast round to scheduled political TV programmes, and when Parliament is sitting there are hours of footage from the House of Commons, Lords and various committees. Thousands of articles are published every day in newspapers and online, and millions upon millions of social media posts are created and shared. 

Given the sheer volume of content, it’s obviously not possible for our team to go through more than a tiny fraction of it. But we do our best to spot, research and fact check the most significant and noteworthy claims.

This article is part of the #FactsMatter campaign, which is highlighting the important work we do at Full Fact and why we believe it matters. Over the course of the campaign we’ll be talking about how we check facts, the challenges we face in getting to the heart of evidence and the difference we can make when we do so. 

We’re asking people to share what we publish, sign up to our newsletter and tell the world why #FactsMatter more than ever. Find out more about the campaign and how you can support it here.

Honesty in public debate matters

You can help us take action – and get our regular free email

How we look for claims

We have a member of the team following broadcast media coverage of senior politicians each morning as they go from studio to studio being interviewed on the topics of the day, and we have a ‘monitoring rota’, in which colleagues are responsible for scouring newspapers, websites and social media for claims. 

Our bespoke AI tools also help us find claims to check, especially claims that match those we’ve already fact checked (‘repeat claims’). They allow us to monitor at scale, and in particular to more effectively keep an eye on the transcripts of some TV broadcasts, as well as radio and podcasts. 

Despite this technological help, it’s still labour-intensive to find the right claims to look at. And once we’ve found them, we still need to research whether these claims are correct or incorrect. Sometimes something that seems like a factual claim is actually an opinion, and we can’t fact check those.

Choosing what to check

Unfortunately we don’t have the resources to write about every single incorrect claim we see. So how do we decide which to focus on? 

It’s not an exact science, but there are some important things we take into consideration. 

The prominence of a claim may determine if we cover one over another. So we’re more likely to write about an incorrect claim by the Prime Minister on a national radio station than an incorrect claim made by a local councillor on local radio. In an ideal world we’d be able to cover both, but if a choice between the two has to be made, we’ll more often than not end up covering the claim with the wider reach, and the higher-profile claimant. 

We’re also more likely to fact check a claim if we’ve seen it repeated often, or believe it will be in future. We’ve seen this with our coverage of a common error about the number of patients on NHS waiting lists, for example. We’ve seen the number of patients on waiting lists confused with the number of cases well over 50 times, and the fact that we’ve already done an extensive amount of work on this means it’s usually relatively straightforward for us to remind politicians and the media about the important distinction between these statistics. 

We also weigh up the significance of a claim, and how much harm we think it may pose to the public, institutions, specific groups of people or the democratic process itself. 

Our health team deals with this in quite an obvious way—misinformation about vaccines or unreliable medical treatments can clearly pose direct physical harm to those who believe it to be true and act on it. And misinformation about political and social topics can also cause harm. We saw this in the aftermath of the Southport stabbings last summer, and see it often with claims made about politicians or the political process that ultimately serve to undermine trust in our democratic institutions. 

Finally, there are a whole host of other considerations too. How wrong or misleading is a claim? Can we intervene to get a claim corrected, or does it help us build an evidence base about a wider problem? Are we checking claims from a wide range of sources, and across the political spectrum?

Can we fact check a claim as part of our Meta partnership and add credible information directly to posts on social media?  Is it going viral? Is it connected to a breaking news event? Is a claim novel—or on a topic we’ve not covered recently? Have our readers asked for information on it? Is it something we think our audience will want to read about?

We’re aware these decisions can involve difficult judgements and trade-offs, and given the almost unlimited number of claims to check (and the very limited number of fact checkers we have), we can only do our best. But we’re always grateful to readers who make suggestions or flag claims to us that we may otherwise not have spotted. Please do get in touch if there’s something you think we should take a look at.


Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.