Research update #3: who shares ‘false news’ and what makes a good graph

3 April 2019 | Amy Sippitt

The best way to tackle misinformation is to understand as much as possible about where and how it spreads.

In the third in a series, our Research Manager Amy Sippitt takes a look at some of the latest findings and updates about the spread of misinformation and the culture of factchecking.

Honesty in public debate matters

You can help us take action – and get our regular free email

More evidence on belief updating…

Belief updating during the 2016 US Presidential elections

In a paper now accepted for publication, Brendan Nyhan, Ethan Porter, Jason Reifler and Tom Wood found that—even at the peak of the 2016 US Presidential elections—they could reduce misperceptions about crime and unemployment trends among both Republicans and Democrats, based on claims made by Donald Trump.

They tested two claims: one made in Trump’s nomination acceptance speech that violent crime had increased substantially, with the study conducted several weeks after, and one made by Trump in the first 2016 presidential debate that jobs were moving from the US to Mexico, with the study conducted that evening.

Misperceptions decreased among both Clinton and Trump supporters, compared to those who saw the claim without any corrective statement.

The first study was tested on around 3,000 Mechanical Turk (MTurk) participants, and a further 1,200 from a nationally representative sample. The second study was conducted with 1,500 MTurk participants.

Caveats: the study only included Donald Trump claims, so there’s only evidence of possible partisan reactions by Trump supporters.

Read more.

Factcheck-style text found to update beliefs in Australian sample

In an experiment involving around 400 participants in Australia, researchers found that factcheck-style text assessing claims made by two prominent Labor and Liberal party politicians updated people’s beliefs.

The text was 41-69 words in the style of a factcheck conclusion.

The research also varied whether respondents saw factchecks showing the politician to have made an even number of accurate and inaccurate claims (four factchecks of each), or factchecks showing the politician making more inaccurate claims than accurate (with four factchecks of inaccurate claims, and one of an accurate claim). They found reported support for the politician was lower when respondents had been shown a disproportionate number of inaccurate claims.

However the study’s authors—Michael Aird, Ullrich Ecker, Briony Swire, Adam Berinsky and Stephan Lewandowsky—did a follow-up study in the US, where they say they found barely any difference in support. This hasn’t been published yet.

Caveats:

This was a small sample, and the researchers say the statements tested may not have reflected key policies from each party. The study also didn’t look at attitudes and beliefs over time, which is particularly important for how these influences might play out in real life.   

Read more.  

For more on this, read our review of research on the "backfire effect" where we've written more about these and other studies on belief updating. 

Sharing articles from false news domains was rare during the 2016 US election— older people did it slightly more…

… Says a study by Andrew Guess, Jonathan Nagler and Joshua Tucker of 3,500 people surveyed via YouGov, 38% of whom agreed to link their Facebook sharing history (49% of those who said they had a Facebook profile).

Their focus is on “false or misleading content intentionally dressed up to look like news articles, often for the purpose of generating ad revenue”.

Using BuzzFeed’s list of the most engaged with false news articles during the 2016 US election campaign, the study found the vast majority of Facebook users didn’t share any articles from these domains. That’s despite only 3% sharing 10 links or fewer, 26% sharing 10 to 100 links during the campaign period, and 61% sharing 100 to 1,000 links.

Only 8.5% of respondents with data shared at least one article from a false news domain. Those aged over 65 were most likely to have done so, including after controlling for ideology, education and other characteristics.

Caveats:

We don’t know the composition of participants’ news feeds, so it may be that older people’s news feeds are very different to younger people’s.  Also, given half of respondents chose not to link their Facebook data, we can’t rule out that they might be different to those in the survey.

Read more.

  • Brendan Nyhan has also summarised his research with colleagues on false news sites during the 2016 and 2018 US election campaigns, making the case for “Why fears of fake news are overhyped”. Read more.
  • Rasmus Kleis Nielsen has pointed to other findings for older people, including that they are less likely to know social media relies on algorithms, and less likely to remember the brands behind stories they click on. Read more. There is also other evidence suggesting older Americans may share more factchecks. Read more.

In brief

  • The science behind good charts, by the Financial Times. Read more. This great interactive piece tests your interpretation of a range of different graphs and infographics. For example, it highlights how columns or bars are more useful than pie charts for comparing values—especially if they’re similar.
  • Why do people fall for misinformation? Read this New York Times piece by Gordon Pennycook and David Rand.
  • Facebook deactivation may increase wellbeing, decrease news knowledge, and reduce polarisation, says a new study based on a randomised experiment of people willing to deactivate their Facebook account in return for a reward. Read more.
  • Factchecks may help media outlets increase trust in journalism, but only when combined with opinion pieces about the importance of journalism, says a study of mostly college grads exposed to a single news portal. Read more.

Full Fact fights bad information

Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.