What was claimed
A video shows Mr Starmer promoting an investment scheme for UK residents.
Our verdict
False. The video is not genuine and appears to have been altered using artificial intelligence.
A video shows Mr Starmer promoting an investment scheme for UK residents.
False. The video is not genuine and appears to have been altered using artificial intelligence.
A video circulating on Facebook appears to show Labour leader Sir Keir Starmer promoting an investment scheme. However, this video is not genuine and the audio on it appears to have been altered using artificial intelligence (AI).
The video starts with footage of what is supposedly Mr Starmer talking directly to the camera before showing a montage of different clips as the audio plays over the top. He says he is promoting a “new platform that offers a unique opportunity for residents of the United Kingdom to earn passive income”.
It has been shared with the caption: “A special initiative by the UK government for all British citizens! Act fast before slows fill up!”, and overlaid text says: “Earn up to £40,000 monthly! Start today with just £250 and make £1,000 on your very first day!”
However, a spokesperson for Mr Starmer has confirmed to Full Fact that this video is not genuine.
We could not find any reports by either the government or reliable sources describing such an investment scheme.
It appears that the footage is actually a deepfake video created by cloning Mr Starmer’s voice. Deepfake content refers to the use of artificial intelligence to create original images, audio and videos that can be used to convincingly imitate real people.
The deepfake appears to have used video from Mr Starmer’s real 2023 New Year address, which has different audio.
The fake audio also repeatedly pronounces the pound sign before its corresponding number, for example saying “pounds 35,000” rather than “35,000 pounds''. This would be a very unlikely mistake for a native English speaker, such as Mr Starmer, to make multiple times.
Mike Russell, founder of the audio production company Music Radio Creative and a certified audio professional with more than 25 years of experience, told Full Fact that “the use of ‘pounds’ really gives it away”.
However, he warned that this type of deepfake audio can be “easy for anyone to make” and that voice cloning is becoming increasingly “hard to detect”.
In the video, the area around Mr Starmer’s mouth appears to be blurry or smudged. Dr Dominic Lees, convenor of the University of Reading’s Synthetic Media Research Network, previously told Full Fact that this is a telltale sign videos have been poorly lip synced.
Referring to a different deepfake video allegedly showing BBC presenters promoting an investment scheme, he explained how AI “finds it very difficult to generate a natural look in the teeth so often leave this blurry and out-of-focus.”
We’ve written about other examples of audio clips that Full Fact has found no evidence are real, such as one supposedly recording Mr Starmer swearing at his staff and another of Sadiq Khan appearing to call for ‘Remembrance weekend’ to be postponed.
The emergence of realistic deepfake material exposes the increased challenges of verification posed by new technology and the challenge of ensuring an effective and proportionate response by social media platforms on such content.
Misinformation spreads quickly online so it’s especially important to consider whether something is likely to be genuine before sharing.
Image courtesy of Rwendland |
This article is part of our work fact checking potentially false pictures, videos and stories on Facebook. You can read more about this—and find out how to report Facebook content—here. For the purposes of that scheme, we’ve rated this claim as false because it appears to be a deepfake and a spokesperson for Mr Starmer confirmed to Full Fact it is not real.
Full Fact fights for good, reliable information in the media, online, and in politics.
Bad information ruins lives. It promotes hate, damages people’s health, and hurts democracy. You deserve better.