Home Articles Starmer Liverpool audio reshared ahead of the U.K. election is likely voice-cloned

Starmer Liverpool audio reshared ahead of the U.K. election is likely voice-cloned

By: Naledi Mashishi

July 4 2024

scaled (Source: Jakub Porzycki/NurPhoto)

In the lead-up to the U.K. general election on July 4, 2024, an audio clip claiming to depict Labour leader Keir Starmer insulting Liverpool was reshared on Facebook.

The audio clip originally circulated on social media in October 2023 and was reshared on July 1, 2024. It depicts a voice similar to Starmer's that can be heard saying, "Why do we have to have a conference here every single year? I f*****g hate Liverpool." 

The audio clip was posted with a screenshot from a source whose name has been cropped out. It reads, "I have obtained *ANOTHER* secret recording of Keir Starmer at conference, this time appearing to take aim at the city of Liverpool." 

We found the Facebook post using Logically Facts® Accelerate, a tool that enables proactive discovery of fact-check-worthy content on platforms. 

The screenshot in the video appears to be of a post on X (formerly known as Twitter), indicating that the claim most likely originated on the platform. There is also a watermark at the bottom right corner indicating that the video was posted on TikTok by a pro-Reform party account. However, the video does not appear on the TikTok account. 

It is not immediately clear where the audio clip is from, who recorded it, or when it was recorded. Logically Facts has previously analyzed an audio clip purporting to be of Starmer and found a high likelihood that it was a deep fake. Like that clip, this one may seem genuine at first listen, but there are clues that it is likely fake. And experts say that it uses a type of AI audio generation, known as voice cloning, that is particularly difficult to debunk. 

What we found

To the untrained ear, the audio clip sounds extremely convincing with few noticeable distortions. But cyber security professor at Loughborough University, Oli Buckley, told Logically Facts that the Starmer audio clip "definitely sounds" AI-generated. 

"The tone is a little stilted and the emphasis on certain words isn’t consistent, the tone and emphasis does not always meet the meaning or context of the words," he explained. "While you can get an AI to generate a voice it can do a really good job of making it sound like the person but it doesn’t have the contextual awareness of the tone that should be conveyed."

The audio clip first appeared online in October 2023 during the Labour Party conference. The clip was posted onto X alongside a second audio clip, which appeared to depict Starmer abusing party staff members. The fact-checking organization Full Fact reported that there was no evidence that the latter clip was genuine. Both Labour and Conservative party members said that the latter audio clip was fake at the time, and the X account that originally posted the audio clips has since been suspended.

Voice-cloned audio AI is difficult to fact-check

AI audio clips are difficult to verify independently because they can be edited to sound convincing. Dr Dominic Lees, an associate professor of filmmaking at the University of Reading, told Logically Facts that this clip appears to be a voice clone, "one of the most difficult forms of AI manipulation to debunk."

"Voice cloning is easy and accessible using many available Generative AI Tools," he explained. "This one has been mixed cleverly with background noise from a conference or event, which makes it difficult to clearly analyze the voice of the fake Keir Starmer."

Voice clones can be so deceptive that they can evade some online tools developed to detect deep fakes. The Deepfake Total tool, developed by the Fraunhofer Institute for Applied and Integrated Security, analyzes potentially AI-generated audio using percentage scores. It gave the clip a score of 0.9 percent, meaning it read the clip as likely authentic. 

However, Dr. Nicolas Müller, one of the research scientists behind the tool, told Logically Facts that the background noise may be why the clip is difficult to detect. "[The noise] will help fool the deepfake-total platform, which works well on clear audio. It will perform not so well when there is a poor signal, i.e. when most of the audio consists of non-speech," he explained. "Thus, a low score of 0.9 does not, in this case, have a big significance."

However, other detection tools have been able to assess the audio more precisely. One of them, Itisaar, assesses potentially AI-generated content using a confidence score from 0 to 1, with 1 representing the highest level of certainty, and using three models to analyze. It has classified the Starmer audio clip as fake, with a confidence score of 0.97. 

Update: This article was updated on July 5, 2024, to correctly reflect the university where Oli Buckley is based.

Follow Logically Facts' coverage and fact-checking of the U.K. general election here.

Would you like to submit a claim to fact-check or contact our editorial team?

0 Global Fact-Checks Completed

We rely on information to make meaningful decisions that affect our lives, but the nature of the internet means that misinformation reaches more people faster than ever before