As members of the UK’s largest opposition party gathered in Liverpool for their party conference—probably their last before the UK holds a general election—a potentially explosive audio file started circulating on X, formerly known as Twitter.

The 25-second recording was posted by an X account with the handle “@Leo_Hutz” that was set up in January 2023. In the clip, Sir Keir Starmer, the Labour Party leader, is apparently heard swearing repeatedly at a staffer. “I have obtained audio of Keir Starmer verbally abusing his staffers at [the Labour Party] conference,” the X account posted. “This disgusting bully is about to become our next PM.”

It’s unclear whether the audio recording is real, AI-generated, or recorded using an impersonator. British fact-checking organization Full Fact said it is still investigating. “As we’re talking now, it can’t be validated one way or the other. But there are characteristics of it that point to it being a fake,” says Glen Tarman, Full Fact’s head of advocacy and policy. “There’s a phrase which appears to be repeated, rather than [using] a different intonation the second time it’s used, and there’s a few glitches in the background noise.”

Audio deepfakes are emerging as a major risk to the democratic process, as the UK—and more than 50 other countries—move toward elections in 2024. Manipulating audio content is becoming cheaper and easier, while fact-checkers say it’s difficult to quickly and definitively identify a recording as fake. These recordings could spend hours or days floating around social media before they’re debunked, and researchers worry that this type of deepfake content could create a political atmosphere in which voters don’t know what information they can trust.

“If you are listening to a sound bite or a video online with this seed of doubt about whether this is genuinely real, it risks undermining the foundation of how debate happens and people’s capacity to feel informed,” says Kate Dommett, professor of digital politics at Sheffield University.

X’s manipulated media policy states that videos or audios that have been deceptively altered or manipulated should be labeled or removed. Neither has happened to the post, and X did not reply to WIRED’s request for comment on whether the platform has investigated the recording’s authenticity.

Starmer’s team has yet to comment. But several MPs from the ruling Conservative party called the recording a deepfake. “There’s a fake audio recording of Keir Starmer going around,” MP Tom Tugendhat said on X. “The last 30 years of public life has seen a catastrophic undermining of faith in institutions, for good and bad reasons,” Matt Warman, another Conservative MP, posted. “But today’s Sir Keir Starmer deepfake is a new low, supercharged by AI and social media. Democracy is under real threat—technology to verify content is essential.”

Services MarketplaceListings, Bookings & Reviews

Entertainment blogs & Forums