A Russian Propaganda Network Is Promoting an AI-Manipulated Biden Video

Experts tell WIRED that Russian disinformation campaigns are using generative AI more and more.
Two robotic arms pushing cut outs of Joe Biden and Donald Trump's faces.
PHOTO-ILLUSTRATION: ANJALI NAIR; GETTY IMAGES

In recent weeks, as so-called cheapfake videoclips suggesting President Joe Biden is unfit for office have gone viral on social media, a Kremlin-affiliated disinformation network has been promoting a parody music video featuring Biden wearing a diaper and being pushed around in a wheelchair.

The video is called “Bye, Bye Biden” and has been viewed more than 5 million times on X since it was first promoted in the middle of May. It depicts Biden as senile, wearing a hearing aid, and taking a lot of medication. It also shows him giving money to a character who seems to represent illegal migrants while denying money to US citizens until they change their costume to mimic the Ukrainian flag. Another scene shows Biden opening the front door of a family home that features a Confederate flag on the wall and allowing migrants to come in and take over. Finally, the video contains references to stolen election conspiracies pushed by former president Donald Trump.

The video was created by Little Bug, a group that mimics the style of Little Big, a real Russian band that fled the country in 2022 following Russia’s invasion of Ukraine. The video features several Moscow-based actors—who spoke with Russian media outlet Agency.Media—but also appears to use artificial intelligence technology to make the actors resemble Biden and Trump, as well as Ilya Prusikin, the lead singer of Little Big.

“Biden and Trump appear to be the same actor, with deepfake video-editing changing his facial features until he resembles Biden in one case and Trump in the other case,” says Alex Fink, an AI and machine-vision expert who analyzed the video for WIRED. “The editing is inconsistent, so you can see that in some cases he resembles Biden more and in others less. The facial features keep changing.”

An analysis by True Media, a nonprofit that was founded to tackle the spread of election-related deepfakes, found with 100 percent confidence that there was AI-generated audio used in the video. It also assessed with 78 percent confidence that some AI technology was used to manipulate the faces of the actors.

Fink says the obvious nature of the deepfake technology on display here suggests that the video was created in a rush, using a small number of iterations of a generative adversarial network in order to create the characters of Biden and Trump.

It is unclear who is behind the video, but “Bye, Bye Biden” has been promoted by the Kremlin-aligned network known as Doppelganger. The campaign posted tens of thousands of times on X and was uncovered by Antibot4Navalny, an anonymous collective of Russian researchers who have been tracking Doppelganger’s activity for the past six months.

The campaign first began on May 21, and there have been almost 4,000 posts on X promoting the video in 13 languages that were promoted by a network of almost 25,000 accounts. The Antibot4Navalny researchers concluded that the posts were written with the help of generative AI technology. The video has been shared 6.5 million times on X and has been viewed almost 5 million times.

Among the prominent accounts sharing the video was Russian Market, which has 330,000 followers and is operated by the Swiss social media personality Vadim Loskutov, who is known for praising Russia and criticizing the West. The video was also shared by Tara Reade, who defected to Russia in 2023 in a bid for citizenship. Reade also accused Biden of sexually assaulting her in 1993.

The video, researchers tell WIRED, was also manipulated in a bid to avoid detection online. “Doppelganger operators trimmed the video at arbitrary points, so they are technically different in milliseconds and therefore are likely considered as distinct unique videos by abuse-protection systems,” the Antibot4Navalny researchers tell WIRED.

“This one is unique in its ambiguity,” Fink said. “It's maybe a known Russian band, but maybe not, maybe a deepfake, but maybe not, maybe has reference to other politicians but maybe not. In other words, it is a distinctly Soviet style of propaganda video. The ambiguity allows for multiple competing versions, which means hundreds or articles and arguments online, which leads to more people seeing it eventually.”

As the Kremlin ramps up its efforts to undermine the US election in November, it is increasingly clear that Russia is willing to utilize emerging AI technologies. A new report published this week from threat intelligence company Recorded Future highlighted this trend by revealing that a campaign, which has been linked to the Kremlin, has been using generative AI tools to push pro-Trump content on a network of fake websites.

The report details how the campaign, dubbed CopyCop, used the AI tools to scrape content from real news websites, repurpose the content with a right-wing bias, and republish the content on a network of fake websites with names like Red State Report and Patriotic Review that purport to be staffed by over a 1,000 journalists—all of whom are fake and have also been invented by AI.

The topics pushed by the campaign include errors made by Biden during speeches, Biden’s age, poll results that show a lead for Trump, and claims that Trump’s recent criminal conviction and trial was “impactless” and “a total mess.”

It is still unclear how much impact these sites are having, and a review by WIRED of social media platforms found very few links to the network of fake websites CopyCop has created. But what the CopyCop campaign has proved is that AI can supercharge the dissemination of disinformation. And experts say this is likely just the first step in a broader strategy that will likely include networks like Doppelganger.

“Estimating the engagement with the websites themselves remains a difficult task,” Clément Briens, an analyst at Recorded Future, tells WIRED. “The AI-generated content is likely not garnering attention at all. However, it serves the purpose of helping establish these websites as credible assets for when they publish targeted content like deepfakes [which are] amplified by established Russian or pro-Russian influence actors with existing following and audiences.”