Authors: Will Oremus, Andrea Jiménez
Russian disinformation doesn’t always work.
Out of 135 Kremlin-aligned propaganda posts analyzed in a new report, 134 more or less fell flat with social media users, getting liked and reshared mostly by a network of bots. But one went big – and with artificial intelligence making disinformation cheaper to produce, experts say that single success likely makes the whole campaign a win in the eyes of its shadowy, likely Russian funders.
The post, a fake news video making false claims about the U.S. Agency for International Development (USAID), hit the jackpot when Elon Musk reposted it on X.
The campaign of propaganda posts is called Operation Overload, also known as Matryoshka. It typically pairs real images with AI-generated voice-overs and the logos of real news organizations or institutions to create false or misleading “news clips” that support Russian talking points. In a report shared with the Tech Brief ahead of its publication Tuesday, researchers at the nonprofit Institute for Strategic Dialogue (ISD) examined the content of those posts, their aims and how they spread.
ISD found that the network, which is presumed but not confirmed to be backed by the Russian government, pivots from one hot news topic to the next with the aim of influencing the discourse unfolding on social media. Its messaging focused on a familiar Russian objective: “weakening NATO countries’ support for Ukraine and disrupting their domestic politics.”
Most of Operation Overload’s posts went nowhere, but one managed to hook one of the political world’s biggest fish.
Posts that pushed “extravagant lies” such as fake assassination plots and pedophilia accusations seemed to go mostly ignored, the report found, even though X has removed less than 20 percent of them.
It was a less flashy fake that ultimately got traction.
The video, which purported to be a clip from the cable TV channel E! News, falsely claimed that USAID had paid millions of dollars for a series of celebrity Hollywood actors, such as Angelina Jolie, Sean Penn and Ben Stiller, to travel to Ukraine in hopes of boosting President Volodymyr Zelensky’s popularity. On Feb. 5, it was posted by a right-leaning anonymous X account with more than 700,000 followers, then reposted by Musk, whose 220 million followers make him the platform’s loudest voice.
The video was later debunked by users X’s crowdsourced fact-checking program, Community Notes, who pointed out the both E! News and Stiller had declared it bogus. Nonetheless, it remained on X as of Monday, having garnered more than 4 million views. X did not respond to a request for comment.
The fakes have at least three aims.
One is to reinforce Russian talking points among those primed to believe them. In this case, that included Musk, who was in the midst of a run of X posts casting aspersions on USAID after his Department of Government Efficiency dismantled the agency.
The Kremlin and its media allies have been trying to demonize USAID for more than a decade, saidNina Jankowicz, CEO of the American Sunlight Project, a nonprofit that researches disinformation. DOGE’s controversial dismantling of the agency offered an opening to reach a receptive American audience with a message that linked the concept of wasteful USAID spending with the notion that Zelensky is unpopular – another Russian talking point.
Another aim is to undermine the credibility of the news organizations and other institutions whose content the posts impersonate, said Joseph Bodnar, senior research manager at ISD.
“Once you’re fooled by a video with a certain outlet’s logo on it, you’re more likely to second-guess the next video you come across from the same outlet,” he said.
Finally, Operation Overload in particular appears intended at least in part to simply waste the time of those institutions and the independent fact-checkers who are tasked with debunking such claims. The more bogus posts, the more resources required to knock them all down.
The report shows how Russian disinfo can be a “numbers game” – and how AI is tilting that game in the trolls’ favor.
“If one out of 100 videos gets traction, that’s a win that achieves its goal,” Bodnar said.
That point is underscored in a paper published in April by academic researchers including Darren Linvill, co-director of Clemson University’s Media Forensics Hub. They found that AI enables bad actors to create more content, faster, without compromising its perceived credibility among the users who run across it.
In other words, Linvill said, “They throw a lot of spaghetti at the wall.” But they also have ways of increasing the chances some will stick. For instance, he said the fake E! News video that Musk reposted was seeded first with a network of pro-Russian influencers who have real, human followings.
Tracking these sorts of campaigns, let alone combating them, is getting harder, experts say.
Musk’s X and Mark Zuckerberg’s Meta have shut down tools for researchers and pulled back from moderating online content in recent years amid a Republican-led push against online content moderation, which many on the right view as censorship. Independent research on disinformation has been chilled by a wave of lawsuits, congressional hearings and campaigns of online harassment against independent researchers.
The administration of PresidentDonald Trump has disbanded government units focused on tracking and countering foreign disinformation, such as the State Department’s Global Engagement Center. Last month it began canceling National Science Foundation grants for research on misinformation and disinformation.
While congressional Republicans accused the GEC of suppressing protected speech, Bodnar said it played a critical role in identifying state-backed disinformation campaigns, which platforms and independent researchers could then link to other operations. Its demise means researchers will more often be left to infer the involvement of Russia and other state actors in the operations they’re tracking – as in the case of Operation Overload.
Russian disinformation operations, for their part, show no signs of slowing down.
With USAID gutted, the operation has lately turned some of its attention to elections in Germany, Poland and other European countries, Bodnar said.
Last month, The Washington Post Joseph Menn reported on a different approach by Russian agents to disseminate propaganda, in that case by seeding AI chatbots with Kremlin lies. It’s a way that disinformation published online could leach into the mainstream even if no humans intentionally read or share it.