Disinformation has been part of the American political landscape for years now — and even on the eve of the 2024 election, it may still be accelerating.
Multiple Russian disinformation campaigns targeting the U.S. elections have ramped up their efforts to sow discord about the electoral process and hot-button domestic issues, according to exclusive findings shared with POLITICO by a Russian disinformation research group.
The research is by Antibot4Navalny, a group of anonymous volunteers. (The group says at least some of its team is based in Russia — a rarity in the world of disinfo research.) The group, which has been cited by Wired, The New York Times and the Recorded Future intelligence company, began tracking Russian troll farms in 2018. This is the first U.S. election it has monitored, a researcher said in an interview on Signal.
The analyst shared findings based on a study of two major disinformation campaigns backed by Russia and targeting the U.S. ahead of Election Day. One campaign, called Matryoshka, spread fake news about the FBI apprehending groups committing ballot fraud, according to Antibot4Navalny, as well as false narratives that U.S. authorities are preparing for civil war and that prisons in swing states rigged inmate voting. A second Russian-backed campaign called Doppelganger has been operating for years and was sanctioned by the EU; in this election, it aimed its posts at undermining Vice President Kamala Harris, the researcher said.
Together, the findings show that Moscow-backed campaigns are posting more frequently in the final days before the election, and that they’re closely tracking the top news in the U.S. to tailor their message. Antibot and several other researchers say the meddling is almost certain to continue after Election Day.
This first appeared in Digital Future Daily, POLITICO’s afternoon newsletter about how tech and power are shaping our world. Subscribe here.
As with previous campaigns, and in keeping with decades of Russian propaganda efforts, the goal isn’t just to support one political party: It’s to destabilize the American process overall. “It is not per se getting Trump elected that is the end goal of the Russian state — but to increase partisan divides, anxiety and fear; all to make the U.S. elites more focused on domestic issues while paying less attention to aid to Ukraine,” Antibot4Navalny wrote POLITICO.
State election officials and tech firms are rushing to counter the ever-shifting Russian threat.
Adrian Fontes, Arizona’s Secretary of State, told the POLITICO Tech podcast recently that the disinformation had gotten “a lot more intense than it was in 2020, because in 2020, the bad guys were not as organized. They didn’t have their tactics honed out. They kind of practiced in 2022 for what they’re doing this time around. We’re a lot better at responding as well.”
A Microsoft spokesperson confirmed the company has been closely tracking both Matryoshka and Doppelganger.
“History has shown them to be nimble and capable of inserting deceptive content and distributing it rapidly at key moments of audience confusion,” the spokesperson wrote.
The two campaigns highlighted by Antibot4Navalny are part of a broader landscape of foreign targeting of U.S. elections. Last week, a 20-second video on X showed a Haitian migrant saying he intended to vote for Harris in two Georgia counties. U.S. intelligence agencies debunked the video, with Georgia’s Republican secretary of state Brad Raffensperger calling it “obviously fake” and “likely” a production of Russian troll farms.
The disinformation campaigns are unlikely to stop on Election Day. In October, the Office of the Director of National Intelligence said it expected “foreign actors to continue to conduct influence operations through inauguration denigrating U.S. democracy, including by calling into question the results of the election.”
Antibot4Navalny says it anticipated both of the influence operations it is tracking to continue well past Election Day.
Jon Bateman, an expert on tech and global influence at the Carnegie Endowment for International Peace, said that meddling in the sensitive period around elections and before the next president is sworn in could have an outsized impact.
“Maybe there’s some kind of AI-generated or false content that is fairly readily debunked by mainstream authorities, but is enough of a fig leaf to allow a group of people in Congress to refuse to certify an election, for example,” said Bateman. “Just something that can kind of muddy the waters and be exploited purposefully.”