“Biden calls Trump a ‘threat to the nation,'” Russian state-run media outlet Sputnik International posted, sharing a video of Biden’s recent speech with its more than 400,000 followers. “The day after Trump gets shot…coincidence?”
A series of sensationalist posts portrayed the US as a nation in decline and on the brink of civil war. Russian state media backed posts that the US had become a Third World country. Chinese state media published cartoons casting the US as an “exporter of violence.” And Iranian posts spread false claims that the shooter had ties to Antifa, a loose-knit group of far-left activists that President Trump and Republicans have previously blamed for the violence.
The frantic news cycle that followed the shooting was a gift to adversaries who had spent years developing digital strategies to exploit the crisis for political gain: the lack of immediate information about the shooter, the graphic images of a bloodied former president in broad daylight, and the widespread conspiracy theories that circulated across the country created an ideal environment to be exploited for influence operations.
“Any domestic crisis can be picked up and exacerbated by state actors, who will try to use it for their own purposes,” said Renée DiResta, former research manager at the Stanford Internet Observatory and author of “The Invisible Rulers: The People Who Turn Lies into Reality.”
Graham Brookie, vice president of technology programs and strategy at the Atlantic Council, said foreign adversaries have jumped at the opportunity to portray the United States as a “violent and destabilizing actor at home and abroad.”
While some government accounts have openly stoked such narratives about X, researchers have also observed activity in more private channels, with Brookie saying on Sunday that Kremlin proxies on the messaging service Telegram were “having a good time.”
Get caught up in
Stories to keep you up to date
Russia has used state media to spread negative news about the United States for decades, a tactic that accelerated with the growth of English-language media and social media, but after the invasion of Ukraine, some platforms began blocking or labeling RT and Sputnik.
In response, Russia has stepped up its efforts to create unlabeled propaganda, including through regular “verified” blue-check accounts on X, influencers on Telegram and other platforms, and communications through unaffiliated media. Deniability gives the messages more credibility, regardless of overlap with content published in state-run media.
Company X did not immediately respond to a request for comment.
The widespread impact of online foreign influence in U.S. elections was first felt in 2016, when Russia used social media to send threatening messages about immigration, minorities and crime to conservatives, posing as Black activists angry about police violence. China has employed some of the same tactics since then, according to researchers and intelligence officials.
Microsoft reported in April that the Chinese government was using fake accounts to bombard people with questions about hot-button topics like drug abuse, immigration and racial tensions. Posing as US voters, the accounts sometimes asked followers about their support for US presidential candidates.
“We know that Russia has historically viewed these events as an opportunity to spread conspiracy theories, and they appear to be continuing to carry out operations such as impersonating Americans,” Kate Starbird, a professor at the University of Washington and longtime intelligence scholar, said Tuesday.
The surge in posts about the shooting comes at a time when foreign interference has become increasingly more widespread and difficult to track. While a variety of foreign powers are involved in the activities, advances in artificial intelligence have made it easier for smaller actors to translate messages into English, create sophisticated imagery and make fake social media accounts look more authentic.
X has seen a proliferation of Russian and Chinese accounts posting about hot-button political issues, including the decay of U.S. cities and the migrant crisis at the Texas border. Earlier this year, it saw a surge in propaganda accounts promoting China’s views ahead of Taiwan’s elections. And last week, U.S. and allied officials identified nearly 1,000 fake X accounts using artificial intelligence to spread pro-Russian propaganda.
Russian diplomatic accounts have been amplifying critical remarks by Kremlin spokespeople on X and other social media since Saturday’s shooting, said Melanie Smith, director of U.S. studies at the Institute for Strategic Dialogue. Chinese state media has taken a more neutral tone, focusing on claims that secret service failures led to the violence, she said.
Chinese state media outlet Global Times published a cartoon early Sunday showing a hammer with the word “political violence” written on it falling on a map of the U.S. “Looking to the future, if the U.S. fails to change the current situation of political polarization, political violence is likely to intensify,” the account tweeted.
#opinionLooking to the future, if the United States fails to change the current climate of political polarization, we may see an increase in political violence, further exacerbating the vicious cycle between these two phenomena. https://t.co/nveRG1rkIx
— Global Times (@globaltimesnews) July 15, 2024
Some foreign actors have boldly accused enemies of orchestrating attacks on Trump. For example, the Russian-backed X account suggested without evidence that Ukraine and the U.S. defense industry may have been involved in an effort to stop Trump from cutting off aid to the region and canceling lucrative military contracts.
“Trump may be hindering the defense industry with his ‘America First’ policy,” one German-language post read. “The industry and military lobby have always had very long arms.”
“Trump’s election would mean the end of the arms race,” one Frenchman said, “so we’ll be looking for people to benefit.”
The accounts are being tracked by the Russian activist research group Antibot4Navalny.
“It’s possible that Ukrainian special forces are behind this, on the orders of the White House,” American journalist John Valori said in an interview with Russian state TV channel Solovyov Live promoted on Telegram, according to a translation by anti-misinformation firm NewsGuard.
Valori also suggested, without evidence, that the suspect had ties to Antifa, as did Iranian state media. As of Wednesday, the FBI had not been able to determine a motive. Investigators said it appeared that Thomas Matthew Crooks, a 20-year-old nursing home employee from suburban Pittsburgh, acted alone.
Over the past two years, social media platforms have scaled back efforts to counter foreign misinformation and have curtailed communications about it with the U.S. government. As The Post previously reported, the FBI recently resumed communications with the companies, shortly before the US Supreme Court rejected a conservative challenge to block such communications as an unjust government interference with protected free speech.
Platforms like Meta have teams that covertly identify and respond to foreign influence operations, but the company, like X and YouTube, has weakened or eliminated policies and programs to combat political misinformation and limited access to tools that could help independent researchers root out such networks.
“I worry that with the changes in recent years, we’re slowly losing that window into this activity,” Starbird said.
Mehta did not immediately respond to a request for comment.
Brian Fishman, a former leader of Facebook’s anti-dangerous individuals and organizations effort and co-founder of trust and security company Synder, said such teams are typically built out in the months leading up to an election but may not be prepared for a crisis like this early in the political cycle.
“The danger here is that threats to our political process don’t just happen on Election Day,” Fishman said.
Naomi Nicks contributed to this report.