The Russian Disinformation Threat: Active Campaigns in 2024
This is the second of two pieces by Kettering Foundation Senior Fellow Alexander Vindman addressing Russian disinformation threats to American democracy.
Having examined the ideology and strategic interest of Russian disinformation operations, we now turn our attention toward several recent campaigns targeting American voters. Many of these operations are ongoing and will remain a challenge for policymakers in the lead-up to the election. The Department of Justice has recently unveiled a major Russian disinformation campaign that includes a large operation known as the “Good Old USA Project.” Operations designed to use social media personalities as a vector for disinformation and copycat websites are also continuing to challenge American voters and its democracy.
The Doppelganger Operation
One main actor in Russia’s propaganda efforts is the RRN—short for “Reliable Recent News” and formerly known as “Reliable Russian News.” The Doppelganger campaign is orchestrated by the Russian companies Struktura and Social Design Agency. These entities are known for creating domains and websites that on first glance would appear to be credible media outlets. An example of this would be “Reuters.cfd” rather than “Reuters.com.” These copycat websites post headlines and articles that push pro-Russian disinformation and talking points while presenting themselves as credible sources. These posts are often shared with unsuspecting social media users through screenshots. Elon Musk’s X is believed to be the main vector for this disinformation campaign. Over 300 copycat domains are believed to be involved in this operation. In July 2024, the Doppelganger campaign was largely focused on France, Germany, the United States, and Ukraine. Doppelganger articles have incorporated right-wing and left-wing perspectives on divisive concepts related to immigration, the Ukraine-Russia war, NATO, and American foreign policy.
Lies Around the Assassination Attempt
No political group or ideology is safe from disinformation operations. However, conservative voters remain a particularly popular target for Russian disinformation campaigns. The aftermath of the July 2024 assassination attempt against Donald Trump presents a recent example. Russian, Chinese, and Iranian propaganda accounts have united in a campaign to call America “a violence exporter” and attribute the attempted murder to the “antifa” or even an insider in the current administration. This view is continuous with the narrative Russian officials have been sharing on their public social media pages. Maria Zakharova, of the Ministry of Foreign Affairs of the Russian Federation, accused the Biden administration of creating favorable conditions for the assassination to happen, claiming “exactly 2 months ago I noticed that in the United States they are literally encouraging incitement of hatred toward political opponents.” By portraying the United States as a country facing active, organized internal threats of political violence, Russian media has managed to disseminate fear and paranoia within the electorate and further solidify the atomization and echo chambers seen in conservative social media. Additionally, by portraying the assassination attempt as an organized plot within the government rather than as the actions of a single individual, Russian media has created a false crisis of political legitimacy in an effort to damage the reputation of the United States abroad.
AI-Powered Informational Warfare
The development and spread of AI technologies has made it easier and more efficient to disseminate propaganda through sources affiliated with the Russia Today news agency. Prigozhin’s bot farm no longer has to cram rows of desktops in a dusty Saint Petersburg factory when AI can do the work of a full team of trolls from a single device. One of the most reported examples of AI-powered disinformation is CopyCop, a Russian government-aligned influence network. CopyCop registered over a hundred new websites in May 2024 alone, all of which used AI-generated journalist personas. These profiles and their content are slight alterations of preexisting Russian-affiliated newspapers and conservative American media, including the Epoch Times and the New York Post. After targeting Biden by highlighting his age and exaggerating mistakes he’s made in speeches, the latest disinformation content from CopyCop is aimed at presenting Vice President Harris as a far-left ideologue or a communist. Due to the speed of this operation and the nature of AI-generated content derived from preexisting media, it is getting harder for policymakers and researchers to trace the origins and spread of this disinformation.
The “Good Old USA” Project
According to an ongoing Department of Justice investigation, Russian actors have sought to spread disinformation among American voters by financing and coordinating video production on American social media platforms with the intended goal of weakening support for Ukraine. In addition to targeting voters in swing states, this operation is also designed to target Hispanic and Jewish voters as well as “gamers”—an umbrella term for young people that spend much of their time on the internet and may be unemployed or underemployed. According to an indictment filed in the Southern District of New York, two employees of the Russia Today media outlet spent roughly $10 million dollars in exchange for editorial direction over a group of social media influencers and content creators believed to be part of Tenet Media. This operation shows the ability and willingness of the Russian government to expand disinformation operations beyond the previously seen social media bot activity as well as its use of Russia Today as a vehicle for illicit financing and editorial direction of disinformation assets.
Combating Disinformation
The technological innovations and evolution of disinformation tradecraft has made sorting through truths and falsehoods increasingly difficult. However, years of political polarization have also made citizens generally unwilling to believe news they don’t agree with. In some cases, polarization has generated skepticism toward counter-disinformation efforts as individuals may ignore warnings when false stories confirm their biases. Beyond recommending that consumers receive their news from trusted, reputable sources and take steps to verify the authenticity of the website or account they are engaging with, one viable means of fighting disinformation would be to suggest that Americans focus their attention on reporting from local journalists with an already established media presence. While it is possible for bad actors to mimic the appearance of major media outlets and for artificial intelligence to generate profiles of nonexistent writers, it’s much harder for an adversarial state to replicate trust between an effective journalist and their audience. Most importantly, it is crucial that Americans temper their reactions to sensationalist and inflammatory content on social media and that concerned citizens contact their representatives to demand greater oversight and protections against disinformation and election interference.
Alexander Vindman is a retired US Army lieutenant colonel and the former director for European Affairs on the White House’s National Security Council. Vindman leads the national security think tank Institute for Informed American Leadership and is an executive board member for the Renew Democracy Initiative, a senior fellow at the Kettering Foundation, and a senior advisor to VoteVets. His best-selling memoir is titled Here, Right Matters.
From Many, We is a Charles F. Kettering Foundation blog series that highlights the insights of thought leaders dedicated to the idea of inclusive democracy. Queries may be directed to fmw@kettering.org.
The views and opinions expressed by contributors to our digital communications are made independent of their affiliation with the Charles F. Kettering Foundation and without the foundation’s warranty of accuracy, authenticity, or completeness. Such statements do not reflect the views and opinions of the foundation which hereby disclaims liability to any party for direct, indirect, implied, punitive, special, incidental, or other consequential damages that may arise in connection with statements made by a contributor during their association with the foundation or independently.