News & Ideas  -  Wrecking the Public Sphere: Democracy Is Undermined by Bad Actors, Not Platforms 

From Many, We Information for a Democratic Society Simone Chambers 

By Simone Chambers 

Healthy democratic opinion formation relies on an open, free, and undistorted public sphere of political communication. We live in an age of big data, and it is capable of microtargeting and manipulating us. It’s algorithmic curation systematically fails to up-rank truth while corporations profit from the personal data that is collected then sold by search engines and social media platforms. Many have commented on the dangers to democracy posed by new information technologies. For the most part, those dangers are the product of intentional human action exploiting technology rather than of the technology itself. Authoritarian antidemocratic forces are building strategies to undermine the key elements of a functional democracy, and they set out to do so intentionally. Rather than authoritarian regimes directly crushing civil society and social movements, the internet has opened up new possibilities for authoritarians to thwart democracy at home and in established constitutional democracies around the world. They can effectively wreck, rather than suppress, the public sphere.  

Below are five common strategies that authoritarian forces use to undermine democracy. Many of these examples are drawn from Russian internet campaigns aimed at weakening Western democracies. All of these strategies have also been used by domestic political actors for basically the same purpose. 

Disinformation 

Of the five strategies, circulating false information and baseless claims on the internet is often considered the most dangerous threat to democracy. This is especially true when false information intentionally and directly undermines democratic procedures; for example, by spreading lies about the trustworthiness of the voting system. But this type of disinformation has brought the public to high alert and has placed pressure on platforms as well as public authorities to expose and mitigate against disinformation. 

Destabilization through Polarization 

Bad actors not only directly attack democracy by undermining faith in democratic procedures but also indirectly attack it by seeking to destroy trust and any modicum of unity among citizens. One primary tactic is the use of malinformation, which is akin to malware. Malinformation is information that might be true but is used to inflict harm on a person, organization, or country. One example is Redfish, a Russian-funded social media venture that was kicked off many mainstream platforms in 2023 for its propaganda campaigns. Redfish produced English language content for Americans and showcased protests, racial inequalities, political violence in the US, and other stories that were designed to divide the American electorate. It did not disseminate full disinformation but rather pushed incendiary stories designed to heighten and fuel the divisions within American society. 

Propaganda 

Classic totalitarian propaganda stresses top-down, ideologically clear messages that secure obedience. Modern political propaganda shifts its focus from ideological indoctrination to building loyalty toward individual leaders. Obedience and allegiance to a leader is secured through online, often deceptive, information campaigns. These campaigns depict a particular leader as being the only person capable of solving political and social problems while also being unfairly maligned by critics and opposition forces. This strategy rests on the existence of a gap in political knowledge between an “informed elite” and the public, and this gap must be sustained and widened to maintain power. 

Noise, Chaos, Disorder, and Uncertainty 

Another virtual strategy used by authoritarians is the intentional flooding of the public sphere with contradictory and confusing information. This was a staple of Russia’s Internet Research Agency until it closed in 2023. It created a number of false websites that purported to be related to Black Lives Matter, including Black Matters, Blacktavist, and Black4Black, all with the goal of sowing confusion and distorting public discussion. Following the Russian playbook, the alt-right agitator Steve Bannon said, “The real opposition is the media. And the way to deal with them is to flood the zone with shit.” The intention is to generate a level of cacophony so that it is impossible to sort truth from untruth and too arduous to deliberate about the best path forward. 

Fake Fake News 

Fake fake news is the rhetorical, often cynical, partisan attack on real facts, fact-based journalism, opposition, and pluralistic contestation with the accusation that it is all fake news and misinformation. This narrative strategy is aimed to undermine the reliability of public information and to depict the press as corrupt and biased and journalists as evil and vindictive. This strategy extends to the academic study of disinformation. Many misinformation research programs have scaled back or shut down under the pressure of legal attacks claiming that these programs are partisan attempts to curb and limit conservative speech. But since there is strong evidence of more disinformation on the right than on the left, solid research into the accuracy of digital communication will likely identify conservative sources of misinformation. 

Focus on the Actors, Not the Platforms 

A vibrant, well-functioning public sphere with open, free, and often critical political communication is anathema to authoritarianism. With the digitalization of political communication, however, authoritarians do not need to directly suppress the public sphere. They can instead ruin it so that it is more difficult for all of us to engage in it. This has created a blossoming of serious authoritarian threats within democracies. Hiding behind the charade of free speech and pointing to votes and majorities, these agents do not have to engage in brute suppression; they can be effective through information pollution tactics. Many of these tactics are unstoppable through regulation and transparency. These are not algorithms controlling our lives—these are political agents with agendas. The struggle remains much the same as in previous eras: friends versus foes of democracy. It is the man and not the machine that we need to worry about. 


Simone Chambers is professor and chair of political science at the University of California, Irvine. 

From Many, We is a Charles F. Kettering Foundation blog series that highlights the insights of thought leaders dedicated to the idea of inclusive democracy. Queries may be directed to fmw@kettering.org.