Politics, Propaganda, and the New Weapons of Mass Manipulation

Isaac Gilles
13 min readMar 18, 2020
Michael Kerbow, “Hollow Pursuits”

In almost every act of our daily lives, whether in the sphere of politics or business, in our social conduct or our ethical thinking, we are dominated by the relatively small number of persons…who understand the mental processes and social patterns of the masses. It is they who pull the wires which control the public mind.

-Edward Bernays, Propaganda

From analysing behavioural data to A/B testing and from geotargeting to psychometric profiling, political parties are using the same techniques to sell political candidates to voters that companies use to sell shoes to consumers. The question is, is that appropriate? And what impact does it have not only on individual voters, who may or may not be persuaded, but on the political environment as a whole?

-Varoon Bashyakarla et al, Personal Data: Political Persuasion

The 20th century saw the application of science to warfare to craft a set of unprecedentedly destructive weapons. Some of these weapons drew strength from their visibility: what made the machine gun so powerful, beyond its unparalleled rate of fire, was its optics and acoustics. It was a weapon made to send a signal. Other weapons, meanwhile, drew upon secrecy and illusion: when chemical weapons were first introduced, for example, a large part of their lethality came from the fact that they were seemingly innocuous. By the time enemy combatants realized the deadly intent behind the clouds or the fumes enveloping them, it was already too late.

The 21st century will mark the subtle, secretive application and proliferation of data-driven behavioral sciences throughout the political landscape, much like how nuclear weapons proliferated through the physical landscape in the second half of the 20th century. In the last decade, politically motivated manipulation and disinformation campaigns have presented a noxious threat to the health of democracies across the globe; as our lives are increasingly entwined with — and dictated by — digital infrastructure, the use of data to misinform citizens and manipulate their behavior will only worsen. Micro-targeted, localized, and automated messaging is already becoming and will further become the norm within political and commercial communication. These developments — and the future they foretell — force us to consider our own data as a weapon of mass manipulation to be used against us, often without our explicit consent. As data is paired with insights from the behavioral sciences to influence our decision-making in a manner that is increasingly brazen and yet also opaque, the notions of truth and choice that underpin popular sovereignty and democratic rule will undergo a metastatic erosion. In the decades to come, we will see a new kind of warfare: the battle for power over our digital lives, as an extension of our physical lives within a democracy that encompasses the two. This battle will revolve around a new iteration of a familiar weapon: propaganda.

In A Chronology and Glossary of Propaganda in the United States, Richard Alan Nelson defines propaganda as follows: “a systematic form of purposeful persuasion that attempts to influence the emotions, attitudes, opinions, and actions of specified target audiences for ideological, political or commercial purposes through the controlled transmission of one-sided messages (which may or may not be factual) via mass and direct media channels” (232–233). Nelson’s definition provides an account with which to scrutinize emerging digital political persuasion techniques. To begin, I turn to the scholarship of Varoon Bashyakarla et al in Personal Data: Political Persuasion. Bashyakarla and his co-authors lay out a triadic framework for understanding data as a weapon of political propaganda: data as a political asset; as a form of political intelligence; and as a tool for political influence (7). The three components are interrelated, with each building off of the last: taken together and reinforced by scholarship in the field, they present a compelling case for the argument that data-driven political propaganda campaigns should be considered a weapon at the whims of state and non-state actors.

Bashyakarla et al begin with a description of data as a political asset, writing that it consists of “valuable sets of existing data on potential voters exchanged between political candidates, acquired from national repositories or sold or exposed to those who want to leverage them. This category includes a wide range of methods for gathering or accessing data, including consumer data, data available on the open internet, public data and voter files” (7). This concept is similarly explored by David Nickerson and Todd Rogers in “Political Campaigns and Big Data,” in which the authors describe how publicly available voter files and census data provide information like a voter’s age, gender, contact information, address, average income and education for the voter’s area, and perhaps most importantly, their voting record. This record is not of which candidates they voted for, but of whether or not they have voted in previous elections, which speaks to their propensity to cast future votes (57). The same concept is addressed by Ira Rubinstein in “Voter Privacy in the Age of Big Data,” wherein Rubinstein writes that “unbeknownst to most citizens, and certainly without their informed consent, presidential candidates, the major parties, and a cadre of data consultants have amassed huge political dossiers on every American voter, which are subject to few if any privacy regulations” (936). These dossiers represent not just the power data holds as an asset, but moreover, reflect the ways in which data is used as political intelligence.

After describing how political datasets are accumulated — through a mixture of publicly accessible databases, data brokers and consultants, census data, and more — Nickerson and Rogers describe how the data can then be leveraged for intelligence. They write that a “capable campaign data analyst who is familiar with the properties of the variables available in voter databases can generate highly accurate predictive scores for citizens” (Nickerson and Rogers 59). This predictive modeling draws comparisons to the commercial advertising industry: as Rubinstein elaborates, “online advertising and voter microtargeting have much in common: a vast data infrastructure, the ability to construct detailed individual profiles, and the application of statistical modeling techniques to identify and persuade targeted consumers or voters” (901). Campaigns are well aware of these commonalities, and in many cases, resort to these comparisons as an implicit argument that if supermarkets and fashion retailers can employ these tactics, their campaigns ought to be able to as well.

The political director of Mitt Romney’s 2012 presidential bid stated that, “Target anticipates your habits, which direction you automatically turn when you walk through the doors, what you automatically put in your shopping cart… We’re doing the same thing with how people vote” (Duhigg). As campaigns acquire more and more data about a desired set of voters, they can use that data to gain insights into how they should structure their campaigns on both the micro- and macro-level. As Rubinstein writes, “in a data-driven campaign, [voters’] response data is translated into a standardized format, uploaded to a computerized voter file, combined with other data in a massive database, shared as the campaign sees fit, and retained indefinitely in a manner that allows predictive data analysis for voter microtargeting purposes” (899). In other words, the intelligence that campaigns, political parties, and governments gather through collecting datasets can be used to understand how best to tactically target messaging at the individual level while also aggregating the knowledge gathered by those insights and by testing that messaging to inform the strategic operations of the persuasion campaign as a whole (Nickerson and Rogers 53). As Bashyakarla et al describe, data as political intelligence enables the “accumulation of insights into voter opinions… [which] can then be used to form positions, decide which areas to campaign in, or how to pitch a speech to a certain community” (35). They give the example of A/B testing, in which a campaign can develop two distinct messages and test their performance — based off of criteria like how many clicks the messages receive and how long individuals spend reading them — on a micro-targeted portion of voters to then inform the messaging decision of the campaign as a whole (Bashyakarla et al 38–43). In other words, voters are used as guinea pigs who provide valuable, actionable insights that save time and money and who allow campaigns to engage in low-risk, high-reward experimentation. It is in the transition from the use of data as a tool for intelligence to the application of that intelligence to sway behavior that we come to understand data as a form of political influence.

Paired together, the predictive models and personalized profiles described above become something else entirely: ingredients in a sophisticated mechanism for altering the behavior of voters, oftentimes using information about them without their consent or knowledge. This trend will only intensify over time with the application of new technologies: in the words of Nickerson and Rogers, “as campaign data analytics becomes more common, sophisticated, and mature, it will likely move away from judgment-based regressions to regressions based on customized machine learning algorithms” (61). This finding is reinforced by Bartlett et al in “The Future of Political Campaigning,” where the authors write that data gathered or purchased by campaigns can be input into analytics programs that spit out “highly precise, contextually relevant and inherently actionable insights as to the motivating drivers behind a customer’s predicted behaviour at machine learning speed and scale” (10). These insights behind what a customer or voter is likely to do can then be paired with psychographic profiles, which use an individual’s behavior to draw inferences about their personality traits. These tactics — labeled “psychometrics” — take consumer psychology and behavioral sciences to their logical, if dystopian, end: as Bashyakarla et al describe, “Psychometric profiling takes… [consumer psychology] a step further by mining vast quantities of personal data, which political strategists can use to tailor their communications to have greater influence on political opinions and voter preferences” (104). Because our digital lives and habits can be used to “quickly and accurately predict [or confirm] sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, use of addictive substances, parental separation, age and gender” (Bartlett et al 21), and because this information can be paired with information from our voter files like our voting history, our geographic location, and our likely level of wealth and education (due to average levels in our area), psychometric profiling represents a distinct threat of not just propaganda, but hyper-individualized propaganda. In the words of Matz et al in “Psychological Targeting as an Effective Approach to Digital Mass Persuasion,” “extrapolating from what one does to who one is is likely just the first step in a continuous development of psychological mass persuasion” (12717–12718). This evidence demonstrates the end point of the triadic framework of data manipulation: data is used as a political asset to gather political intelligence that is employed as a tool of propaganda to influence the behavior of voters, whether in persuading them to vote or not vote, donate or not donate, spread awareness or spread misinformation.

As our digital infrastructure continues to expand, so too will the use of psychometric profiles in combination with extensive data gathering to persuade voters through micro-targeted, localized messaging that relies on emotional triggers — i.e. using knowledge of a voter’s fears or hopes to refine messaging that reflects those emotional states. While some voters may realize the extent to which their data is vulnerable, they may not realize that data consultants have “developed the ability to match the data in national voter files with online, cookie-based profiles and… [are] offering new services capable of delivering targeted ads to voters at any web site they might visit” (Rubinstein 899). As the Internet of Things — the interrelated set of devices that we rely on in more and more facets of our daily lives — inflates, Bartlett et al write that the “direction of travel is towards customer segments of one. Companies and marketers are increasingly seeking to target consumers on an individual and personalised basis, across their [interconnected suite of] devices, based on widely available data about them from multiple sources” (30, emphasis added). To substantiate this claim, Bartlett and his co-authors point to the example of a data consultancy, one of many, that claims to have “matched 155 million registered voters to their ‘email addresses, online cookies, and social handles’, in addition to their ‘400 segmentation filters’ which ‘combine demographic, geographical, cultural and interest-based data to create the precise profiles you need’” (30). This evidence points to a future in which a personality profile is built about you based off of your voter information and your online habits; the assumptions about your personality will then be tested and refined through machine learning algorithms that target you with messaging across your phone, your laptop, your television, your podcasts, and more; as those algorithms learn more about your preferences, they will continue to segment you, reaching you with hyper-personalized propaganda meant to sway your behavior by (a) telling you what you want to hear, (b) tailoring a message to align with your personality, and (c) subtly misinforming or misleading you. This is a future in which “the candidate knows everything about the voter, but the media and the public know nothing about what the candidate really believes. It is, in effect, a nearly perfect perversion of the political process” (Gertner, emphasis added). As the political process is frayed and distorted by the machinations of digital political propaganda, the fabric of democracy itself will become increasingly exposed to risk.

Evidence of digital political propaganda is easy to rationalize away. As discussed above, many of the propagandists compare their work to commercial advertising, offering an implicit argument along the lines of “if they are doing it too, how bad can it be?” While it is beyond the scope of this discussion to explore the evidence that commercial propaganda is itself pernicious to the health and well-being of society, one way to move past this deflection by comparison is to look at the secrecy surrounding digital political marketing and persuasion techniques. An examination of the language and the secrecy around these technologies calls to mind the extensive classification regimes that accompanied the development of biological weaponry in the early 20th century, which Harris and Paxman describe in A Higher Form of Killing as follows: “each side camouflaged the existence of these stocks [of weapons] with great secrecy for fear that the enemy would discover them” (121). When it comes to the tools of digital political manipulation, the same hypervigilant secrecy policies obtain. According to Nickerson and Rogers, “campaigns view their analytic techniques as secret weapons to be kept out of the hands of opponents” (Nickerson and Rogers 53, emphasis added). Rubinstein reaches the same findings in “Voter Privacy in the Age of Big Data”, pulling directly from the statements of campaign members themselves. For example, he writes that, “‘we have no interest in telling our opponents our digital strategy,’ said one Obama spokesperson; another said: ‘They are our nuclear codes’” (897, emphasis added). It is no accident that the analogy drawn by the spokesperson is to the hypersecrecy associated with nuclear development efforts. The nuclear projects were shrouded by a veil of secrecy and opacity: in fact, Eric Schlosser argues in Command and Control that “secrecy is essential to the command and control of nuclear weapons… They are not classified by government officials; they are classified as soon as they exist” (465). The same argument about secrecy can be made based on all the available evidence about digital manipulation tactics: they are tools of political propaganda designed to be used covertly so that the general public is either unaware of their existence or is unquestioning of the extent to which they are pernicious.

Secrecy and opacity are a key element in ensuring that the public is unaware or apathetic. Writing of the 2012 presidential election, Ira Rubenstein points out that neither of the candidates or political parties provided the electorate with a “clear and concise description of campaign data practices in their entirety or what choices they had (if any) to (1) grant, limit, or withdraw consent about the collection, use, and disclosure of their personal data; (2) access, correct, or request the deletion of data about them; or (3) make an inquiry or lodge a complaint” (891). As a result, voters are often unaware that their data is being used (Bashyakarla et al 28); unable to determine the source of the targeted messaging they receive (Bartlett et al 38); unable to opt out from experimentation with their data for purposes like A/B testing (Bashyakarla et al 38); and incapable of exercising their rights to amend and control how their data is used against them (Bartlett et al 41). This evidence suggests that not only are weapons of political propaganda capable of exerting influence over the political system and the behavior of voters, but moreover, that voters are powerless to control how those weapons are developed and applied. Here, again, the covert and authoritarian development of nuclear weaponry comes to mind.

The relationship between science and war in the 20th century was one of massive, unprecedented physical destruction. Hundreds of thousands of lives were lost to emerging technologies developed by scientists, oftentimes operating with the utmost secrecy. In the 21st century, the destruction wrought by developments in science will no longer be physical, but rather digital; it is not lives that will be lost to the effects of digital political propaganda, although as noted in “The Global Disinformation Disorder”, these tools “will certainly enable authoritarian regimes to manufacture consensus, automate suppression, and undermine trust in the liberal international order” (Bradshaw and Howard 1). The destruction enabled by the application of neuroscience, psychology, behavioral economics, and sophisticated data science to political advertising will instead be the fragmentation of the concepts of truth and choice. In this world, the political process will be deeply distorted by campaigns who profile and relentlessly manipulate us, persuading us with each micro-targeted, geographically refined message and each meticulously crafted piece of misinformation to hand over our ownership over the political infrastructure of our own democracies. The new warfare will take place in our heads and on our curated, fragmented world of screens. Welcome to the new weapons of mass manipulation.

Works Cited:

Bartlett, Josiah, et al. “The Future of Political Campaigning”. Demos, July 2018. ico.org.uk/media/action-weve-taken/reports/2259365/the-future-of-political-campaigning.pdf

Bartlett, Josiah, et al. “The Future of Political Campaigning”. Demos, July 2018. ico.org.uk/media/action-weve-taken/reports/2259365/the-future-of-political-campaigning.pdf

Bashyakarla, Varoon, et al. “Personal Data: Political Persuasion Inside the Influence Industry. How it works.” Tactical Tech, March 2019. cdn.ttc.io/s/tacticaltech.org/Personal-Data-Political-Persuasion-How-it-works.pdf

Bradshaw, Samantha and Philip N. Howard. “The Global Disinformation Disorder: 2019 Global Inventory of Organised Social Media Manipulation.” Working Paper 2019.2. Oxford, UK: Project on Computational Propaganda. comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2019/09/CyberTroop-Report19.pdf

Duhigg, Charles. “Campaigns mine personal lives to get out vote.” New York Times 13 (2012).

Gertner, Jon. “The very, very personal is the political.” The New York Times Magazine (2004): 42–42.

Harris, Robert, and Jeremy Paxman. A higher form of killing: the secret history of chemical and biological warfare. Random House Incorporated, 2002.

Matz, Sandra C., et al. “Psychological targeting as an effective approach to digital mass persuasion.” Proceedings of the national academy of sciences 114.48 (2017): 12714–12719.

Nelson, Richard Alan. A chronology and glossary of propaganda in the United States. Greenwood Publishing Group, 1996.

Nickerson, David W., and Todd Rogers. “Political campaigns and big data.” Journal of Economic Perspectives 28.2 (2014): 51–74.

Rubinstein, Ira S. “Voter privacy in the age of big data.” Wis. L. Rev. (2014): 861.

Schlosser, Eric. Command and control: Nuclear weapons, the Damascus accident, and the illusion of safety. Penguin, 2013.

--

--

Isaac Gilles

For the birds, not the cages. Working to move culture and public policy toward democracy and human flourishing. igilles.wixsite.com/isaac