Академический Документы
Профессиональный Документы
Культура Документы
REFERENCES
Linked references are available on JSTOR for this article:
https://www.jstor.org/stable/10.5325/jinfopoli.9.2019.0370?seq=1&cid=pdf-
reference#references_tab_contents
You may need to log in to JSTOR to access the linked references.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide
range of content in a trusted digital archive. We use information technology and tools to increase productivity and
facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at
https://about.jstor.org/terms
Penn State University Press is collaborating with JSTOR to digitize, preserve and extend
access to Journal of Information Policy
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation and Internet
Advertising Infrastructure
Matthew Crain and Anthony Nadler
ABSTRACT
Disinformation and other forms of manipulative, antidemocratic communication
have emerged as a problem for Internet policy. While such operations are not
limited to electoral politics, efforts to influence and disrupt elections have cre-
ated significant concerns. Data-driven digital advertising has played a key role in
facilitating political manipulation campaigns. Rather than stand alone incidents,
manipulation operations reflect systemic issues within digital advertising markets
and infrastructures. Policy responses must include approaches that consider digital
advertising platforms and the strategic communications capacities they enable. At
their root, these systems are designed to facilitate asymmetrical relationships of
influence.
Keywords: Disinformation, political advertising, infrastructure, targeted advertis-
ing, social media
This research was supported in part by the Government of Canada. The views expressed here are
the authors’ own.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 371
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
372 JOURNAL OF INFORMATION POLICY
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 373
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
374 JOURNAL OF INFORMATION POLICY
8. As Full Fact points out, a “moral panic” around fake news could prompt overreactions that
threaten free speech. Full Fact.
9. Shane and Blinder.
10. Bradshaw and Howard.
11. U.S. v. Internet Research Agency.
12. Bradshaw and Howard.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 375
13. Ibid.
14. Ghosh and Scott, “Russia’s Election Interference.”
15. McNair.
16. Kaye; Kim et al.; Valentino-DeVries.
17. Google, “Changing Channels,” 12.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
376 JOURNAL OF INFORMATION POLICY
social feeds, mobile apps, websites, and other channels. Highly segmented
message targeting, through digital advertising, can help spur “organic”
amplification and generate human assets for information operations.18
Major ad platforms have typically operated as open marketplaces, avail-
able to any advertiser who meets basic quality standards. Responding to
controversies, platforms have tightened restrictions in recent years, imple-
menting various protocols for advertiser authentication and restricting
access to ad services for certain groups. For example, Facebook now requires
advertisers in certain countries to “confirm identity and location before
running political ads and disclose who paid for them.”19 As we discuss in
the following, while such policies seem to take first steps in the fight against
political manipulation, these efforts can be circumvented by influence oper-
ations with relative ease. The scope and implementation of these systems,
which can vary significantly, require careful regulatory scrutiny.
Digital ad infrastructure provides three key interlocking communica-
tion capacities.20 The first is the capacity to use consumer monitoring to
develop detailed consumer profiles. The second is the capacity to target
highly segmented audiences with strategic messaging across devices and
contexts. The third is the capacity to automate and optimize tactical ele-
ments of influence campaigns. There are numerous technical means that
have been developed to enable these capacities.21 Like all advertising for-
mats, digital ad spaces are designed to offer advertisers many choices and
options. However, we see these three capacities as core features that have
been built into the digital infrastructure as essential, top-level features that
enable today’s data-driven advertising practices.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 377
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
378 JOURNAL OF INFORMATION POLICY
around characteristics and traits that have not been self-disclosed by the
targets.30 Such inferences have been made available to target, or exclude,
politically sensitive groups for social media ad campaigns.31
Microtargeting
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 379
36. Enwemeka.
37. Ghosh and Scott, “Digital Deceit I.”
38. HubSpot. What is deep learning? https://blog.hubspot.com/marketing/what-is-deep-
learning
39. Kaptein et al.; Berkovsky, Kaptein, and Zancanaro, 18.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
380 JOURNAL OF INFORMATION POLICY
variations to see which are the most effective. Advertisers can use such tools
to determine what issues resonate with p articular targets as well as test for
fears or prejudices that can be invoked to influence political behavior.
These systems bring significant speed and cost advantages, allowing adver-
tisers to quickly and efficiently tailor their efforts to meet particular strategic
objectives. Campaigns can be optimized for individual behaviors like clicks
and video views, but they can also be tuned to elevate particular conversa-
tion or promote social interaction.40 As standard practice, digital market-
ing campaigns are coordinated across multiple platforms and channels and
paid advertising is often deployed in conjunction with other promotional
techniques. Tools such as social media management services enable advertis-
ers to operate complex multiplatform campaigns and use automated deci-
sion-making systems to “optimize persuasive power for every dollar spent.”41
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 381
43. Ariely.
44. Calo; Shaw.
45. PHD Media.
46. Some of the landmark contributions to this line of critique include, Williamson; Packard;
Ewen; McClintock.
47. Zuboff; Ghosh and Scott, “Digital Deceit I.”
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
382 JOURNAL OF INFORMATION POLICY
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 383
They exploited social unrest and human cognitive biases. The divi-
sive propaganda Russia used to influence American thought and
steer conversations for over three years wasn’t always objectively false.
The content designed to reinforce in-group dynamics would likely
have offended outsiders who saw it, but the vast majority wasn’t hate
speech. Much of it wasn’t even particularly objectionable. But it was
absolutely intended to reinforce tribalism, to polarize and divide . . . 55
Digital ad systems offer a great advantage for such efforts over mass audi-
ence print and broadcast media. First, microtargeting allows advertisers
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
384 JOURNAL OF INFORMATION POLICY
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 385
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
386 JOURNAL OF INFORMATION POLICY
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 387
61. Allbright.
62. Penzenstadler, Heath, and Guynn.
63. Facebook Business.
64. Google. “Political Content.”
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
388 JOURNAL OF INFORMATION POLICY
65. House of Commons of Canada, “Disinformation and ‘Fake News’: Interim Report,” 37.
66. Ravel, Woolley, and Sridharan.
67. Ibid.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 389
68. Turow.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
390 JOURNAL OF INFORMATION POLICY
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 391
Discussion of Transparency
Major tech companies such as Facebook and Google have already started
to implement their own policies requiring certain types of political ads in
some countries to include a disclaimer naming a sponsor and to go through
a verification process. These verification processes, however, have proved
feeble. Just before the 2018 midterm election, a VICE news investigation
team “applied to buy fake ads on behalf of all 100 sitting U.S. senators,
including ads ‘Paid for by’ Mitch McConnell and Chuck Schumer. All 100
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
392 JOURNAL OF INFORMATION POLICY
sailed through the system, indicating that just about anyone can buy an
ad identified as ‘Paid for by’ a major U.S. politician.”71 Even if measures
are put in place to prevent advertisers from impersonating elected officials,
as long as sponsors are easily able to create front groups that provide lit-
tle information about their donors, such disclaimers do little to provide
meaningful information to citizens, journalists, researchers, or regulators.
In certain circumstances, there are legitimate concerns about whether
requiring identification of donors for political ads would potentially chill
speech. We recommend policymakers consider tradeoffs carefully, but we
think that given that targeted advertising relies on personal users’ data
there is more justification for limiting anonymous speech by large donors
in this area than others. Alternatively, policymakers could require that
platforms specifically ask users if they are willing to allow their data to be
used by political groups that do not disclose all major donors. We suspect
that requiring such explicit permission from users would effectively end
the practice of anonymously funded targeted ads.
Requiring ad platforms to stringently verify the identity of sponsor-
ing organizations and their financing is a separate issue from requiring
sponsors to make their donors public. Making ad sales contingent upon
verification is a crucial step to preventing undisclosed foreign influence
operations from using targeted advertising. If the burdens of a rigorous
verification process significantly disadvantage small advertisers, policy-
makers could consider whether to place spending thresholds below which
advertisers could use a less rigorous process, though ad platforms would
need to take steps to prevent abuse of this leniency.
Data Rights
71. Turton.
72. Information Commissioner’s Office; Bradshaw, Neudert, and Howard.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 393
73. Ghosh and Scott, “Digital Deceit II”; Greenspon and Owen.
74. Ghosh and Scott, “Digital Deceit II,” 22.
75. General Data Protection Regulation Article 4 line 11.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
394 JOURNAL OF INFORMATION POLICY
76. Woodrow Hartzog, “Policy Principles for a Federal Data Privacy Framework in the
United States,” § U.S. Senate Committee on Commerce, Science and Transportation (2019).
77. General Data Protection Regulation Article 4 line 11.
78. Dillet.
79. Ravel, Woolley, and Sridharan, 14.
80. Chester and Montgomery.
81. Rothchild.
82. Centre for International Governance Innovation.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 395
83. Sample was 6,387 adults in France, Germany, the United Kingdom, and the United
States. RSA Security
84. House of Commons of Canada. “Disinformation and ‘Fake News’: Final Report.”
85. Bradshaw, Neudert, and Howard; McCann and Hall.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
396 JOURNAL OF INFORMATION POLICY
possible to give individuals control over how their data is used by different
advertisers.
One implementation of granularity would apply consent not only
to the platforms that provide advertising infrastructure services, but to
every advertiser that uses those platforms to profile and target individuals.
Rather than simply asking individuals for blanket consent to all manner of
targeted advertising (as Facebook has attempted to do under the GDPR86),
permission could be obtained by individual advertisers on a platform by
platform basis. For example, XYZ Political Action Committee (PAC) could
be required to obtain consent from individuals before targeting them on
Facebook, regardless of whether the PAC imports their own database of
supporters or simply uses Facebook’s baked-in ad targeting systems.87 If
the PAC then wanted to reach those same individuals on another platform,
further consent could be required. If split testing were used in any of these
instances, separate and distinct consent could be mandated as well.
We contend that granular consent aligns with the spirit of GDPR’s
purpose specification requirement. Guidelines from the Article 29 Data
Protection Working Party (precursor to European Data Protection Board)
suggest that data processors should “consider introducing a process of
granular consent where they provide a clear and simple way for data
subjects to agree to different purposes for processing.”88 Current inter-
pretations seem to understand targeted advertising as a single category
of purpose. We argue that advertising contains a spectrum of purposes
dependent upon advertiser identities, objectives, and targeting mecha-
nisms. Policy should recognize that important distinctions exist between
an ad campaign that uses profile data to target individuals about a divisive
social issue and a consumer product campaign that uses demographics to
reach a broad audience.
Granular consent extends the basic principle that people must be
informed in order to make choices about how their data is used. This
86. In addition to seeking blanket consent from its users, Facebook has also “bundled” con-
sent to advertising within its more general terms of service provision. At the time of this writing,
privacy regulators in several EU countries are investigating this issue as it pertains to Facebook
and other major ad platforms.
87. To the best of our knowledge, the GDPR is not clear on whether consent is required to be
obtained by advertisers that use the built-in targeting capacities of an ad platform like Facebook.
88. “Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of
Regulation 2016/679” (ARTICLE 29 DATA PROTECTION WORKING PARTY, October 3,
2017), https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 397
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
398 JOURNAL OF INFORMATION POLICY
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 399
under the rationale that market forces will diversify the tech services land-
scape and give consumers more choices that enhance privacy and reduce
manipulative targeting.
In addition to market-based approaches, policymakers should con-
sider more direct forms of intervention into the data-driven advertising
capacities that are most susceptible to abuse. As Hartzog notes, rather than
offloading risk to consumers though transparency guidelines and consent
mechanisms, “strong rules limiting collection and storage on the front end
can mitigate concern about the privacy problems raised through data ana-
lytics, sharing, and exploitation.”95
The UK House of Commons final report on Disinformation and Fake
News proposes “re-introducing friction into the online experience.”96
While that report focuses on slowing down user interactivity “to give peo-
ple time to consider what they are writing and sharing,” we propose that
incorporating friction into ad targeting systems could be an effective means
to tamp down advertising-supported political manipulation. Proposals in
this area generally seek to
1. Limit advertisers’ capacities to find and target vulnerabilities
2. Mitigate the tendency of online political advertising toward niche tar-
geting that can amplify social segmentation with incentives that encour-
age campaigns to address a broad and heterogeneous public sphere
If political advertising requires elevated codes of transparency and data
rights in order to meet public interest goals, then policymakers should also
consider higher public interest standards for the tools and techniques of
political influence operations. Such an approach draws from and extends
GDPR-style privacy regulation, which as Bradshaw et al. note, “has gaps in
coverage and enforcement that limit its effectiveness to address all problems
associated with social media manipulation and data-driven targeting.”97
Proposals under the category of public interest ad regulation include
1. Political profiling and targeting could be inhibited by strong data min-
imization standards such as those mandated by the GDPR. Key com-
ponents of data minimization are “collecting personal data only when
it is absolutely needed; deciding if some types of data should never be
95. Hartzog, Policy principles for a federal data privacy framework in the United States.
96. House of Commons of Canada. “Disinformation and ‘Fake News’: Final Report.”
97. Bradshaw, Neudert, and Howard.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
400 JOURNAL OF INFORMATION POLICY
collected; keeping data only for as long as necessary; and limiting access
to only those who truly need it.”98
2. Advertising profile information or certain categories therein could be
subject to firm expiration dates. In such cases, “old data” would rou-
tinely be expunged from storage systems.99 This would limit advertisers’
ability to develop profiles over long periods of time and could shift
advertising away from intermittent communication toward more peri-
odic contact with trusted entities. The GDPR includes rules intended
to limit data storage, though they appear to give wide discretion to data
processors/controllers.
3. Policymakers should closely scrutinize specific advertising techniques
that present clear opportunities for abuse and convene multistakeholder
discussions about their social benefits and costs. Policymakers should
move to constrain profiling and targeting practices that are found to
present unacceptable levels of political risk. Lookalike targeting,100 geo-
targeting,101 cross-device tracking, third-party data brokering,102 split
testing, and microtargeting are among the techniques that deserve
heightened regulatory review.
4. Policymakers should commission or undertake research to consider pol-
icies that could encourage political advertisers to forego microtargeting
and address broad and heterogeneous constituencies. Potentially, such
a result could come from policies that greatly restrict data collection or
targeting capacities. Yet, policymakers might consider more direct routes
to counteracting the economic incentives that push political advertis-
ing toward digital microtargeting. Policies could include direct require-
ments on ad platforms that no more than a certain percentage of their
political advertising meets a well-thought criteria of microtargeting.
Other policies could include additional burdens on funding used for
microtargeted political advertising, such as not allowing tax-deductible
nonprofit funds to be used for microtargeted ads.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 401
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
402 JOURNAL OF INFORMATION POLICY
106. “IPA to Call for Moratorium on Micro-Targeted Political Ads Online,” accessed
March 5, 2019, https://ipa.co.uk/news/ipa-to-call-for-moratorium-on-micro-targeted-political-
ads-online#.
107. Ibid.
108. Singer.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 403
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
404 JOURNAL OF INFORMATION POLICY
bibliography
Angwin, Julia, and Terry Parris Jr. “Facebook Lets Advertisers Exclude Users by Race.”
ProPublica, October 28, 2016. Accessed March 15, 2019. https://www.propublica.org/
article/facebook-lets-advertisers-exclude-users-by-race.
Angwin, Julia, Madeleine Varner, and Ariana Tobin. “Facebook Enabled Advertisers to Reach
‘Jew Haters.’” ProPublica, September 14, 2017. Accessed March 15, 2019. https://www.
propublica.org/article/facebook-enabled-advertisers-to-reach-jew-haters.
Ariely, Dan. Predictably Irrational: The Hidden Forces That Shape Our Decisions. New York:
Harper Collins, 2008.
Beckett, Lois. “Trump Digital Director Says Facebook Helped Win the White House.”
The Guardian, October 9, 2017, sec. Technology. https://www.theguardian.com/
technology/2017/oct/08/trump-digital-director-brad-parscale-facebook-advertising.
Berkovsky, Shlomo, Maurits Kaptein, and Massimo Zancanaro. “Adaptivity and Personalization
in Persuasive Technologies.” In Proceedings of the Personalization in Persuasive
Technology Workshop, Persuasive Technology 2016, edited by R. Orji, M. Reisinger,
M. Busch, A. Dijkstra, A. Stibe, and M. Tscheligi, Salzburg, Austria, April 5, 2016.
Bey, Sebastian, Giorgio Bertolin, Nora Biteniece, Edward Christie, and Anton Dek.
“Responding to Cognitive Security Challenges.” NATO STRATCOM Centre
of Excellence, January 2019. Accessed March 15, 2019. https://stratcomcoe.org/
responding-cognitive-security-challenges.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 405
Bodine-Baron, E., T. Helmus, A. Radin, and E. Treyger. Countering Russian Social Media
Influence. Santa Monica, CA: Rand Corporation, 2018. Accessed March 15, 2019. https://
www.rand.org/content/dam/rand/pubs/research_reports/RR2700/RR2740/RAND_
RR2740.pdf.
Bradshaw, S., and P. Howard. Challenging Truth and Trust: A Global Inventory of Organized
Social Media Manipulation. Computational Propaganda Research Project, Oxford
Internet Institute, 2018. Accessed March 15, 2019. http://comprop.oii.ox.ac.uk/research/
cybertroops2018/.
Bradshaw, S., L.-M. Neudert, and P. Howard. Government Responses to Malicious Use of Social
Media. NATO STRATCOM Centre of Excellence, 2018. Accessed March 15, 2019.
https://comprop.oii.ox.ac.uk/research/government-responses/.
Calo, Ryan. “Digital Market Manipulation.” George Washington Law Review 82, no. 4 (August
2014): 995–1051.
Centre for International Governance Innovation (CIGI). “2018 CIGI-Ipsos Global Survey on
Internet Security and Trust,” 2018. Accessed March 5, 2019. https://www.cigionline.org/
internet-survey-2018.
Chester, Jeff, and Kathryn C. Montgomery. “The Role of Digital Marketing in Political
Campaigns.” Internet Policy Review 6, no. 4 (December 31, 2017). Accessed March 15,
2019. https://policyreview.info/articles/analysis/role-digital-marketing-political-
campaigns.
Das, Sauvik, and Adam D. I. Kramer. “Self-Censorship on Facebook.” Facebook Research,
July 2, 2013. Accessed March 15, 2019. https://research.fb.com/publications/
self-censorship-on-facebook/.
Davies, Jessica. “WTF Is a Persistent ID.” Digiday, March 8, 2017. Accessed March 15, 2019.
https://digiday.com/marketing/wtf-persistent-id/
Dean, Sam. “Facebook Decided Which Users Are Interested in Nazis—and Let Advertisers Target
Them Directly.” Los Angeles Times, February 21, 2019. Accessed March 15, 2019. https://
www.latimes.com/business/technology/la-fi-tn-facebook-nazi-metal-ads-20190221-
story.html.
Dillet, Romain. “French Data Protection Watchdog Fines Google $57 Million under the
GDPR.” TechCrunch, January 21, 2019. Accessed March 15, 2019. https://techcrunch.
com/2019/01/21/french-data-protection-watchdog-fines-google-57-million-under-
the-gdpr/.
DiResta, Renee, Kris Shaffer, Becky Ruppel, David Sullivan, Robert Matney, Ryan Fox,
Jonathan Albright, and Ben Johnson. “The Tactics & Tropes of the Internet Research
Agency.” New Knowledge, 2018. Accessed March 15, 2019. https://cdn2.hubspot.net/
hubfs/4326998/ira-report-rebrand_FinalJ14.pdf.
The Electoral Commission. Digital Campaigning: Increasing Transparency for Voters. United
Kingdom, 2018. Accessed March 15, 2019. https://www.electoralcommission.org.uk/__
data/assets/pdf_file/0010/244594/Digital-campaigning-improving-transparency-for-
voters.pdf
Engelhardt, Steven, and Arvind Narayanan. “Online Tracking: A 1-million-site Measurement
and Analysis,” October 27, 2016. Accessed March 15, 2019. http://randomwalker.info/
publications/OpenWPM_1_million_site_tracking_measurement.pdf.
Enwemeka, Zeninjor. “Under Agreement, Firm Won’t Target Digital Ads around Mass.
Health Clinics.” WBUR, April 4, 2017. Accessed March 15, 2019. http://www.wbur.org/
bostonomix/2017/04/04/massachusetts-geofencing-ads-settlement.
Estrin, J., and S. Gill. “The World Is Choking on Digital Pollution.” Washington Monthly,
January 13, 2019. Accessed March 15, 2019. https://washingtonmonthly.com/magazine/
january-february-march-2019/the-world-is-choking-on-digital-pollution/.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
406 JOURNAL OF INFORMATION POLICY
European Commission. “High Representative of the Union for Foreign Affairs and Security
Policy.” Action Plan against Disinformation (No. JOIN(2018) 36 final), 2018a.
Accessed March 15, 2019. https://ec.europa.eu/commission/sites/beta-political/files/
eu-communication-disinformation-euco-05122018_en.pdf
———. Report on the Implementation of the Communication “Tackling Online Disin‑
formation: A European Approach” (No. COM(2018) 794/3), 2018b. Accessed March 15,
2019. https://ec.europa.eu/commission/sites/beta-political/files/eu-communication-
disinformation-euco-05122018_en.pdf
Ewen, Stuart. Captains of Consciousness: Advertising and the Social Roots of the Consumer Culture.
New York: Basic Books, 2008.
Facebook Business. “Getting Authorized to Run Ads Related to Politics or Issues of National
Importance.” Advertiser Help Center. Accessed September 9, 2018. Accessed March 15,
2019. https://www.facebook.com/business/help/208949576550051.
Fridkin, Kim L., and Patrick J. Kenney. “Variability in Citizens’ Reactions to Different Types
of Negative Campaigns.” American Journal of Political Science 55, no. 2 (2011): 307–25.
Full Fact. Tacking Misinformation in an Open Society. 2018. Accessed March 15, 2019.
https://fullfact.org/media/uploads/full_fact_tackling_misinformation_in_an_open_
society.pdf
Ghosh, D., and B. Scott. Digital Deceit I: The Technologies behind Precision Propaganda on the
Internet. New America Foundation, January 2018a. Accessed March 15, 2019. https://
www.newamerica.org/public-interest-technology/policy-papers/digitaldeceit/.
———. Digital Deceit II: A Policy Agenda to Fight Disinformation on the Internet. New
America Foundation, 2018b. Accessed March 15, 2019. https://shorensteincenter.org/
digital-deceit-ii-policy-agenda-fight-disinformation-internet/
———. “Russia’s Election Interference Is Digital Marketing 101.” The Atlantic, February 19, 2018c.
Accessed March 15, 2019. https://www.theatlantic.com/international/archive/2018/02/
russia-trump-election-facebook-twitter-advertising/553676/.
Google. “Changing Channels: Building a Better Marketing Strategy to Reach Today’s Viewers,”
February 2018a. Accessed March 15, 2019. https://services.google.com/fh/files/misc/
changing_channels_a_marketers_guide_to_tv_and_video_advertising.pdf
———. “Political Content—Advertising Policies Help.” Accessed September 16, 2018b. Accessed
March 15, 2019. https://support.google.com/adspolicy/answer/6014595?hl=en
Graves, Christopher, and Sandra Matz. “What Marketers Should Know About Personality-
Based Marketing.” Harvard Business Review, May 2, 2018. Accessed March 15, 2019.
https://hbr.org/2018/05/what-marketers-should-know-about-p ersonality-based-
marketing.
Greenspon, E., and T. Owen. Democracy Divided: Countering Disinformation and Hate
in the Digital Public Sphere. University of British Columbia: Public Policy Forum,
2018. Accessed March 15, 2019. https://ppforum.ca/publications/social-marketing-
hate-speech-disinformation-democracy/
Hartzog, Woodrow. “Opinions—The Case Against Idealising Control.” European Data Protection
Law Review 4, no. 4 (2018): 423–32. doi:10.21552/edpl/2018/4/5.
Hill, Kashmir. “‘Do Not Track’ Privacy Tool Doesn’t Do Anything.” Gizmodo, October 15,
2018. Accessed March 15, 2019. https://gizmodo.com/do-not-track-the-privacy-tool-
used-by-millions-of-peop-1828868324.
House of Commons of Canada. Digital, Culture, Media and Sports Committee. Disinformation
and ‘Fake News’: Interim Report (No. HC 363), 2018a. Accessed March 15, 2019. https://
publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/363/363.pdf
———. Standing Committee on Access to Information, Privacy and Ethics. Democracy under
Threat: Risks and Solutions in the Era of Disinformation and Data Monopoly, 2018b.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 407
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
408 JOURNAL OF INFORMATION POLICY
McCann, D., and M. Hall. Blocking the Data Stalkers. New Economics Foundation, 2018.
Accessed March 15, 2019. https://neweconomics.org/uploads/files/NEF_Blocking_Data_
Stalkers.pdf
McClintock, Anne. “Soft-Soaping Empire: Commodity Racism and Imperial Advertising.” In
Travellers’ Tales: Narratives of Home and Displacement, edited by Jon Bird, Barry Curtis,
Melinda Mash, Tim Putnam, George Robertson, and Lisa Tickner, 129–52. London:
Routledge, 2005.
McNair, Corey. “Global Ad Spending Update.” eMarketer, November 20, 2018. Accessed March
15, 2019. https://www.emarketer.com/content/global-ad-spending-update.
Morris, Steven. “British Army Ads Targeting ‘Stressed and Vulnerable Teenagers.’” The Guardian,
June 8, 2018. Accessed March 15, 2019. https://www.theguardian.com/uk-news/2018/
jun/08/british-army-criticised-for-exam-results-day-recruitment-ads.
Nadler, Anthony, Matthew Crain, and Joan Donovan. “Weaponizing the Digital Influence
Machine: The Political Perils of Online Ad Tech.” Data & Society Research
Institute, October 17, 2018. Accessed March 15, 2019. https://datasociety.net/output/
weaponizing-the-digital-influence-machine/.
Nielsen, Rasmus Kleis, and Sarah Anne Ganter. “Dealing with Digital Intermediaries: A Case
Study of the Relations between Publishers and Platforms.” New Media & Society 20, no.
4 (April 1, 2018): 1600–17. doi:10.1177/1461444817701318.
Oremus, Will. “Facebook Says a ‘Clear History’ Tool Will Hurt Its Advertising Business. Good.”
Slate, February 27, 2019. Accessed March 15, 2019. https://slate.com/technology/2019/02/
facebook-clear-history-button-real-wow.html.
Packard, Vance. The Hidden Persuaders. New York: David McKay Company, 1957.
Penzenstadler, Nick, Brad Heath, and Jessica Guynn. “We Read Every One of the 3,517 Facebook
Ads Bought by Russians. Here’s What We Found.” USA Today, May 13, 2018.
PHD Media, “New Beauty Study Reveals Days, Times and Occasions When U.S. Women Feel
Least Attractive.” Cision PR Newswire, October 2, 2013. https://www.prnewswire.com/
news-releases/new-beauty-study-reveals-days-times-and-occasions-when-us-women-feel-
least-attractive-226131921.html.
Pickard, Victor. “Break Facebook’s Power and Renew Journalism.” The Nation, April 18,
2018. Accessed March 15, 2019. https://www.thenation.com/article/break-facebooks-
power-and-renew-journalism/.
———. “The Violence of the Market.” Journalism 20, no. 1 (January 1, 2019): 154–58.
doi:10.1177/1464884918808955.
Ravel, A. M., S. C. Woolley, and H. Sridharan. Principles and Policies to Counter Deceptive Digital
Politics. Maplight; Institute for the Future, 2019. Accessed March 15, 2019. https://s3-us-
west-2.amazonaws.com/maplight.org/wp-content/uploads/20190211224524/Principles-
and-Policies-to-Counter-Deceptive-Digital-Politics-1-1-2.pdf
Reilly, Michael. “Is Facebook Targeting Ads at Sad Teens?” MIT Technology Review, May 1, 2017.
Accessed March 15, 2019. https://www.technologyreview.com/s/604307/is-facebook-
targeting-ads-at-sad-teens/
Riek, Blake M., Eric W. Mania, and Samuel L. Gaertner. “Intergroup Threat and Outgroup
Attitudes: A Meta-Analytic Review.” Personality and Social Psychology Review 10, no. 4
(November 1, 2006): 336–53. doi:10.1207/s15327957pspr1004_4.
Roese, Neal J., and Gerald N. Sande. “Backlash Effects in Attack Politics.” Journal of Applied
Social Psychology 23, no. 8 (1993): 632–53. doi:10.1111/j.1559-1816.1993.tb01106.x.
Rothchild, John. “Against Notice and Choice: The Manifest Failure of the Proceduralist
Paradigm to Protect Privacy Online (or Anywhere Else).” Cleveland State Law Review 66,
no. 3 (May 15, 2018): 559.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
Political Manipulation 409
“RSA Data Privacy & Security Survey 2019: The Growing Data Disconnect between Consumers
and Businesses.” RSA Security, February 6, 2019. https://www.rsa.com/content/dam/en/
misc/rsa-data-privacy-and-security-survey-2019.pdf.
Schechner, Sam, and Mark Secada. “You Give Apps Sensitive Personal Information. Then They
Tell Facebook.” Wall Street Journal, February 22, 2019, sec. Tech, accessed March 15, 2019.
https://www.wsj.com/articles/you-give-apps-sensitive-personal-information-then-they-tell-
facebook-11550851636.
Shane, Scott, and Alan Blinder. “Democrats Faked Online Push to Outlaw Alcohol in Alabama
Race.” The New York Times, January 7, 2019, sec. U.S., accessed March 15, 2019. https://
www.nytimes.com/2019/01/07/us/politics/alabama-senate-facebook-roy-moore.html.
Shaw, Tamsin. “Invisible Manipulators of Your Mind.” The New York Review of Books, April
20, 2017. Accessed March 15, 2019. http://www.nybooks.com/articles/2017/04/20/
kahneman-tversky-invisible-mind-manipulators/.
Singer, Natasha. “‘Weaponized Ad Technology’: Facebook’s Moneymaker Gets a Critical Eye.”
New York Times, August 16, 2018. Accessed March 15, 2019. https://www.nytimes.
com/2018/08/16/technology/facebook-microtargeting-advertising.html.
Solon, Olivia, and Sabrina Siddiqui. “Forget Wall Street—Silicon Valley Is the New
Political Power in Washington.” The Guardian, September 3, 2017, sec. Technology,
accessed March 15, 2019. https://www.theguardian.com/technology/2017/sep/03/
silicon-valley-politics-lobbying-washington.
Solove, Daniel J. “Introduction: Privacy Self-Management and the Consent Dilemma
Symposium: Privacy and Technology.” Harvard Law Review 126 (2012): 1880–1903.
Spiecer, Till, Muhammad Ali, Giridhari Venkatadri, Filipe Nunes Ribeiro, George Arvanitakis,
Fabrício Benevenuto, Krishna P. Gummadi, Patrick Loiseau, and Alan Mislove. “Potential
for Discrimination in Online Targeted Advertising.” Proceedings of Machine Learning
Research 81 (2018): 1–15. http://proceedings.mlr.press/v81/speicher18a/speicher18a.pdf.
Stamos, A. “How the U.S. Has Failed to Protect the 2018 Election—and Four Ways to Protect
2020.” Lawfare, August 22, 2018. Accessed February 12, 2019. https://www.lawfareblog.
com/how-us-has-failed-protect-2018-election-and-four-ways-protect-2020.
Tiku, Nitasha. “Facebook Is Steering Users Away from Privacy Protections.” Wired,
April 18, 2018. Accessed March 15, 2019. https://www.wired.com/story/
facebook-is-steering-users-away-from-privacy-protections/
Tufekci, Zeynep. “Engineering the Public: Big Data, Surveillance and Computational Politics.”
First Monday 19, no. 7 (2014). Accessed March 15, 2019. http://firstmonday.org/article/
view/4901/4097
Turow, Joseph. The Daily You: How the New Advertising Industry Is Defining Your Identity and
Your Worth. New Haven, CT: Yale University Press, 2012.
Turton, William. “We Posed as 100 Senators to Run Ads on Facebook. Facebook Approved
All of Them.” Vice News, October 30, 2018. Accessed March 15, 2019. https://
news.vice.com/en_ca/article/xw9n3q/we-posed-as-100-senators-to-run-ads-on-
facebook-facebook-approved-all-of-them.
United States Federal Trade Commission. “Cross Device Tracking: An FTC Staff Report,”
January 2017. Accessed March 15, 2019. https://www.ftc.gov/system/files/documents/
reports/cross-device-tracking-federal-trade-commission-staff-report-january-2017/ftc_
cross-device_tracking_report_1-23-17.pdf.
Vaidhyanathan, Siva. Antisocial Media: How Facebook Disconnects Us and Undermines Democracy.
New York: Oxford University Press, 2018.
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms
410 JOURNAL OF INFORMATION POLICY
court case
U.S. v. Internet Research Agency, 18 U.S.C. §§ 2, 371, 1349, 1028A (U.S. Dist., D.C., 2018).
This content downloaded from 1.23.210.162 on Thu, 23 Apr 2020 07:38:12 UTC
All use subject to https://about.jstor.org/terms