Copy
Digital Deception Decoder 
May 11, 2019

Produced by MapLight and the DigIntel Lab at the Institute for the Future, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • This week, the DigIntel Lab published new research on “The Human Consequences of Computational Propaganda,” compiling eight case studies of the effects of digital disinformation on specific groups during the 2018 midterm elections. The case studies examined online manipulation of public sentiment towards climate science, immigrants, the Latinx community, Muslim Americans, Black women gun enthusiasts, moderate Republicans, and women’s reproductive rights. Check out editors Katie Joseff and Sam Woolley’s executive summary for key themes and recommendations, as well as coverage in BuzzFeed by Craig Silverman and Jane Lytvynenko.
  • Margaret Sessa-Hawkins and I compiled a guide to Facebook, Twitter, and Google’s political ad transparency initiatives, highlighting the ways these systems can be used and abused. On a related note, Judd Legum’s work to expose misleading political advertising via Facebook’s ad library continues—this week, with a fascinating examination of pro-Trump advertising by The Epoch Times, a 501(c)(3) nonprofit tied to Falun Gong.
  • In The New York Times, Google CEO Sundar Pichai has joined the bandwagon of tech leaders calling for a focus on privacy in product design and law. The Guardian’s Julia Carrie Wong explores the competitive posturing around privacy by tech companies, observing acerbically that “In Silicon Valley ‘privacy’ is in 2019 what reclaimed wood was in 2010: a must-have design feature that signals a certain degree of authenticity and hipness and could also double as a weapon in a pinch.”
  • Shareholder activism: On May 30, Facebook’s shareholders will vote on whether to retain Mark Zuckerberg on the board (he is in the omnipotent position of being both CEO and board chair of his company). Emily Birnbaum reports for The Hill that in advance of this meeting, Color of Change and Majority Action have launched a campaign to persuade shareholders to oust Zuckerberg. Meanwhile, in The New York Times, Cecilia Kang provides insight into debates at the FTC over whether to hold Zuckerberg directly accountable, in addition to fining Facebook.
  • Also calling for accountability for Zuckerberg and Facebook is his dorm room era co-founder Chris Hughes, who contends in a New York Times op-ed that Facebook needs to be broken up and an independent regulatory body charged with overseeing the tech industry—including privacy protections and guidelines for content moderation. In The Verge, Nick Statt shares Facebook’s response that it would be wrong to break up a “successful American company.” Also in The Verge, Adi Robertson attempts to parse what Hughes means when he suggests that a regulatory agency should establish “guidelines for acceptable speech on social media,” with corresponding free speech implications.
  • For the World Economic Forum, Public Knowledge’s Harold Feld discusses a new e-book that seeks to define what a digital platform is and argue for countries to create a “comprehensive, sector-specific” regulator to tackle all related issues.
  • At Associated Press, Desmond Butler and Barbara Ortutay reveal that Facebook has been auto-generating celebratory video content for Islamist, neo-Nazi, and white supremacist groups. AP’s investigation suggests that, despite Facebook’s assertions to the contrary, these groups continue to flourish on the platform.
  • At Vox, Jane Coaston unpacks Facebook’s attempts to position itself, somewhat contradictorily, as both a platform and a publisher. Harvard Law’s Jonathan Zittrain offers a useful correction to the article’s explanation of Section 230 of the Communications Decency Act—namely, that Facebook’s protection from liability under that law is what allows the platform to set its own guidelines for what is and isn’t allowed to appear on the site (vs. the common misconception that only by adopting “publisher” status would Facebook be allowed to make editorial decisions about what is allowed).

We want to hear from you! If you have suggestions for items to include in this newsletter, please email hamsini@maplight.org. - Hamsini Sridharan

Share
Tweet
Forward
Brought to you by:
 
MapLight     DigIntel Lab
Copyright © 2019 MapLight, All rights reserved.


Want to change how you receive these emails?
You can
update your preferences or unsubscribe from this list