Digital Deception Decoder 
December 11, 2019

Produced by MapLight, the Decoder is a newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • Fakebook: In the San Francisco Chronicle, MapLight President Daniel G. Newman takes Facebook to task for allowing politicians to lie to voters. “Facebook continues to fact-check and ban false political ads from sources besides candidates, like independent political committees. What is the difference between a super PAC paying to spread lies and a candidate paying to spread lies?” Newman argues that tech companies like Facebook need to be held accountable for their role in allowing digital deception to spread in our democracy.
  • Counterspeech: In the Washington Post, Tony Romm and Isaac Stanley-Becker examine Facebook’s latest suggestion: that it will label ads from politicians as not fact-checked. For Reuters, Kanishka Singh reports on a letter from the DNC to Sheryl Sandberg decrying the company’s response to disinformation as inadequate. Finally, for the Columbia University Press blog, media policy scholar Philip M. Napoli (Duke University) discusses why Facebook’s reliance on broadcast-era ad policies and counterspeech is misguided.
  • For the Columbia Journalism Review’s disinformation-themed issue, Errin Haines (Associated Press) digs into the disturbing pattern of online disinformation targeting Black voters and candidates. Haines points out, “For journalism to adequately cover how disinformation is used against the Black electorate, a solution is having Black journalists on the story.” Also in CJR, Bob Moser describes domestic disinformation targeting the 2020 presidential candidates.
  • An investigation by The Guardian and Queensland University of Technology researchers has uncovered a covert operation, apparently based in Israel, that has taken over several far-right Facebook pages in the U.S. and other countries in order to spread Islamophobic content — and profit from the engagement of more than 1 million followers. Posts by this group of pages have disproportionately targeted Muslim congresswomen Ilhan Omar (D-MN) and Rashida Tlaib (D-MI).
  • Two new reports from NATO StratCom shine further light on the business-end of social media manipulation and political trolling. Jonathan Corpus Ong (UMass-Amherst) and Jason Vincent A. Cabañes (De La Salle University-Manila) share the results of three years of ethnographic research on the people who produce political disinformation, providing insight into four labor models: in-house staff; advertising and PR; clickbait; and state-sponsored. And Sebastian Bay and Rolf Fredheim reveal the results of an experiment to test major platforms’ abilities to detect social media manipulation, purchasing fake engagement via “Manipulation Service Providers.” As discussed by Alberto Nardelli for BuzzFeed, the platforms did not perform particularly well.
  • Fact check: In OneZero, Will Oremus explores the implications and limitations of a recent study by the nonprofit Avaaz, according to which as many as 158.9 million people may have viewed the most viral false political Facebook posts of 2019. Oremus is cautious about the findings, but does note that “Facebook’s fact-checking efforts, however sincere, appear to be overmatched by the dynamics of its platform.”
We want to hear from you! If you have suggestions for items to include in this newsletter, please email - Hamsini Sridharan


Share Share
Tweet Tweet
Forward Forward
Brought to you by:
Copyright © 2019 MapLight, All rights reserved.

Want to change how you receive these emails?
You can
update your preferences or unsubscribe from this list