Digital Deception Decoder 
August 31, 2018

MapLight and the Digital Intelligence (DigIntel) Lab at the Institute for the Future have teamed up on a free newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • Writing for The Guardian, Andrew Smith forays into the world of unpredictable algorithms and their (too predictable) ramifications for society. Quoting UCSD science studies scholar Lilly Irani: “The choices algorithm designers and policy experts make are presented as objective, where in the past someone would have had to take responsibility for them. [...] The choice to use algorithms to automate certain kinds of decisions is political too.” On a similar note, legal scholar Frank Pasquale argues in Real Life Mag that “algorithmic accountability” needs to involve social, critical race, and feminist theory because “No algorithmic system can circumvent the necessary and endless conversations that these ultimately political and moral questions demand.”
  • In Politico, legal scholar Laurent Sacharoff discusses a civil rights case that raises important questions about how to deal with bots and fake accounts on social media. Sacharoff argues out that while malicious use of bots needs to be eliminated, automated accounts can offer powerful tools for civil rights activists and researchers to monitor algorithmic discrimination online.
  • Red herring: This week, President Trump stoked fears of political bias by social media companies by suggesting (on Twitter) that Google rigs search results against conservatives. Rob Verger debunks this claim for PopSci, demystifying how search engines work. And in WIRED, media manipulation expert Renee DiResta contends that “We need to hold tech companies accountable—for irresponsible tech, not evidence-free allegations of censorship—and demand transparency into how their algorithms and moderation policies work.” (See similar commentary from Slate’s Will Oremus and Microsoft researcher Tarleton Gillespie.)
  • AI bodies: These three videos offer food for thought regarding developments in AI and their implications for democracy: 1. UC Berkeley researchers have found a way to copy movements—like dance—from one video subject to another. 2. BuzzFeed reporter Charlie Warzel tricks his mother using an AI clone of his voice. 3. Earlier this year, researchers from MIT and Stanford reported significant gender and skin tone bias in top facial recognition systems; lead author Joy Buolamwini unpacks what that means for black women in this powerful spoken word piece.
  • In a Reuters exclusive, Jack Stubbs and Christopher Bing report that Iran’s online political manipulation efforts extend far beyond last week’s findings, encompassing a shadowy network of websites and social media accounts called the International Union of Virtual Media. The total number of Iranian accounts removed from Twitter has increased from 284 to 770. In The Hill, Ali Breland discusses the likelihood that foreign influence campaigns on social media will be a persistent problem for our democracy through the midterms and beyond.
  • Speaking of the midterms: A conservative dark money group is now the top political advertiser on Google, reports Bloomberg’s Ken Doyle. As of August 29, the nonprofit One Nation, which does not disclose its donors, had spent over $1.1 million on ads attacking Democratic candidates and supporting Republicans, none of which has been disclosed to the FEC due to gaps in campaign finance law.
  • At NiemanLab, Laura Hazard Owen recaps a new study from Duke, Brigham Young, and NYU researchers which suggests that exposure to opposing political views on social media might actually increase polarization, especially among Republicans—with some important caveats. (Full study here).

We want to hear from you! If you have suggestions for items to include in this newsletter, email them to - Hamsini Sridharan

Brought to you by:
MapLight     DigIntel Lab
Copyright © 2018 MapLight, All rights reserved.

Want to change how you receive these emails?
You can
update your preferences or unsubscribe from this list