Copy
Digital Deception Decoder 
August 10, 2018

MapLight and the Digital Intelligence (DigIntel) Lab at the Institute for the Future have teamed up on a free newsletter to help you track the most important news, research, and analysis of deceptive digital politics. Each week, we'll send you coverage of the webs of entities that seek to manipulate public opinion, their political spending ties, and the actors working to safeguard our democracy. Know someone who might be interested? Ask them to sign up!

  • In the Wall Street Journal (paywalled), Deepa Seetharaman and Robert McMillan examine how imposter Facebook accounts such as “Resisters,” “Black Elevation,” and “Aztlan Warriors” infiltrated social media organizing for real political protests and rallies—likely to drive political polarization and destabilize trust in legitimate political expression. DigIntel Lab’s Sam Woolley asks, “What’s real grass-roots activity versus fake grass-roots activity?” noting the difficulty of distinguishing digital manipulation from organic speech and organizing. According to a new analysis published by Tony Romm and Elizabeth Dwoskin from the Washington Post, more than 40,000 people expressed interest in attending rallies linked to Resisters.
  • This week, major technology platforms have finally taken action against right-wing conspiracy theorist Alex Jones and Infowars, reports Jack Nicas for The New York Times. Apple and YouTube made the first moves in removing content, followed by Spotify and Facebook (the latter after dancing around it for weeks). Twitter remains a lone holdout, with CEO Jack Dorsey tweeting that Jones hadn’t yet violated the company’s rules. CNN’s Oliver Darcy confronted Twitter with numerous examples of apparent violations by Jones and Infowars (which were then mysteriously deleted). Of course, as Max Fisher points out for The New York Times, the case is hardly unique—companies like Facebook have long fumbled on hate speech and incitement throughout the developing world.
  • The Atlantic Council’s Digital Forensic Research Lab went down the rabbit hole of investigating an arm of the 2016 Russian trolling operation that appears to be distinct from the Internet Research Agency’s efforts. This second operation, which used a number of fake personas, including that of “Alice Donovan,” was run by Russia’s military intelligence agency GRU and focused on mobilizing black Americans against Hillary Clinton and promoting Russian military propaganda.
  • AI arms race: DARPA researchers have developed tools for catching deepfake videos using artificial intelligence, reports Will Knight for the MIT Technology Review. While researchers were willing to discuss certain techniques, they're keeping others under wraps in an attempt to stay ahead of forgers.
  • Perhaps the U.S. can counter disinformation by drawing on tactics that have worked in anti-smoking efforts, argue the Brookings Institution’s Alina Polyakova and the Atlantic Council’s Geysha Gonzalez. They call for a “whole-of-society approach” that integrates regulation, trusted messengers, civil society, and education.
  • Writing for Quartz, Nikhil Sonnad contends that Facebook’s problems stem from a systemic disregard for individual human beings. “There are certain things you do not in good conscience do to humans. To data, you can do whatever you like.”
  • Given the glaring need for ethics in tech, the Institute for the Future and the Omidyar Network have teamed up to launch the Ethical OS, a toolkit to help technologists anticipate the potential long-term harms of their creations, including categories of risk such as disinformation and propaganda.

We want to hear from you! If you have suggestions for items to include in this newsletter, email them to hamsini@maplight.org. - Hamsini Sridharan

Share
Tweet
Forward
Brought to you by:
 
MapLight     DigIntel Lab
Copyright © 2018 MapLight, All rights reserved.


Want to change how you receive these emails?
You can
update your preferences or unsubscribe from this list