Since Facebook first set out its “blueprint for content governance” in November 2018, we’ve followed the development of its Oversight Board closely.
Now, everything is more or less in place: the first slate of members is announced, some questions have been answered, and an appropriately sober-hued Twitter page has been unveiled to the public. We went to Evelyn Douek—expert on all things content governance—to get her reflections on what it all means; listen to our conversation here.
The initial media response was predictably mixed. John Naughton’s Guardian comment piece is a good representation of the sceptical position. For those who see the Board as a good faith—if imperfect—attempt to address the content governance challenge, questions remain about how it will operate in practice. (We set out some of these in our response to the final Charter.)
Shortly after the Board’s launch, a bigger story emerged: Twitter’s decision to impose a content warning on a tweet by US President Donald Trump which “incited violence”. Facebook didn’t follow suit on the equivalent post on its platform: with CEO and founder Mark Zuckerberg explaining his reasoning in a public statement 18 hours later, following a “productive” phone conversation with the President. In response, some Facebook employees staged a virtual “walk-out” and called on the Board to make a judgement on the decision.
The Board isn’t currently operational; it’s set to begin hearing cases later this year. But even if it had been up and running, could it have done anything?
Under the current bylaws, the Board—at least in its initial form—would only have the authority to review cases in which Facebook had taken down content, not left it up—a vulnerability consistently raised by Evelyn Douek. A wider remit (including the power to rule on “left up” content, the platform’s wider policies, and advertising) has been promised at an unspecified point in the future—which could be years from now. After this week, Facebook may want to look at bringing that forward...