On March 12th, a group of Assembly Student Fellows joined evelyn douek in conversation over Zoom. douek is a lecturer on law and an S.J.D. candidate at Harvard Law School, where she studies the global regulation of online speech, private content moderation institutional design, and comparative free speech law and theory. While our conversation focused on Facebook’s Oversight Board, we also discussed content moderation writ large and, briefly, the recent tumult between news publishers and Facebook in douek’s native Australia. The article below is a summary of our conversation. Quotation marks indicate where douek is quoted directly. The headings in bold introduce new segments of the discussion, prompted by questions from Assembly Student Fellows.
— Danny Wilson
We began by discussing the seven cases decided by the Oversight Board since it began reviewing selected Facebook content moderation decisions in January. One case was rendered ineligible for a decision because the relevant post had been deleted by a user. Of the six remaining, in one case the Board upheld Facebook’s decision to remove a piece of content; in all other cases, the Board overturned Facebook’s decision to remove content and required restoration of the relevant post. (The Oversight Board’s website details all of its decisions.)
douek noted that these decisions — in the main, the overturning of content removal and mandating post restoration — are consistent with interpretations of international human rights law that take a robust approach to upholding free speech. To douek, the Board’s decision history, albeit still narrow, is evidence that it is placing significant emphasis on freedom of expression.
Yet the Board has what douek described as a skewed jurisdiction. It can only evaluate cases where Facebook has taken a post down, and can only offer judgment on (1) whether Facebook should restore the post and (2) whether Facebook should have removed the post when it did. The Board’s decisions have been “fact-specific, context-specific, hair-splitting,” douek notes, “like judicial decisions rather than comporting with the way that Facebook actually works.”
douek believes that the dominant view of the Oversight Board is that it is a PR stunt on Facebook’s part, an effort to hinder future regulation.
Yet, she views the idea that the Board will stop regulation as a strawman. She described herself as “cautiously optimistic” about the Board, noting that its members are conscious of its limitations and are considering how to widen its scope. A more systemic impact might involve pushing to understand the decisions of Facebook’s algorithms — although this is a charge that is easy to articulate and hard to do (one can’t simply open the hood of an algorithm to see what is happening inside).
The trickiness of “looking at the content moderation algorithm” is at the core of tensions around the Oversight Board. douek noted that the Board’s decisions don’t always match the reality of content moderation, which is not a juridical process but instead the result of actions by algorithms and overstretched human moderators. She believes that the Board can, however, build up legitimacy and forcing mechanisms with Facebook over time, which contributes to douek’s optimism.
In the process of making decisions, the Board creates an implicit mechanism for disclosure. While it can only issue binding decisions with respect to the individual piece of content under consideration, the Board has started to issue non-binding policy recommendations. This, douek notes, “creates a dialogue between the Board and Facebook that is information-forcing,” allowing scholars and other observers to learn more about Facebook’s moderation practices. douek reinforced that conversations on content moderation are often “untethered from reality” given how rarely commentators grapple with the issues of scale. She noted that the Oversight Board reinforces an approach that suggests moderation decisions are like constitutional cases, when in reality there’s no way for humans to review every post.
Over time, the Oversight Board may effectively set precedent, both for Facebook and other platforms.
The Oversight Board has surprisingly little Facebook-specific branding, which douek sardonically described as “Orwellian.” The platforms often “fall in line behind each other” in moderation decisions broadly, because moving as a group shields them from individual controversy. This is a form of collaboration douek has described as “content cartel creep,” or the possibility of “arrangements between platforms to work together to remove content or actors from their service without adequate oversight.” These observations leads douek to her idea that the future is ‘content moderation as a service’ (a play on Software as a Service products), given how many emerging companies will have to grapple with moderation — and how far from their desired focus scaling moderation teams might be.
douek noted that the Oversight Board is far from her primary interest, but that her expertise is a function of how quickly the Board moved to the forefront of discussions on content moderation and disinformation. She has begun to turn her focus to India, the world’s largest democracy.
In India, Twitter and the government are locked in a showdown, in part precipitated by government calls to remove posts tied to recent mass protests by small farmers. Twitter, according to douek, is “showing spine” by upholding the free speech rights of Indian protestors.
The Indian market is a major focal point for scholars like douek who follow the platforms and their moderation decisions closely. It’s a growth space for companies like Twitter, and douek noted that the greatest threats to free speech still come from nation-states — which means that in India, companies like Twitter must determine when they will obey local laws, and when they will disobey them in the name of international human rights.
Australia recently passed a law that would “make big tech pay for news,” ostensibly designed to reduce the power of Facebook and Google. Facebook initially responded by banning the sharing of news from Australia, a move it then rescinded after negotiations with the Australian government.
douek argued that it is “possible to think two things at once”: first, that Australia’s media ecosystem is “shot,” reinforcing the need to think through new business models for local media, and second, that Australia is approach this problem in a “stupid” way.
Rather than compel platforms to pay for news, douek argues, the government could just tax them more, and support local journalism through government budgets. Facebook “couldn’t care less about Australia,” which is a small market — therefore making the initial retaliatory decision less painful. douek pointed out that the Australian press is disproportionately controlled by Rupert Murdoch, an ally of the current Australian government.
Assembly Student Fellows Nick Anway, Mark Haidar, David Stansbury and Danny Wilson joined Berkman staff members Zenzele Best, Hilary Ross and Sarah Newman for this conversation.
Further reading
evelyn douek on the foundation of the Oversight Board, from the University of Chicago Law Review blog
evelyn douek on “content cartel creep,” (referenced above), from the Knight First Amendment Institute
Kate Klonick on the novel aspects of the Oversight Board, from the Yale Law Journal