Teaching Simulation: What can you say and share online?

In the spring of 2021, student fellows worked in small teams, called Collaboration Groups, to focus on specific areas within disinformation. One group sought to understand better how decisions around content policy play out, particularly in instances of “deplatforming.” 

Ultimately, they explored this through the creation of a simulation targeted at students in the 10th and 11thgrades. The students are asked to consider content moderation scenarios that parallel the decisions before the Oversight Board, a quasi-juridical body discussed in this conversation with evelyn douek, a lecturer at Harvard Law School. 

Their teaching plan is below. Guidance for instructors, the prompts that can be provided to students, and discussion questions are included, structured around two successive class sessions. 

What can you say and share online? 

Model “Oversight Board” Simulation (Age Group: 10th-11th grade) 

Prepared by Paul Tembo, Rachel Gibian and David Stansbury 

Introduction: Content Moderation and the COVID-19 Pandemic

During a global health crisis–when cures are desperately needed, but the science isn’t moving as quickly as we would like–what health claims can you share online? How can you tell the difference between helpful advice and harmful conspiracy theories? And crucially, how do you know what constitutes misinformation… if all the facts aren’t available yet?

Lesson Instructions

Class 1: In small groups of 3-6 students, you’ll be asked to debate the perspective of three different characters: a French man named Adam; a global technology company, Facebook; and the separate legal body that is judging between their cases: The Oversight Board. 

Class 2: Then, we’ll come together as a class to align on what the right course of action was: should Adam’s post have been allowed on Facebook’s platform? It is up to the Oversight Board to make this decision and to offer recommendations for how content like Adam’s should be moderated in the future. Once we’ve made a decision as a class about the right course of action, we’ll reveal how the actual Oversight Board decided to weigh in. Do you agree with their decision?


Note for instructors: If desired, you can assign a character for each student to play in their small group discussion: either Adam, Facebook or the Oversight Board.

1. Adam’s Perspective: A Scandal in France

By October 2020, Adam had had enough. Months into the COVID-19 pandemic, he was losing hope and frustrated at the lack of solutions proposed by the French government. When would the lockdowns end? And when would a vaccine or remedy be available? In Adam’s view, the French authorities were far more concerned with policing the behavior of their people than with proposing potential cures to the virus. 

Adam wasn’t going to sit around and wait. Instead, he decided to seek answers online and connect with like-minded individuals by joining several Facebook groups about COVID-19. As he looked into COVID-19 cures, he was frustrated by the different ways the French government reacted to different proposed solutions. He noticed that an antiviral medication called remdesevir was authorized as a drug against COVID-19, but the combination of hydroxychloroquine (HCQ) and azithromycin weren’t allowed. Wasn’t the quantity of information about each of these different drugs quite similar, given how little research existed about how to treat COVID-19? Why were some medications sanctioned and others not? The discrepancies didn’t sit right with Adam. 

What’s more, as he continued to dive into his research, Adam discovered the work of Didier Raoult, a professor of microbiology at the Faculty of Medicine of Marseille, who claimed that the combination of HCQ and azithromycin could work against COVID-19 if it was administered early to patients. These drugs could be considered miracles–potentially curing COVID-19 in a matter of days. Adam imagined how these medicines could put an end to the pandemic, and was excited to learn that they were already being used in other countries to treat the virus. Why was the French government holding back? It was scandalous.

Moved by these questions, Adam posted in a COVID-19 Facebook group with 500,000 members. In his post, he shared a video of Prof. Didier Raoult, along with an effusive caption:

“It’s a scandal! The French government refuses to authorize hydroxychloroquine combined with azithromycin for use against COVID-19, but authorizes and promotes remdesivir? Raoult’s cure is being used in other countries already. What does society have to lose by prescribing a harmless drug in an emergency situation like this pandemic? Hydroxychloroquine works and can be administered early, before patients get very sick, unlike remdesivir. The French Government needs to act.”  

The reactions rolled in quickly. Soon, Adam’s video had reached an impressive 50,000 views, with nearly 900 reactions–most of which were “angry” just like him. Adam was proud to see that the post was shared by 600 people and read through all the 300 comments that came in. But just as he was getting inspired by his successful post, Facebook removed the content for violating its Community Standard on Violence and Incitement. 

Questions to consider from Adam’s perspective:

  • France is understood as the country of liberté (freedom). Why should Adam’s right to free expression be limited online?
  • While the combination of hydroxychloroquine and azithromycin wasn’t legal in France, it was being tested in different countries in October of 2020. If Facebook is a global platform, how do we know what health claims can or can’t be posted online? Should there be national limits to online speech?
  • Does Facebook taking down Adam’s post prove his point: that authorities are excessively intervening and policing individual’s thoughts?

2. Facebook’s Perspective: Real World Harm

It has been a tricky couple of years for Facebook and their policy team. Especially since the 2016 US Presidential election, the company has been subject to increased scrutiny and condemnation, seemingly from all sides. People have criticised both the way that Facebook manages what content people see in their newsfeeds via its recommendations algorithm, as well as what things the company allows users to post on the platform. In some cases, content shared on Facebook incited violence, with examples in places like India,[1] Germany,[2] and Myanmar.[3] Since then, Facebook has been trying hard to be more proactive in dealing with these problems on its platform, and its CEO Mark Zuckerberg has regularly been called to testify in the US Congress about the steps the company is taking to make progress.

The coronavirus pandemic that broke out in 2020 posed a new set of problems for the company. It was important that people be able to share their views and information about what was happening. The platform could be an important way for people to learn how to protect themselves and how to handle the pandemic, including getting advice from official bodies and governments. But, there was also a risk that people might deliberately or accidentally share information about the virus – how it spread, its symptoms and cures – that was not true. 

Early on in the pandemic, Facebook’s Vice President of Global Affairs and Communications, Nick Clegg, said that the company would not allow information to circulate on the platform that could lead to “real world harm.” If the information was misleading but probably would not lead to physical harm, Facebook said it would make that information not appear as often on the platform in people’s newsfeeds.[4]

But, with 1.8 billion people using Facebook every day, it is impossible for Facebook to make detailed decisions on every post that each user makes. Facebook also thinks it is important for people to be able to express themselves on the platform, so they do not want to limit things if they do not have to. For a society to be free, some people think that people must be able to say objectionable, offensive things. However, the company has a policy that posts that could credibly incite real world violence[5] and hatred[6] will not be allowed.

In Adam’s case, Facebook saw his post and judged that it was misinformation about the coronavirus. Health experts had recommended that the drugs Adam mentioned should not be used to treat it. As part of the company’s commitment to stopping the spread of incorrect information about the virus, the policy team decided to take the post down.

In addition, the team were worried that someone who used the drugs Adam had mentioned in his post to treat themselves for Covid-19 might hurt themselves. The drugs might lead to adverse effects, perhaps making the person even more sick. So, in keeping with the company’s commitment to prevent the spread of information that might cause people real, physical harm, the team also thought it was appropriate to take the post down.

Questions to consider from Facebook’s perspective:

  • We have received lots of criticism in the last few months about our liberal approach to what people can and cannot say on Facebook. It is very important for us as a company that people can talk about things that matter to them. So was it a good idea for us to take Adam’s post down?
  • It makes our job much easier if we can have clear rules that help us decide about what people can and cannot post on Facebook. With Adam’s case in mind, can you suggest some rules?
  • Adam is a user of Facebook from France. Does it make a difference for us as Facebook where our users come from, with different rules depending on where a user is from? Or should we have one set of rules for all our users? 

3. Oversight Board’s Perspective: Seeking Justice and Balance

In 2020, Facebook appointed the first members of its Oversight Board. They are a diverse group of people from around the world who are experts in a range of different subjects, from law and journalism through to human rights. They are independent of Facebook, and can review the decisions Facebook takes about whether to remove content that its users post. Some people have said that the Oversight Board is like Facebook’s “Supreme Court”, where the ultimate decisions are made about what is acceptable on Facebook’s platform. In order for the board to examine an issue, it must already have been considered twice by Facebook: first in an initial decision and then in an appeal. If a user is still not content with Facebook’s decision, they can sometimes appeal the decision to the Oversight Board. A decision might be eligible for appeal if Facebook does not think that the user has clearly broken the law of the country they are in, and if the user’s post does not put other people in danger. 

The mission of the Oversight Board is to use its independent judgement to support people’s right to free expression and ensure that those rights are being adequately respected. The Board’s decisions to uphold or reverse Facebook’s content decisions are binding. This means that Facebook has to implement the decisions the Board makes, unless doing so could violate the law. Many of the decisions the board have taken so far have also included instructions for Facebook to update the rules they use for deciding what posts to take down, so that similar future cases can be dealt with by the Facebook team, without needing to go to the full appeal process.

When taking its decisions, the Oversight Board team considers many things, including international rules that have been decided by the countries of the world at the United Nations. These include the Universal Declaration of Human Rights, which in Article 19 makes a clear statement about people’s right to freedom of speech:

“Everyone has the right to freedom of opinion and expression; this right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media and regardless of frontiers.”[7]

In practice, different countries have different laws that govern what is acceptable speech. These range from countries like the USA where the first constitutional amendment means there are very few limitations on speech; to countries like Germany where it is illegal to incite hatred against national, religious, racial or ethnic groups. This is known as “hate speech.”[8] There are more extreme countries like North Korea and Eritrea, where saying or writing anything that opposes the people in power is likely to end up with you in jail, or worse.[9]

In the case of Adam, Facebook removed his post because they believed it violated their rules about misinformation and might put people in imminent harm if they read it. The Oversight Board thought about this and wondered if Adam was really putting people in imminent danger with his post. Some of the things they considered were whether people were likely to believe Adam, and whether people could actually cause themselves harm as a result of his post: did people actually have easy access to the drugs that Adam had mentioned? If they did not, how could they put themselves in danger as a result of Adam’s post? As it turns out, the drugs Adam had mentioned were only available in France with a prescription, so most people did not have easy access to them.

The Board also thought about whether Adam’s right to freedom of speech had been violated. This freedom is an important piece of international rights law, and it is important for Facebook to protect it. Adam’s post did not recommend that people actually take the drugs he was talking about. His post seemed to mainly be about pressuring the government into changing its position on what drugs they would use to treat Covid-19.   

The other thing the board considered was how easy it was for Facebook’s users to know what is acceptable and unacceptable to post on the platform. The rules that applied in Adam’s case – misinformation and putting people in imminent danger – were spread across multiple different places, in documents called ‘Community Standards’. But in some cases, these community standards were not the same as what Facebook had said the rules were when it spoke to the media about misinformation on its platform. So, how could Adam or anyone else reasonably know what the rules actually were? 

Questions to consider from the Oversight Board’s perspective:

  • Facebook has an important job to protect freedom of speech. Are they right to have taken down Adam’s post, or are his freedoms being violated?
  • Are there any special cases where the right of freedom of expression should be limited, especially in cases like this where people are trying to understand important information about their health in the middle of a global pandemic?
  • Are there any suggestions we can make to Facebook about changes they can make to their rules and how they are presented, so that future cases like this are clearer to decide about, and so that users like Adam can more easily understand them?



Note for instructors: After regrouping as a class on the different opinions and ideas that came up in small group discussions, you can introduce the information below as a new discussion prompt for the class. Does the Oversight Board’s decision reflect what students expected? Why or why not? 

The Oversight Board’s Decision

The Oversight Board ruled against Facebook, arguing that Facebook should not have removed Adam’s post. 

In its decision, the board noted that Adam’s post was mostly focused on governmental policy and inaction, not medical information. Moreover, the author’s comments didn’t constitute “imminent harm” because, while they did mention a potential “cure” to COVID-19, the medications he recommended couldn’t be accessed without a prescription in France. It’s therefore unlikely that people would be at immediate risk based on Adam’s information, since most individuals wouldn’t be able to procure the “remedies” he discussed.

The Oversight Board also underlined that Facebook’s misinformation policy–especially when it comes to health–was very vague. Because of this lack of clarity, the Board recommended that Facebook review its policies around “imminent harm,” which is found in Facebook’s current Violence and Incitement Community Standard. Overall, Facebook’s rules appear to users as a “patchwork”–making it hard for users to know what they can and cannot post. Therefore, the Board encouraged Facebook to work towards a more comprehensive and holistic policy, which would be easier to understand (by users) and to enforce (by Facebook).

Finally, the board also found that Facebook had unfairly limited Adam’s freedom of expression, which goes against international human rights standards. As the Oversight Board noted in its decision, there were several, intermediate steps that Facebook could have taken rather than deleting Adam’s post: they could have added a fact-checking label to it or guided users towards authoritative information. Such labels and tools are less extreme and less likely to infringe on users rights to free expression. 

Questions to discuss as a class:

  • Do you agree with the Oversight Board’s decision? Why or why not?
  • Do you agree with the Oversight Board’s interpretations of “imminent harm” and “freedom of speech”?
  • The Oversight Board also recommended that Facebook clarify its policies around misinformation. With Adam’s post in mind, what changes would you recommend to revise Facebook’s policies? 

[1] https://time.com/5712366/facebook-hate-speech-violence/

[2] https://www.technologyreview.com/2018/08/21/2339/evidence-is-piling-up-that-facebook-can-incite-racial-violence/

[3] https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html

[4] https://www.npr.org/2020/03/25/821591134/how-facebook-wants-to-handle-misinformation-around-the-coronavirus-epidemic

[5] https://www.facebook.com/communitystandards/credible_violence

[6] https://www.facebook.com/communitystandards/objectionable_content

[7] https://www.un.org/en/about-us/universal-declaration-of-human-rights

[8] https://www.politico.eu/article/germany-hate-speech-internet-netzdg-controversial-legislation/

[9] https://cpj.org/reports/2019/09/10-most-censored-eritrea-north-korea-turkmenistan-journalist/