On April 9th, a group of Assembly Student Fellows joined Nat Gyenes in conversation over Zoom. Gyenes is the Director of the Digital Health Lab at Meedan, a non-profit technology company that builds software to “strengthen global journalism, digital literacy, and accessibility of information.” She was formerly a researcher affiliated with the Berkman-Klein Center. Gyenes’s interests include misinformation in public health, including COVID-19 and HIV. Gyenes has studied content moderation infrastructure and its intersection with public health, and works to incorporate physicians and public health officials in journalism and advisory structures on content moderation.
The article below weaves together our conversation with Gyenes along with recent discussion of vaccine hesitancy coincident with the rollout of the COVID-19 vaccines.
— Danny Wilson, with contributions from Valentina Vargas
Major organizations — including media outlets and social media platforms — will oftentimes not fact-check a piece of information if they are unable to get someone with topic area expertise to validate, verify, or explain it. This, according to Gyenes, means that public health content may not be integrated into content moderation systems at platforms like Facebook. Gyenes has identified what she views as a central question for research: for what topic areas does a fact-checking organization not have reliable access to expertise? And, in order to address those gaps, how can content moderation decisions be made in closer collaboration with public health experts?
The effects of misinformation in public health are not universal. Some pieces of misinformation disproportionately affect certain populations. Posts about the nutritional effects of certain foods might indirectly target, or unduly reach, groups where those foods are more prevalent and central to a group’s culinary identity. Other groups might have historically-rooted distrust of vaccines and other government-supported medical interventions (Black Americans, for example, saw heightened levels of medical mistrust after incidents like Tuskegee, although it’s important to note that recent analysis does not show that this is the case with COVID-19 vaccines). This challenge becomes particularly acute in a comparative context: many content moderation decisions, or guidelines, are made in the United States, where entrenched attitudes about public health and cultural norms from other countries are less clear.
Gyenes has worked on clear proposals to bring public health experts and content moderators closer together. In response to the COVID-19 pandemic, Meedan’s Health Desk initiative puts a team of public health experts on speed dial, allowing them to respond to any questions a fact-checker might have. Crucially, efforts like the Health Desk provide reliable expertise to community newsrooms who otherwise would not have access to a specialist, given the nature of resource-strapped journalism.
The COVID-19 pandemic has invited public health misinformation at global scale. Gyenes noted how difficult it has been for public health officials to communicate the limitations of policy recommendations, especially given circumstances where new evidence was rapidly emerging. This leaves journalists even more exposed to the challenge of balancing differences in scientific opinion when faith in ‘experts’ is tenuous. For example, an outlet might be left noting that global guidelines from organizations like the CDC and the WHO may not line up with emerging evidence, as we saw with mask-wearing recommendations early on in the pandemic.
The idea that neither misinformation nor ‘best practices’ are universal, but instead depend on community context, guides Gyenes thinking about misinformation around COVID-19 and vaccines. Gyenes cited the work of Heidi Larson, an anthropologist at the London School of Hygiene and Tropical Medicine, whose research focuses on the factors that “affect uptake of health interventions and influence policies.” Larson has recommended a shift in language away from terms like vaccine “skepticism” towards a “spectrum of confidence.” In her recent book Stuck: How Vaccine Rumors Start and Why They Don’t Go Away, Larson, writing prior to the start of the coronavirus pandemic, argues, “The point is that we need a more holistic, context-aware, and dynamic engagement between publics and those who develop the technologies and determine the policies and which depend on public cooperation for their success.” Larson points out that vaccine “questioning” is often associated with feelings of “lost dignity and distrust” between publics subject to immunization campaigns and experts.
Larson’s view that the language experts use to describe the public’s confidence in a vaccine needs to shift stems from the idea that terms like “vaccine resistance” often serve to patronize the very populations who are hesitant. Patients, equipped with access to the internet, often go online to investigate, or verify, information they have received from their physicians. “Publics,” Larson writes, “have mobilized themselves, empowered by new digital media to speak their unfettered views and organize themselves, and they have access to global online audiences. They are engaged, but on their terms.” Larson’s analysis echoes Gyenes: “The real challenge is that many of the vaccine narratives, both pro and con, are embedded in websites or social networks whose primary focus is on other issues.”
In March of this year, reporting by the Washington Post echoed Larson’s argument. The Post’s coverage described an internal study by Facebook that found that many posts promoting vaccine hesitancy fell into Facebook’s “gray area,” or posts that did not count as “outright false or misleading statements about COVID-19 vaccines,” but were instead expressions of concern, or instances of misinformation, that “maybe be causing harm in certain communities, where it has an echo chamber effect.” These findings serve to amplify Gyenes’s point that misinformation is community-specific, and that fact-checking without adequate community context or access to public health resources is particularly difficult.
Gyenes closed our conversation by citing the work of Theresa Amobi, a researcher at the University of Lagos, who has argued that COVID-19 misinformation often reflects a desire to return to the status quo (imagine, for example, a social media post describing insignificant coronavirus risk from dining indoors or attending a packed concert — two of the activities considered most dangerous during the pandemic). Sharing posts of this kind may not signal a desire to see harm done to others, but instead can be understood as a form of wish fulfillment, creating a sense of reassurance that we can resume the activities we might miss the most.
Discussing a “spectrum of confidence” for COVID-19 vaccines may seem like a luxury given the urgency of widespread immunization — but earning public buy-in for vaccinations is essential, and may be a necessary condition for achieving herd immunity. And COVID-19 vaccine-related misinformation serves as a meaningful case study in how often responding to misinformation is not straightforward, and how expressing misinformation is often an effort to reclaim agency by those sharing it. Given a desire for autonomy over one’s body and health decisions, and significant restrictions on activity during the pandemic, the roots of vaccine misinformation betray easy characterization and assignment of blame.