Additional reports a month later raised concerns about the prevalence of vaccine hesitancy — which in some cases may amount to misinformation — in comments, which employees said Facebook’s systems were less equipped to moderate than posts. One of the March 2021 reports stated that “our ability detect vaccine hesitancy comment is poor in English and basically nonexistent elsewhere.”[Covid-19 vaccine hesitancy]These documents were included in disclosures to Securities and Exchange Commission and given to Congress in redacted form, by Frances Haugen (Facebook whistleblower). The redacted versions were reviewed by a group of news organizations, including SME.
The World Health Organization (WHO), in its early stages of the pandemic, began calling Covid-19’s misinformation an “infodemic”. It was amid a flood social media posts about conspiracy theories, dangerous advice about faulty vaccines, and dangerous information about the virus’ origins. The organization asked big tech companies to give it a direct line so that they could flag posts on their platforms which could pose a threat to people’s safety.
However, there were many comments that raised questions and raised legitimate concerns about vaccines on Facebook. In some cases, these organizations did not want to receive the free help. One of the March 2021 internal reports stated that the rate at which vaccine hesitancy comments were raised on Facebook posts was so high that authoritative health actors like UNICEF or the WHO would not use the free ad spend that we provide to them to promote provaccine content because they don’t want to encourage anti-vaccine commenters who swarm their Pages.”
Facebook employees were concerned that although the company’s AI systems could detect misinformation in posts but not comments, documents show. This may be because comments are more likely have vaccine-hesitant material.
“The overall risk from
“While comments may be more than posts, we have under-invested on preventing hesitancy from comments as compared with our investment in content,” another March 2020 report stated.
“One flag from UNICEF was a disparity between FB & IG,” one comment stated. “Where they said this: ‘One way we manage these situations on Instagram is through pinning top comments. Pinning allows us to highlight our most important comment (which will almost always link to useful vaccine information) and highlight other top comments which support vaccination. [vaccine hesitancy]UNICEF, WHO and other agencies did not respond to our requests for comment.
A Facebook spokesperson claimed that the company had improved on the issues raised by the internal memos. She said: “We approach misinformation in comments using policies that help us remove, reduce or minimize the visibility of false or potentially misleading data while also promoting reliable and giving people control over their comments. There is no one-size-fits all solution to stop the spread of misinformation. We are committed to developing new tools and policies that make comments sections more secure.
Among other efforts, Facebook — as well as fellow social media giants Twitter and YouTube — has added Covid-19 misinformation to its “strike policy” under which users can get suspended (and potentially removed) for posting violating content since the pandemic began. The platforms began labeling content that was related to Covid-19 in order to direct users towards authoritative sources.
Facebook halted the public release a “transparency report” earlier this year after it revealed that the most viewed link on the platform for the first quarter 2021 was a news story that claimed that a doctor died from the coronavirus vaccine. The New York Times reported that it had been pulled by the social media giant.
The author of the post wrote that these groups were full of Covid-19 misinformation. He also noted that content from these groups was prominently featured in the Facebook feeds “the tens to millions of Americans who are now a member of them.”
SME was informed by a spokesperson for Facebook that the company has added safety controls to groups following the May 2020 internal post.
Biden claimed that platforms like Facebook were “killing” people with Covid-19 misinformation in July 2021. Biden later backtracked on that claim, but not until a Facebook executive posted a strongly rebuke to the President.
“At an age when COVID-19-related cases are increasing in America, the Biden Administration has chosen to blame some American social media firms,” Guy Rosen, Facebook vice President of integrity, wrote. “While social media is an important part of society, it is clear that a whole-of society approach is needed to end this pandemic. And facts — not allegations — should help inform that effort.”
However, the February 2021 internal report on the prevalence of anti-vaccine or vaccine-hesitant comments in Facebook comments suggested that there was an “anti-vax sentiment” in Facebook comments relative to the wider population in the United States and the United Kingdom.
“This overrepresentation could convey that it is normal to be hesitant regarding the Covid-19 vaccination and encourage greater vaccine hesitancy,” said the report.