Article 5B6KS Facebook will remove misinformation about covid-19 vaccines

Facebook will remove misinformation about covid-19 vaccines

by
Abby Ohlheiser
from MIT Technology Review on (#5B6KS)

The news: Facebook will remove false claims that have been debunked by public health experts" about covid-19 vaccines, it has announced. In a post, the company outlined how Facebook plans to apply its existing ban on covid misinformation-which is intended to screen out posts that could lead to imminent physical harm"-as countries around the world move closer to acquiring and rolling out vaccines. The removals will apply to both Facebook and Instagram.

Effective vaccines are coming: The success of covid-19 vaccines is seen as critical to overcoming the pandemic, with a number of candidates in late-stage testing. Earlier this week the UK became the first country to approve a vaccine, granting authorization to use the treatment developed by Pfizer and BioNTech and saying that the first doses could be given to patients within days.

What is Facebook removing? The policy announcement isn't comprehensive, but it gives a few examples of what would be removed from the site:

This could include false claims about the safety, efficacy, ingredients or side effects of the vaccines. For example, we will remove false claims that COVID-19 vaccines contain microchips, or anything else that isn't on the official vaccine ingredient list. We will also remove conspiracy theories about COVID-19 vaccines that we know today are false: like specific populations are being used without their consent to test the vaccine's safety."

So is this a big deal? Yes and no. It's important that Facebook is addressing how it will handle misinformation about vaccinations with more specifics, particularly as we enter what could be the most important public health moment in modern history. Misinformation about vaccines has long thrived on Facebook, and so anything it announces in terms of a ban or major crackdown has the potential to be very consequential.

The but" here is also important and multifaceted. Facebook's policies are only as effective as their enforcement. With health misinformation in particular, these bans will succeed in their aims only if they are effectively carried out within the many private groups on Facebook where false health claims are promoted and amplified. This has been an issue with the platform's previous attempts to crack down on damaging falsehoods.

Uneven enforcement: Even after Facebook began rolling out policies to limit the spread of vaccine misinformation in 2019-by restricting recommendations of groups and hashtags promoting such messages, for example- the anti-vaccine ecosystem continued to thrive in private spaces on the site. Since the pandemic, however, Facebook has been more aggressive about removing some health misinformation, citing its policy against content that could lead to imminent physical harm. A few weeks ago, Facebook banned prominent anti-vaccine personality Larry Cook, and an enormous Facebook group he ran, for violating its policies about the QAnon conspiracy theory.

External Content
Source RSS or Atom Feed
Feed Location https://www.technologyreview.com/stories.rss
Feed Title MIT Technology Review
Feed Link https://www.technologyreview.com/
Reply 0 comments