Article 54F4Q Facebook needs 30,000 of its own content moderators, says a new report

Facebook needs 30,000 of its own content moderators, says a new report

by
Charlotte Jee
from MIT Technology Review on (#54F4Q)

Imagine if Facebook stopped moderating its site right now. Anyone could post anything they wanted. Experience seems to suggest that it would quite quickly become a hellish environment overrun with spam, bullying, crime, terrorist beheadings, neo-Nazi texts, and images of child sexual abuse. In that scenario, vast swaths of its user base would probably leave, followed by the lucrative advertisers.

But if moderation is so important, it isn't treated as such. The overwhelming majority of the 15,000 people who spend all day deciding what can and can't be on Facebook don't even work for Facebook. The whole function of content moderation is farmed out to third-party vendors, who employ temporary workers on precarious contracts at over 20 sites worldwide. They have to review hundreds of posts a day, many of which are deeply traumatizing. Errors are rife, despite the company's adoption of AI tools to triage posts according to which require attention. Facebook has itself admitted to a 10% error rate, whether that's incorrectly flagging posts to be taken down that should be kept up or vice versa. Given that reviewers have to wade through three million posts per day, that equates to 300,000 mistakes daily. Some errors can have deadly effects. For example, members of Myanmar's military used Facebook to incite genocide against the mostly Muslim Rohingya minority in 2016 and 2017. The company later admitted it failed to enforce its own policies banning hate speech and the incitement of violence.

If we want to improve how moderation is carried out, Facebook needs to bring content moderators in-house, make them full employees, and double their numbers, argues a new report from New York University's Stern Center for Business and Human Rights.

Content moderation is not like other outsourced functions, like cooking or cleaning," says report author Paul M. Barrett, deputy director of the center. It is a central function of the business of social media, and that makes it somewhat strange that it's treated as if it's peripheral or someone else's problem."

Why is content moderation treated this way by Facebook's leaders? It comes at least partly down to cost, Barrett says. His recommendations would be very costly for the company to enact-most likely in the tens of millions of dollars (though to put this into perspective, it makes billions of dollars of profit every year). But there's a second, more complex, reason. The activity of content moderation just doesn't fit into Silicon Valley's self-image. Certain types of activities are very highly valued and glamorized-product innovation, clever marketing, engineering ... the nitty-gritty world of content moderation doesn't fit into that," he says.

He thinks it's time for Facebook to treat moderation as a central part of its business. He says that elevating its status in this way would help avoid the sorts of catastrophic errors made in Myanmar, increase accountability, and better protect employees from harm to their mental health.

It seems an unavoidable reality that content moderation will always involve being exposed to some horrific material, even if the work is brought in-house. However, there is so much more the company could do to make it easier: screening moderators better to make sure they are truly aware of the risks of the job, for example, and ensuring they have first-rate care and counseling available. Barrett thinks that content moderation could be something all Facebook employees are required to do for at least a year as a sort of tour of duty" to help them understand the impact of their decisions.

The report makes eight recommendations for Facebook:

  • Stop outsourcing content moderation and raise moderators' station in the workplace.
  • Double the number of moderators to improve the quality of content review.
  • Hire someone to oversee content and fact-checking who reports directly to the CEO or COO.
  • Further expand moderation in at-risk countries in Asia, Africa, and elsewhere.
  • Provide all moderators with top-quality, on-site medical care, including access to psychiatrists.
  • Sponsor research into the health risks of content moderation, in particular PTSD.
  • Explore narrowly tailored government regulation of harmful content.
  • Significantly expand fact-checking to debunk false information.

The proposals are ambitious, to say the least. When contacted for comment, Facebook would not discuss whether it would consider enacting them. However, a spokesperson said its current approach means we can quickly adjust the focus of our workforce as needed," adding that it gives us the ability to make sure we have the right language expertise-and can quickly hire in different time zones-as new needs arise or when a situation around the world warrants it."

But Barrett thinks a recent experiment conducted in response to the coronavirus crisis shows change is possible. Facebook announced that because many of its content moderators were unable to go into company offices, it would shift responsibility to in-house employees for checking certain sensitive categories of content.

I find it very telling that in a moment of crisis, Zuckerberg relied on the people he trusts: his full-time employees," he says. Maybe that could be seen as the basis for a conversation within Facebook about adjusting the way it views content moderation."

External Content
Source RSS or Atom Feed
Feed Location https://www.technologyreview.com/stories.rss
Feed Title MIT Technology Review
Feed Link https://www.technologyreview.com/
Reply 0 comments