Facebook's content moderation a mess, employees outraged, contractors have PTSD: Reports
In a message to employees and on its official public blog, Facebook is today defending its choice to outsource content moderation to contractors.
Facebook has about 15,000 content reviewers, almost all of whom do not work for Facebook, but are contractors at staffing firms like Accenture and Cognizant.
From an overview of the company's latest woes, Bloomberg News:
The company's decision to outsource these operations has been a persistent concern for some full-time employees. After a group of content reviewers working at an Accenture facility in Austin, Texas complained in February about not being allowed to leave the building for breaks or answer personal phone calls at work, a wave of criticism broke out on internal messaging boards. "Why do we contract out work that's obviously vital to the health of this company and the products we build," wrote one Facebook employee. (Bloomberg News viewed dozens of messages about the topic, on the condition that it not publish the names of people involved.)
A Facebook spokeswoman said there has been no change in policies at the facility in Austin, and that it is has been working with Accenture to ensure practices comply with Facebook policies. Accenture did not respond to a request for comment.
The pressure on the company doesn't seem likely to subside. Over the years, a stream of media reports have detailed one the Internet's most dystopian jobs. The most recent example came on Monday, when the Verge published a lengthy account from several Cognizant employees working in Phoenix. They described the trauma of being presented with an endless procession of graphic violence and disturbing sexual activities, and said the restrictive working conditions further aggravated their stress.
Legally, Facebook believes it is insulated from much of what goes on in the outsourcing centers like the ones in Austin and Phoenix. Selena Scola, a content moderator working for a company called Pro Unlimited sued Facebook in September, saying it was responsible for her post-traumatic stress disorder. In a court filing, the company responded by saying that Scola had no right to sue, because she was an independent contractor. It argued that any harm she suffered was either her own fault, or the fault of unnamed third parties. The case is pending.
But legal cover isn't the only consideration. Over the weekend, Facebook circulated an explanation on internal message boards trying to dispel employee concerns, and detailing how it planned to address questions about how staffing companies treat their employees. The message, posted publicly to Facebook's blog on Monday, was written by Justin Osofsky, Facebook's vice president of global operations. It said that outsourcing content review was the only way it could scale quickly enough to meet its needs. "Given the size at which we operate and how quickly we've grown over the past couple of years, we will inevitably encounter issues we need to address on an ongoing basis," he wrote.
Facebook Grappling With Employee Anger Over Moderator Conditions [bloomberg.com]
Don't miss the shocking related piece at The Verge by Casey Newton, highlighted among conversations on Twitter below.
Employees are developing PTSD-like symptoms after they leave the company, but are no longer eligible for any support from Facebook or Cognizant. "I'm fucked up," one moderator who now has PTSD and generalized anxiety disorder told me.
- Casey Newton (@CaseyNewton) February 25, 2019
This is must-read from a@CaseyNewtona(C), who know this topic so well, right from the gripping first line: "The panic attacks started after Chloe watched a man die.": The secret lives of Facebook moderators in America - The Verge https://t.co/lgZxZAGOqH
- Kara Swisher (@karaswisher) February 25, 2019
Today I want to tell you what it's like to be a content moderator for Facebook at its site in Phoenix, Arizona. It's a job that pays just $28,800 a year - but can have lasting mental health consequences for those who do it. https://t.co/quieXG1Bm9 pic.twitter.com/cwtHXIqgol
- Casey Newton (@CaseyNewton) February 25, 2019
I cannot stress this enough: These companies do not care.https://t.co/5YTGwL82BI
- Ben Collins (@oneunderscore__) February 25, 2019
The Facebook mods who don't quit from the trauma are radicalized by ithttps://t.co/tHyxuNscu4 pic.twitter.com/beMi9ciPFv
- Andy Campbell (@AndyBCampbell) February 25, 2019
Of all the disturbing parts of this @CaseyNewton piece about Facebook content reviewers -- and there are many -- the one about people slowly coming to believe the conspiracy theories sticks with me https://t.co/ulDx3PEaWa
- Joshua Brustein (@joshuabrustein) February 25, 2019