Rights Groups Demand Facebook Set Up Real Due Process Around Content Moderation

For quite some time now, when discussing how the various giant platforms should manage the nearly impossible challenges of content moderation, one argument I've fallen back on again and again is that they need to provide real due process. This is because, while there are all sorts of concerns about content moderation, the number of false positives that lead to "good" content being taken down is staggering. Lots of people like to point and laugh at these, but any serious understanding of content moderation at scale has to recognize that when you need to process many many thousands of requests per day, often involving complex or nuanced issues, many, many mistakes are going to be made. And thus, you need a clear and transparent process that enables review.
A bunch of public interest groups (including EFF) have now sent an open letter to Mark Zuckerberg, requesting that Facebook significantly change its content removal appeal process, to be much clearer and much more accountable. The request first covers how clear the notice should be concerning what content caused the restriction and why:
Notice: Clearly explain to users why their content has been restricted.
- Notifications should include the specific clause from the Community Standards that the content was found to violate.
- Notice should be sufficiently detailed to allow the user to identify the specific content that was restricted, and should include information about how the content was detected, evaluated, and removed.
- Individuals must have clear information about how to appeal the decision.
And then it goes into many more details on how an appeal should work, involving actual transparency, more detailed explanations, and knowledge that an appeal actually goes to someone who didn't make the initial decision:
Appeals: Provide users with a chance to appeal content moderation decisions.
- The appeals mechanism should be easily accessible and easy to use.
- Appeals should be subject to review by a person or panel of persons not involved in the initial decision.
- Users must have the right to propose new evidence or material to be considered in the review.
- Appeals should result in a prompt determination and reply to the user.
- Any exceptions to the principle of universal appeals should be clearly disclosed and compatible with international human rights principles.
- Facebook should collaborate with other stakeholders to develop new independent self-regulatory mechanisms for social media that will provide greater accountability.
Frankly, I think this is a great list, and am dismayed that the large platforms haven't implemented something like this alread. For example, we recently wrote about Google deeming our blog post on the difficulty of content moderation to be "dangerous or derogatory." In that case, we initially got no further information other than that claim. And the appeals process was totally opaque. The first time we appealed, the ruling was overturned (again with no explanation) and a month later when that article got dinged again, the appeal was rejected.
After we published that article, we had an employee from the Adsense team eventually reach out to us to explain that it was "likely" that some of the comments on that article were what triggered the problems. After pointing out that there were well over 300 comments on the article, we were eventually pointed to one particular comment that used some slurs, though the comment used them to demonstrate the ridiculousness of automated filters, rather than as derogatory epithets.
However, as I noted in my response, my main complaint was not Google's silly setup, but the fact that it provided no actual guidance. We were not told that it was a comment that was to blame until after our published article resulted in someone higher up on the AdSense team reaching out. I pointed out that it seemed only reasonable that Google should share with us specifically what term it felt we had violated and which content was the problem so that we could then make an informed decision. Similarly, the appeals process was entirely opaque.
While the reasons that Google and Facebook have not yet created this kind of due process are obvious (it would be kinda costly, for one), it does seem like such a system will be increasingly important, and it's good to see these groups pushing Facebook on this in particular.
Of course, earlier this year, Zuckerberg had floated an idea of an independent (i.e. outside of Facebook) third party board that could handle these kinds of content moderation appeals, and... a bunch of people freaked out, falsely claiming that Zuckerberg wanted to create a special Facebook Supreme Court (even as he was actually advocating for having a body outside of Facebook reviewing Facebook's decisions).
No matter what, it would be good for the large platforms to start taking these issues seriously, not only for reasons of basic fairness and transparency, but because it would also serve to better make the public comfortable with how this process works. When it is, as currently construed, a giant black box, that leads to a lot more anger and conspiracy thinking over how content moderation actually works.
Update: It appears that shortly after this post went out, Zuckerberg told reporters that Facebook is now going ahead with creating an independent body to handle appeals. We'll have more on this once some details are available.
Permalink | Comments | Email This Story