YouTube Warns That, Thanks To Covid-19, It's Handing Over More Content Moderation To The Machines And They Might Suck

Content moderation at scale is impossible to do well in the best of times, but the solutions that seem to at least keep it from devolving into a total mess almost always use a combination of humans and technology working together. But what do you do when the humans are sick, self-isolating, quarantined, etc? While I imagine some may be able to work from home, it's a difficult time to expect anyone to be at full productivity. So YouTube has made it clear that it's turning over more content moderation decisions to the machines knowing full well that some of those decisions are going to be bad:
Our Community Guidelines enforcement today is based on a combination of people and technology: Machine learning helps detect potentially harmful content and then sends it to human reviewers for assessment. As a result of the new measures we're taking, we will temporarily start relying more on technology to help with some of the work normally done by reviewers. This means automated systems will start removing some content without human review, so we can continue to act quickly to remove violative content and protect our ecosystem, while we have workplace protections in place.
As we do this, users and creators may see increased video removals, including some videos that may not violate policies. We won't issue strikes on this content except in cases where we have high confidence that it's violative. If creators think that their content was removed in error, they can appeal the decision and our teams will take a look. However, note that our workforce precautions will also result in delayed appeal reviews. We'll also be more cautious about what content gets promoted, including livestreams. In some cases, unreviewed content may not be available via search, on the homepage, or in recommendations.
And, of course, this is absolutely the right choice to make -- indeed, it's the only choice to make given the circumstances. But it's yet another reminder of how impossible and fragile the system is when people demand that humans review everything being posted to social media. Either way, don't be surprised to hear many more stories of bad content moderation decisions not just on YouTube but elsewhere in the coming weeks. Of course, I still imagine people will scream and yell and take it personally, but at least recognize that some of the issue may be that the humans are all kinda preoccupied with more important things right now.
Permalink | Comments | Email This Story