Article 4GQQE The Impossibility Of Content Moderation: YouTube's New Ban On Nazis Hits Reporter Who Documents Extremism, Professor Teaching About Hitler

The Impossibility Of Content Moderation: YouTube's New Ban On Nazis Hits Reporter Who Documents Extremism, Professor Teaching About Hitler

by
Mike Masnick
from Techdirt on (#4GQQE)
Story Image

So just as the recent big content moderation mess was happening on YouTube, the company announced that it had changed its policies to better deal with violent extremism and supremacism on the platform:

Today, we're taking another step in our hate speech policy by specifically prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status. This would include, for example, videos that promote or glorify Nazi ideology, which is inherently discriminatory. Finally, we will remove content denying that well-documented violent events, like the Holocaust or the shooting at Sandy Hook Elementary, took place.

The timing of this announcement was seen as curious (or, at the very least, poorly timed) as it came basically hours after they had refused to take down Steven Crowder's account (see the earlier post linked above), even though that wasn't an identical situation -- though analogous enough that tons of people commented on it.

In making the announcement, YouTube correctly noted that this new bit of line drawing could represent some problems, including among those tracking hate and extremism:

We recognize some of this content has value to researchers and NGOs looking to understand hate in order to combat it, and we are exploring options to make it available to them in the future. And as always, context matters, so some videos could remain up because they discuss topics like pending legislation, aim to condemn or expose hate, or provide analysis of current events. We will begin enforcing this updated policy today; however, it will take time for our systems to fully ramp up and we'll be gradually expanding coverage over the next several months.

But within hours of the new policy rolling out, we were already seeing how difficult it is to implement without taking down content that probably deserves to remain up. Ford Fischer, a reporter who tracks extremist and hate groups, and whose work is regularly cited, noted that his own channel had been demonetized.

Within minutes of @YouTube's announcement of a new purge it appears they caught my outlet, which documents activism and extremism, in the crossfire.

I was just notified my entire channel has been demonetized. I am a journalist whose work there is used in dozens of documentaries. pic.twitter.com/HscG2S4dWh

- Ford Fischer (@FordFischer) June 5, 2019

Fischer than discusses the specific videos that YouTube says is the reason for this -- and it does include holocaust denialism, but for the sake of documenting it, not promoting it:

The only other one flagged was raw video of a speech given by Mike Peinovich "Enoch." While unpleasant, this documentation is essential research for history.

Indeed, this exact footage was used in a PBS documentary I associate produced, which MLKIII presented at the premiere. pic.twitter.com/O5p1tPoFnH

- Ford Fischer (@FordFischer) June 5, 2019

And this gets, once again, to the very problem of expecting platforms to police this kind of speech. The exact same content can mean very different things in different contexts. In some cases, it may be used to promote odious ideology. In other cases, it's used to document and expose that ideology and the ignorance and problems associated with it.

But how do you craft a policy that can determine one from the other? As YouTube is discovering (truth is, they probably already knew this), the answer is that you don't. Any policy ends up creating some sort of collateral damage, and the demands from well meaning people mean that the direction this tends to go in leads to greater and greater takedowns. But, if in the process of doing this we end up sweeping the documentation under the rug, that's a problem as well.

Here's another example: right after YouTube's new policy was put in place, a history teacher found that his own YouTube channel was banned. Why? Because he hosted archival footage of Hitler:

YouTube have banned me for 'hate speech', I think due to clips on Nazi policy featuring propaganda speeches by Nazi leaders. I'm devastated to have this claim levelled against me, and frustrated 15yrs of materials for #HistoryTeacher community have ended so abruptly.@TeamYouTube

- Mr Allsop History (@MrAllsopHistory) June 5, 2019

"My stomach fell," Allsop told BuzzFeed News via email. "I'm a history teacher, not someone who promotes hatred. I share archive footage and study materials to help students learn about the past."

Once again, it often sounds easy to say something like "well, let's ban the Nazis." I'd even argue it's a reasonable goal for a platform to have a blanket "no Nazis" policy. But the reality is that the implementation is not nearly as easy as many people believe. And the end result can be that archival and documentary footage gets blocked. And that could have serious long term consequences if part of our goal is to educate people about why Nazis are bad.

Of course, none of this should come as a surprise to anyone who's been dealing with these issues over the past couple of decades. Early attempts to ban "porn" also took down information on breast cancer. Attempts to block "terrorist content" have repeatedly taken down people documenting war crimes. This kind of thing happens over and over and over again and believing that this time will magically be different is a fool's errand.



Permalink | Comments | Email This Story
External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments