Insanity: Theresa May Says Internet Companies Need To Remove 'Extremist' Content Within 2 Hours
It's fairly stunning just how much people believe that it's easy for companies to moderate content online. Take, for example, this random dude who assumes its perfectly reasonable for Facebook, Google and Twitter to "manually review all content" on their platforms (and since Google is a search engine, I imagine this means basically all public web content that can be found via its search engine). This is, unfortunately, a complete failure of basic comprehension about the scale of these platforms and how much content flows through them.

Tragically, it's not just random Rons on Twitter with this idea. Ron's tweet was in response to UK Prime Minister Theresa May saying that internet platforms must remove "extremist" content within two hours. This is after the UK's Home Office noted that they see links to "extremist content" remaining online for an average of 36 hours. Frankly, 36 hours seems incredibly low. That's pretty fast for platforms to be able to discover such content, make a thorough analysis of whether or not it truly is "extremist content" and figure out what to do about it. Various laws on takedowns usually have statements about a "reasonable" amount of time to respond -- and while there are rarely set numbers, the general rule of thumb seems to be approximately 24 hours after notice (which is pretty aggressive).
But for May to now be demanding two hours is crazy. It's a recipe for widespread censorship. Already we see lots of false takedowns from these platforms as they try to take down bad content -- we write about them all the time. And when it comes to "extremist" content, things can get particularly ridiculous. A few years back, we wrote about how YouTube took down an account that was documenting atrocities in Syria. And the same thing happened just a month ago, with YouTube deleting evidence of war crimes.
So, May calling for these platforms to take down extremist content in two hours confuses two important things. First, it shows a near total ignorance of the scale of content on these platforms. There is no way possible to actually monitor this stuff. Second, it shows a real ignorance about the whole concept of "extremist" content. There is no clear definition of it, and without a clear definitions wrong decisions will be made. Frequently. Especially if you're not giving the platforms any time to actually investigate. At best, you're going to end up with a system with weak AI flagging certain things, and then low-paid, poorly trained individuals in far off countries making quick decisions.
And since the "penalty" for leaving content up will be severe, the incentives will all push towards taking down the content and censorship. The only pushback against this is the slight embarrassment if someone makes a stink about mistargeted takedowns.
Of course, Theresa May doesn't care about that at all. She's been bleating on censoring the internet to stop terrorists for quite some time now -- and appears willing to use any excuse and make ridiculous demands along the way. It doesn't appear she has any interest in understanding the nature of the problem, as it's much more useful to her to be blaming others for terrorist attacks on her watch, than actually doing anything legitimate to stop them. Censoring the internet isn't a solution, but it allows her to cast blame on foreign companies.
Permalink | Comments | Email This Story