Article 4N8PC Be Careful What You Wish For In Asking Silicon Valley To Police Speech Online

Be Careful What You Wish For In Asking Silicon Valley To Police Speech Online

by
Mike Masnick
from Techdirt on (#4N8PC)
Story Image

We live in a weird moment right now where any piece -- no matter how misleading or unhinged -- seems to be able to find a publication place so long as it blames basically everything on the big internet companies and demands that they do more (or sometimes less) to stop bad stuff from happening online. There are still a few brave souls out there pointing out how problematic all of this might be, and thankfully the EFF's executive director, Cindy Cohn, has taken to the pages of Wired to explain why asking the internet to stifle speech online could backfire in a really big way. She notes that it's a reasonable emotional reaction to mass murdering assholes posting screeds on 8chan to seek to shut the site down entirely, but that comes with serious costs as well.

The hope-for some it may be a belief-is that eliminating online speech forums will help prevent future violence. This is understandable. Everyone wants to live in a country where they are safe in their local stores, at festivals, and in other public places. The vile ideas driving shooters whose actions have caused unspeakable pain and loss are in plain view on 8chan, and the thought that we could just make them go away has strong appeal.

But this is also a critical moment to look closely at what is being proposed and pay attention to the potential consequences for us all. We all rely on our internet communities for connection, information, and organizing against violence. The same mechanisms used to eliminate online forums hosting objectionable speech are all too often used to silence marginalized voices of people who have important things to say, including those drawing attention to hate and misogyny. Rules prohibiting violence have already taken down Syrian YouTube channels documenting human rights violations, while Facebook discussions by black Americans about racism they have experienced have been removed as hate speech.

She then discusses the two key tools that people have proposed for dealing with such speech online: deplatforming and increasing liability under Section 230 of the Communications Decency Act. Both would have significant costs.

After the 2009 shootings at Fort Hood in Texas, we saw calls to ban forums where Muslims gathered to speak. We've seen hate speech prohibitions in companies' terms of service used to silence conversations among women of color about their experiences being harassed. We've seen regulations on violent content result in the erasure of vital documentation of human rights abuses in Egypt and Kashmir, and domestic law enforcement brutality here in the United States. We've seen efforts to convince upstream providers to block information about problems with electronic voting machines and actions to protect the environment.

Both strategies also assume that we want to double down on the idea that representatives from private companies-generally underpaid and traumatized content moderation contractors, but also the creators of unmoderated forums like 8chan-should be the primary deciders about what gets to be on the internet. They also assume that there is global agreement about what should be allowed and what should be banned.

Indeed, the very fact that increasing liability under 230 might contradict efforts to deplatform individuals is something that is never discussed by those who support these moves. I know it's no fun to be pointing out that the "easy" solutions people say will solve everything aren't easy and won't solve much at all, but kudos to Cindy for doing so.



Permalink | Comments | Email This Story
External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments