Article 6HRFQ Substack Realizes Maybe It Doesn’t Want To Help Literal Nazis Make Money After All (But Only Literal Nazis)

Substack Realizes Maybe It Doesn’t Want To Help Literal Nazis Make Money After All (But Only Literal Nazis)

by
Mike Masnick
from Techdirt on (#6HRFQ)
Story Image

Last year, soon after Elon completed his purchase of (then) Twitter, I wrote up a 20 level speed run" of the content moderation learning curve. It seems like maybe some of the folks at Substack should be reading it these days?

As you'll recall, last April, Substack CEO Chris Best basically made it clear that his site would not moderate Nazis. As I noted at the time, any site (in the US) is free to make that decision, but those making it shouldn't pretend that it's based on any principles, because the end result is likely to be that you have a site full of Nazis and... that tends not to be good for business because other people you might want to do business with might not want to be on the site welcoming Nazis.

Thus, it should not have been shocking when, by the end of the year, Substack had a site with a bunch of literal Nazis. And, no, we're not just talking about people with strong political viewpoints that lead people who oppose them to call them Nazis. We're talking about people who are literally embracing Naziism and Nazi symbols.

And Substack was helping them make money.

Even worse, Substack co-founder Hamish McKenzie put out a ridiculous self-serving statement pretending that their decision to help monetize Nazis was about civil liberties, even as the site regularly deplatformed anything about sex. At that point, you're admitting that you moderate, and then it's just a question over which values you moderate for. McKenzie was claiming, directly, that they were cool with Nazis, but sex was bad.

The point of the content moderation learning curve is not to say that there's a right way or a wrong way to handle moderation. It's just noting that if you run a platform that allows users to speak, you have to make certain calls on what speech you're going to allow and what you're not going to allow - and you should understand that some of those choices have consequences.

In the case of Substack, some of those consequences were that some large Substack sites decided to jump ship. Rusty Foster's always excellent Today in Tabs" switched over to Beehiiv. And then, last week, Platformer News, Casey Newton's widely respected newsletter with over 170,000 subscribers, announced that if Substack refused to remove the Nazi sites, it would leave.

Content moderation often involves difficult trade-offs, but this is not one of those cases. Rolling out a welcome mat for Nazis is, to put it mildly, inconsistent with our values here at Platformer. We have shared this in private discussions with Substack and are scheduled to meet with the company later this week to advocate for change.

Meanwhile, we're now building a database of extremist Substacks. Katz kindly agreed to share with us a full list of the extremist publications he reviewed prior to publishing his article, most of which were not named in the piece. We're currently reviewing them to get a sense of how many accounts are active, monetized, display Nazi imagery, or use genocidal rhetoric.

We plan to share our findings both with Substack and, if necessary, its payments processor, Stripe. Stripe's terms prohibit its service from being used by any business or organization that a. engages in, encourages, promotes or celebrates unlawful violence or physical harm to persons or property, or b. engages in, encourages, promotes or celebrates unlawful violence toward any group based on race, religion, disability, gender, sexual orientation, national origin, or any other immutable characteristic."

It is our hope that Substack will reverse course and remove all pro-Nazi material under its existing anti-hate policies. If it chooses not to, we will plan to leave the platform.

As a result of those meetings, Substack has now admitted that some of the outright Nazis actually do violate existing" rules, and will be removed.

Substack is removing some publications that express support for Nazis, the company said today. The company said this did not represent a reversal of its previous stance, but rather the result of reconsidering how it interprets its existing policies.

As part of the move, the company is also terminating the accounts of several publications that endorse Nazi ideology and that Platformer flagged to the company for review last week.

The company will not change the text of its content policy, it says, and its new policy interpretation will not include proactively removing content related to neo-Nazis and far-right extremism. But Substack will continue to remove any material that includes credible threats of physical harm," it said.

As law professor James Grimmelann writes in response: As content moderation strategies go, We didn't realize until now that the Nazis on our platform were inciting violence" perhaps raises more questions than it answers."

Molly White, who remains one of the best critics of tech-boosterism, also noted that Substack's decisions seemed likely to piss off the most people possible, by first coddling the Nazis (pissing off most people who hate Nazis), and then pissing off the people who cheered on the we don't moderate Nazis."

In the end, Substack is apparently removing five Nazi newsletters. As White notes, this will piss off the most people possible. The people who want Substack to do more won't be satisfied and will be annoyed it took pointing out the literal support for genocide for Substack to realize that maybe they don't want literal Nazis. And the people who supported Substack will be annoyed that Substack was pressured" into removing these accounts.

Again, there are important points in all of this, and it's why I started this post off by pointing to the speed run post at the beginning. You can create a site and say you'll host whatever kinds of content you want. You can create a site and say that you won't do any moderation at all. Those are valid decisions to make.

But they're not decisions that are in support of free speech." Because a site that caters to Nazis is not a site that caters to free speech. Because (as we've seen time and time again), such sites drive away people who don't like being on a site associated with Nazis. And, so you're left in a situation where you're really just supporting Nazis and not much else.

Furthermore, for all of McKenzie's pretend high-minded talk about civil liberties" and freedom," it's now come out that he had no problem at all trying to put his fingers on the scale to put together a list of (mostly) nonsense peddlers to sign a letter in support of his own views. McKenzie literally organized the we support Substack supporting Nazis" letter signing campaign. Which, again, he's totally allowed to do, but it calls into question his claimed neutrality in all of this. He's not setting up a neutral" site to host speech. He's created a site that hosts some speech and doesn't host other speech. It promotes some speech, and doesn't promote other speech.

Those are all choices, and they have nothing to do with supporting free speech.

Running a private website is all about tradeoffs. You have to make lots of choices, and those choices are difficult and are guaranteed to piss off many, many people (no matter what you do). For what it's worth, this is still why I think a protocol-based solution should beat a centralized solution every time, because with protocols you can setup a variety of approaches and let people figure out what works best, rather than relying on one centralized system.

Substack is apparently realizing that there were some tradeoffs to openly supporting Naziism, and will finally take some action on that. It won't satisfy most people, and now it's likely to piss off the people who were excited about Nazis on Substack. But, hey, it's one more level up on the content moderation speed run.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments