Article 6KMKZ Elon Discovers When Content Moderation Makes Sense: When He Can Use It To Protect Racist Bigots From Being Called Out

Elon Discovers When Content Moderation Makes Sense: When He Can Use It To Protect Racist Bigots From Being Called Out

by
Mike Masnick
from Techdirt on (#6KMKZ)

Right after Elon took over Twitter, we published what we had hoped would be a useful speed run" through the content moderation learning curve that most platforms figure out along the way. We've seen other free speech!" platforms learn these basic lessons, though not always quickly enough to survive.

The basic idea is that no one actually wants to be on a platform that doesn't enforce some fairly basic rules. But the actual rules can vary a lot.

And while Elon Musk continues to insist to this day that he wants his site to be the platform for all legal" speech, he has an odd way of showing it, regularly banning or suppressing perfectly legal speech that he just doesn't like.

The latest one, though, is particularly telling. For years, an extraordinarily awful (in all the most awful ways) comic called Stonetoss" has made the rounds among all the worst people. It had some similarities to an earlier cartoon, which was equally awful, called Red Panels. Both comics were focused on the most bigoted, awful shit: anti-LGBTQ, antisemitic, racist nonsense. Some researchers figured out that the person behind Stonetoss (who went by the name Stonetoss" online) was also behind Red Panels and was actually a dude in Texas named Hans Kristian Graebener. And, not surprisingly, he's expressed support for literal neo-Nazis (so, no, this isn't one of those cases where people overreact in calling someone a supporter of neo-Nazis).

And Elon decided to suspend anyone who mentioned Hans Kristain Graebener's name from ExTwitter. This came after Graebener himself appealed to Musk, asking for the thread revealing his name to be deleted in the name of free speech" (lol).

On Thursday, the Stonetoss account appealed to X users who have a direct line" to Musk, X's owner, to help to get the thread deleted. Musk has, in the past, shared an altered version of a Stonetoss cartoon about the collapse of society. If Elon's idea of a free speech' website is one where people can be intimidated into silence, the outcome will be a site where the Stasi will drive out all dissent," Stonetoss wrote. The account also tagged Musk and offered to share a list of people to target.

In a subsequent post, Stonetoss said this appeal was not about him but about other artists."

This is about others I know personally," Stonetoss wrote. There is a whole ecosystem of artists out there who cannot (or have stopped) making art because of people on twitter organized to punish them IRL for doing so." The cartoonist also added that sales of his plush toy were going gangbusters" since his alleged identity was revealed.

Won't somebody please think of the poor, oppressed neo-Nazi cartoonists?

Now, this is kind of funny, because as many people have pointed out over the years, the reason you have rules on sites and trust & safety teams is not political censorship," but rather to avoid harassment and abuse that drives people from the site. So, now, suddenly, this reactionary creep who has regularly made fun of others for wanting safe spaces and the horrible woke" view that people shouldn't be harassed is worried about people who will stop making art" because people on twitter organized to punish them." Oh, really?

And, of course, Musk took it further and started banning anyone mentioning Graebener.

Hours later, the account associated with the Anonymous Comrades Collective that posted the thread was deleted, and the account was suspended. On Friday, dozens of users, including a number of researchers and journalists, began discussing the incident and posting some of the details of the research, including Graebener's name.

X locked down many of these accounts and ordered them to delete the offending tweet to get full access to their accounts back. Among those targeted were Jared Holt, a senior research analyst at the Institute for Strategic Dialogue, who covers right-wing extremism; Hannah Gais, a senior research analyst at Southern Poverty Law Center; and Steven Monacelli, an investigative journalist for the Texas Observer. (WIRED has also published Monacelli's work.)

X also imposed a ban on sharing the link to the Anonymous Comrades Collective blog detailing its research. WIRED verified this on Monday morning by attempting to post the link, only to be met with a pop-up message that read: We can't complete this request because this link has been identified by X or our partners as being potentially harmful."

Free speech!

The whole thing quickly turned into a Streisanding for Graebener with tons of people posting about him on social media, and articles like this:

d48d9e32-b355-4764-8710-d3e95dc37fad-Rac

In some ways this is reminiscent of when Elon banned the ElonJet account (which he had promised to leave intact) and then started banning any account that merely mentioned ElonJet's existence. After people criticized that, Elon retconned an excuse, about how ElonJet was doxxing" (it wasn't) and it was publishing assassination coordinates" (it wasn't).

But, at least in that case, you could argue that there was some information (location of a plane) that some might consider kinda intrusive, even though it's public information, required by law to be available. But here, we're just talking about someone's name.

ExTwitter quietly changed their privacy policy as more people complained. They added in this line as a violation:

  • the identity of an anonymous user, such as their name or media depicting them

Of course, if you go just a few lines below that, to the section What is not a violation of this policy?", you also find out that it says the following are not in violation of this policy:"

sharing information that we don't consider to be private, including:

  • names

Ah. Well.

As I noted on our recent Ctrl-Alt-Speech episode, we now have Schrodinger's privacy policy. Mentioning names both is and isn't allowed. It's a quantum superposition of content moderation that only collapses when observed by Musk himself.

And, of course, it's notable that while he insisted these kinds of things were a problem of the woke mind virus" that had infected Twitter's old employees, it seems only fair to point out that Elon seems to be infected by racist brainworms leading to his decision to protect this bigoted asshole against clearly legal speech of revealing the dude's name.

That shouldn't be much of a surprise, though. As Greg Sargent at the New Republic recently highlighted, Elon seems to keep diving deeper and deeper into extremist conspiracy nonsense around the great replacement theory" that has resulted in real world violence.

It shouldn't be any real surprise, then, that as Musk has embraced the kinds of Nazi-adjacent ideas that Stonetoss has also been promoting for years, he would use his understanding of content moderation to not actually protect marginalized groups, but to protect those pushing for further marginalization and harm.

Musk is turning his platform into a cozy nook for neo-Nazis. He's rolled out the welcome mat and fluffed the pillows, while making it clear that those who might want to push back on fascism and bigotry are not welcome at all. His moderation practices appear even more biased and arbitrary than the old Twitter. It's just that it's in favor of the worst fucking people in the world. But sure, tell us more about how you're a free speech absolutist,' Elon. We're all ears.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments