Article 6R81A Researchers Confirm: Content Moderation Appears To Target Dangerous Nonsense, Not Political Ideology

Researchers Confirm: Content Moderation Appears To Target Dangerous Nonsense, Not Political Ideology

by
Mike Masnick
from Techdirt on (#6R81A)

Going back many, many years, we've written about how the public narrative that the large social media networks engage in anti-conservative bias" in their content moderation policies is bullshit. Because it is. And now we have yet another scientific study to prove this.

The first time we covered it was in response to a ridiculous study" from MAGA darling and unremarkable racist" Richard Hanania. He released a ridiculously embarrassing study claiming to prove" anti-conservative bias in Twitter suspensions. It was based on a self-collected set of just 22 examples of accounts suspended from Twitter, in which Hanania noted that 21 of the 22 accounts were Trump supporters." What he left out of his analysis was that a bunch of those Trump supporters" were... out and out neo-Nazis, including (I'm not joking) the American Nazi Party account.

Since then, many other actual studies have called bullshit on the claims of anti-conservative bias in moderation. Indeed, the evidence has suggested that both Twitter and Facebook even adjusted the rules to allow for even greater levels of rules violations for MAGA supporters, just to avoid the appearance of anti-conservative bias. That is, their bias was actually pro-MAGA in that they loosened the rules for Trump-supporting accounts, allowing them to break the rules more frequently.

This is what people mean when they talk about working the refs." So much of the whining and complaining about how everyone is biased" against conservatives" (though I'd argue the MAGA movement is hardly conservative") is really about making sure that anyone in a position of gatekeeping or arbiting gives them more leeway to break the rules, simply to avoid the appearance of bias.

That means that in continually accusing everyone (mainstream media, social media, etc.) of unfair bias against the MAGA movement, we actually get the exact opposite: an unfair bias that gives MAGA folks a pass on breaking not just the rules, but general societal norms like... not contesting the results of a presidential election.

Two years ago (just as Elon Musk was gearing up to acquire Twitter to fight back against what he insisted was bias" in their moderation policies), we wrote about a preprint of a study by a group of researchers, including David Rand, Mohsen Mosleh, Qi Yang, Tauhid Zaman, and Gordon Pennycook.

This week, an updated version of that study has finally been published in the prestigious journal, Nature. Its findings are pretty clear: content moderation does not appear to be focused on ideology, but does target potentially dangerous disinformation. The simple reality is that the MAGA world is way, way, way, way more likely to post absolute fucking nonsense.

We first analysed 9,000 politically active Twitter users during the US 2020 presidential election. Although users estimated to be pro-Trump/conservative were indeed substantially more likely to be suspended than those estimated to be pro-Biden/liberal, users who were pro-Trump/conservative also shared far more links to various sets of low-quality news sites-even when news quality was determined by politically balanced groups of laypeople, or groups of only Republican laypeople-and had higher estimated likelihoods of being bots. We find similar associations between stated or inferred conservatism and low-quality news sharing (on the basis of both expert and politically balanced layperson ratings) in 7 other datasets of sharing from Twitter, Facebook and survey experiments, spanning 2016 to 2023 and including data from 16 different countries. Thus, even under politically neutral anti-misinformation policies, political asymmetries in enforcement should be expected. Political imbalance in enforcement need not imply bias on the part of social media companies implementing anti-misinformation policies.

I think it's important that these researchers point out that they even had groups of only Republicans" rate the quality of the news sources that the MAGA world was pushing.

Often in discussions around bias in a different context, there are debates about whether or not it makes sense for there to be equality in opportunity vs. equality in outcomes. This is often demonstrated in some variation of this graphic, created by the Interaction Institute for Social Change, which has become quite a meme and comes up in lots of culture war discussions.

9719e1bf-47af-4c2f-8b2a-36669f706289-RackMultipart20241003-178-y0hzuf.png?ssl=1

But, in many ways, the debate on social media moderation and bias is just a different form of that same argument (though, in some weird ways, with the viewpoints reversed from conservative/liberal thinking). On issues of bias in opportunity, the traditional" (grossly generalized!) view is that conservatives" want equality in opportunity (the left side of the picture) and liberals" prefer equality of outcomes (the second picture).

When it comes to social media moderation, the roles seem somewhat reversed. The MAGA world insists that since they get moderated more often, showing that the outcomes are uneven," it proves an unfair bias.

Yet, as this study shows, if the inputs (i.e., the likelihood of sharing absolutely dangerous bullshit nonsense) are uneven, then of course the outputs will be uneven.

And that's even true after working the refs. When the MAGA world is so committed to pushing blatantly false misinformation, some of which could cause real harm which a platform might not want to support, the outcome may still show that they end up getting suspended more often, even when sites like Facebook bend over backwards to give MAGA folks more leeway to violate its rules.

The study makes that clear. It notes that the greatest predictor of getting suspended was not are you conservative?" but are you sharing bullshit?" For people who supported Trump but didn't share nonsense, they were less likely to be suspended. People who supported Biden (in 2020) but did share nonsense, were more likely to be suspended.

The determining factor here was sharing nonsense, not political ideology. It's just that Trump supporters shared way more nonsense.

ea139ac5-6eba-4d4f-9a0a-494c476fae90-RackMultipart20241003-108-26y78.png?ssl=1

The researchers also explore what would happen if a totally neutral anti-misinformation policy" were implemented. And... they found nearly identical results:

Using this approach, we find that suspending users for sharing links to news sites deemed to be untrustworthy by politically balanced groups of laypeopleleads to higher rates of suspension for Republicans than Democrats... For example, if users have a 1% chance of getting suspended each time they share a low-quality link, 2.41 times more users who shared Trump hashtags would be suspended compared with users who shared Biden hashtags (d=0.63; t-test, t(8,998) = 30.1, P<0.0001). Findings are equivalent when basing suspension on expert assessments of the 60 news sites, or when correlating predicted suspension rate with ideology (0.31<r<0.39, depending on ideology measure; P<0.0001 for all); ...

[....]

These analyses show that even in the absence of any (intentional) disparate treatment on the part of technology companies, partisan asymmetries in sanctioned behaviours will lead to (unintentional) disparate impact whereby conservatives are suspended at greater rates. From a legal perspective, political orientation is not a protected class in the USA and thus neither form of disparate treatment is illegal (although potentially still normatively undesirable). Although disparate impact may reasonably be considered to constitute discrimination in some cases (for example, employment discrimination on the basis of job-irrelevant factors that correlate with race), in the present context reducing the spread of misinformation and the prevalence of bots are legitimate and necessary goals for social media platforms. This makes a normative case for disparate impact on the basis of political orientation.

This shouldn't be surprising to folks who have followed this space for a while. Indeed, it confirms a lot of what many of us have been saying for years. But it's certainly nice to have the data to support the findings.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments