Article 4RMYK YouTube demonetizing videos where LGBTQ keywords are said

YouTube demonetizing videos where LGBTQ keywords are said

by
Rob Beschizza
from on (#4RMYK)
Story Image

YouTube denies that it punishes users simply for being queer or using queer terms. But users of the platform are putting it to the test and finding many such phrases that will lead to automatic demonetization.

"We tested 15,296 words against YouTube's bots, one by one, and determined which of those words will cause a video to be demonetized."

It seems quite damning, especially the tests showing videos being remonetized after removing LGBTQ keywords.

Above, "YouTube Analyzed" publicizes the list of terms suspected to trigger the bot. ("This list should not be viewed as "banned words on youtube... This list should be used as a reference, when trying to figure out why a video is [demonetized]")

Below, a clever youngster posted a video where he utters just a few LGBTQ keywords: "LGBTQ, queer, lesbian, gay, bisexual, homosexual". Though he's not monetized in the first place, he reports that the video soon acquired an "ineligible for monetizing" icon not on his other uploads.

Last month, The Verge reported on YouTubers that sued the company over discrimination it claims does not exist. Pink News found similar outcomes in a survey.

YouTube denied that words describing the LGBT+ community cause videos to be demonetised in a statement. They said they are "constantly evaluating our systems to help ensure that they are reflecting our policies without unfair bias".

A denial, but also an aftertaste of "we don't entirely know what our automated systems are up to."

The underlying implication is YouTube's created a system that excludes content advertisers don't want to be associated with--perhaps a rushed response to the 2017 moral panic over the grotesque trash YouTube was running ads against. Now that this system exists, advertisers get a crude tool for content exclusion, and YouTube gets a lesson in unintended consequences.

Tech companies sometimes like to hide behind the suggestion that algoriths-computers making automated decisions-can't be bigoted. This is a example that makes clear how empty that argument is, and how an automatic process can baldly reflect human bigotry.

Whether they feel responsible for their code's decisions is immaterial. They will be held responsible for it, one way or another.

External Content
Source RSS or Atom Feed
Feed Location https://boingboing.net/feed
Feed Title
Feed Link https://boingboing.net/
Reply 0 comments