Article 6KRV2 Fox Used to Hate Disinformation Experts. Now It’s Hiring One.

Fox Used to Hate Disinformation Experts. Now It’s Hiring One.

by
Ken Klippenstein
from The Intercept on (#6KRV2)

Fox News, one of the most relentless critics of the war on disinformation, now has a new challenge: Its parent company is looking to build up its own internal capability to combat disinformation.

Last week, Fox Corporation issued a job posting looking for a corporate trust and safety behavioral analyst" whose responsibilities would include identifying misinformation/disinformation." The job aims to establish a content moderation system across Fox's businesses, which includes Fox News, to fight disinformation. The corporation will work in close coordination with unnamed partners both inside and outside of the company, the posting says. To this end, Fox intends to use pattern recognition, a key component of artificial intelligence, to identify hostile users," the job description says.

The analyst, Fox says, would tend to the ongoing community health and brand safety of Fox sites and apps that interact directly with users" in order to safeguard ... user communities." A background in psychology, criminal justice, social media, gaming, news or media" is a plus, the job announcement says.

Asked about the job posting, Fox did not respond to a request for comment.

The corporate concern with disinformation contrasts rather sharply with Fox News's overwhelmingly critical coverage of anti-disinformation efforts that police what is posted in social media, which Fox News consistently equates with censorship.

When the Department of Homeland Security created a now-defunct Disinformation Governance Board in 2022, prominent Fox News hosts condemned the move in sensational terms. Fox News host Sean Hannity and then-host Tucker Carlson both called the Disinformation Governance Board a Ministry of Truth," a reference to the propaganda ministry of a totalitarian state from George Orwell's dystopian novel 1984." Fox News's Brian Kilmeade echoed their remarks, saying that it looks like the Biden administration is taking Orwell's work not as a warning but as their own manual."

GettyImages-1229449603-crop.jpg Related Why Fox News Can't Afford to Quit Donald Trump

Since the board story, the network has been obsessed with the disinformation battle. In the week following the revelation of the Disinformation Governance Board, 70 percent of Fox's one-hour segments referenced disinformation and the DHS official in charge of the board, according to a defamation lawsuit Nina Jankowicz has filed against Fox News. During 2022, Fox News mentioned Jankowicz over 300 times, the lawsuit states. (Asked about the lawsuit, Irena Briganti, a spokesperson for Fox News, said that the company has filed a motion to dismiss the lawsuit.)

Fox's corporate interest in disinformation differs from the federal government's. Fox is interested in audience engagement" - a term that appears almost half a dozen times in the job posting.

Helping deliver innovative technology solutions to support user safety and increase engagement," Fox's posting lists among the responsibilities of the job.

Much of the debate about content moderation focuses on heady subjects like freedom of speech and the threat of state-sponsored foreign influence campaigns. But largely absent from the discussion is the simple fact that it's profitable for companies to remove content that might offend advertisers or audiences. And with advancements in AI technology, it is increasingly possible to do so at scale.

In addition to machine learning, Fox's job posting references two other terms common to AI: large language models and natural language processing. This technology makes it possible to autonomously sift through vast amounts of data, which previously would have required expensive human teams. As a result, content moderation is going to be cheaper to conduct than ever before.

Fox is far from the only company taking advantage of the breakthroughs in AI to respond to disinformation.

More than 95 percent of the hate speech that we take down is done by an AI and not by a person," Mark Zuckerberg, CEO of Facebook (now Meta) told Congress in 2021. And I think it's 98 or 99 percent of the terrorist content that we take down is identified by an AI and not a person."

The federal government is increasingly turning to AI to identify foreign influence operations, according to the Biden administration's new budget request delivered to the Congress last month.

For the most part, the rapid changes brought about by the explosion of AI technology have yet to enter into the disinformation debate.

I am pro-disinformation because one man's disinformation is another person's fact," Fox News host Greg Gutfeld said in 2022.

Gutfeld may want to take that up with his employer.

The post Fox Used to Hate Disinformation Experts. Now It's Hiring One. appeared first on The Intercept.

External Content
Source RSS or Atom Feed
Feed Location https://theintercept.com/feed/
Feed Title The Intercept
Feed Link https://theintercept.com/
Reply 0 comments