Article 53YR8 Facebook studied how it polarizes users, then ignored the research

Facebook studied how it polarizes users, then ignored the research

by
Xeni Jardin
from on (#53YR8)

Our algorithms exploit the human brain's attraction to divisiveness."

64% of all extremist group joins are due to our recommendation tools"

GOP operative turned Facebook policy VP Joel Kaplan, who threw a party for Brett Kavanaugh upon his Supreme Court confirmation, killed any action on Facebook's internal findings, reports WSJ

Mark Zuckerberg and other top executives at Facebook shelved damning internal research into the social media platform's polarizing effect, which hampered efforts to apply its conclusions to products and minimize harm, reported the Wall Street Journal on Tuesday.

Facebook's own internal research discovered that most people who join extremist groups did so as a result of Facebook's recommendation algorithms.

The company shelved the research, and pressed on, making money and radicalizing Americans.

The Wall Street Journal report by Deepa Seetharaman and Jeff Horwitz is based on company sources and internal documents. One internal Facebook presentation slide from 2018 laid out the issue like this: Our algorithms exploit the human brain's attraction to divisiveness."

If left unchecked," it warned, Facebook would feed users more and more divisive content in an effort to gain user attention & increase time on the platform."

Excerpt:

That presentation went to the heart of a question dogging Facebook almost since its founding: Does its platform aggravate polarization and tribal behavior?

The answer it found, in some cases, was yes.

Facebook had kicked off an internal effort to understand how its platform shaped user behavior and how the company might address potential harms. Chief Executive Mark Zuckerberg had in public and private expressed concern about sensationalism and polarization."

But in the end, Facebook's interest was fleeting. Mr. Zuckerberg and other senior executives largely shelved the basic research, according to previously unreported internal documents and people familiar with the effort, and weakened or blocked efforts to apply its conclusions to Facebook products.

Facebook policy chief Joel Kaplan, who played a central role in vetting proposed changes, argued at the time that efforts to make conversations on the platform more civil were paternalistic," said people familiar with his comments.

Read more:
Facebook Executives Shut Down Efforts to Make the Site Less Divisive

A core problem at Facebook is that one policy org is responsible for both the rules of the platform and keeping governments happy. https://t.co/EnAVxmI3hl

- Alex Stamos (@alexstamos) May 26, 2020

The concern predates the election. A 2016 slide shows showed extremist content thriving in more than one-third of large German political groups. 64% of all extremist group joins are due to our recommendation tools." Our recommendation systems grow the problem."

- Deepa Seetharaman (@dseetharaman) May 26, 2020

The common ground" team at Facebook wasn't trying to change people's minds. They were trying to stop the vilification of the other side. And they came up with different product ideas to tone down the rhetoric on Facebook. pic.twitter.com/b5gcAo52p3

- Deepa Seetharaman (@dseetharaman) May 26, 2020

Facebook's own internal research found the majority of people joining extremist groups did so as a result of Facebook's recommendation algorithms https://t.co/eku7dAfe9n pic.twitter.com/jdDLnMGrUn

- Karissa Bell (@karissabe) May 26, 2020

Facebook's stated mission is "to bring the world closer together." Its own research showed the social network was instead driving polarization, tribalism, and extremism.

Rather than change course, Facebook's top executives opted to bury the research. https://t.co/9QqWtJULsq

- Will Oremus (@WillOremus) May 26, 2020

This is key: Facebook could mitigate many of the problems. The company chose not to." https://t.co/cOw1EsTLs8

- Derek Powazek 1f410.png1f4a8.png (@fraying) May 26, 2020

Twitter, YouTube, and Facebook KNOW what's broken about them. They KNOW what it's doing to us. And they KNOW how to fix it, or at least diminish the harms.

They CHOOSE NOT TO. Every day they wake up and choose not to put out the fire.

Do not ever let them pretend otherwise.

- Derek Powazek 1f410.png1f4a8.png (@fraying) May 26, 2020

Republican-operative-turned-Facebook-exec Joel Kaplan helped kill an effort to reduce the spread of extremist content on the platform. https://t.co/fAedmT0Qis

- Noah Shachtman (@NoahShachtman) May 26, 2020

[via techmeme.com]

External Content
Source RSS or Atom Feed
Feed Location https://boingboing.net/feed
Feed Title
Feed Link https://boingboing.net/
Reply 0 comments