Article 6828B Documents Show 15 Social Media Companies Failed to Adequately Address Calls for Violence in 2021

Documents Show 15 Social Media Companies Failed to Adequately Address Calls for Violence in 2021

by
EditorDavid
from Slashdot on (#6828B)
The Washington Post has obtained "stunning new details on how social media companies failed to address the online extremism and calls for violence that preceded the Capitol riot." Their source? The bipartisan committee investigating attacks on America's Capitol on January 6, 2021 "spent more than a year sifting through tens of thousands of documents from multiple companies, interviewing social media company executives and former staffers, and analyzing thousands of posts. They sent a flurry of subpoenas and requests for information to social media companies ranging from Facebook to fringe social networks including Gab and the chat platform Discord." Yet in the end it was written up in a 122-page memo that was circulated among the committee but not delved into in their final report. And this was partly because the committee was "concerned about the risks of a public battle with powerful tech companies, according to three people familiar with the matter who spoke on the condition of anonymity to discuss the panel's sensitive deliberations."The [committee staffer's] memo detailed how the actions of roughly 15 social networks played a significant role in the attack. It described how major platforms like Facebook and Twitter, prominent video streaming sites like YouTube and Twitch and smaller fringe networks like Parler, Gab and 4chan served as megaphones for those seeking to stoke division or organize the insurrection. It detailed how some platforms bent their rules to avoid penalizing conservatives out of fear of reprisals, while others were reluctant to curb the "Stop the Steal" movement after the attack.... The investigators also wrote that much of the content that was shared on Twitter, Facebook and other sites came from Google-owned YouTube, which did not ban election fraud claims until Dec. 9 and did not apply its policy retroactively. The investigators found that its lax policies and enforcement made it "a repository for false claims of election fraud." Even when these videos weren't recommended by YouTube's own algorithms, they were shared across other parts of the internet. "YouTube's policies relevant to election integrity were inadequate to the moment," the staffers wrote. The draft report also says that smaller platforms were not reactive enough to the threat posed by Trump. The report singled out Reddit for being slow to take down a pro-Trump forum called "r/The-Donald." The moderators of that forum used it to "freely advertise" TheDonald.win, which hosted violent content in the lead-up to Jan. 6.... The committee also spoke to Facebook whistleblower Frances Haugen, whose leaked documents in 2021 showed that the country's largest social media platform largely had disbanded its election integrity efforts ahead of the Jan. 6 riot. But little of her account made it into the final document. "The transcripts show the companies used relatively primitive technologies and amateurish techniques to watch for dangers and enforce their platforms' rules. They also show company officials quibbling among themselves over how to apply the rules to possible incitements to violence, even as the riot turned violent."

twitter_icon_large.pngfacebook_icon_large.png

Read more of this story at Slashdot.

External Content
Source RSS or Atom Feed
Feed Location https://rss.slashdot.org/Slashdot/slashdotMain
Feed Title Slashdot
Feed Link https://slashdot.org/
Feed Copyright Copyright Slashdot Media. All Rights Reserved.
Reply 0 comments