Article 5DF0M How Patents Made Facebook’s Filter Bubble

How Patents Made Facebook’s Filter Bubble

by
Mark Harris
from IEEE Spectrum on (#5DF0M)
Mzc1MzM5NQ.jpeg Photo: Andrew Caballero-Reynolds/AFP/Getty Images

When Mark Zuckerberg said Move fast and break things," this is surely not what he meant.

Nevertheless, at a technological level the 6 January attacks on the U.S. Capitol could be contextualized by a line of patents filed or purchased by Facebook, tracing back 20 years. This portfolio arguably sheds light on how the most powerful country in the world was brought low by a rampaging mob nurtured on lies and demagoguery.

While Facebook's arsenal of over 9,000 patents span a bewildering range of topics, at its heart are technologies that allow individuals to see an exclusive feed of content uniquely curated to their interests, their connections, and increasingly, their prejudices.

Algorithms create intensely personal filter bubbles," which are powerfully addictive to users, irresistible to advertisers, and a welcoming environment for rampant misinformation and disinformation such as QAnon, antivaxxer propaganda, and election conspiracy theories.

As Facebook turns 17-it was born" 4 February 2004-a close reading of the company's patent history shows how the social network has persistently sought to attract, categorize, and retain users by giving them more and more of what keeps them engaged on the site. In other words, hyperpartisan communities on Facebook that grow disconnected from reality are arguably less a bug than a feature.

Anyone who has used social media in recent years will likely have seen both misinformation (innocently shared false information) and disinformation (lies and propaganda). Last March, in a survey of over 300 social media users by the Center for an Informed Public at the University of Washington (UW), published on Medium, almost 80 percent reported seeing COVID-19 misinformation online, with over a third believing something false themselves. A larger survey covering the United States and Brazil in 2019, by the University of Liverpool and others, found that a quarter of Facebook users had accidentally shared misinformation. Nearly one in seven admitted sharing fake news on purpose.

A Tale of 12 Patents Facebook has more than 9,000 patents. A crucial few scaled its ability to build informational isolation chambers for its users (filter bubbles")-ones in which people could arguably evade commonplace facts and embrace entirely alternative realities. Lately, Facebook's patents begin to address its echo-chamber-on-overdrive problem.
  1. 2001: Intelligent Information Delivery System (filed by Philips, purchased by Facebook in 2011)

    U.S. Patent No. 6,912,517 B2

  2. 2004: Facebook founded

  3. 2006: Communicating a Newsfeed of Media Content Based on a Member's Interactions in a Social Network Environment

    U.S. Patent No. 8,171,128 B2

  4. 2006: Providing a Newsfeed Based on User Affinity for Entities and Monitored Actions in a Social Network Environment

    U.S. Patent No. 8,402,094

  5. 2009: Filtering Content in a Social Networking Service

    U.S. Patent No. 9,110,953 B2

  6. 2011: Content Access Management in a Social Networking System for Externally Stored Content

    U.S. Patent No. 9,286,642 B2

  7. 2012: Inferring Target Clusters Based on Social Connections

    U.S. Patent No. 10,489,825 B2

  8. 2012: Facebook IPO

  9. 2013: Categorizing Stories in a Social Networking System News Feed

    U.S. Patent No. 10,356,135 B2

  10. 2015: Systems and Methods for Demotion of Content Items in a Feed

    U.S. Patent No. 20,160,321,260 A1

  11. 2016: Quotations-Modules on Online Social Networks

    U.S. Patent No. 10,157,224 B2

  12. 2017: Contextual Information for Determining Credibility of Social-Networking Posts (Abandoned 2020)

    Publication No. US 2019/0163794 A1

  13. 2017: Filtering Out Communications Related to Unwanted Emotions on Online Social Networks (Abandoned 2019)

    Publication No. US 2019/0124023 A1

  14. 2017: Systems and Methods for Providing Diverse Content

    U.S. Patent No. 10,783,197 B2

    [All patents are listed by filing years. Ten of them were granted two to seven years later. Two were ultimately abandoned.]

Misinformation tends to be more compelling than journalistic content, as it's easy to make something interesting and fun if you have no commitment to the truth," says Patricia Rossini, the social-media researcher who conducted the Liverpool study.

In December, a complaint filed by dozens of U.S. states asserted, Due to Facebook's unlawful conduct and the lack of competitive constraints...there has been a proliferation of misinformation and violent or otherwise objectionable content on Facebook's properties."

When a platform is open, like Twitter, most users can see almost everyone's tweets. Therefore, tracking the source and spread of misinformation is comparatively straightforward. Facebook, on the other hand, has spent a decade and a half building a mostly closed information ecosystem.

Last year, Forbes estimated that the company's 15,000 content moderators make some 300,000 bad calls every day. Precious little of that process is ever open to public scrutiny, although Facebook recently referred its decision to suspend Donald Trump's Facebook and Instagram accounts to its Oversight Board. This independent 11-member Supreme Court" is designed to review thorny content moderation decisions.

Meanwhile, even some glimpses of sunlight prove fleeting: After the 2020 U.S. presidential election, Facebook temporarily tweaked its algorithms to promote authoritative, fact-based news sources like NPR, a U.S. public-radio network. According to The New York Times, it soon reversed that decision, though, effectively cutting short its ability to curtail what a spokesperson called inaccurate claims about the election."

The company began filing patents soon after it was founded in 2004. A 2006 patent described how to automatically track your activity to detect relationships with other users, while another the same year laid out how those relationships could determine which media content and news might appear in your feed.

In 2006, Facebook patented a way to characterize major differences between two sets of users." In 2009, Mark Zuckerberg himself filed a patent that showed how Facebook and/or external parties" could target information delivery," including political news, that might be of particular interest to a group.

This automated curation can drive people down partisan rabbit holes, fears Jennifer Stromer-Galley, a professor in the School of Information Studies at Syracuse University. When you see perspectives that are different from yours, it requires thinking and creates aggravation," she says. As a for-profit company that's selling attention to advertisers, Facebook doesn't want that, so there's a risk of algorithmic reinforcement of homogeneity, and filter bubbles."

In the run-up to Facebook's IPO in 2012, the company moved to protect its rapidly growing business from intellectual property lawsuits. A 2011 Facebook patent describes how to filter content according to biographic, geographic, and other information shared by a user. Another patent, bought by Facebook that year from the consumer electronics company Philips, concerns an intelligent information delivery system" that, based on someone's personal preferences collects, prioritizes, and selectively delivers relevant and timely" information.

In recent years, as the negative consequences of Facebook's drive to serve users ever more attention-grabbing content emerged, the company's patent strategy seems to have shifted. Newer patents appear to be trying to rein in the worst excesses of the filter bubbles Facebook pioneered.

The word misinformation" appeared in a Facebook patent for the first time in 2015, for technology designed to demote objectionable material that degrades user experience with the news feed and otherwise compromises the integrity of the social network." A pair of patents in 2017 described providing users with more diverse content from both sides of the political aisle and adding contextual tags to help rein in misleading false news."

Such tags using information from independent fact-checking organizations could help, according to a study by Ethan Porter, coauthor of False Alarm: The Truth About Political Mistruths in the Trump Era (Cambridge University Press, 2019). It's no longer a controversy that fact-checks reliably improve factual accuracy," he says. And contrary to popular misconception, there is no evident exception for controversial or highly politicized topics."

Franziska Roesner, a computer scientist and part of the UW team, was involved in a similar, qualitative study last year that also gave a glimmer of hope. People are now much more aware of the spread and impact of misinformation than they were in 2016 and can articulate robust strategies for vetting content," she says. The problem is that they don't always follow them."

Rossini's Liverpool study also found that behaviors usually associated with democratic gains, such as discussing politics and being exposed to differing opinions, were associated with dysfunctional information sharing. Put simply, the worst offenders for sharing fake news were also the best at building online communities; they shared a lot of information, both good and bad.

Moreover, Rossini doubts the very existence of filter bubbles. Because many Facebook users have more and more varied digital friends than they do in-person connections, she says most social media users are systematically exposed to more diversity than they would be in their offline life."

The problem is that some of that diversity includes hate speech, lies, and propaganda that very few of us would ever seek out voluntarily-but that goes on to radicalize some.

I personally quit Facebook two and a half years ago when the Cambridge Analytica scandal happened," says Lalitha Agnihotri, formerly a data scientist for the Dutch company Philips, who in 2001 was part of a team that filed a related patent. In 2011, Facebook then acquired that Philips patent. I don't think Facebook treats my data right. Now that I realize that IP generated by me enabled Facebook to do things wrong, I feel terrible about it."

Agnihotri says that she has been contacted by Facebook recruiters several times over the years but has always turned them down. My 12-year-old suggested that maybe I need to join them, to make sure they do things right," she says. But it will be hard, if not impossible, to change a culture that comes from their founder."

This article appears in the February 2021 print issue as The Careful Engineering of Facebook's Filter Bubble."

_WZIYuVlvXs
External Content
Source RSS or Atom Feed
Feed Location http://feeds.feedburner.com/IeeeSpectrum
Feed Title IEEE Spectrum
Feed Link https://spectrum.ieee.org/
Reply 0 comments