Article 5AEZH Upload Filters And The Internet Architecture: What's There To Like?

Upload Filters And The Internet Architecture: What's There To Like?

by
Konstantinos Komaitis and Farzaneh Badiei
from Techdirt on (#5AEZH)

InAugust 2012, YouTube briefly tookdown avideo that had been uploaded by NASA. The video, which depicted alanding on Mars, was caught by YouTube's Content ID system as apotential copyright infringement case but, like everything else NASAcreates, it was in the public domain. Then, in 2016, YouTube'sautomated algorithms removedanother video, this time a lecture by a Harvard Law professor, whichincluded snippets of various songs ranging from 15 to roughly 40seconds. Of course, use of copyright for educational purposes isperfectly legal. Examples of unwarranted content takedowns are notlimited to only these two. Automated algorithms have been responsiblefor taking down perfectly legitimate content that relates tomarginalizedgroups,politicalspeechor the mere existence of information that relates to warcrimes.

But,the over-blocking of content through automated filters is only onepart of the problem. A few years ago, automated filtering wassomewhat limited in popularity, being used by a handful of companies;but, over the years, they have become increasingly the go-totechnical tool for policy makers wanting to address any content issue-- whether it is copyrighted or any other form of objectionablecontent. In particular, in the last few years, Europe has beenchampioningupload filters as a solution for the management of content. Althoughnever explicitly mentioned, upload filters started appearing as earlyas 2018 in various Commission documents but became a tangible policytool in 2019 with the promulgation of the Copyright Directive.

Broadlyspeaking, upload filters are technology tools that platforms, such asFacebook and YouTube, use to check whether content published by theirusers falls within any of the categories for objectionable content.They are not new - YouTube's Content ID system dates back to2007; they are also not cheap - YouTube's Content ID has cost areported$100 million to make. Finally, they are ineffectiveasmachine learning tools will always over-block or under-block content.

But,even with these limitations, upload filters continue to be thepreferred option for content policy making. Partly, this is due tothe fact that policy makers depend on online platforms to offertechnology solutions that can scale and can moderate content enmasse. Another reason is that elimination of content and take-downsis perceived to be easier and has an instant effect. In a world wheremore than 500 hours of content are uploadedhourlyon YouTube or 350 million photos are posteddaily onFacebook, technology solutions such as upload filters appear moredesirable than the alternative of leaving the content up. A thirdreason is the computer-engineering bias of the industry. What thismeans is that typically when you build programmed systems, you followa pretty-much predetermined route: you identify a gap, buildsomething to fill that gap (and, hopefully, in the process make moneyat it) and, then you iteratively fix bugs in the program as they areuncovered. Notice that in this process, the question of whether theproblem is best solved through building a software is never asked.This has been the case with the upload filters'software.

Asonline platforms become key infrastructure for users, however, themoderation practices they adopt are not only about content removal. Through such techniques, online platforms undertake agovernance function, which must ensure the productive, pro-social andlawful interaction of their users. Governments have depended onplatforms carrying out this function for quite some time but, overthe past few years, they have become increasingly interested insetting the rules for social network governance. To this end, thereseems to be a trend of several new regionaland nationalpolicies that mandate upload filters for content moderation.

Whatis at stake?

Theuse of upload filters and the legislative efforts to promote them andmake them compulsory is having a major effect on Internetinfrastructure. One of the core properties of the Internet is that itis based on an open architecture of interoperable and reusablebuilding blocks. In addition to this open architecture, technologybuilding blocks work together collectively to provide services to endusers. At the same time, each building block delivers a specificfunction. All this allows for fast and permissionless innovationeverywhere.

User-generated-contentplatforms are now inserting deep in their networks automatedfiltering mechanisms to deliver services to their users. Platformswith significant market power have convened a forum called the GlobalInternet Forum to Counter Terrorism (GIFCT),through which approved participants (but not everyone) collaborate tocreate shared upload filters. The idea is that these filters areinteroperable amongst platforms, which, primafacie,is good for openness and inclusiveness. But, allowing the designchoices of filters to be made by a handful of companies turns theminto defactostandards bodies. This provides neither inclusivity nor openness. Tothis end, it is worrisome that some governments appear keen toempowerand perhaps anointthis industry consortiumas a permanent institutionfor anyone who accepts content from users and republishes it.In effect, this makes an industry consortium, with its designassumptions, a legally-required and permanent feature of Internetinfrastructure.

Conveningclosed consortiums, like the GIFCT, combined with governments'urge to make upload filters mandatory can violate some of the mostimportant Internet architecture principles: ultimately, uploadfilters are not based on collaborative, open, voluntarystandards but on closed, proprietary ones, owned by specificcompanies. Therefore, unlike traditional building blocks, theseupload filters end up not being interoperable. Smaller onlineplatforms will need to license them. New entrants may find thebarriers to entry too high. This,once again,tilts the scales in favor of large, incumbent market players anddisadvantages an innovator with a new approach to these problems.

Moreover,mandating GIFCT tools or any other technology, determines the designassumptions underpinning that upload filter framework. Upload filtersfunction as a sort of panopticon device that is operated by socialmedia companies. But, if the idea is to design a social media systemthat is inherently resistant to this sort of surveillance, thenupload filters are not going to work because the communications areprotected from users. In effect, that means that mandating GIFCTtools, further determines what sort of system design is acceptable ornot. This makes the regulation invasive because it undermines the"general purpose" nature of the Internet, meaning somepurposes get ruled out under this approach.

Thecurrent policy objective of upload filters is twofold: regulatingcontent and taming the dominance by certain players. These arelegitimate objectives. But, as technology tools, upload filters failon both counts: not only do they have limitationsin moderating content effectively, but they also cementthe dominant position of big technology companies. Given the costs ofcreating such a tool and the requirement for online platforms to havesystems that ensure the fast, rigorous and efficient takedown ofcontent, there is a trend emerging where smaller players depend onthe systems of bigger ones.

Ultimately,upload filters are imperfect and not even an effective solution toour Internet and social media governance problems. They don'treduce the risk of recidivism and only eliminate the problems, nottheir recurrence. Aside from the fact that upload filters cannotsolve societal problems, mandated upload filters can adversely affectInternet architecture. Generally, the Internet's architecturecan be impactedby unnecessary technology tools, like deep packet inspection, DNSblocking or upload filters. These tools produce consequences that runcounter to the benefits expected by the Internet: they compromise itsflexibility and do not allow the Internet to continuously serve adiverse and constantly evolving community of users and applications.Instead, they require significant changes to the networks in order tosupport their use.

Overall,there is a real risk that upload filters become a permanent featureof the Internet architecture and online dialogue. This is not asociety that any of us should want to live in - a society wherespeech is determined by software that will never be able to grasp thesubtlety of human communication.

KonstantinosKomaitis is the Senior Director, Policy Strategy at the InternetSociety

FarzanehBadiei is the Director of the Social Media Governance Initiative atYale Law School.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments