Article 4M0SJ Make the internet better by empowering users, not by demanding that platforms implement automated filters

Make the internet better by empowering users, not by demanding that platforms implement automated filters

by
Cory Doctorow
from on (#4M0SJ)
Story Image

In the wake of the Senate's predictably grandstanding "Protecting Digital Innocence" hearings (on how to keep kids from online harms), my EFF colleagues Elliot Harmon and India McKinney have posted an excellent, thoughtful rebuttal to proposals to segregate a "kid internet" from an "adult internet" in order to ensure that kids don't see "harmful" things.

They start by pointing out that there is no consensus on what constitutes "harmful material": a ban on kids seeing "any pictures of human genitalia" (proposed by Sen. John Kennedy, R-LA) would block Our Bodies, Ourselves and even Where Did I Come From? (as a parent, I've given both to my daughter, who is now 11).

Attempts to determine what is and is not "safe for kids" have resulted in a grotesque string of laughable failures, from Tumblr's hilariously terrible filter to SESTA/FOSTA, an "anti-sex-trafficking" law that has led to a resurgence in street prostitution, violence against sex-workers, and a golden age of pimping, as sex-workers seek out physical protection now that they can't use the internet to screen clients.

Rather than requiring platforms to block material that might be harmful, Harmon and McKinney propose that we should empower users: allowing kids and their parents to use third-party services and tools that filter, block and sort the materials the Big Tech platforms serve to them, so they can make up their own minds about what is and is not appropriate.

Particularly interesting is their critique of a proposal to put "kid material" in separate silos where there is no tracking for behavioral advertising purposes: "Platforms must take measures to put all users in charge of how their data is collected, used, and shared-including children-but cleanly separating material directed at adults and children isn't easy. It would be awful if a measure designed to protect young Internet users' privacy made it harder for them to access materials on sensitive issues like sexual health and abuse. A two-tiered Internet undermines the very types of speech for which young Internet users most need privacy."

The final part of their critique, defending Section 230 of the Communications Decency Act (which has been targeted by Lindsay Graham and other conservative lawmakers) is a must-read: the platforms would probably prefer to keep CDA230 intact, but if sacrificing it means raising the cost of entry so they need never fear a nascent competitor, then they'll happily bargain it away in exchange for a perpetual Internet Domination License.

During the hearing, Stephen Balkam of the Family Online Safety Institute provided an astute counterpoint to the calls for a more highly filtered Internet, calling to move the discussion "from protection to empowerment." In other words, tech companies ought to give users more control over their online experience rather than forcing all of their users into an increasingly sanitized web. We agree.

It's foolish to think that one set of standards would be appropriate for all children, let alone all Internet users. But today, social media companies frequently make censorship decisions that affect everyone. Instead, companies should empower users to make their own decisions about what they see online by letting them calibrate and customize the content filtering methods those companies use. Furthermore, tech and media companies shouldn't abuse copyright and other laws to prevent third parties from offering customization options to people who want them.

The Key to Safety Online Is User Empowerment, Not Censorship [Elliot Harmon and India McKinney/EFF Deeplinks]

External Content
Source RSS or Atom Feed
Feed Location https://boingboing.net/feed
Feed Title
Feed Link https://boingboing.net/
Reply 0 comments