ExTwitter’s Last-Minute Update To Kids Online Safety Act Still Fails To Protect Kids—Or Adults—Online
Last week, the Senate released yet another version of the Kids Online Safety Act, written, reportedly, with the assistance of X CEO Linda Yaccarino in a flawed attempt toaddress the critical free speechissues inherent in the bill. This last minute draft remains, at its core, an unconstitutional censorship bill that threatens the online speech and privacy rights of all internet users.
Update Fails to Protect Users from Censorship or Platforms from Liability
The most important update, according to its authors, supposedly minimizes the impact of the bill on free speech. As we've said before, KOSA's duty of care" section is its biggest problem, as it would force a broad swath of online services to make policy changes based on the content of online speech. Though the bill's authors inaccurately claim KOSA only regulates designs of platforms, not speech, the list of harms it enumerates-eating disorders, substance use disorders, and suicidal behaviors, for example-are notcausedbythe design of a platform.
KOSA is likely to actually increase the risks to children, because it will prevent them from accessing online resources about topics like addiction, eating disorders, and bullying. It will result in services imposing age verification requirements and content restrictions, and it will stifle minors from finding or accessing their own supportive communities online. For these reasons, we've been critical of KOSAsince it was introducedin 2022.
This updated bill adds just one sentence to the duty of care" requirement: Nothing in this section shall be construed to allow a government entity to enforce subsection a [the duty of care] based upon the viewpoint of users expressed by or through any speech, expression, or information protected by the First Amendment to the Constitution of the United States." But the viewpoint ofuserswas never impacted by KOSA's duty of care in the first place. The duty of care is a duty imposed on platforms, not users.Platformsmust mitigate the harms listed in the bill, not users, and the platform's ability to share users' views is what's at risk-not the ability of users to express those views. Adding that the bill doesn't impose liability based on userexpression doesn't change how the bill would be interpreted or enforced. The FTC could still hold a platform liable for the speech it contains.
Let's say, for example, that a covered platform like reddit hosts a forum created and maintained by users for discussion of overcoming eating disorders. Even though the speech contained in that forum is entirely legal, often helpful, and possibly even life-saving, the FTC could still hold reddit liable for violating the duty of care by allowing young people to view it. The same could be true of a Facebook group about LGBTQ issues, or for a post about drug use that X showed a user through its algorithm. If a platform's defense were that this information is protected expression, the FTC could simply say that they aren't enforcing it based on the expression of any individual viewpoint, but based on the fact that the platform allowed a design feature-a subreddit, Facebook group, or algorithm-to distribute that expression to minors. It's a superfluous carveout for user speech and expression that KOSA never penalized in the first place, but which the platform would still be penalized for distributing.
It's particularly disappointing that those in charge of X-likely a covered platform under the law-had any role in writing this language, as the authors have failed to grasp the world of difference between immunizing individual expression, and protecting their own platform from the liability that KOSA would place on it.
Compulsive Usage Doesn't Narrow KOSA's ScopeAnother of KOSA'sissueshas been its vague list of harms, which have remained broad enough that platforms have no clear guidance on what is likely to cross the line. This update requires that the harms of depressive disorders and anxiety disorders" have objectively verifiable and clinically diagnosable symptoms that are related to compulsive usage." The latest text's definition of compulsive usage, however, is equally vague: a persistent and repetitive use of a covered platform that significantly impacts one or more major life activities, including socializing, sleeping, eating, learning, reading, concentrating, communicating, or working." This doesn't narrow the scope of the bill.
It should be noted thatthere is no clinical definition of compulsive usage" of online services. As in past versions of KOSA, this updated definition cobbles together a definition that sounds just medical, or just legal, enough that itappearslegitimate-when in fact the definition is devoid of specific legal meaning, and dangerously vague to boot.
How could the persistent use of social medianotsignificantly impact the way someone socializes or communicates? The bill doesn't even require that the impact be a negative one. Comments on an Instagram photo from a potential partner may make it hard to sleep for several nights in a row; a lengthy new YouTube video may impact someone's workday. Opening a Snapchat account might significantly impact how a teenager keeps in touch with her friends, but that doesn't mean her preference for that over text messages is compulsive" and therefore necessarily harmful.
Nonetheless, an FTC weaponizing KOSA could still hold platforms liable for showing content to minors that theybelieveresults in depression or anxiety, so long as they can claim the anxiety or depression disrupted someone's sleep, or even just changed how someone socializes or communicates. These so-called harms" could still encompass a huge swathe of entirely legal (and helpful) content about everything from abortion access and gender-affirming care to drug use, school shootings, and tackle football.
Dangerous Censorship Bills Do Not Belong in Must-Pass LegislationThe latest KOSA draft comes as incoming nominee for FTC Chair, Andrew Ferguson-who would be empowered to enforce the law, if passed-hasreportedly vowedto protect free speech by fighting back against the trans agenda," among other things. As we've said for years (andabouteveryversionof thebill), KOSA would give the FTC under this or any future administration wide berth to decide what sort of content platforms must prevent young people from seeing. Just passing KOSA would likely result in platforms taking down protected speech and implementing age verification requirements, even if it's never enforced; the FTC could simply express the types of content they believe harms children, and use the mere threat of enforcement to force platforms to comply.
No representative should consider shoehorning this controversial and unconstitutional bill into a continuing resolution. A law that forces platforms to censor truthful online content should not be in a last minute funding bill.
Republished from the EFF's Deeplinks blog.