Big Tech sues California, claims child-safety law violates First Amendment
Enlarge (credit: Image taken by Mayte Torres | Moment)
In the last half of 2022 alone, many services-from game platforms designed with kids in mind to popular apps like TikTok or Twitter catering to all ages-were accused of endangering young users, exposing minors to self-harm and financial and sexual exploitation. Some kids died, their parents sued, and some tech companies were shielded from their legal challenges by Section 230. As regulators and parents alike continue scrutinizing how kids become hooked on visiting favorite web destinations that could put them at risk of serious harm, a pressure that's increasingly harder to escape has mounted on tech companies to take more responsibility for protecting child safety online.
In the United States, shielding kids from online dangers is still a duty largely left up to parents, and some tech companies would prefer to keep it that way. But by 2024, a first-of-its-kind California online child-safety law is supposed to take effect, designed to shift some of that responsibility onto tech companies. California's Age-Appropriate Design Code Act (AB 2273) will force tech companies to design products and services with child safety in mind, requiring age verification and limiting features like autoplay or minor account discoverability via friend-finding tools. That won't happen, however, if NetChoice gets its way.
The tech industry trade association-with members including Meta, TikTok, and Google-this week sued to block the law, arguing in a complaint that the law is not only potentially unconstitutional but also poses allegedly overlooked harms to minors.