School Monitoring Software Sacrifices Student Privacy For Unproven Promises Of Safety
Imagine your search terms, key-strokes, private chats and photographs are being monitored every time they are sent. Millions of students across the country don't have to imagine this deep surveillance of their most private communications: it's a reality that comes with their school districts' decision to install AI-powered monitoring software such asGaggleandGoGuardianon students' school-issued machines and accounts. As we demonstrated with our ownRed Flag Machine, however, this software flags and blocks websites for spurious reasons and often disproportionately targetsdisadvantaged,minorityandLGBTQyouth.
The companies making the software claim it's all done for the sake of student safety: preventing self-harm, suicide, violence, and drug and alcohol abuse. While a noble goal, given that suicide is thesecond highestcause of death among American youth 10-14 years old, no comprehensive or independent studies have shown an increase in student safety linked to the usage of this software. Quite to the contrary: a recent comprehensiveRAND research studyshows that such AI monitoring software may cause more harm than good.
That study also found that how to respond to alerts is left to the discretion of the school districts themselves. Due to a lack of resources to deal with mental health, schools often refer these alerts to law enforcement officers who are not trained and ill-equipped to deal with youth mental crises. When police respond to youth who are having such episodes, the resulting encounters can lead todisastrous results. So why are schools still using the software-when a congressionalinvestigationfound a need for federal action to protect students' civil rights, safety, and privacy"? Why are they trading in their students' privacy for a dubious-at-best marketing claim of safety?
Experts suggest it's because these supposed technical solutions are easier to implement than the effective social measures that schools often lack resources to implement. I spoke withIsabelle Barbour, a public health consultant who has experience working with schools to implement mental health supports. She pointed out that there are considerable barriers to families, kids, and youth accessing health care and mental health supports at a community level. There is also a lack of investment in supporting schools to effectively address student health and well-being. This leads to a situation where many students come to school with needs that have been unmet and these needs impact the ability of students to learn. Although there areclear and proven measures that workto address the burdens youth face, schools often need support (time, mental health expertise, community partners, and a budget) to implement these measures. Edtech companies market largely unproven plug-and-play products to educational professionals who are stretched thin and seeking a path forward to help kids. Is it any wonder why schools sign contracts which are easy to point to when questioned about what they are doing with regard to the youth mental health epidemic?
One example: Gaggle in marketing to school districtsclaimsto have saved 5,790 student lives between 2018 and 2023, according toshaky metrics they themselves designed. All the while they keep the inner-workings of their AI monitoringsecret, making it difficult for outsiders to scrutinize and measure its effectiveness.
We give Gaggle an F"Reports of the errors and inability of the AI flagging tounderstand contextkeep popping up. When the Lawrence, Kansas school district signed a$162,000 contractwith Gaggle, no one batted an eye: It joined a growing number of school districts (currently ~1,500) nation-wide using the software. Then, school administrators called innearly an entire classto explain photographs Gaggle's AI had labeled as nudity" because the software wouldn't tell them:
Yet all students involved maintain that none of their photos had nudity in them. Some were even able to determine which images were deleted by comparing backup storage systems to what remained on their school accounts. Still, the photos were deleted from school accounts, so there is no way to verify what Gaggle detected. Even school administrators can't see the images it flags."
Young journalists within the school district raised concerns about how Gaggle's surveillance of students impacted their privacy and free speech rights. As journalist Max McCoy points out in hisarticlefor the Kansas Reflector, newsgathering is a constitutionally protected activity and those in authority shouldn't have access to a journalist's notes, photos and other unpublished work." Despite having renewed Gaggle's contract, the district removed the surveillance software from the devices of student journalists. Here, a successful awareness campaign resulted in a tangible win for some of the students affected. While ad-hoc protections for journalists are helpful, more is needed to honor all students' fundamental right to privacy against this new front of technological invasions.
Tips for Students to Reclaim their PrivacyStudents struggling with the invasiveness of school surveillance AI may find some reprieve by taking measures and forming habits to avoid monitoring. Some considerations:
- Consider any school-issued device a spying tool.
- Don't try to hack or remove the monitoring software unless specifically allowed by your school: it may result in significant consequences from your school or law enforcement.
- Instead, turn school-issued devices completely off when they aren't being used, especially while at home. This will prevent the devices from activating thecamera, microphone, and surveillance software.
- If not needed, consider leaving school-issued devices in your school locker: this will avoid depending on these devices to log in to personal accounts, which will keep data from those accounts safe from prying eyes.
- Don't log in to personal accounts on a school-issued device (if you can avoid it - we understand sometimes a school-issued device is the only computer some students have access to). Rather, use a personal device for all personal communications and accounts (e.g., email, social media). Maybe your personal phone is the only device you have to log in to social media and chat with friends. That's okay: keeping separate devices for separate purposes will reduce the risk that your data is leaked or surveilled.
- Don't log in to school-controlled accounts or apps on your personal device: that can be monitored, too.
- Instead, create another email address on a service the school doesn't control which isjustfor personal communications. Tell your friends to contact you onthatemail outside of school.
Finally, voice your concern and discomfort with such software being installed on devices you rely on. There are plenty of resources to point to, many linked to in this post, when raising concerns about these technologies. As the young journalists at Lawrence High School have shown, writing about it can be an effective avenue to bring up these issues with school administrators. At the very least, it will send a signal to those in charge that students are uncomfortable trading their right to privacy for an elusive promise of security.
Schools Can Do Better to Protect Students SafetyandPrivacyIt's not only the students who are concerned about AI spying in the classroom and beyond. Parents areoften unawareof the spyware deployed on school-issued laptops their children bring home. And when using a privately-owned shared computer logged into a school-issued Google Workspace or Microsoft account, a parent's web search will be available to the monitoring AI as well.
New studies have uncovered some of themental detrimentsthat surveillance causes. Despite this and the array of First Amendment questions these student surveillance technologies raise, schools have rushed to adopt these unproven and invasive technologies. As Barbour put it:
While ballooning class sizes and the elimination of school positions are considerable challenges, we know that apositive school climatehelps kids feel safe and supported. This allows kids to talk about what they need with caring adults. Adults can then work with others to identify supports. This type of environment helps not only kids who are suffering with mental health problems, it helps everyone."
We urge schools to focus on creating that environment, rather than subjecting students to ever-increasing scrutiny through school surveillance AI.
Reposted from the EFF's Deeplinks blog.