More Suicide Resource Orgs Found To Be Monetizing Sensitive User Data

Last February, a report in Politico found that Crisis Text Line, one of the nation's largest nonprofit support options for the suicidal, had been monetizing user data. More specifically, the nonprofit was collecting all sorts of data on customer interactions" (ranging from the frequency certain words are used, to the type of distress users are experiencing), then sharing that data with their for profit partner.
That partner then made money by selling that data to data brokers. This was ok, the companies claimed, because the data collected was anonymized," a term that studyafterstudyafterstudy have shown means nothing and doesn't actually protect your data.
Now The Markup has another report showing how websites for mental health crisis resources across the country routinely collect sensitive user data and share it with Facebook. More specifically, their websites contain the Meta Pixel," which can collect data including names, user ID numbers, email addresses, and browsing habits to the social media giant:
The Markup tested186 local crisis center websitesunder the umbrella of the national 988Suicide and Crisis Lifeline. Calls to the national 988 line are routed to these centers based on the area code of the caller. The organizations often also operate their own crisis lines and provide other social services to their communities.
The Markup's testing revealed that more than 30 crisis center websites employed the Meta Pixel, formerly called the Facebook Pixel. The pixel, a short snippet of code included on a webpage that enables advertising on Facebook, is a free and widely used tool. A2020 Markup investigationfound that 30 percent of the web's most popular sites use it.
We recently noted how many of the States that have been freaking out about TikTok also have tech embedded in their websites that share all kinds of sensitive data with data brokers or companies like Facebook. Often they're just using websites developed by third parties using templates, and aren't fully aware that their website is even doing this. Other times they know and just don't care.
But it's a continued example of the kind of stuff that simply wouldn't be as common if we had even a basic national privacy law for the internet era. One that required just the slightest bit of due diligence, especially for nonprofits and companies that operate in particularly sensitive arenas.
But for decades the U.S. government, at the direct behest of numerous industries, prioritized making money over human safety, brand trust, or even marketplace health. And we keep paying for it in direct and indirect ways alike with scandals that will only get worse now that issues like the authoritarian assault on women's reproductive healthcare has come squarely into frame.