Surprise: Telehealth Startups Playing Fast And Loose With Sensitive User Medical Data

From the Internet of very broken things to telecom networks, the state of U.S. privacy and user security is arguably pathetic. It's 2022 and we still don't have even a basic privacy law for the Internet era, in large part because over-collection of data is too profitable to a wide swath of industries, which, in turn, lobby Congress to do either nothing, or the wrong thing.
Sensitive medical data, supposedly held to a higher standard, isn't much of an exception. The Markup and STAT this week had an interesting joint report showcasing how many telehealth startups routinely play fast and loose with consumer data. Numerous telehealth websites were found to share sensitive data with ad networks, including which new medications you were taking and what issues you are having:
On 13 of the 50 websites, we documented at least one tracker-from Meta, Google, TikTok, Bing, Snap, Twitter, LinkedIn, or Pinterest-that collected patients' answers to medical intake questions. Trackers on 25 sites, including those run by industry leaders Hims & Hers, Ro, and Thirty Madison, told at least one big tech platform that the user had added an item like a prescription medication to their cart, or checked out with a subscription for a treatment plan.
Once this data makes its way into advertising networks, it inevitably gets collated into anonymized" profiles of individuals that data routinely suggests aren't actually that anonymous. All it takes is a few additional snippets of data found elsewhere (often available courtesy of a parade of breaches, hacks, or leaks) before individual users can be identified.
A recent Mozilla report also found that most mental health and prayer apps similarly have pathetic privacy and security standards. And numerous reports have pointed out how the new and improved" privacy standards, heavily hyped by tech giants like Apple, are often performative.
As The Markup report makes clear, existing privacy regulations like the Health Insurance Portability and Accountability Act (HIPAA) were not built for telehealth, so much of this sloppy handling of consumer data falls through the cracks. Most consumers, meanwhile, operate from the false belief that this data is far more protected than it actually is:
Individually, we have a sense that this information should be protected," said [Andrew] Mahler, who is now vice president of privacy and compliance at Cynergistek, a health care risk auditing company. But then from a legal and a regulatory perspective, you have organizations saying ... technically, we don't have to."
U.S. regulators occasionally crack down on bad behavior in this sector, such as when the FTC sued data broker Kochava last July, stating the company wasn't adequately protecting data on whether consumers had visited a reproductive health clinic or addiction recovery center. But even post-Roe, with the over-collection of location data taking on life or death stakes, the FTC routinely lacks the staff or finances to take such action with any real consistency in a market full of bad actors.
And it lacks the staff and resources because it's become zealous dogma, particularly on the right, to lobotomize all meaningful US regulatory oversight (whether it's privacy or anything else), then put on dumb, hollow performances any time a company abuses the cavalier private data environment they created through their greed and apathy (see: the myopic fixation on TikTok and only TikTok).
Inevitably there will be a medical privacy data scandal so massive it will force the culture to truly own the fact they've prioritized money over consumer/market health, privacy, and safety for decades. But even then, it's a steep uphill climb to get a comically corrupt Congress to craft even the most modest of guardrails.