Article 6MFEW Two Years Post-Roe: A Better Understanding Of Digital Threats

Two Years Post-Roe: A Better Understanding Of Digital Threats

by
Mike Masnick
from Techdirt on (#6MFEW)
Story Image

It's been a long two years since theDobbsdecision to overturnRoe v. Wade. Between May 2022 when the Supreme Court accidentally leaked the draft memo and the following June when the case was decided, there was a mad scramble to figure out what the impacts would be. Besides the obvious perils of stripping away half the country's right to reproductive healthcare, digital surveillance and mass data collection caused a flurry ofconcerns.

Although many activists fighting for reproductive justice had been operating under assumptions of little to no legal protections for some time, theDobbsdecision was for most a sudden and scary revelation. Everyone implicated in that moment somewhat understood the stark difference between pre-Roe1973 and post-Roe2022; living under the most sophisticated surveillance apparatus in human history presents a vastly different landscape of threats. Since 2022, some suspicions have been confirmed, new threats have emerged, and overall our risk assessment has grown smarter. Below, we cover the most pressing digital dangers facing people seeking reproductive care, and ways to combat them.

Digital Evidence in Abortion-Related Court Cases: Some ExamplesSocial Media Message Logs

A case in Nebraskaresulted in a woman, Jessica Burgess, being sentenced to two years in prison for obtaining abortion pills for her teenage daughter. Prosecutors used a Facebook Messenger chat log between Jessica and her daughter as key evidence, bolstering the concerns many had raised about using such privacy-invasive tech products for sensitive communications. At the time, Facebook Messenger did not have end-to-end encryption.

In response to criticisms about Facebook's cooperation with law enforcement that landed a mother in prison, a Meta spokesperson issued a frustratinglylaconic tweetstating that [n]othing in the valid warrants we received from local law enforcement in early June, prior to the Supreme Court decision, mentioned abortion." They followed this up with ashort statementreiterating that the warrants did not mention abortion at all. The lesson is clear: although companies do sometimes push back against data warrants, we have to prepare for the likelihood that they won't.

Google: Search History & Warrants

Well before theDobbsdecision, prosecutors had already usedGoogle Search historyto indict a woman for her pregnancy outcome. In this case, it was keyword searches for misoprostol (a safe and effective abortion medication) that clinched the prosecutor's evidence against her. Google acquiesced,as it so often has, to the warrant request.

Related to this is the ongoing and extremely complicated territory ofreverse keyword and geolocation warrants. Googlehas promisedthat it would remove from user profiles all location data history related to abortion clinic sites. Researchers tested this claim and it was shownto be false,twice. Late in 2023, Google made a bigger promise: it would soon change how it stores location data to make it much more difficult-if not impossible-for Google to provide mass location data in response to a geofence warrant, a change we've been asking Google to implement for years. This would be agenuinely helpful measure,but we've been conditioned to approach such claims with caution. We'll believe it when we see it (and refer to external testing for proof).

Other Dangers to ConsiderDoxxing

Sites propped up for doxxing healthcare professionals that offer abortion services are about asold as the internetitself. Doxxing comes in a variety of forms, but a quick and loose definition of it is the weaponization of open source intelligence with the intention of escalating to other harms. There's been amassive increasein hate groups abusing public records requests and data broker collections to publish personal information about healthcare workers. Doxxing websites hosting such material are updated frequently. Doxxing has led tosteadily risingmaterial dangers (targeted harassment, gun violence, arson, just to name a few) for thepast few years.

There are some piecemeal attempts at data protection for healthcare workers in more protective states likeCalifornia(one which we've covered). Other states may offer some form of anaddress confidentiality programthat provides people with proxy addresses. Though these can be effective, they are not comprehensive. Since doxxing campaigns are typically coordinated through a combination of open source intelligence tactics, it presents a particularly difficult threat to protect against. This is especially true for government and medical industry workers whose information may be subjected to exposure through public records requests.

Data Brokers

Recently,Senator Wyden's office released a statementabout a long investigation into Near Intelligence, a data broker company that sold geolocation data to The Veritas Society, an anti-choice think tank. The Veritas Society then used the geolocation data to target individuals who had traveled near healthcare clinics that offered abortion services and delivered pro-life advertisements to their devices.

That alone is a stark example of the dangers of commercial surveillance, but it's still unclear what other ways this type of dataset could be abused. Near Intelligence has filed for bankruptcy, but they are far from the only, or the most pernicious, data broker company out there. This situation bolsters whatwe've been saying for years: the data broker industry is a dangerously unregulated mess of privacy threats that needs to be addressed. It not only contributes to the doxxing campaigns described above, but essentially creates a backdoor for warrantless surveillance.

Domestic Terrorist Threat Designation by Federal Agencies

Midway through 2023,The Intercept published an articleabout a tenfold increase in federal designation of abortion-rights activist groups as domestic terrorist threats. This projects a massive shadow of risk for organizers and activists at work in the struggle for reproductive justice. The digital surveillance capabilities of federal law enforcement are more sophisticated than that of typical anti-choice zealots. Most people in the abortion access movement may not have to worry about being labeled a domestic terrorist threat, though for some that is a reality, and strategizing against it is vital.

Looming ThreatsLegal Threats to Medication Abortion

Last month, the Supreme Courtheard oral argumentschallenging the FDA's approval of and regulations governing mifepristone, a widely available and safe abortion pill. If the anti-abortion advocates who brought this case succeed, access to the most common medication abortion regimen used in the U.S. would end across the country-even in those states where abortion rights are protected.

Access to abortion medication might also be threatened bya 150 year old obscenity law. Many people now recognize the long dormantComstock Actas a potential avenue to criminalize procurement of the abortion pill.

Although the outcomes of these legal challenges are yet-to-be determined, it's reasonable to prepare for the worst: if there is no longer a way to access medication abortion legally, there will be even more surveillance of the digital footprints prescribers and patients leave behind.

Electronic Health Records Systems

Electronic Health Records(EHRs) are digital transcripts of medical information meant to be easily stored and shared between medical facilities and providers. Since abortion restrictions are now dictated on a state-by-state basis, the sharing of these records across state lines present a serious matrix of concerns.

Assome academicsand privacy advocates have outlined, the interoperability of EHRs can jeopardize the safety of patients when reproductive healthcare data is shared across state lines. Although the Department of Health and Human Services hasproposed a new rule to help protect sensitive EHR data,it's currently possible that data shared between EHRs can lead to the prosecution of reproductive healthcare.

The Good Stuff: Protections You Can Take

Perhaps the most frustrating aspect of what we've covered thus far is how much is beyond individual control. It's completely understandable to feel powerless against these monumental threats. That said, you aren't powerless. Much can be done to protect your digital footprint, and thus, your safety. We don't propose reinventing the wheel when it comes to digital security and data privacy. Instead, rely on the resources that already exist and re-tool them to fit your particular needs. Here are some good places to start:

Create a Security Plan

It's impossible, and generally unnecessary, to implement every privacy and security tactic or tool out there. What's more important is figuring out the specific risks you face and finding the right ways to protect against them. This process takes some brainstorming around potentially scary topics, so it's best done well before you are in any kind of crisis. Pen and paper works best.Here's a handy guide.

After you've answered those questions and figured out your risks, it's time to locate the best ways to protect against them. Don't sweat it if you're not a highly technical person; many of the strategies we recommend can be applied in non-tech ways.

Careful Communications

Secure communication is as much a frame of mind as it is a type of tech product. When you are able to identify which aspects of your life need to be spoken about more carefully, you can then make informed decisions about who to trust with what information, and when. It's as much about creating ground rules with others about types of communication as it is about normalizing the use of privacy technologies.

Assuming you've already created a security plan and identified some risks you want to protect against, begin thinking about the communication you have with others involving those things. Set some rules for how you broach those topics, where they can be discussed, and with whom. Sometimes this might look like the careful development of codewords. Sometimes it's as easy as saying let's move this conversation to Signal." Now thatSignal supports usernames(so you can keep your phone number private), as well asdisappearing messages, it's an obvious tech choice for secure communication.

Compartmentalize Your Digital Activity

As mentioned above, it's important to know when to compartmentalize sensitive communications to more secure environments. You can expand this idea to other parts of your life. For example, you can designate different web browsers for different use cases, choosing those browsers for the privacy they offer. One might offer significant convenience for day-to-day casual activities (like Chrome), whereas another is best suited for activities that require utmost privacy (like Tor).

Now apply this thought process towards what payment processors you use, what registration information you give to social media sites, what profiles you keep public versus private, how you organize your data backups, and so on. The possibilities are endless, so it's important that you prioritize only the aspects of your life that most need protection.

Security Culture and Community Care

Both tactics mentioned above incorporate a sense of community when it comes to our privacy and security. We've said it before and we'll say it again:privacy is a team sport. People live in communities built on trust and care for one another; your digital life is imbricated with others in the same way.

If a node on a network is compromised, it will likely implicate others on the same network. This principle of computer network security is just as applicable to social networks. Although traditional information security often builds from a paradigm of zero trust," we are social creatures and must work against that idea. It's more about incorporating elements of shared trust pushing for a culture of security.

Sometimes this looks like setting standards for how information is articulated and shared within a trusted group. Sometimes it looks like choosing privacy-focused technologies to serve a community's computing needs. The point is to normalize these types of conversations, to let others know that you're caring for them by attending to your own digital hygiene. For example, when you ask for consent to share images that include others from a protest, you are not only pushing for a culture of security, but normalizing the process of asking for consent. This relationship of community care through data privacy hygiene is reciprocal.

Help Prevent Doxxing

As somewhat touched on above in the other dangers to consider section, doxxing can be a frustratingly difficult thing to protect against, especially when it's public records that are being used against you. It's worth looking into your state levelvoter registration records, if that information is public, and how you can request for that information to be redacted (success mayvary by state).

Similarly, although business registration records are publicly available, you can appeal to websites that mirror that information (likeBizapedia) to have your personal information taken down. This is of course only a concern if you have a business registration tied to your personal address.

If you work for a business that is susceptible to public records requests revealing personal sensitive information about you, there's little to be done to prevent it. You can, however, apply for an address confidentiality program if your state has it. You can also do the somewhat tedious work of scrubbing your personal information from other places online (since doxxing is often a combination of information resources). Consider subscribing to a service likeDeleteMe(or follow a freeDIY guide) for a more thorough process of minimizing your digital footprint. Collaborating with trusted allies to monitor hate forums is a smart way to unburden yourself from having to look up your own information alone. Sharing that responsibility with others makes it easier to do, as well as group planning for what to do in ways of prevention and incident response.

Take a Deep Breath

It's natural to feel bogged down by all the thought that has to be put towards privacy and security. Again, don't beat yourself up for feeling powerless in the face of mass surveillance. You aren't powerless. You can protect yourself, but it's reasonable to feel frustrated when there is no comprehensive federal data privacy legislation that would alleviate so many of these concerns.

Take a deep breath. You're not alone in this fight.There are guides for youto learn more about stepping up your privacy and security.We've even curated a special listof them. And there isDigital Defense Fund, a digital security organization for the abortion access movement, who we are grateful and proud to boost. And though it can often feel like privacy is getting harder to protect,in many ways it's actually improving. With all that information, as well as continuing to trust your communities, and pushing for a culture of security within them, safety is much easier to attain. With a bit of privacy, you can go back to focusing on what matters, like healthcare.

Originally published to the EFF Deeplinks blog.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments