Data Privacy Matters: Facebook Expands Encryption Just After Facebook Messages (Obtained Via Search Warrant) Used To Charge Teen For Abortion

In the wake of the Dobbs decision overturning Roe v. Wade, there has been plenty of attention paid to the kinds of data that companies keep on us, and how they could be exposed, including to law enforcement. Many internet companies seemed somewhat taken by surprise regarding all of this, which is a bit ridiculous, given that (1) they had plenty of time to prepare for this sort of thing, and (2) it's not like plenty of us haven't been warning companies about the privacy problems of having too much data.
Anyway, this week, a story broke that is re-raising many of these concerns, as it's come out that a teenager in Nebraska has been charged with an illegal abortion, after Meta turned over messages on Facebook Messenger pursuant to a search warrant, which was approved following an affidavit from Norfolk Police Detective Ben McBride.
This is raising all sorts of alarms, for all sorts of good reasons. While many are blaming Meta, that's somewhat misplaced. As the company notes (and as you can confirm by looking at the linked documents above), the search warrant that was sent to the company said it was an investigation into the illegal burning and burial of a stillborn infant, not something to do with abortion. Given that, it's not difficult to see why Meta provided the information requested.
Of course, there's a bigger question here: which is about why Meta should even have access to that information in the first place. And, it appears that Meta agrees. Just days after this all came out, the company announced that it is (finally) testing a much more encrypted version of Messenger (something the company has been talking about for a while, but which has proven more complicated to implement). The new features include encrypted backups of messages and also making end-to-end encrypted chats the default for some users.
While the timing is almost certainly a coincidence, many observers are making the obvious connection to this story.
While the Nebraska story is horrifying in many ways, it's also a reminder of why full end-to-end encryption is so incredibly important, and how leaving unencrypted data with third parties means your data is always, inherently at risk.
Arguably, Facebook should have encrypted its messaging years ago, but it's been a struggle for a variety of reasons, as Casey Newton laid out in a fascinating piece. Facebook has certainly faced technical challenges and (perhaps more importantly) significant political pushback from governments and law enforcement who like the ability to snoop on everyone.
But also part of the problem is the end users themselves:
The first is that end-to-end encryption can be a pain to use. This is often the tradeoff we make in exchange for more security, of course. But average people may be less inclined to use a messaging app that requires them to set a PIN to restore old messages, or displays information about the security of their messages that they find confusing or off-putting.
The second, related challenge is that most people don't know what end-to-end encryption is. Or, if they're heard of it, they might not be able to distinguish it from other, less secure forms of encryption. Gmail, among many other platforms, encrypts messages only when a message is in transit between Google's servers and your device. This is known as transport layer security, and it offers most users good protection, but Google - or law enforcement - can still read the contents of your messages.
Meta's user research has shown that people grow concerned when you tell them you're adding end-to-end encryption, one employee told me, because it scares them that the company might have been reading their messages before now. Users also sometimes assume new features are added for Meta's benefit, rather than their own - that's one reason the company labeled stored-message feature secure storage," rather than automatic backups," so as to emphasize security in the branding.
It's also interesting to note that Casey's piece says that Meta's user survey found that most of their users don't think encrypting their own data is that much of a priority, as they're just not that concerned. This does not surprise me at all - as we've now had decades of revealed preferences that show that, contrary to what many in the media suggest, most people don't actually care that much about their privacy.
And, yet, as this story shows, they really should. But if we've learned anything over the past couple decades, no amount of horror stories about revealed data will convince the majority of people to take proactive steps to better secure their data. So, on that front, it's actually a positive move that Meta is pushing forward with effectively moving people over to fully encrypted messaging - hopefully in a user-friendly manner.
Data privacy does matter, and the answer to it has to come from making it widely available in a consumer friendly version - even when that's a really difficult challenge. Laws are not going to protect privacy. Remember, governments seem more interested in banning or breaking end-to-end encryption than encouraging it. And while perhaps the rise of data abuse post-Dobbs will expand the number of people who proactively seek to use encryption and take their own data privacy more seriously, history has shown that most people will still take the most convenient way forward.
And that means it's actually good news that Facebook is finally moving forward with efforts to make that most convenient path... still end-to-end encrypted.