Article 5QGMR Bizarre Magistrate Judge Ruling Says That If Facebook Deletes An Account, It No Longer Needs To Keep Details Private

Bizarre Magistrate Judge Ruling Says That If Facebook Deletes An Account, It No Longer Needs To Keep Details Private

by
Mike Masnick
from Techdirt on (#5QGMR)
Story Image

There have been a bunch of slightly wacky court rulings of late, and this recent one from magistrate judge Zia Faruqui definitely is up there on the list of rulings that makes you scratch your head. The case involves the Republic of Gambia seeking information on Facebook accounts that were accused of contributing to ethnic genocide of the Rohingya in Myanmar. This situation was -- quite obviously -- horrible, and it tends to be the go-to story for anyone who wants to show that Facebook is evil (though I'm often confused about how people often seem more focused on blaming Facebook for the situation than the Myanmar government which carried out the genocide...). Either way, the Republic of Gambia is seeking information from Facebook regarding the accounts that played a role in the genocide, as part of its case at the International Court of Justice.

Facebook, which (way too late in the process) did shut down a bunch of accounts in Myanmar, resisted demands from Gambia to hand over information on those accounts noting, correctly, that the Stored Communications Act likely forbids it from handing over such private information. The SCA is actually pretty important in protecting the privacy of email and messages, and is one of the rare US laws on the books that is actually (for the most part) privacy protecting. That's not to say it doesn't have its own issues, but the SCA has been useful in the past in protecting privacy.

The ruling here more or less upends interpretations of the SCA by saying once an account is deleted, it's no longer covered by the SCA. That's... worrisome. The full ruling is worth a read, as you'll know you'll be in for something of a journey when it starts out:

I come to praise Facebook, not to bury it.

Not quite what you expect from a judicial order. The order lays out the unfortunately gory details of the genocide in Myanmar, as well as Facebook's role in enabling the Myanmar government to push out propaganda and rally support for its ethnic cleansing. But the real question is how does all of this impact the SCA. As the judge notes, since the SCA was written in 1986 it certainly didn't predict today's modern social media, or the questions related to content moderation, so this is a new issue for the court to decide. But... still. The court decides that because an account is disabled... that means that the communications are no longer "stored." Because [reasons].

The Problem Of Content Moderation

At the time of enactment, Congress viewed ECS and RCS providers as mail/packagedelivery services. See Cong. Rsch. Serv., R46662, Social Media: Misinformation and ContentModeration Issues for Congress (2021), https://crsreports.congress.gov/product/pdf/R/R46662.This view failed to consider content moderation; mail/package delivery services have neither theability nor the responsibility to search the contents of every package. Yet after disinformation onsocial media has fed a series of catastrophic harms, major providers have responded by taking onthe de facto responsibility of content moderation. See id. The question of how social mediaplatforms can respect the freedom of expression rights of users while also protecting [users] fromharm is one of the most pressing challenges of our time." ...

This Court is the first to consider the question of what happens after a provider acts on itscontent moderation responsibility. Is content deleted from the platform but retained by theprovider in backup storage?" It is not.

That obviously seems like a stretch to me. If the company still retains the information then it is clearly in storage. Otherwise, you've just created a massive loophole by saying that any platform can expose the private communications of someone if they first disable their account.

The court's reasoning, though gets at the heart of the language of the SCA and how it protects both "any temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof" or "any storage of such communication by anelectronic communication service for purposes of backup protection of such communication." It says the first bit can't apply because these communications had reached their "final destination" and were no longer temporary. And it can't be "backup" since the original content had been deleted, therefore there couldn't be any "backup."

Congress's conception of backup'necessarily presupposes the existence of another copy to which this [backup record] would serveas a substitute or support." Id. Without an original, there is nothing to back up. Indeed thelifespan of a backup is necessarily tied to that of the underlying message. Where the underlyingmessage has expired . . . , any copy is no longer performing any backup function. An [ECS] thatkept permanent copies of [deleted] messages could not fairly be described as backing up' thosemessages."

But... I think that's just wrong. Facebook retaining this data (but blocking the users from accessing it themselves) is clearly a "backup." It's backup in case there is a reason why, at some future date, the content does need to be restored. Under the judge's own interpretation, if you backup your hard drive, but then the drive crashes, your backup is no longer your backup, because there's no original. But... that's completely nonsensical.

The judge relies on (not surprisingly) a case in which the DOJ twisted and stretched the limits of the SCA to get access to private communications:

Nearly all backup storage" litigation relates to delivered, undeleted content. That caselaw informs and supports the Court's decision here. Although there is no binding circuitprecedent, it appears that a clear majority of courts have held that emails opened by the intendedrecipient (but kept on a web-based server like Gmail) do not meet the [backup protection]definition of electronic storage.'" Sartori v. Schrodt, 424 F. Supp. 3d 1121, 1132 (N.D. Fla. 2019)(collecting cases). The Department of Justice adopted this view, finding that backup protectiondoes not include post-transmission storage of communications." U.S. Dep't of Just., Searchingand Seizing Computers and Obtaining Electronic Evidence in Criminal Investigations, 123 (2009),https://www.justice.gov/sites/default/files/criminal-ccips/legacy/2015/01/14/ssmanual2009.pdf.The Gambia argues for following the majority view's limited definition of backup storage. SeeSartori, 424 F. Supp. 3d at 1132; ECF No. 16 (Pet'r's Resp. to Surreply) at 5-6. If undeletedcontent retained by the user is not in backup storage, it would defy logic for deleted content towhich the user has no access to be in backup storage.

As for the argument (which makes sense to me) that Facebook made that the entire reason for retaining the account shows that it's backup, the judge just doesn't buy it.

Facebook argues that because the provider-deleted content remains on Facebook servers inproximity to where active content on the platform is stored, both sets of content should be protectedas backup storage. See Conf. Tr. at 76. However, the question is not where the records are storedbut why they are stored. See Theofel, 359 F.3d at 1070. Facebook claims it kept the instant recordsas part of an autopsy of its role in the Rohingya genocide. See Conf. Tr. at 80-81. Whileadmirable, that is storage for self-reflection, not for backup.

The judge also brushes aside the idea that there are serious privacy concerns with this result, mainly because the judge doesn't believe Facebook cares about privacy. That, alone, is kind of a weird way to rule on this issue.

Finally, Facebook advances a policy argument, opining that this Court's holding will havesweeping privacy implications-every time a service provider deactivates a user's account for anyreason, the contents of the user's communications would become available for disclosure toanyone, including the U.S. government.".... Facebook taking up the mantleof privacy rights is rich with irony. News sites have entire sections dedicated to Facebook's sordidhistory of privacy scandals.

So... because Facebook doesn't have a great history regarding the protection of privacy... we can make it easier for Facebook to expose private communications? What? And even if it's true that Facebook has made problematic decisions in the past regarding privacy, that's wholly separate from the question of whether or not it has a legal obligation to protect the privacy of messages now.

Furthermore, the judge insists that even if there are privacy concerns, they are "minimal":

The privacy implications here are minimal given the narrow category of requested content.Content urging the murder of the Rohingya still permeates social media. See Stecklow, supra(documenting more than 1,000 examples . . . of posts, comments, images and videos attackingthe Rohingya or other Myanmar Muslims that were on Facebook" even after Facebook apologizedfor its services being used to amplify hate or exacerbate harm against the Rohingya"). Suchcontent, however vile, is protected by the SCA while it remains on the platform. The parade ofhorribles is limited to a single float: the loss of privacy protections for de-platformed content. Andeven that could be mitigated by users joining sites that do not de-platform content.

Yes. In this case. But this could set a precedent for accessing a ton of other private communications as well, and that's what's worrying. It's absolutely bizarre and distressing that the judge doesn't bother to think through the implications of this ruling beyond just this one case.

Prof. Orin Kerr, one of the foremost experts on ECPA and the SCA, notes that this is both an "astonishing interpretation" and "stunning."

Also, it's a stunning interpretation in its consequences. Under the op, the most fundamental rule of Internet privacy -- that your e-mails and messages are protected from disclosure -- is largely meaningless. A provider can just delete your account and hand out your messages.

- Orin Kerr (@OrinKerr) September 24, 2021

The entire ruling is concerning -- and feels like yet another situation where someone's general disdain for Facebook and its policies (a totally reasonable position to take!) colored the analysis of the law. And the end result is a lot more dangerous for everyone.

External Content
Source RSS or Atom Feed
Feed Location https://www.techdirt.com/techdirt_rss.xml
Feed Title Techdirt
Feed Link https://www.techdirt.com/
Reply 0 comments