Facebook Asked To Change Terms Of Service To Protect Journalists

There are plenty of things to be concerned about regarding Facebook these days, and I'm sure we'll be discussing them for years to come, but the Knight First Amendment Center is asking Facebook to make a very important change as soon as possible: creating a safe harbor for journalists who are researching public interest stories on the platform. Specifically, the concern is that basic tools used for reporting likely violate Facebook's terms of service, and could lead to Facebook being able to go after reporters for CFAA violations for violating its terms. From the letter:
Digital journalism and research are crucial to the public's understanding ofFacebook's platform and its influence on our society. Many of the most importantstories written about Facebook and other social media platforms in recent monthshave relied on basic tools of digital investigation. For example, research publishedby an analyst with the Tow Center for Digital Journalism, and reported in TheWashington Post, uncovered the true reach of the Russian disinformation campaignon Facebook. An investigation by Gizmodo showed how Facebook's "People YouMay Know" feature problematically exploits "shadow" profile data in order torecommend friends to users. A story published by ProPublica revealed thatFacebook's self-service ad platform had enabled advertisers of rental housing todiscriminate against tenants based on race, disability, gender, and other protectedcharacteristics. And a story published by the New York Times exposed a vast tradein fake Twitter followers, some of which impersonated real users.
Facebook's terms of service limit this kind of journalism and research becausethey ban tools that are often necessary to it-specifically, the automated collectionof public information and the creation of temporary research accounts. Automatedcollection allows journalists and researchers to generate statistical insights intopatterns, trends, and information flows on Facebook's platform. Temporaryresearch accounts allow journalists and researchers to assess how the platformresponds to different profiles and prompts.
Journalists and researchers who use tools in violation of Facebook's terms ofservice risk serious consequences. Their accounts may be suspended or disabled.They risk legal liability for breach of contract. The Department of Justice andFacebook have both at times interpreted the Computer Fraud and Abuse Act toprohibit violations of a website's terms of service. We are unaware of any case inwhich Facebook has brought legal action against a journalist or researcher for aviolation of its terms of service. In multiple instances, however, Facebook hasinstructed journalists or researchers to discontinue important investigative projects,claiming that the projects violate Facebook's terms of service. As you undoubtedlyappreciate, the mere possibility of legal action has a significant chilling effect. Wehave spoken to a number of journalists and researchers who have modified theirinvestigations to avoid violating Facebook's terms of service, even though doing somade their work less valuable to the public. In some cases, the fear of liability ledthem to abandon projects altogether.
This is a big deal, as succinctly described above. We've talked in the past about how Facebook has used the CFAA to sue useful services and how damaging that is. But the issues here have to do with actual reporters trying to better understand aspects of Facebook, for which there is tremendous and urgent public interest, as the letter lays out. Also, over at Gizmodo, Kash Hill has a story about how Facebook threatened them over their story investigating Facebook's "People You May Know" feature, showing that this is not just a theoretical concern:
In order to help conduct this investigation, we built a tool to keep track of the people Facebook thinks you know. Called the PYMK Inspector, it captures every recommendation made to a user for however long they want to run the tool. It's how one of us discovered Facebook had linked us with an unknown relative. In January, after hiring a third party to do a security review of the tool, we released it publicly on Github for users who wanted to study their own People You May Know recommendations. Volunteers who downloaded the tool helped us explore whether you'll show up in someone's People You Know after you look at their profile. (Good news for Facebook stalkers: Our experiment found you won't be recommended as a friend just based on looking at someone's profile.)
Facebook wasn't happy about the tool.
The day after we released it, a Facebook spokesperson reached out asking to chat about it, and then told us that the tool violated Facebook's terms of service, because it asked users to give it their username and password so that it could sign in on their behalf. Facebook's TOS states that, "You will not solicit login information or access an account belonging to someone else." They said we would need to shut down the tool (which was impossible because it's an open source tool) and delete any data we collected (which was also impossible because the information was stored on individual users' computers; we weren't collecting it centrally).
The proposal in the letter is that Facebook amend its terms of service to create a "safe harbor" for journalism. While Facebook recently agreed to open up lots of data to third party academics, it's important to note that journalists and academics are not the same thing.
The safe harbor we envision would permit journalists and researchers toconduct public-interest investigations while protecting the privacy of Facebook'susers and the integrity of Facebook's platform. Specifically, it would provide thatan individual does not violate Facebook's terms of service by collecting publiclyavailable data by automated means, or by creating and using temporary researchaccounts, as part of a news-gathering or research project, so long as the projectmeets certain conditions.
First, the purpose of the project must be to inform the general public aboutmatters of public concern. Projects designed to inform the public about issues likeecho chambers, misinformation, and discrimination would satisfy this condition.Projects designed to facilitate commercial data aggregation and targetedadvertising would not.
Second, the project must protect Facebook's users. Those who wish to takeadvantage of the safe harbor must take reasonable measures to protect userprivacy. They must store data obtained from the platform securely. They must notuse it for any purpose other than to inform the general public about matters ofpublic concern. They must not sell it, license it, or transfer it to, for example, a dataaggregator. And they must not disclose any information that would readily identifya user without the user's consent, unless the public interest in disclosure wouldclearly outweigh the user's interest in privacy.
There are a few more conditions in the proposal, including not interfering with the proper working of Facebook. The letter includes a draft amendment as well.
While there may be some hesitation among certain people with anything that seems to try to carve out different rules for a special class of people, I appreciate that the approach here is focused on carving out a safe harbor for journalism rather than journalists. That is, as currently structured, anyone could qualify for the safe harbors if they are engaged in acts of journalism, and it does not have any silly requirement about being attached to a well known media organization or anything like that. The entire setup seems quite reasonable, so now we'll see how Facebook responds.
Permalink | Comments | Email This Story