Facebook's '15 months of Fresh Hell' detailed deliciously by WIRED
'Scandals. Backstabbing. Resignations. Record profits. Time Bombs. In early 2018, Mark Zuckerberg set out to fix Facebook.'
Welp. That didn't work.
The May issue cover story of WIRED Magazine is a 12,000-word rip-snorting takedown of Facebook.
Never would have predicted this more than a decade ago, when I first wrote about Facebook--- it was still a college networking website-- and I was a contributor to Wired. The future is weird.
For the past year, the biggest story in tech has been the meltdown and mayhem at Facebook. So @fvogelstein and I dug in, spoke with 65 current + former employees there, and learned some rather interesting things. https://t.co/3UuFVtMdZz
- Nicholas Thompson (@nxthompson) April 16, 2019
The magazine hits newsstands April 23, and I'll be wanting a paper copy of this one.
WIRED's editor in chief Nicholas Thompson and editor at large Fred Vogelstein spoke to 65 current and former Facebook employees for this story about life and work there over the past year.
Plenty of new information, including previously unreported dirt on the split between Facebook and Instagram. Sheryl Sandberg comments here on Cambridge Analytica, which is noteworthy.
"It's ultimately a story about the biggest shifts ever to take place inside the world's biggest social network," Thompson and Vogelstein say in a statement sent around to media today, touting WIRED's big story.
"But it's also about a company trapped by its own pathologies and, perversely, by the inexorable logic of its own recipe for success."
TRUSTWORTHINESS
Why would a company beset by fake news stick a knife into real news? And what would Facebook's algorithm deem trustworthy? Would the media executives even get to see their own scores?
Facebook didn't have ready answers to all of these questions; certainly not ones it wanted to give. The last one in particular-about trustworthiness scores-quickly inspired a heated debate among the company's executives at Davos and their colleagues in Menlo Park. Some leaders, including [VP of Global Communications, Elliot] Schrage, wanted to tell publishers their scores. It was only fair. Also in agreement was Campbell Brown, the company's chief liaison with news publishers, whose job description includes absorbing some of the impact when Facebook and the news industry crash into one another.
But the engineers and product managers back at home in California said it was folly. Adam Mosseri, then head of News Feed, argued in emails that publishers would game the system if they knew their scores. Plus, they were too unsophisticated to understand the methodology, and the scores would constantly change anyway. To make matters worse, the company didn't yet have a reliable measure of trustworthiness at hand.
Heated emails flew back and forth between [Davos,] Switzerland and Menlo Park. Solutions were proposed and shot down. It was a classic Facebook dilemma. The company's algorithms embraid choices so complex and interdependent that it's hard for any human to get a handle on it all. If you explain some of what is happening, people get confused. They also tend to obsess over tiny factors in huge equations. So in this case, as in so many others over the years, Facebook chose opacity. Nothing would be revealed in Davos, and nothing would be revealed afterward. The media execs would walk away unsatisfied.
**
If you want to promote trustworthy news for billions of people, you first have to specify what is trustworthy and what is news. Facebook was having a hard time with both. To define trustworthiness, the company was testing how people responded to surveys about their impressions of different publishers. To define news, the engineers pulled a classification system left over from a previous project-one that pegged the category as stories involving "politics, crime, or tragedy."
That particular choice, which meant the algorithm would be less kind to all kinds of other news-from health and science to technology and sports-wasn't something Facebook execs discussed with media leaders in Davos. And though it went through reviews with senior managers, not everyone at the company knew about it either. When one Facebook executive learned about it recently in a briefing with a lower-level engineer, they say they "nearly fell on the fucking floor."
CAMBRIDGE ANALYTICA
...some people at Facebook worried that the story of their company's relationship with Cambridge Analytica was not over. One former Facebook communications official remembers being warned by a manager in the summer of 2017 that unresolved elements of the Cambridge Analytica story remained a grave vulnerability. No one at Facebook, however, knew exactly when or where the unexploded ordnance would go off. "The company doesn't know yet what it doesn't know yet," the manager said. (The manager now denies saying so.)
**
And so it was that a confused and fractious communications team huddled with management to debate how to respond to the Times and Guardian reporters. The standard approach would have been to correct misinformation or errors and spin the company's side of the story. Facebook ultimately chose another tack. It would front-run the press: dump a bunch of information out in public on the eve of the stories' publication, hoping to upstage them. It's a tactic with a short-term benefit but a long-term cost. Investigative journalists are like pit bulls. Kick them once and they'll never trust you again.
Facebook's decision to take that risk, according to multiple people involved, was a close call. But on the night of Friday, March 16, the company announced it was suspending Cambridge Analytica from its platform. This was a fateful choice. "It's why the Times hates us," one senior executive says. Another communications official says, "For the last year, I've had to talk to reporters worried that we were going to front-run them. It's the worst. Whatever the calculus, it wasn't worth it."
**
"Those five days were very, very long," says Sandberg, who now acknowledges the delay was a mistake. The company became paralyzed, she says, because it didn't know all the facts; it thought Cambridge Analytica had deleted the data. And it didn't have a specific problem to fix. The loose privacy policies that allowed [researcher Aleksandr] Kogan to collect so much data had been tightened years before. "We didn't know how to respond in a system of imperfect information," [Sandberg] says.
Facebook's other problem was that it didn't understand the wealth of antipathy that had built up against it over the previous two years. Its prime decisionmakers had run the same playbook successfully for a decade and a half: Do what they thought was best for the platform's growth (often at the expense of user privacy), apologize if someone complained, and keep pushing forward. Or, as the old slogan went: Move fast and break things. Now the public thought Facebook had broken Western democracy. This privacy violation-unlike the many others before it-wasn't one that people would simply get over.
Finally, on Wednesday, the company decided Zuckerberg should give a television interview. After snubbing CBS and PBS, the company summoned a CNN reporter who the communications staff trusted to be reasonably kind.
READ ONLINE: "15 Months of Fresh Hell Inside Facebook."
Disappointing that Cambridge Analytica's unlawful conduct still gets papered over in these various recapitulations. The findings of ICO to Parliament rarely ever get mentioned in the press. THEY BROKE UK LAW IN THE US ELECTIONS.
-https://t.co/DF7Vlf75Ts
-https://t.co/OBgsVAB6nh pic.twitter.com/6IdILXlWCl- David Carroll (@profcarroll) April 16, 2019
Speculation over whether Cambridge Analytica was effective is a fool's errand. There are no data nor control groups to prove anything either way. Why do folks take that bait when the real debate is whether it was lawful or not? The real story is how they may get away with it.
- David Carroll (@profcarroll) April 16, 2019
Here's a rare article from March 2018 that actually characterized Cambridge Analytica accurately as a military-grade PSYOPs shop for plutocrats that got into Big Data and Facebook targeting while flouting the law. https://t.co/cjd4YE2J9j pic.twitter.com/fevX84hKFT
- David Carroll (@profcarroll) April 16, 2019
"It would front-run the press: dump a bunch of information out in public on the eve of the stories' publication, hoping to upstage them. Investigative journalists are like pit bulls. Kick them once and they'll never trust you again."
Google's PR shop did this to workers before. https://t.co/zxuu7ZD18r
- Liz Fong-Jones (ae-^1c(R)c) (@lizthegrey) April 16, 2019
This is an incredible deep dive into the last 15 months of hell at Facebook. Anyone who runs social knows what's been happening to our feeds - this is what was happening inside Facebook during that time. https://t.co/8eZI9aAQRu
- Meghann Farnsworth (@mtfarnsworth) April 16, 2019
My favorite part of this is the name of Sheryl Sandberg's private conference room:
"Only Good News" https://t.co/83nnH763Q2- John Paczkowski (@JohnPaczkowski) April 16, 2019
This is pretty nuts. From the big Wired feature:
Last year, Facebook's engineers classified news as only being "politics, crime, or tragedy," which apparently vanished every other kind of journalism on the platform?https://t.co/ondxSGewqD pic.twitter.com/EfaFewU3s7
- Ryan Broderick (@broderick) April 16, 2019