Facebook Derangement Syndrome: The Company Has Problems, But Must We Read The Worst Into Absolutely Everything?
Since the whole Facebook/Cambridge Analytica thing broke, we've been pointing out that there are many, many valid concerns about things Facebook has done, but people seem to be freaking out about things it didn't actually do and that's bad, because freaking out about the wrong things will make things worse, not better. Indeed, that seems to be the direction things are heading in.
One thing I've noticed in having this discussion a few times now both online and off is that there's appears to be a bit of Facebook derangement syndrome going on. It seems to go something like this: Facebook did some bad things concerning our privacy, and therefore every single possible thing that Facebook does or Mark Zuckerberg says must have some evil intent. This is silly. Not only is it obviously wrong, but (more importantly) it makes it that much more difficult to have a serious discussion on the actual mistakes of Facebook and Zuckerberg, and to find ways to move forward productively.
I'll give one example of this in practice, because it's been bugging me. Back in January, in the podcast we had with Nabiha Syed about free speech and the internet, where the question of platform moderation came up, I brought up an idea I've discussed a few times before. Noting that one of the real problems with platform moderation is the complete lack of transparency and/or due process, I wondered whether or not there could be an independent judicial-type system that could be set up to determine whether or not an account truly violated a site's policies. As I noted in the podcast, there could clearly be some problems with this (our own judicial system is costly and inefficient), but I still think there may be something worth exploring there. After all, one reason why so many people get upset about internet companies making these kinds of decisions is that they don't know why they're being made, and there's no real way to appeal. An open judicial system of sorts could solve at least some of those problems, bringing both transparency and due process to the issue.
And while I've talked about this idea a few times before, I've never seen anyone else appear to take it seriously... until I was surprised to see Zuckerberg suggest something similar in his interview with Ezra Klein at Vox. That interview has been criticized for being full of softball questions, which is pretty fair criticism. But I still found this part interesting:
Here are a few of the principles. One is transparency. Right now, I don't think we are transparent enough around the prevalence of different issues on the platform. We haven't done a good job of publishing and being transparent about the prevalence of those kinds of issues, and the work that we're doing and the trends of how we're driving those things down over time.
A second is some sort of independent appeal process. Right now, if you post something on Facebook and someone reports it and our community operations and review team looks at it and decides that it needs to get taken down, there's not really a way to appeal that. I think in any kind of good-functioning democratic system, there needs to be a way to appeal. And I think we can build that internally as a first step.
But over the long term, what I'd really like to get to is an independent appeal. So maybe folks at Facebook make the first decision based on the community standards that are outlined, and then people can get a second opinion. You can imagine some sort of structure, almost like a Supreme Court, that is made up of independent folks who don't work for Facebook, who ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world.
Huh. That's almost exactly what I suggested. Again, I also see some potential problems with this kind of setup and am not 100% convinced it's the best idea -- but it does solve some of the very real existing problems. But, the knee jerk "everything Zuckerber says must be bad" crowd kinda took this statement and ran with it... straight into a wall. Here's the tweet that Laura Rosenberger, a former high level government staffer, had to say in response to that part of Zuck's interview:

If you can't read it, she says:
This is terrifying. Facebook essentially sees itself becoming a system on world governance, complete with its own Supreme Court.
So, first of all, this gets what Zuckerberg said exactly backwards. Indeed, it takes a special kind of "must-hate-on-everything-he-says" attitude to misread a statement about being more transparent and more accountable to an outside set of arbitrators, and turn it into Facebook wants to build its own Supreme Court. I mean, he literally says it should be an outside panel reviewing Facebook's decisions, and she turns it into "Facebook's own Supreme Court."
But, of course, her tweet got tons of retweets, and lots of people agreeing and chipping in comments about how Zuckerberg is a sociopath and dangerous and whatnot. And, hey, he may very well be those things, but not for what he said here. He actually seemed to be recognizing the very real problem of Facebook having too much power to make decisions that have a huge impact, and actually seemed to open up the idea of giving up some of that power to outside arbitrators, and doing so in a much more transparent way. Which is the kind of thing we should be encouraging.
And, instead, he gets attacked for it.
If that's what happens when he actually makes a potentially good suggestion that results in more transparency and due process, then why should he bother to keep trying? Instead, he can do what people keep demanding he do, and become an even more powerful middleman, with even less transparency and more control over everyone's data -- which he could now do in the name of "protecting your data."
So can we please get past this Facebook derangement syndrome where people are so quick to read the worst into everything Facebook does or Zuckerberg says that we actively discourage the good ideas and push him towards even worse ideas?
Permalink | Comments | Email This Story