Editor's Soapbox: Encryption By Analogy
Last week, US President Obama said something that is usually the sort of line we give the "idiot boss" character in one of our stories. From Ars Technica:
"I am not a software engineer," Obama made his beliefs wholly clear. "You cannot take an absolutist view on this. If your view is strong encryption no matter what and we can and should create black boxes, that does not strike the balance that we've lived with for 200 or 300 years. And it's fetishizing our phones above every other value. That can't be the right answer."
That was when I decided to write this article, and as it turns out, I wasn't the only one so inspired. John Oliver's "Last Week Tonight" also weighed in on the subject.
Our readers are a technical audience, so I don't think I need to use this soapbox to preach to the choir. I think we all recognize that in security, "an absolutist view" is required. Any security that can be bypassed by a third party can be bypassed by any third party, or more colorfully: "Two people can share a secret if one of them is dead." And of course, the hard reality is that encryption is math and anybody can do it, and you can't outlaw math.
I think President Obama's statement, though, highlights a more prosaic issue that impacts our lives as IT professionals. Obama, here, is standing in the role of upper level management- someone with power who is simultaneously ignorant. In this case, it's downright terrifying, because Obama is in a position where his power can cause serious harm. In our daily lives, the damage caused by this kind of ignorance is" well, hit the random article button on the sidebar. Odds are, you'll land on something that covers it.
Here's what I want to know: how does this happen, and what can we do about it? Why do non-technical people fail to grasp important issues, and how can we do a better job helping them?
Fro the purposes of this article, I'm going to be generous- with all of my examples, I'm going to assume the person I'm talking about is sincere and attempting to make the correct choice. I will assume no one is being willfully ignorant, manipulative, or purposefully harmful.
Intuition and Common-SenseAstronomer and science-popularizer Neil DeGrasse-Tyson put his foot in it last week, when he tweeted "If there were ever a species for whom sex hurt, it surely went extinct long ago." Now, he's an extremely smart man, and has done great things for helping the public grasp with some serious cosmic questions- but this statement is wrong. Just ask bed-bugs.
This statement falls into a class of ideas that are intuitively correct, but actually wrong. It sounds perfectly reasonable, but isn't actually true.
Take a random person, and ask them: "I flipped a coin ten times, and it came up heads every time. Is it more or less likely that the next flip will also be heads?" Most people will get it wrong, because we intuitively know that 11 heads is very unlikely, but actually every coin flip is strict 50/50 odds. Our bad intuition when it comes to probability can be seen in the Monty Hall Problem or the continued existence of Las Vegas.
When the President says, "Hey, if we have a warrant, we should be able to search your phone," that's intuitively correct (that's the whole point of a warrant), but actually wrong. As previously stated, encryption that can be broken isn't encryption worth having.
Wishful ThinkingRelated, but subtly different is when our intuition starts feeding into how we think the world should work. Every few months, a journalist hears about the Alcubierre Drive and runs an article on it. Everyone's going, "Yay, warp drives!" until some party-pooping physicist points out that to build one you'd need matter with negative mass, and what does that even mean?
The rebuttal to the party-pooper is pretty much the same: "They said (breaking the sound barrier|heavier-than-air flight|going to the moon|some other really hard thing) was impossible." This may be rooted in intuition, but it grows in a different direction: a world with warp drives is cooler than one without, and I want to live in that world.
Donald Trump's entire Presidential campaign this year has been a mixture of reality-show "oh snap" moments and wishful thinking. His plan, for example, to stop illegal immigration by building a wall across the Mexican border ignores the technical challenges, the economic challenges, the ecological consequences, or the reality that walls can be bypassed in a number of ways. The obstacles or impracticality are ignored in favor of a, "wouldn't it be cool if"" feeling.
In the John Oliver clip, a recurring theme he touches upon is the, "if we put a man on the moon, I don't see why it's so hard to make encryption that the government can spy on!" There's a lot of wishes in this: a wish for a world where good guys can read anybody's mail, but bad guys never can, and also a world where the government is always a good guy. That's arguably a better world than the one we live in, but the statement blows across real-world impossibility of their demands.
Now, at this point, it is both easy and tempting to allow this article to pivot into a full taxonomy of logical fallacies and cognitive blindspots. The thing is, I don't care as much about the ways in which people are wrong, as much as I do why they reach these wrong conclusions. And I'm not concerned about being exhaustive. Instead, I want to talk about one more cause- and one that we are often complicit in when we talk to non-technical people.
The Argument from AnalogyHave you ever found yourself trying to explain something to your boss by saying, "Well, it's like waiting in line at the bank- instead of having a separate queue for each teller, it's faster if they have one queue for all of the worker threads." That's an analogy, and it's a powerful way to explain complicated concepts.
Scientists use this all the time. Most of you have probably seen a demonstration or visualization like this one which uses a rubber mat to show how gravity warps spacetime. And of course, Schridinger's rather famous cat has been alive and dead so many times.
Of course, both of those analogies actually do a really bad job of actually explaining the phenomenon. They're simplistic and obscure a lot of details in the name of getting the point across- in technical terms, they're leaking abstractions.
And these kinds of leaky abstractions are built right into our technical terms. Like, for example, oh" I don't know" keys? An encryption key is analogous to a real key, but I've never signed something with my house key. Yet, in John Oliver's segment, we hear a police officer that since there's no door or safe that they couldn't break open, encryption should be the same way.
How many times have we heard of encryption described in terms of locks and safes?
Analogies are powerful, but with great power comes great responsibility. Analogies provide an illusion of knowledge- by mapping a complex and difficult to understand problem domain on a realistic(ish) and easy to grasp analog, people can feel like they understand something complicated, even though they don't.
What Can We Do?At this point, I feel a little like I'm standing in the corner, looking at a freshly painted floor. I've laid out a challenge that can't be resolved by just a little essay on an IT humor site. How do we fight technical ignorance in our co-workers, our friends and family, and our elected officials?
It's not just a matter of educating people. You can re-run the original Carl Sagan Cosmos as much as you like, but there will still be people who deny evolution and the scale of the universe. Statistics have never convinced a single anti-vaxxer they were wrong. Conspiracy theorists are never swayed by little things like evidence.
I think, though, that there's a common thread that connects my three points above. Intuition, wishful thinking, and analogies are all involve setting and then confirming our expectations. So, if nothing else, breaking those expectations is something that's helpful. Comedy is always useful for being "surprising".
More generally, taking people's assumptions and showing the situations where they fail is a great way to try and get them to learn things. The challenge is this requires us to understand their assumptions. It's often very hard for an expert in one area to forget what they know and approach a situation with a "beginner's mind". It also requires some skills that the IT field generally doesn't prioritize: empathy and active listening.
When I see things like this battle of encryption, I think that it's less important to educate people about the correct details, and more important to break their incorrect preconceptions.
I'm curious to hear from our readers on this, so click the comments link or shoot us a message via email. A future edition of my hand-drawn videos will likely tackle the specific issue of encryption, and I'd love to hear some ideas about that.
[Advertisement] High availability, Load-balanced or Basic - design your own Universal Package Manager, allow the enterprise to scale as you grow. Download and see for yourself!