To start off this blog, I will take a look at a topical example of technology intersecting with the everyday world: the recently-proposed Burr-Feinstein Compliance with Court Orders Act of 2016, which has been variously called the “anti-encryption bill,” the “backdoor bill,” and various other names (some not printable in this family-friendly forum!). Ramping up the relevance to me is that Richard Burr, one of the sponsors of this bill, is my senator from North Carolina. Burr and Feinstein proposed this bill as a response to the Apple/FBI argument over breaking into the iPhone of one of the San Bernardino shooters, and the question I’ll look at here is whether this bill matches up with how technology works. To skip to the punch line, from a technical standpoint this bill is about as high on the clueless scale as you can get.
A wide array of technical experts, writers, and pundits have rightly slammed this bill, including Bruce Schneier, a highly prominent security researcher and writer who observed that “The person who wrote this either has no idea how technology works or just doesn’t care,” to Julian Sanchez of the Cato Institute, who wins extra points from me for using an Inigo Montoya meme from The Princess Bride in one of his excellent pieces. Looking at various people’s objections, some are philosophical and some are technical. I’m not sure I can add anything new to the conversation at this point, but I hope that highlighting a few issues will benefit anyone who comes across this article. I will focus on technical aspects of the proposed bill, and leave the philosophical questions of whether the intentions behind it are reasonable to another time. The two main technical questions are “Can legislation effectively control a technology like encryption?” and “If commonly-used products are forced to comply with this proposed law, will we be more secure or less secure?” Unfortunately, the answers that pretty much every technical expert agrees on are “no, it’s impossible to control encryption” and “this would make us less secure.”
What does the bill say?
Before seeing what is in the bill, let’s look at the name: the Compliance with Court Orders Act of 2016. Wow, what a great idea! People should have to comply with court orders, once the judicial systems arrives at a definitive decision (including resolution of any reasonable appeals or challenges). Rule of law! Apple pie! (But not the technology Apple, who Feinstein doesn’t seem to like very much.) But, of course, people already have to comply with court orders — it’s a “court order” after all, not a “court suggestion,” and that’s what the whole contempt of court thing is about, and many people learn the hard way that yes, Virginia, you really do need to obey the court. While hiding behind an apple pie title, what the bill is really about is forcing companies to design their products so that courts can ask for certain things that they might not otherwise be able to ask for.
So what does the bill mandate? It says that any provider of “communication services and products (including software)” must, upon receiving a court order for information or data, “provide such information or data to such government in an intelligible format.” This then presupposes that any such technology must be capable of providing information in an intelligible format. Specifically, Apple must design the iPhone in such a way that they are able to decrypt your communications, so that in the future they can respond to such requests from a court. Simply put, no one can provide you with a tool that the provider can’t break.
The bill is essentially requiring a backdoor in communication products: a way that someone who is not involved in the communication can get access to the content of the conversation. The “front door” for access is what the communication participants use: an account password, phone PIN, decryption key — things that the legitimate user uses. A “backdoor” then is a way for someone else to get access without going through the accepted and visible controls of the front door — you’re sneaking in and around other security measures. That’s what this bill mandates.
Now let’s look at the two main technical issues.
Why legislation can’t control encryption
Non-technical people sometimes think encryption is some highly complex magical incantation that only a privileged few can put into products. Here’s a secret: That’s just not true — some strong encryption algorithms are in fact outrageously simple, using basic math (sure, maybe not math that you use every day, but drop-dead simple for any mathematician). Consider the RSA algorithm, which is widely used in very security-sensitive situations. Don’t take my word for it though, use Firefox to go to https://www.bankofamerica.com (to pick a random example), click on the green padlock indicating a secure connection, drill down into “Bank of America Corporation” and then the “More Information” button. Under “Technical Details” you’ll see something like this:
See the “RSA” in there? That’s one of the algorithms that is keeping the information of banking customers safe, and it is in fact a very secure algorithm given appropriately-sized encryption keys. So what sort of wizardry is this? Surely this is a horribly complex use of tricky mathematics that no one would be able to do without purchasing a special product, right? Here’s the RSA encryption algorithm, written in Python, a popular programming language:
|
Yes, that’s the whole, entire thing. Of course, you have to encode your message (the plaintext) as a number, but everything in a computer is a number anyway, so that’s basically done for you. And you have to pick an appropriate modulus, which is a little trickier, but not much. In fact, going back to the “More Information” window for Bank of America, click on “View Certificate” and then the “Details” tab, and finally click on “Subject’s Public Key.” That should show you the modulus that is Bank of America’s key. If I create a number like that and don’t let you (or the FBI or anyone else) know what the prime factorization of the number is, you won’t be able to break this encryption. It’s backdoor-free and completely immune to any court order.
The code above doesn’t use anything written specifically for cryptography. “pow” performs a basic mathematical operation called modular powering that doesn’t necessarily have anything at all to do with encryption. Every math major learns about this in an undergraduate abstract algebra class, and it’s probably safe to say that there are people in every country on Earth (including Syria, North Korea, and China) that could use this to make a secure encryption system in short order. Al Qaeda has been using its own encryption software called “Mujahideen Secrets” for over a decade. For some odd reason, I don’t believe that Al Qaeda will change their behavior and use products that provide access to “intelligible information” because the Burr-Feinstein bill told them to. So much for this bill foiling the terrorists.
Of course, the U.S. doesn’t control the world’s technology, and you don’t have to use terrorist-produced software to use secure communication technology. This law (or any other law) can do nothing to stop that. Bruce Schneier recently made a list of 546 non-U.S. encryption products, produced in 54 different countries. None of those products would be subject to any U.S. backdoor requirements, and it would be impossible to stop a U.S. citizen (or anyone else) from using these unrestricted products. Unless the enforcers monitor every single Internet communication, police-state style, it would be impossible to tell if someone were installing or using one of these products.
We’ve actually been through very similar arguments, back in the 1990’s. Back then, the government tightly controlled the export of encryption technologies, including software. When you downloaded a web browser, like Netscape, you selected either the “U.S.-only” version or the “International” version, where the International version used weakened cryptography (trivial to crack with 2016 technology, and only slightly less difficult to crack 20 years ago). Eventually sanity prevailed — people pointed out the absurdity of restricting the export of cryptography, when anyone in the world could download very secure encryption software from a server in Finland or some other country. One of the more interesting protests at that time was the “munitions T-shirt,” a T-shirt that included an implementation of the RSA algorithm on it, with the warning message that “This shirt is classified as a munition and may not be exported from the United States, or shown to a foreign national.” In the end, the argument that probably carried the day with politicians in Washington was that U.S. companies were losing business, because foreign countries could (and were) providing technology that U.S. companies were restricted from providing. The U.S. was going to lose out on the economic boom of the digital economy, and there wouldn’t even be any security gain because people were obtaining secure products anyway — from foreign companies. All of those arguments are just as valid today as they were 20 years ago, but somehow there are politicians who still think restrictions are a sensible idea.
Why backdoors make us less secure
Finally, consider a world in which this law is in place in the U.S., and for some reason people keep using U.S. technologies. If the technology includes a way for someone to get an “intelligible” version of encrypted communication, there must some secret information that enables someone to perform this task. How long would such a secret remain secret? There has been speculation that law enforcement and intelligence agencies could submit thousands of decryption requests to Apple every year, which would mean multiple requests every day. How many people would need access to the secret in order to handle these requests? How secure would the integrity of the requests be? Other technologies that similarly rely on secure chains of trust do not inspire trust.
Further consider that if it is possible for the U.S. government to compel a technology provider to decrypt private communications, then it is also possible for any government in any country in which do business to similarly compel this decryption. The U.S. law would put in big bold letters to the world that “we have the technology to break the security of this product.” Even if you trust the U.S. government to only use this power to protect the physical safety of U.S. citizens, there are clearly governments that would use this power to crack down on political dissidents or others that pose a danger not to people but to their regime.
Technological backdoors can’t distinguish who is using them. As far as the technology is concerned, the backdoor will work the same for a good government, a tyrannical government, or a criminal discovers the secret to the backdoor. The FBI’s eventual cracking of the San Bernardino iPhone, without Apple’s help, is disturbing for this same reason: As far as the technology is concerned, there’s no difference between the FBI being able to do this and a criminal being able to do it. A weakness is a weakness is a weakness. And once you weaken a product by including a backdoor, it’s just waiting to be exploited.
Conclusions
The bottom line here is that this law attempts to do something that is impossible, and passage of the law would make everyone less secure. There is simply no way to make a technology that allows just the “good guys” to break the technology of just the “bad guys,” since technology has no notion whatsoever of “good” vs. “bad.”
I want technology companies to focus on securing their products, and not thinking about how to weaken that security in certain situations. Providing strong security is a hard enough technological problem to solve, without having to do it with one hand tied behind your back.
No comments:
Post a Comment