
The Manhattan District Attorney’s Office recently published a report on smartphone encryption and public safety. They argue that the best way to balance privacy and safety is to require companies like Apple and Google to be able to decrypt users’ smartphones upon presentation of a valid warrant. The report is here
In episode 535 of Security Now, security expert Steve Gibson agreed with the proposal, saying that this is the right compromise. Leo Laporte disagreed. I was shocked to hear Steve endorse this, and I hope he’ll reconsider. I’m writing this essay to explain why this proposal not only has legal problems, but is also fundamentally incompatible with both open source software and the principle that users own their devices.
Law Enforcement’s Problem
To understand the issue, it’s important to understand the problem that law enforcement agencies are trying to address — finding digital evidence of crimes.
A warrant has traditionally provided law enforcement with the authority to go wherever they need to go and do (almost) whatever they need to do to find the evidence that they already have probable cause to believe exists.
From the perspective of law enforcement, the increasing role that digital records play in our lives and the discovery of methods that allow average citizens to easily apply strong encryption is problematic. It means there are digital spaces on computers and smartphones where a search warrant doesn’t help them very much because they can’t break the encryption in a reasonable time period (it may take thousands of years).
What law enforcement wants is for a warrant to enable them to search a computer or smartphone in the same way that they can search a room or a safe. It’s a reasonable request, but I argue that this is neither legally nor technically feasible.
The Problem with Law Enforcement’s Problem
Law enforcement has done an excellent job of convincing people that their ability to find criminals has been diminished due to the advent of modern encryption, and it’s putting us all in danger. I argue that this is not the case.
Encryption is not New
Encryption is not a new discovery. We’ve been encrypting messages almost since we’ve had written language. The Caesar Cipher is a prominent example; Julius Caesar is known to have used it to encrypt military messages to protect their contents.
Not only is encryption ancient, it’s a fundamental part of human expression. Children and adults alike use it regularly. We write words backwards. We substitute shapes or numbers for words or letters. We speak in Pig Latin for fun, and to prepare for bedtime without children understanding us. These are all examples of encryption.
When criminals keep incriminating written logs for financial transactions or times and places, they may use some form of hand-written encryption such that a casual observer would not realize the purpose of the writing. Criminals may talk about a “stack of pancakes” when they mean “box of bullets” to prevent others from overhearing.
My point here is that law enforcement has always had to deal with encryption. It is not at all a new discovery, as they would have us believe. Computers have simply made it easier to use.
Physical Places Still Exist
A warrant still authorizes them to search a home, bug a phone, or track a vehicle. Crimes always leave evidence, and only a small part of it may (or may not) be on an encrypted device. You’re likely to find way more information in a person’s home than on their smartphone.
If law enforcement cannot find sufficient evidence of a crime after following a suspect, searching his home and work, speaking with all the people he called, and viewing his financial records, then there’s generally no reason to believe that unencrypting his smartphone will be any different.
Encryption Does not Prevent All Access
Look at how the FBI caught Ross William Ulbricht (known as the Dread Pirate Roberts, the founder of the drug-trafficking Silk Road), despite the fact that he used full-disk encryption on his laptop. They waited for him in the library. As soon as he sat down and decrypted his computer, the FBI jumped out and grabbed him. They got full access to his laptop, entirely bypassing the encryption.
There are still anomalies in how they identified him as a suspect, but the arrest is a perfect example of how law enforcement can legally arrest someone and bypass their encryption, without any need for a backdoor or key in escrow.
Cloud Providers
We live in a connected world, and the vast majority of our sensitive digital data is synced to cloud providers. Our email and contacts may be kept by Google. Our photos may be kept by Facebook. Our text messages may be kept by Verizon. Even our location is known to at least four different companies at all times. All these cloud provider companies are subject to court orders, without any need for special access to encrypted devices.
Making their Job Easier
At this point I have argued that encryption is not a new discovery, that in many cases law enforcement does not need any form of special access to bypass encryption, and that most of the data contained in an encrypted smartphone is also available elsewhere.
The real reason that law enforcement is fighting against encryption is simple: their job would be much easier if they could get around it. It’s natural for anyone to wish their job were easy, but in the case of law enforcement, this is something we cannot allow.
What law enforcement really wants is to go fishing. That is, to collect data on a grand scale and have computers automatically sort through it to identify things that may indicate a crime.
In order to protect our God-given and constitutionally-acknowledged rights, the job of law enforcement must be a difficult one. In the United States, we hold that our rights are supremely important. While giving law enforcement special access to encrypted devices would catch a few more criminals, it would not be worth sacrificing our right to privacy.
We must remember that making law enforcement an easy job is the quickest path to tyranny.
This Particular Proposal
Apple and Google have both enabled full-disk encryption on iOS and Android phones as a default setting. Apple and Google do not have direct access to these devices any longer (only to the corresponding cloud accounts where data may be backed up).
Due to this state of affairs, the report from the Manhattan District Attorney’s Office proposes federal legislation:
The federal legislation would provide in substance that any smartphone manufactured, leased, or sold in the U.S. must be able to be unlocked, or its data accessed, by the operating system designer.
There are a lot of problems, both technical and legal, why this won’t work. I’m going to dig into all of them.
Legal Problems
Where would the authority for such a federal statute come from?
The Commerce Clause gives the federal government the authority to “regulate Commerce… among the several States,” and “with foreign Nations.” Because smartphones are part of interstate and foreign commerce, a federal statute regulating smartphones would comfortably fall within the power of Congress to regulate activities “that substantially affect interstate commerce.”
Naturally, the Commerce Clause. I am not a lawyer, but I have studied constitutional law. This is typical logic for modern politicians: It is produced in one state and sold in multiple states, therefore the federal government can legislate anything about it we want!
It would be hilarious if it weren’t such a common abuse. Interstate commerce is when an entity in one state trades with an entity in another state. It generally includes a semi-trailer truck crossing a state line with goods. Interstate commerce begins when goods are loaded onto the truck and ends when they are unloaded in the other state. The Commerce Clause gives the federal government the authority to regulate the manner by which goods are traded between people in one state and people in another. This means transport conditions, import tariffs, railroads and trucking lanes, etc.
It does not give them the authority to legislate how consumers use the goods or that the government have special access to them. Using the Commerce Clause to mandate special government access to consumer devices is clearly unconstitutional, given an educated reading of both the Commerce Clause and the Necessary and Proper Clause.
Technical Problems
The “operating system designer” would be required to decrypt the device and provide the data to law enforcement following a search warrant.
In the case of Apple and iOS, this is straightforward. iOS is a proprietary operating system written and owned by Apple, exclusively for hardware sold by Apple. Apple is clearly the “operating system designer.”
What about Android? Android is open source. It uses the Linux kernel, which is under the GPLv2 license. The user space code is under the Apache 2.0 license. No one owns it; that’s the whole point of open source. Aside from stock Android, there are many other versions; CyanogenMod is the most common alternative, but there are several dozen. So there is no “operating system designer” to which law enforcement could serve a court order.
Even carriers put their own version of Android on the phones they sell. But for the sake of discussion, let’s assume that Google is somehow responsible for all Android variants.
How Encryption Works
To see the implications of this proposed legislation, it’s important to understand how full-disk encryption is implemented.
Full-disk encryption works by translating all data saved to or read from a drive. When you save data, the encryption layer first encrypts it and then writes it. When you read data back, it reads the encrypted data, decrypts it, and then gives you the decrypted version.
As with any method of encryption, you need a key to input into the encryption algorithm in addition to the data. The key is subject to the following properties:
- If the key is lost, the encrypted data can no longer be recovered.
- If the key is weak, the encrypted data could be accessed by guessing the key.
- If the key were to change, all the encrypted data would need to be decrypted with the old key and then re-encrypted with the new key, which is not feasible.
Full-disk encryption is generally accessed by providing a password, although it could also be something like a fingerprint. Passwords are subject to the following properties:
- They are weak (easily guessable, not random).
- They must be changeable.
Looking at these two sets of properties, it’s clear that encrypting a drive with a password is not ideal. To address this, full-disk encryption systems use the following system:
- A strong symmetric key is generated.
- All saved data will be encrypted with that symmetric key.
- The key is encrypted with the user’s password.
- The encrypted version of the key is stored in a special location on the drive.
I’ve simplified it a bit, but those are the key points. In actuality, you’d use a key derivation function like PBKDF2 to turn the password into something stronger before using it.
To turn on the device, the following steps take place:
- The user provides their password.
- The key is read from the drive and decrypted.
- The decrypted key is remembered as long as the device is on.
- When any data is read, the key is used to decrypt it on the fly.
This is why the data is only protected when the device is powered off; so long as it’s on, the decrypted key is still in memory and anyone holding the phone can access everything (assuming the screen is unlocked).
Special Access
So how would this key escrow proposal work? To comply with the proposed law, Apple and Google would need access to the key used to decrypt the phone’s drive. The key must not be the same for each phone, or else my decrypted key would also open everyone else’s phone. So rather than storing one copy of the encrypted key, you store two:
- A strong symmetric key is generated.
- All saved data will be encrypted with that symmetric key.
- The key is encrypted with the user’s password.
- The key is separately encrypted with Apple’s public key.
- Both encrypted keys are stored in a special location on the drive.
In this system, a user still decrypts their phone like normal. The phone’s data is still protected against theft. But now Apple can use their private key to access the device too, much like a user using a password.
At first glance, this seems like a good compromise. Our devices are very well protected; A stolen phone would still be reasonably secure. Only Apple or Google could access it, given both physical access to the phone and a court order.
The Problems
But there’s a big problem here. What is stopping me from removing Google’s version of the key? What if I write a bunch of zeros to that section of the drive, erasing their copy of the key? Then I would once again be the only one who could decrypt the phone.
To comply with the law, Google would have to prevent me from doing that. But how? The phone is mine, and I have control over it. They can’t stop me from erasing their copy of the key.
The only solution to this problem is for them to take away my control. They’d have to restrict what I can do with it, and that means restricting what software I can run on it.
Apple could actually do this if they wished, because they already don’t allow the users any control. But Android’s open source license prevents Google from doing this. Any code they add to generate and protect their copy of the key could be removed by an average user — you only have to flash it with a different version of Android that wouldn’t include this code, like CyanogenMod.
Android phones would then have to prevent you from modifying the operating system. This practice is known as tivoization. It uses hardware restrictions to prevent open source software from functioning in an open manner.
It’s obvious at this point that any such proposal is fundamentally incompatible with open source. It requires proprietary operating systems and locked-down devices. It removes the power of choice from consumers.
We’ve already seen what happens when consumers don’t control their own devices. I was forced to retire my previous Android phone because my carrier wasn’t releasing Stagefright patches and the locked bootloader wouldn’t let me install a different version of Android. Carrier control over devices is a huge security problem, and it’s the biggest challenge facing Android today.
With cell phone unlocking being legal and the FCC encouraging wireless providers to unlock phones, a law that effectively requires all phones and tablets to be locked down is a clear step backward.
And contrary to what the report asserts, this proposal would impose a significant burden on “operating system designers,” who would not only need to maintain an additional public key infrastructure, but also implement complex code to attempt to prevent deletion of their key.
What about Computers?
This proposal is specifically about smartphones and tablets, but there is little or no difference between a tablet and a laptop these days, with many devices that claim to be both. What if we apply this to all computers?
The same conclusion applies here. It would be illegal for hardware manufacturers or “operating system designers” to sell a computer that could be encrypted in such a way that the “operating system designer” could not recover the data.
This gets absurd quickly. Not only would we be unable to install Linux, BSD, or Illumos, Plan 9, or any of the great variety of open source operating systems, there’s a big problem with removable drives. What prevents a criminal from attaching an external USB drive and storing encrypted data there?
Encryption is Math
There is a fundamental problem with any attempt to grant law enforcement special access to encrypted data, which is that encryption is just math. It’s not a product or a service or a physical thing you can control. It wasn’t invented; it was discovered. It’s really just math. You can no more legislate special access to encryption than you can legislate the value of pi.
This is why any attempt to give law enforcement special access to encrypted data is doomed to failure. Even if this proposal were made law, it would be trivial to create an app that stores one’s own encrypted data to the drive. Law enforcement could decrypt the drive, but all they would find is more encrypted data.
If they ban all apps that do encryption, then we lose web browsers, as well as any app that connects to ‘the cloud’. Even then, users could sideload their own encryption apps, bypassing the app store.
The short answer is that it’s not possible to do what law enforcement has requested. No matter what scheme you come up with, it can be easily bypassed, even by a user with little technical knowledge (as with rooting or jailbreaking a phone). It would be just as anti-consumer and just as ineffective as DRM.
Default Settings
So what if, rather than mandating that “operating system designers” must be able to access the data, we reduce this to a best-effort approach? They must be able to access data from a device using default settings, but are not held accountable if the user has manually removed their keys or bypassed their access mechanism.
This removes most of the issues I’ve raised about the proposal, but it also removes any benefit that special access would provide to law enforcement. Anyone who felt that protecting their privacy from government access was worth the extra effort would remove the second key. It would be very similar to rooting or jailbreaking a phone.
In the end, the criminals that the government is targeting would be immune to law enforcement’s special access, which would make this proposal pretty pointless.
Conclusion
Once again, what law enforcement has asked for just isn’t possible. No amount of beating up Silicon Valley nerds is going to make it any less so.
The right answer is for law enforcement to rely on traditional methods, which have only gotten easier to employ. If they need access to an encrypted device, they can arrest the suspect while the device is decrypted. Even without the devices themselves, they still have access to our most important data, which lives with cloud providers. And they have more metadata than they could ever hope to sort through.
They already have their golden age of surveillance. It’s time they stopped asking for more.
Bio: I’m an engineer and open source advocate living near Cambridge, MA. I do a lot
of freelance work and work from home most days.
My areas of focus include electronics, PCB design, microcontrollers, robotics,
Linux, BSD, system administration, web design, and RF communications.
I’ve been to Antarctica twice, doing satellite communication work for NASA.
I’m a licensed amateur radio operator, callsign KK4DOP.
In my spare time, I enjoy hiking, photography, and running BSD servers on
computers other people throw away.
2 responses on "Smartphones and Search Warrants by Randy Westlund"