Last month Apple and Google announced improved encryption schemes for their mobile devices. Once in place, no other parties, including the companies themselves, will be capable of extracting locally stored data without the user’s passcode, even if they have physical access to the device. This was great news for users worried about the troves of intimate data stored on their devices. But for the FBI, it became a flashpoint for a decades-old campaign of fearmongering to convince companies to leave their products intentionally flawed and unprotected.
The FBI and its allies in the Department of Justice and other law enforcement agencies have accused tech companies of building roadblocks that hamper police investigations, warning of a world where encryption allows terrorists, kidnappers and other horsemen of the infopocalypse to run wild. During a speech at the Brookings Institution on Oct. 17, FBI Director James Comey suggested that if companies such as Apple won’t build back door access into their products for law enforcement, the government should force them to.
“If the challenges of real-time interception threaten to leave us in the dark, encryption threatens to lead all of us to a very dark place,” he said. “Are we so mistrustful of government and of law enforcement that we are willing to let bad guys walk away?” In an even more hysterical statement, John J. Escalante, chief of detectives for the Chicago Police Department, declared that the new iPhone “will be the phone of choice for the pedophile.”
But Apple and Google’s improved encryption has nothing to do with mistrusting the government or letting bad guys get away. It has to do with a choice between building secure technology and building defective technology that leaves users vulnerable in order to help authorities combat a problem that doesn’t actually exist.
The fallacy of going dark
The FBI’s dire warnings should evoke a strong sense of déjà vu. U.S. law enforcement has been crying wolf for decades about going dark, asserting that strong and widespread encryption will cripple its ability to catch criminals. In many ways, it’s a repeat of the crypto wars of the 1990s, when the U.S. government attempted to introduce something called the Clipper Chip, NSA-designed hardware that would have effectively created a federal skeleton key for any computer with the chip installed.
What the government wanted was what industry observers call nobody but us, a security model based on universal access keys and pathways known only to the “good guys.” But apart from the problem of who the good guys are and whether we can trust them, forcing companies to introduce secret doors means the same doors exist for any sophisticated criminal or foreign government to find and exploit.
Security experts agree that over time, the chance that even a well-hidden back door will be found by one or more adversaries approaches near certainty. There’s nothing hypothetical about it: Recent history is full of prominent examples of so-called lawful intercept systems being subverted by outside attackers. During a high-profile incident in 2004 known as the Athens affair, an intruder — widely believed to be the United States — exploited the interception capability of a Greek telecommunications network to illegally wiretap more than 100 phones, including those belonging to high-level government officials. A similar scandal occurred in 2010, when Google revealed that some of its systems had been penetrated by Chinese hackers via an interception system the U.S. government had forced the company to implement to comply with surveillance orders. Cryptography guru Bruce Schneier explained it best, writing for CNN, “Encryption protects against cybercriminals, industrial competitors, the Chinese secret police and the FBI. You’re either vulnerable to eavesdropping by any of them, or you’re secure from eavesdropping from all of them.”
The FBI is not really complaining that Apple has closed a back door. Rather, they're complaining that Apple used to have badly designed security and now they've fixed it.
Julian Sanchez
Cato Institute senior fellow
If anything, the FBI’s characterization of encryption as a golden ticket for criminals is more laughably hyperbolic today than it was 20 years ago. Through NSA whistleblower Edward Snowden and others, we now know that police and the U.S. government have countless ways of accessing our private data without a warrant — from using Stingray cellphone interceptors, which capture data from thousands of nearby devices, to the process of parallel construction, which allows law enforcement agencies to use data on American citizens “incidentally” collected by the NSA’s foreign dragnets as evidence while obscuring their true source. And that says nothing of PRISM or other top-secret intelligence programs that serve the NSA’s collect-it-all mission. The FBI’s argument that encrypted phones swing the pendulum too far toward privacy when Congress still hasn’t passed a single law curbing mass surveillance is at best tone-deaf and at worst disingenuous.
In fact, annual reports from the U.S. Administrative Office of the Courts show that the number of cases in which encryption thwarted cops is still minuscule. Of the thousands of wiretaps authorized each year (the vast majority of which are drug-related, not scary terrorism and kidnapping scenarios like the ones that U.S. officials have put forth), encryption stymied an investigation only nine times in 2013 and four times in 2012. Before that, the number was zero every year going back to 2000, when the data started being recorded. That’s because in most cases, encryption is circumvented rather than broken; the most common reason encryption fails is that it’s poorly implemented in the software that’s using it, not that the crypto algorithm is flawed.
These programming errors are common, and the FBI knows it. With Silk Road, the anonymous online black market that the FBI shut down last year, a misconfigured login screen apparently enabled agents to controversially discover the physical location of the site’s servers — despite the layers of protective encryption used by the anonymity-providing Tor network. In the case of the iPhone, there are products available on the market right now from companies such as Cellebrite and Guidance Software designed to exploit flaws and harvest data from phones. When those don’t work, PIN-cracking robots programmed to punch in every possible passcode combination can gain access to a device in under 24 hours. And then there’s still all the data uploaded to iCloud and other third-party services, which doesn’t have the same warrant protections as data stored locally on your phone. In other words, obtaining data with a warrant is still very much possible; the cops just can’t rely on Apple anymore as their one-stop phone-unlocking service.
So what is the FBI complaining about, exactly?
The way officials describe it, you’d think Apple is thumbing its nose at law enforcement, removing back door access it had given. In reality, the company is simply removing security flaws that allowed data to be accessed without a device’s passcode. Under previous versions of iOS, only certain classes of stored data were encrypted by combining a secret key with the user’s passcode, while others — such as photos, contacts and text messages — were unprotected. As Cato researcher Julian Sanchez put it for Just Security, “The FBI is not really complaining that Apple has closed a back door, because there never was a back door in the conventional sense. Rather, they’re complaining that Apple used to have badly designed security and now they’ve fixed it.”
This kind of mentality would be unthinkable outside the digital realm. Imagine the government saying the locks on your safe or front door need to be defective just so it’s more convenient for cops to get inside if they ever need to. That is what the FBI wants to be able to do with your phone.
Fortunately, there seems to be little appetite for Comey’s proposal on Capitol Hill. Internally, officials in Barack Obama’s administration have even admitted that picking a fight with the likes of Apple and Google is a bad idea. But we need to make clear that a world where catching bad guys trumps digital safety is not one we want to live in.