For months, the FBI, the National Security Agency and an alphabet soup of other spooky agencies have been lashing out at tech companies that have responded to former NSA contractor Edward Snowden’s surveillance revelations by starting to protect customers with stronger encryption. But it’s increasingly obvious that the government’s crypto panic is powered by fear, not facts.
Last week at the RSA security conference in San Francisco, Department of Homeland Security Director Jeh Johnson begged Silicon Valley companies to give the government access to encrypted communications, asking the crowd to “imagine the problems if well after the advent of the telephone, the warrant authority of the government to investigate crime had extended only to the U.S. mail.”
“Imagine an America where federal, state and municipal law enforcement agencies cannot access critical communications, even when legally authorized to do so,” begins a recent Wall Street Journal blog post written by Amy Hess, the FBI’s executive assistant director. “Imagine the injustice if a suspected criminal can hide incriminating communications without fear of discovery by the police or if information that could exonerate an innocent party is inaccessible.”
The reason the FBI, Homeland Security and other agencies want us to imagine these frightening scenarios is that their encryption problem is just that: imaginary. It’s built on the false premise that making encryption more accessible will allow criminals to shield themselves from the law. The only solution, the government says, is for companies to put backdoors into their devices and apps, which by definition means installing defects that make our data more vulnerable to criminals and spies.
One need look only at what law enforcement agencies are doing in secret to see that these predictions of digital anarchy are pure fantasy.
Earlier this month, Motherboard reporter Lorenzo Franceschi-Bicchierai discovered that the Drug Enforcement Administration has been buying hacking tools from an Italian company, Hacking Team, through a shell company based in Maryland. The software, Remote Control System, is a remote host-based interception suite that allows police to infect devices, steal passwords, intercept Skype calls and even monitor targets in real time through their webcams. Researchers discovered it (and a competing product, FinFisher) is being used to spy on journalists and activists in Morocco, Ethiopia, the United Arab Emirates and other countries with notoriously poor human rights records.
Here’s how Hacking Team advertises the software (emphasis added):
You cannot stop your targets from moving. How can you keep chasing them? What you need is a way to bypass encryption, collect relevant data out of any device and keep monitoring your targets wherever they are, even outside your monitoring domain. Remote Control System does exactly that.
These kinds of tools aren’t new, but their recent prevalence as commercial products underscores how government agencies are increasingly utilizing hacker techniques. The FBI has been in the hacking business for more than a decade, and it recently won new powers to hack computers even when their user and location are unknown. This despite the fact that in 2013, a judge in Texas rejected an FBI request to send spyware to an unknown suspect’s computer, saying the agency offered “little more than vague assurances” that it wouldn’t intrude on innocents in the process.
From a practical standpoint, these tactics make sense. Encryption protects data using impossibly complicated math, and it’s infinitely easier to solve complicated math problems by stealing the answers than by cracking the code. The strongest encryption in the world won’t save you if someone can get inside your computer and steal your encryption keys, and products such as Remote Control System and FinFisher are giving those capabilities to police and governments around the globe.
It might also explain why U.S. agencies are still unable to show a single case in which encryption has crippled a criminal investigation. According to annual reports presented to Congress since 1997, encryption wasn’t an obstacle to government wiretaps even once until 2012. Of the 3,576 wiretaps authorized in 2013, the government was bested by encryption in only nine cases. None of those cases involved terrorists, kidnappers or any of the other cyberbogeymen the FBI keeps warning about, and there’s no indication that encryption alone prevented any crimes from being solved.
So either government agencies are being incredibly modest or they’re simply hiding the fact that encryption isn’t a real problem because they already have the means to circumvent it.
Of course, giving police hacking powers presents a whole new set of problems. When should they be allowed to break into someone’s computer? How would a judge ensure that they’re hacking the right device and that innocent bystanders won’t be affected? How long should a police or government agency be allowed to exploit a commercial software vulnerability for hacking purposes?
Hacking isn’t the only way police can get access to encrypted communications. In most cases, a court will simply compel a suspect to surrender their passwords or encryption keys. And the four-digit PIN that protects your iPhone or Android can be easily cracked in a matter of days.
When it comes to encrypted messaging apps such as WhatsApp and Signal, another option for the government is to force companies to send a fake key to the target. Even though the companies can’t read their users’ messages, they still control the system that distributes the keys needed to encrypt them. That means the FBI could compel WhatsApp to send a suspect an FBI key instead of an intended recipient’s, allowing agents to decrypt the message.
These aren’t perfect solutions, but their targeted nature undoubtedly makes them better options than forcing tech companies to build backdoors for police. Security experts have warned again and again that you can’t create “golden keys” for the FBI that will be safe from Chinese hackers and Russian credit card thieves — a backdoor for one can be found and exploited by all.
The FBI keeps plugging its ears and saying there’s a way to make backdoors work. But so far, its only ideas are fantasies. Take the split key escrow system, in which a “trusted third party” such as the FBI holds a portion of the keys needed to decrypt data. Cryptographers rejected this concept nearly two decades ago.
Testifying before Congress on Wednesday (PDF), Matt Blaze, the cryptographer who famously discovered flaws in the NSA’s proposed Clipper Chip key escrow system, said:
Harsh technical realities make such an ideal solution effectively impossible, and attempts to mandate one would do enormous harm to the security and reliability of our nation’s infrastructure, the future of our innovation economy and our national security.
Amazingly, when Rep. Blake Farenthold, R-Texas, asked the panel of experts at the hearing whether anyone thought it was possible to build secure crypto backdoors, no one — including the FBI’s own expert witness — raised their hand. That law enforcement groups continue to ignore this broad consensus proves that their position relies on scaremongering and distortions.
These agencies need to accept that they can’t have their cake and eat it too. Criminals have always taken steps to avoid being caught, and if we’ve learned anything from the FBI’s takedown of the online drug bazaar Silk Road, it’s that even the strongest encryption and anonymity tools can’t stop people from making mistakes.
In asking for backdoors, the government is simply trying to double down on surveillance powers while putting the security of law-abiding citizens at risk — and inviting other countries to come knocking for golden keys of their own.