The debate over whether tech companies should be required to break encrypted Internet communications for national security reasons has heated up since the Dec. 2 attack in San Bernardino, California, and reports on Friday that suspects in the Nov. 13 Paris attacks used the encrypted platform WhatsApp to plan their deadly rampage. Presidential candidates and lawmakers on both sides of the aisle have called for new laws that force tech companies to grant law enforcement greater access to encryption, with Ohio Gov. John Kasich going so far at Tuesday night's Republican presidential debate as to name encryption the “enemy.”
But most digital security experts say that, beneath the tough talk, many politicians do not seem to appreciate the implications of what they are asking. Tech companies provide encryption for the benefit of all their users, not just those planning armed attacks. If a “key” or “backdoor” to encrypted communications were created, it would be highly valuable not only to law enforcement but to hackers, foreign governments and even groups like ISIL, too.
To be sure, there is considerable evidence that ISIL supporters make use of encryption to communicate with each other. The group has even circulated documents to its followers that rank applications from most encrypted — including Telegram and WhatsApp — to least. The shooters in the San Bernardino attack, FBI officials said Wednesday, expressed their support for “jihad” in private communications online two years prior to the attack. Law enforcement officials have suggested that these red flags went unnoticed because companies like Facebook and WhatsApp refuse to break encryption for fear of compromising user privacy.
FBI Director James Comey argues that law enforcement should be able to spy on a suspect’s online communication if there is adequate evidence for a warrant – a procedure similar to what they would use for a phone tap. He has endorsed the “backdoor” proposal, reportedly the subject of a bill being drafted in Congress, that would require companies to create and hold onto a secret “key” to be turned over to law enforcement only on court order. Comey has downplayed privacy concerns, saying that adding a "backdoor" was fundamentally a "business model question" for Silicon Valley.
But digital security experts strongly disagree, arguing that breaking encryption carries significant risks. As Matt Blaze, a computer scientist at the University of Pennsylvania and foremost critic of the “backdoor” approach, explained to Politico this week: “Just as the local police department might want to decrypt a phone of a criminal suspect, so would the Chinese or the Russian or the Iranian intelligence agencies like to be able to do exactly the same thing." He added, “If it were possible to hold onto this sort of database and really be assured that only good guys get access to it, we might have a different discussion than we're having. Unfortunately, we don't know how to build systems that work that way. We don't know how to do this without creating a big target and a big vulnerability.”
More generally, the "backdoor" proposal is out of step with trends in recent years, which have seen tech companies become much more sensitive to accusations of complicity in government surveillance — ever since the Edward Snowden leaks brough their past cooperation with National Security Agency data collection to light. Apple CEO Tim Cook, for example, has recently described user privacy as a human right and signaled his strong aversion to compromising encryption.
Still others question the very premise of the debate — that encryption has played a critical role in recent attacks. Comey has said that one of the gunmen in a thwarted attack on a Muhammad-drawing cartoon exhibit in Garland, Texas, in May exchanged more than 100 messages with an “overseas terrorist” that the FBI has been unable to read because they are encrypted. Other than that example, however, digital security experts argue there is relatively limited evidence of a link between encryption and terror attacks. Even in the case of Paris, the attackers also made use of un-encrypted communications, including regular cell phones.
In the case of the San Bernardino shooters, Tashfeen Malik and Syed Razwan Farook, for example, the assumption lawmakers seem to be making is that the couple's “private” messages could have been decrypted and flagged prior to the attack, or at least prior to Malik's entrance into the U.S. on a fiancée visa. But it isn't clear that they would have even been using these platforms to share their views in the first place if they didn't think they were secure. And as a married couple living in a house together, they also wouldn’t need encryption to plan the actual attack.
In that light, the debate over encryption “doesn’t seem like a tailored response to the threat" but rather, "something law enforcement have been trying to pursue" for some time, said Andy Sellars, a fellow at Harvard Law School’s Berkman Center for Internet and Society and an advocate for digital rights. "Encryption is how we bank online, share medical records, private emails, etc.,” he added, so the "backdoor" lawmakers are proposing “could compromise our daily activities in a way ISIS never could.”