By now, you probably know that in mid-February, the FBI asked Apple to create software that could bypass encryption on an iPhone relevant to the San Bernardino mass shooting. Apple wrote a letter to customers explaining why they would not. What you might not know is why this matters to you — and to everyone. I talked to some people who could shed some light: Avi Rubin and Matthew D. Green, professors at Johns Hopkins University; Cindy Cohn of Electronic Frontier Foundation, and Christopher Soghoian, TED Fellow and a lawyer at the American Civil Liberties Union. I’ve synthesized their views here.
1. Apple doesn’t want you to trust it. And that’s good.
This sounds counterintuitive, but it’s the very foundation of the company’s argument. “Apple has developed a really unique threat model,” says Matthew Green from Johns Hopkins. “A threat model details what attackers you want to protect against, and that will inform how you are going to deal with them,” he explains. By including itself in its threat model, Green says, it thus limits its own ability to bypass fundamental security measures. And that’s by design.
“It’s completely appropriate that Apple has created technology that takes it out of the equation,” says Cindy Cohn of EFF. In other words, Apple has realized that if it inserts itself as a third party into your digital life, it will have access to many personal, private details. And so, as a responsible technology company, it devised a way to avoid ever revealing that information. What the FBI is essentially asking is that the company undo this work and insert itself into everyone’s lives.
2. Breaking the San Bernardino phone means more phones will be broken.
If Apple were to create a version of iOS that would allow this one iPhone to be broken, that same version could and likely would be used to break other phones. “Apple would have to digitally sign the software,” says ACLU’s Christopher Soghoian. “This would allow it to be installed on other devices,” he says — and at that point there is really no way to know how and where it might be used.
“Technology doesn’t know the consequences of how it will be used,” agrees EFF’s Cohn. Once such software is created, he suggests, it could be used by anyone for illegal or nefarious purposes.
Already, the FBI has said it has at least 12 other phones that it wants to unlock, and there are perhaps 175 or more iPhones in possession of other law enforcement agencies that could be broken by compromised software.
“Right now there is a lot of friction to overcome, in the event that the FBI wants to break a device,” says Green. “But if that friction is lowered, and compromised software exists, then there is no way to know how it might be used in other settings.” Apple, meanwhile, argues that the high amount of friction helps protect users from abuse — from governments and malicious actors alike.
3. No company should ever be compelled to produce compromised code.
Imagine you had 500 pounds of shredded paper waste that could be used as evidence in a criminal case. Legally, you’d have to hand over the shreds; but you would never be asked to re-create those shredded papers. As Cindy Cohn puts it: “If you have evidence in your possession, then you have to turn it over. But you do not ever have to manufacture evidence.”
Meanwhile, America’s First Amendment protects people from “forced speech.” “A person cannot be compelled to have ‘Live Free or Die’ on their license plate, or be compelled to say the Pledge of Allegiance in school,” says Cohn. Similarly, because code is a kind of speech (as determined by the Bernstein v. US Department of Justice case argued by the EFF in the mid-1990s), Apple cannot be compelled to produce computer code it does not wish to create.
4. If the All Writs Act of 1789 applies here, it applies everywhere.
“The legal precedent is much scarier than the technical precedent,” says Cohn of this battle, with the government citing the All Writs Act of 1789 and trying to use it to force Apple to attack its security. As Neil Richards and Woodrow Hartzog wrote recently in The Guardian, “this little law is a piece of Swiss Army knife legislation that the FBI is trying to turn into a giant sword, out of all proportion to what it is supposed to do.” And if it’s successful, the same approach could be used against any company to attack their security. “It could be used to compromise the Internet of Things, cars, or devices like the Nest thermostat,” Cohn says. “Imagine if the government wanted to stop your car or turn off all the lights in your house. That’s a real possibility if we allow this precedent.”
5. Law enforcement has access to plenty of data. This case is a red herring.
Both Green and Rubin agree that there is growing volume of data available to law enforcement — much more than even just a few years ago. And they argue that the sensational San Bernardino case is being used to try to get access to even more still. “Law enforcement already has access to encrypted ‘data in motion,’ services like iMessage and such, but now they want access to encrypted ‘data at rest,’ which is data on phones,” says Green. What does this mean? That this is really an attempt both to expand the power of law enforcement and to leverage public opinion.
“I’m an impact litigator, and I know [a case selected for its potential broad impact] when I see one,” says Cohn of the FBI’s decision to focus on this one case. After all, who wouldn’t want to help solve a high-profile terrorism case? Yet we shouldn’t be distracted. “There are already many constitutional limitations on evidence available to law enforcement,” she adds. “We have attorney-client privilege, husband-wife privilege. Sometimes the right answer is ‘no.'”
“It seems to me that the FBI is attempting to ride public sentiment and the opportunity that the media exposure of this situation has created to set a precedent where Apple is forced to comply with law enforcement requests that undermine the privacy protections that Apple has made it a point to provide to their customers,” says Rubin. “They have so much information from other sources, and it is not clear just what they are looking for on this phone that they can’t get from the phone carriers and the computers and cloud accounts.”
6. American competitiveness is at risk if the FBI wins.
In February 2016, Pew Research conducted a study that showed that 51% of Americans approve of the FBI’s stance in this case. Nonetheless, the idea that US law enforcement could put the security and privacy of every iPhone user worldwide at risk raises serious questions. For instance: if the FBI wins, would Apple even remain a US-based company? Would it develop a global encryption strategy that sidesteps US regulation altogether?
It’s happened before. “So-called ‘crypto-with-a-hole’ strategies were developed in the 1990s to help [American-based multinational companies] bypass US crypto export restrictions,” says Cohn. Some US companies made encryption that couldn’t be exported; so they simply built their products with a “hole” where the encryption should go — then added the crypto in another country with different laws. He says: “There’s no reason such strategies could not be used again.”
Some have said application-level solutions like encrypted messaging apps are partially an answer, going some way toward protecting the user even if the operating system is vulnerable. But Rubin maintains that operating system security is critical. “The OS protects the keys, and the applications have to bootstrap off of something secure,” he says.
But the reality is that whatever happens in this case, many outfits are likely already working on products that can offer stronger security and privacy protection. Apple itself is reportedly now working on an even more unbreakable iPhone that would potentially encrypt parts of the operating system and render the approach the FBI is currently proposing impossible. That means that this is by no means the end of anything; rather it’s just another salvo in this ongoing, infinite technology battle.