Interesting little development going on in the tech/privacy world and, depending on who you believe, a possible turning point for the better/worse.
After the San Bernardino shootings, the FBI seized the iPhone used by shooter Syed Rizwan Farook. The FBI has a warrant to search the phone’s contents, and because it was Farook’s work phone, the FBI also has permission from the shooter’s employer, the San Bernardino County Department of Public Health, to search the device. Legally, the FBI can and should search this phone. That’s not up for debate. If the FBI gets a warrant to search a house and the people who own it say okay, there’s no ambiguity about whether it can search the house.
But if the FBI comes across a safe in that house, the warrant and permission do not mean it can force the company that manufactures the safe to create a special tool for opening its safes, especially a tool that would make other safes completely useless as secure storage. That’s the situation that Apple’s dealing with here.
The FBI obtained an order from a California district court on Tuesday ordering Apple to provide “reasonable technical assistance” in cracking Farook’s passcode. The court order doesn’t flat-out demand that Apple unlock the phone, which is an iPhone 5C running iOS 9. Instead, the judge is asking Apple to create a new, custom, terrorist-phone-specific version of its iOS software to help the FBI unlock the phone. Security researcher Dan Guido has a great analysis of why it is technically possible for Apple to comply and create this software. (It would not be if Farook had used an iPhone 6, because Apple created a special security protection called the Secure Enclave for its newer phones that cannot be manipulated by customizing iOS.)
Apple quickly said it would fight the judge’s order. Chief executive Tim Cook called it “an unprecedented step which threatens the security of our customers,” and said the order “has implications far beyond the legal case at hand.” He published a message emphasizing that the company can’t build a backdoor for one iPhone without screwing over security for the rest:
In today’s digital world, the “key” to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
Apple, Google and other technology firms in recent years have stepped up encryption — allowing only the customers to have “keys” to unlock their devices — claiming improved security and privacy is needed to maintain confidence in the digital world.
This has sparked a national discussion on weighing security against privacy. Not a new debate — we’ve had that since 9/11. But this relates to our smartphones, and so everyone has a strong opinion, it seems. Republican candidates are coming down on the side of national security in a few that is somewhat contradictory of the anti-big-government stance they often take. Again, nothing new there.
Let’s see if we can’t shake out this tree a little.
First off, here is the actual order. Magistrate Judge Sheri Pym, a former federal prosecutor, relied on the All Writs Act, passed in 1789 (one of the first federal laws ever). It has been used many times in the past by the government to require a third party to aid law enforcement in its investigation.
The order would require Apple (US) to create firmware to be loaded onto a specific phone to make it possible to do brute force password guessing. (Among a couple of other things, it would take away the maximum number of guesses to unlock the device.)
The significant thing about this case is that the FBI, minus any enforcing legislation, has gone and found itself a judge to order a company to do something.
Think about that — ‘ordering a company to do something’. That is something arguably new in the current FBI approach.
The Apple case is remarkable in that it couches what the court views as “reasonable assistance” as basically breaking your own products. Apple has quite rightly made the point that not only does this break company security and therefore customer privacy, but that if they create an exploit for the FBI, the vulnerability will be used by the likes of Putin and various repressive regimes.
Facebook, Twitter and Google have all voiced support for Apple‘s fight against a court order that Apple says would make iPhones less secure a,d it is not hard to understand why — they simply cannot run a global business if they are seen to do too many special favors for one government, the United States.
But is this really about privacy? Do we as individuals really care about these things? Let’s face it — we are now just little motors chuntering around creating metadata exhaust trails. The current conflict is not an argument about our privacy rights, since we seem to be content to leave ourselves all over the place (Facebook,. Twitter, etc.). Rather, this might be a fight between governments and firms on how better to pin us down and hoover up the effluent we leave behind. You can see why they might all be getting testy about who gets what.
So I tend to think this is less about Apple preserving privacy for its owners, and more about it being seen in international quarters as subservient to the American government. What will happen to the foreign markets of Google and Facebook and Apple and Android if it widely believe that one American judge can order these giant companies to invade one person’s privacy?
This is about the Benjamins just as much as about the privacy rights of people.