Cybersecurity

Demystifying Apple’s FAQ – A Rebuttal

By Blair Reeves
Monday, February 29, 2016, 12:24 PM

To say that Apple has great marketing would be quite an understatement. Apple is widely recognized as having one of the most effective marketing and PR operations in their—or perhaps any—industry. Stories of people lining up to buy virtually any new Apple product, sight unseen, for hundreds of dollars are now utterly routine. Loyalty to Apple’s brand is nearly unmatched, and accordingly it ranks as the most valuable in the world.

Apple tapped that reservoir of adulation as it took its case against the FBI public, publishing a “Message to our Customers” on the front page of their website. This letter effectively—and conveniently—frames the issue as a simple one of encryption and democracy, and accused the FBI of attempting to undermine the security and safety of millions of Americans and Apple customers around the world. In its anti-authoritarian posturing and celebration of encryption, Apple touched on tribal markers for the tech community—and Silicon Valley took notice. From Google to Facebook, and from Box to Twitter, the tech industry’s leading voices weighed in to support Tim Cook, as did much of the industry rank-and-file. The tech press and its pundits praised Cook’s clarity and resolve and Apple’s brave stand against unwarranted government intrusion. Such is the prize of framing the debate before it is has even started.

Predictably, the debate has begun to resolve into a sadly familiar pattern: techies versus the government and law enforcement, with both sides rallying respective supporters. Chalk it up to an East Coast vs. West Coast thing or blame it on the echo chambers of our respective Twitter interest graphs, but the clash has not, thus far, involved much reasoned engagement between the two. The insularity of Silicon Valley from the rest of the country, both in professional and social networks, doesn’t help. And the tendency for technology activists to often present their personal ideologies as scientific fact—and accordingly conflate technical choices and political ones—has conditioned many DC policymakers to take Silicon Valley’s tantrums with some grains of salt.

All this said, I’m nonetheless surprised by the vehemence with which my colleagues in the tech industry—I am a technical product manager for cloud analytics software—have rallied to Apple’s cause without stopping to consider the wider public policy implications, at least not those beyond protecting the American tech industry. And I would venture that no small part of this is a response to the way Apple has masterfully defined the debate. By Apple’s narrative, the FBI now seeks an onerous set of unreasonable and impractical demands, the technical difficulty and impact of which it does not understand; and in doing so, the government undermines both American civil liberties and personal safety. Unsurprisingly, Apple emerges from its own narrative as both victim and savior.

But our policy decisions must be based in reality, and Apple’s framing of fact is anything but. Its PR aside, Apple is choosing—both here and in other courts—to challenge multiple court orders directing the company to assist in unlocking phones involved in serious crimes. But Apple is not suing the federal government for relief, not lobbying Congress to pass new laws, not even calling for a good old-fashioned letter-writing campaign. Instead Apple is using the specific venue of declining to cooperate as its mechanism to force its desired outcome. While there is nothing wrong with Apple asserting its right in court, the unilateral manner in which it is preceding directly prevents our democratically elected government from pursuing evidence in real cases involving real crimes against real people. And it also deprives us all of the right to decide for ourselves, via our elected representatives, where this important balance between protecting civil liberties and the maintenance of public safety should be struck.

Apple recently unveiled an FAQ page offering what can be fairly characterized as self-serving “answers to your questions about Apple and security.” You can read Apple’s complete answers at the link above. Here, I offer some additional background they’ve left out.

 

“Why is Apple objecting to the government’s order?”

Apple: It is too hard and carries too many security risks. Also, it could set a precedent that we don’t like and who knows where it could lead? If we do this, in a dystopian future that resembles an X-Files episode, the government could make us embed nefarious surveillance features too.

What the FBI is asking Apple to provide is not an “entirely new operating system” in the sense of re-building iOS from scratch. It is, if anything, a relatively minor change that would require a trivial amount of effort for a company that spends $8 billion each year in R&D. By Apple’s own estimates it would take 6-8 engineers less than a month. And recall that the capability the FBI is asking for is actually a reinstatement of access that was previously possible up until relatively recently; Apple has worked assiduously to remove its ability to provide access like this, and is reportedly accelerating plans to make it even harder.

Apple returns again and again to the term “backdoor,” in a relatively blatant attempt to define this debate—favorably—in terms much larger than the case in San Bernardino. But the government is not requesting a “backdoor” in any traditional sense of the word. The FBI is seeking a way to preserve access to encrypted iPhones (via a custom iOS firmware image) in the presence of a valid warrant granted pursuant to well-defined evidentiary standards, exactly the same as they have with other forms of communication and personal records. What the government seeks is a preservation of the status quo based on the long-standing legal agreements about how law enforcement interfaces with reasonable expectations of privacy. It’s reasonable to periodically revisit those issues, but this is hardly a first step down a dark, shadowy road of unknowable consequences.

And Apple’s suggestions that this will lead to bizarre government-ordered surveillance features—well in excess of what a court has ever authorized—is downright silly.

 

Is it technically possible to do what the government has ordered?

Apple: Yes. But it’s too dangerous to attempt.

Everyone agrees gaining access to the device is technically possible. The level of danger it poses is, of course, a matter of opinion. Note that Apple is not a disinterested party here. The company has a direct interest—as a matter of strategy and resources—to not be subject to law enforcement requests for assistance.

The dangers implicated if Apple is to prevail here are quite clear. Should Apple win, and the company is legally able to preserve their devices as “dark” to law enforcement, criminals of all stripes have their marching orders: use iPhones exclusively and turn off iCloud backup. It’s not a bad new market for Apple, but would be a public safety disaster. The Justice Department and police from across the country (and the world) currently rely evidence obtained from iPhones, with Apple’s assistance, in preventing, investigating, and prosecuting all manner of crimes, from petty thefts to murder, kidnapping, or (yes) terrorism. Whatever the ultimate balance, it must be struck understanding that what is proposed here is to slam a door shut to those devices and to throw away the key. It is altogether possible (if not outright probable) that this will have calamitous effects on the ability to investigate and prosecute crimes.

Of course, there are other countries which also seek to investigate as crimes activities that the American system (both our laws and values) explicitly protects, like the freedoms of speech, assembly and protest. This is an important and difficult challenge for any global company, particularly in technology, to meet (just ask Google!). In the present situation, however, it is not within the FBI’s remit to consider. The FBI has an assigned role and mandate to diligently pursue its lawful mission. And significant evidence is emerging to counter Apple’s claim that can or will use technological conditions to counter human rights violating regimes—indeed, it is only too happy to conform to mandated cooperation with security authorities in China.

 

Could Apple build this operating system just once, for this iPhone, and never use it again?

Apple: Well, sort of, but not really, because hackers could steal it.

Apple is correct that the only way to absolutely “guarantee” a digital asset is never stolen is to not possess it in the first place. But this is a rhetorical trick. Plenty of companies manage to protect extremely sensitive data from cyberattacks, not least Apple itself, already. While the technology at issue is slightly different, Apple held the keys to decrypt iPhones until just a year and a half ago—and it was no more or less difficult to steal that software as the one at issue today. Recognizing the security implications, the government request goes to some lengths to accommodate the company, and permits Apple engineers to install the software and perform the extraction entirely within Apple facilities, and even to destroy the custom OS after its use. The suggestion that it is just too difficult for Apple to keep this software out of attackers’ hands strains belief and is contrary to all available historical evidence.

Moreover, the fact that “law enforcement agents around the country… have hundreds of iPhones they want Apple to unlock” goes to show how essential smartphones have become as a source of evidence in modern investigations. The very people enthusiastically reordering our lives around modern smartphone technology are equally eager to lock law enforcement out of it.

Of course, the California case will not be the last iPhone that the FBI or other law enforcement agencies will need to access. Apple—like any company—must plan for some degree of law enforcement assistance, exactly as it always has in the past. After creating the very conditions under which it is even more burdensome to meet basic and well-defined responsibilities under the law, it simply beggars responsible belief for Apple to complain that those burdens are now too heavy. And if Apple happened to lock out law enforcement only as an unintended side-effect of its own security decisions, then it surely cannot simply shrug now, or even worse, claim it is outraged by requests for new assistance.

 

Has Apple unlocked iPhones for law enforcement in the past?

Apple: No. But actually, yes.

Before iOS 8, Apple was able to do this and did so. Whether they “unlocked” the phone or bypassed the locking mechanism is a matter of semantics. In fact, as William Bratton and John Miller point out in their NYT Opinion piece, Apple routinely assisted law enforcement in this manner until late 2014. Since then, Apple has specifically engineered its operating system to make it harder for the company to comply with valid requests from law enforcement. The resulting quandary is the inevitable result of Apple’s deliberate product choices which it has made for years despite being expressly warned about this situation.

 

The government says your objection appears to be based on concern for your business model and marketing strategy. Is that true?

Apple: Heavens, no.

Apple makes money—a whole lot of it—by serving its customers and meeting their demands. For several years now, Apple has explicitly made “privacy” a key marketing stick with which to beat its chief competitor, Google. Certainly, a stated commitment to protect customer privacy is vital to Apple’s brand and continuing business strategy. Apple’s CEO and employees may be expressing genuinely held private convictions, but the regulatory theater in which Apple, the corporation, is currently embarked is without question motivated by its business concerns.

 

Is there any other way you can help the FBI?

Apple: It’s out of our hands. Also, don’t forget the FBI is incompetent.

Apple has framed the issue thusly: We are only able to do that which we are willing to do—technical reality aside—and we’ve already given the help we felt like giving. This approximates the reasoning my five-year-old niece gives for not wanting to put on pants. Apple is now opposing court orders in, as the WSJ reports, at least a dozen other cases across the country, many of which implicate phones for which Apple continues to voluntarily maintain the capacity to comply.

So, yes, there seems to be much more Apple could do.

 

What should happen from here?

Apple: There should be a commission to decide these issues.

This is actually a great idea. Of course, it should not delay Apple’s immediate legal responsibilities to furnish assistance to the FBI and other law enforcement agencies. But the questions at stake in this case are truly important, and the technology at issue has arguably advanced sufficiently that updating the legislation is warranted. Ideally, Congress should update and clarify existing legislation.

Whatever Silicon Valley says—and as I’ve written elsewhere—the issues at stake in this debate are not fundamentally technical. Rather, they are political: how should our understanding of the limits of law enforcement adapt to keep up with the role of technology in modern life? What sort of code counts as “speech?” (I recommend this outstanding primer.) And which civil liberties are sacrosanct and inviolable even in the face of valid public safety needs? Traditionally, our answer to that last question has been quite limited; should smartphones, or all digital data, be treated under different rules? There is a legitimate and needed debate to be had.

But the current dysfunction in the Congress—and in an election year, no less—makes it difficult to imagine something substantial emerging from a “commission” in the foreseeable future. I’m dubious as to whether a Congress which now routinely teeters on the brink of shutdown over merely funding the government is actually capable of brokering a productive engagement among the various stakeholders to weigh the appropriate balance of technical security specifications and law enforcement needs.

Moreover, I would venture a guess that this is Apple’s calculation too. Call me cynical, but this would seem like an effort by Apple to prolong a public debate, while opposing any legislation that might actually clarify access and simultaneously accelerating its product development efforts on encryption towards the point of no (practical) return. It would be difficult to square Apple embarking on such a dangerous and irresponsible course with the company’s lofty language and love of country. But it would be a reasonable guess that Tim Cook’s strategy may be simply to make un-decryptable smartphones a fait accompli before the FBI, the courts or the legislature can force Apple to do otherwise. And we should call that what it is: Apple arrogating to itself the power to dictate to Americans what their civil liberties will be, rather than allowing them to decide for themselves. We will all be poorer, and probably less safe, as a result.

If we want to lash law enforcement to the technology of 1995, then let that choice be clear. But let it be ours, not Apple’s. And let’s not pretend that there are no tradeoffs involved in making it. That is magical thinking.