Encryption

Encryption as Living Will: Think Before You Drink the Kool-Aid

By Benjamin Wittes
Tuesday, April 26, 2016, 9:07 AM

A grieving father in Italy has written to Apple’s chief executive, Tim Cook, to beg him to unblock his dead son’s iPhone so he can retrieve the photographs stored on it.

If the US tech giant fails, he said he would turn to the Israeli mobile forensics firm that reportedly helped the FBI crack the iPhone used by gunman Syed Farook in the San Bernardino attack in December.

“Don’t deny me the memories of my son,” architect Leonardo Fabbretti wrote.

Fabbretti’s son, Dama, who was adopted from Ethiopia in 2007, was diagnosed with bone cancer in 2013 after a skiing accident and died in September aged 13 after a series of operations and chemotherapy sessions failed to cure him.

“I cannot give up. Having lost my Dama, I will fight to have the last two months of photos, thoughts and words which are held hostage in his phone,” he said in the letter, sent on 21 March.

“I think what’s happened should make you think about the privacy policy adopted by your company. Although I share your philosophy in general, I think Apple should offer solutions for exceptional cases like mine.”

Fabbretti said he had given his son an iPhone 6 nearly nine months before his death, which he used all the time. “He wanted me to have access, he added my fingerprint ID,” he told AFP. “Unfortunately, it doesn’t work if the phone is turned off and on again.”

Agence France Presse

 

The travails of people like Leonardo Fabbretti don’t seem to count for much in the encryption wars. Fabbretti, like the FBI, ultimately had to turn to a third party to hack his son’s iPhone—a company called Cellebrite—and recent reports suggest that the company may be able to get into the phone.

Be that as it may, the case, and others like it, give the lie to the idea that the encryption debate stacks law enforcement and intelligence interests, on the one hand, against personal privacy and security interests, on the other. FBI Director Jim Comey has often talked about the case of Brittney Mills, the 29-year-old pregnant Baton Rouge woman who was murdered by an unknown killer—leaving only an impenetrable iPhone.

As NPR summarizes:

Barbara Mills [the victim’s mother] saw Apple CEO Tim Cook on TV the other day, talking about the rights of consumers. To privacy activists, he's a hero. To her, he's not.

"You still trying to protect consumers, but what about the victims who used your product?" she says. "They were faithful, too. They paid their bills."

According to family members, Brittney Mills kept a diary on her phone, in some app, which could be very useful to investigators. They haven't been able to name a single suspect yet.

Sitting at the conference table in his office, East Baton Rouge District Attorney Hillar Moore explains just how thin the murder scene was: "The daughter heard someone knock on the door and heard her mom speak to somebody, who she was not able to identify. After the shots rang out is when the daughter ran for safety."

Moore says the little girl ran into the bathroom and locked the door.

Brittney Mills lived on the ground floor of a small apartment complex in a nice part of town. None of the apartments — not No. 3, where she lived, or Nos. 1 or 2, had any sign of a forced entry.

"The critical thing is she opens the door," Moore explains.

And it looks like the shooter didn't enter the apartment, didn't rummage inside. "No gun left, no gun found. We really are desperate to try to get into the phone, just to see if there's anything else there," he says.

Like fingerprints, like DNA evidence, phone data figure prominently into criminal cases.

Investigators were able to get AT&T, the mobile carrier, to provide a call log — every number that called or texted Brittney, or that she contacted — but not what was said inside a text.

Apple turned over data stored on iCloud — like, 15,000 pages worth of data, according to prosecutors. But the account stopped backing up months before the murder, and that could be for any number of reasons, so the data ended up being outdated.

Back in February, I had a brief but interesting Twitter exchange with Jacob Appelbaum of Wikileaks fame about this case. Appelbaum and I were talking past one another about the history of industry cooperation with FBI surveillance requests, when he said the following:

 

The exchange has stuck with me for the successive months because Appelbaum essentially described as a moral good the fact that Mills’s privacy settings should bind her mother after her death. By extension, Dama Fabbretti’s privacy settings should bind his father, even though they were apparently never meant to keep his father out of his phone. In this view, privacy settings constitute a kind of living will. And at least as far as Appelbaum is concerned, it’s a fully binding living will—even if your posthumous security interests might, say, militate towards having your murder investigated.

I have to say that I admire Appelbaum’s purity on this point. He’s clearly thought through the implications of end-to-end security and full-device encryption and decided that privacy in the limited sense of perfect impenetrability of his data is the most important form of security. Appelbaum strikes me as the kind of guy who probably reads through all those click-through agreements with particular care and thinks through the security implications of every step he takes. The New Yorker once described his security precautions as follows:

He insisted on being interviewed in the club’s sauna, where another naked man was lying down. This seemed to be Appelbaum’s way of insuring that I wasn’t hiding any surveillance devices. We had contrary ideas about privacy: I keep my phone with me during interviews, but I don’t like discussing personal matters in front of strangers with my clothes off. After fifteen minutes, the pages of my notebook were soaked in sweat, and I asked to move the venue. We continued talking in an adjacent room, where numerous men wrapped in towels lounged on benches as Appelbaum told me the story of his life as an activist for anonymity.

So let's give him the benefit of the doubt: he has likely given considerable thought to the implications of his security choices.

My problem is that I very much doubt that Brittney Mills did the same. And I suspect that she, not he, is closer to the average consumer, who likely does not turn on his or her iPhone with a full understanding of what Apple’s “security” really means and how inimical to his or her security interests it could turn out to be. I doubt that many people will read the above exchange and think, “Ah yes, Brittney Mills clearly meant to have Apple protect her privacy by blocking an investigation of her murder, and her living will in that regard should be respected by all the power that math and Apple’s engineering can bring to bear on the problem.” I suspect most people are closer to Fabbretti on this when he says, "Although I share your philosophy in general, I think Apple should offer solutions for exceptional cases like mine."

This divide between Apple’s behavior and holistic consumer security would be less of a problem if device encryption were not a default setting, but by making it so, Apple is making a security choice for tens of millions of users. It is a security choice that says that the parents of a minor child may not be able to access that child’s phone if the kid is in trouble. It is a security choice that says if you get murdered or kidnapped, your communications are secure against being a tool in your rescue or in the delivery of justice to your family. It is a security choice that says that if you lose your own password, there’s no way to recover or reset it or retain your data. As Apple puts it,

If you enter the wrong passcode in to an iOS device six times in a row, you'll be locked out and a message will say that your device is disabled.

Pick a way to erase your device

Unless you made a backup before you forgot your passcode, there isn't a way to save your device's data. You'll need to erase your device, which deletes all of your data and settings.

[Memo to Apple’s marketing copyeditors: “into” is one word, not two.]

Though there is a strong bias among many tech professionals in favor of the purity of end-to-end encryption and against emergency access to devices for anyone, for many users these are simply not rational security choices, not even in the post-Snowden world.

The cult of end-to-end encryption has other security costs too. One of the reasons I like Gmail is that it has particularly good spam filtering; this is not possible without Google’s reading my plaintext. In a world of end-to-end encryption, it is also far harder to screen for malware signatures.

Here's the bottom line: I actually don’t want to be fully responsible for my own security. I don’t want to have to conduct my meetings naked in a sauna (and believe me, neither do my interlocutors). One of the solutions to that problem is to have relatively trusted intermediaries holding and processing my data. I do not best serve my own aggregate privacy and security hygiene by preventing this.

So yes, I use end-to-end encryption for certain limited purposes where data security is of the highest importance: communications with reporters in authoritarian countries, for example. And I increasingly use it against my will because it is the default setting on programs like WhatsApp, iMessage and FaceTime that I use for other reasons. But if and when someone murders me, I want my iPhone accessible both to my family and to law enforcement.

And in my opinion, if Apple wants to write a living will for its consumers, it should put a big red warning on its phone and privacy policies making clear what its default settings really mean.