Going Dark

Trust, Apple, and the First Amendment

By Andrew Keane Woods
Tuesday, February 23, 2016, 5:44 PM

Last week, I floated the idea that Apple might argue that being asked to assist the government in unlocking an iPhone constituted compelled speech. I did not explore the idea in any depth or offer any thoughts about its viability; this post offers a rough first cut.

Here’s the short version: Apple might establish a First Amendment claim if they can convince a court that digitally signing new software is an expressive act that signals a particular value or belief – and that, I think, might require a court to evaluate the role of trust in a cryptographic system. Alternatively, Apple might make the claim that the government is compelling them to act in a way that would erode its credibility; this is not a claim about chilling speech, but cheapening it.

Before going any further, two huge caveats.  First, there is a lot we don’t know about the technological options available to Apple to comply with this order (and available to the FBI if Apple resists).  So I’m going to make some assumptions along the way that may turn out to be wrong – assumptions that would affect the legal analysis.  Second, this is a very preliminary analysis that I offer up as food for thought – I haven’t talked this through with cryptographers or First Amendment scholars.  So buyer beware. 

That said, let’s examine two possible First Amendment arguments that Apple might make.

The first is about compelling a statement of belief.  In Barnette, the Supreme Court held that students could not be compelled to salute the flag or recite the pledge of allegiance.  Significantly for Apple, the Court found that even non-speech actions, like saluting a flag, can be expressions under the First Amendment.  So Apple may argue that by forcing them to write and sign new code, they are being impermissibly compelled to speak much like the students in Barnette

But are they? 

If the government asks you to open a lock, you won’t have many First Amendment arguments to make because opening the lock is not an expressive action of the sort protected by the First Amendment.  Even if it were a voice-activated safe, the government could compel you to say “open sesame!”  The fact that the safe is opened by speech does not automatically bring the First Amendment into play.  You might have a stronger claim if the government asked you to open the safe by saying the words “I pledge allegiance to the flag…,” because suddenly the government is mandating the very thing that was not allowed in Barnette.

Or are they?

In Barnette, what bothered the Court was the “compulsion of students to declare a belief.”  It is possible that a court would look at this (very hypothetical) voice-activated safe and say that in this context, reciting the pledge of allegiance is not really a statement of belief, but rather just a passphrase or a key – an action with little expressive value.  Actions can be constitutionally relevant speech acts (saluting the flag); conversely, not all speech acts merit First Amendment protection (“Fire!”). 

Last week, I mentioned that code can be seen as speech.   I have since seen this phrase tweeted a number of times.  Let’s be clear:  saying that “code is speech” means very little in this context.  Is it constitutionally protected speech?  Is it speech that can nonetheless be regulated for other compelling reasons?  These are the more relevant questions.  In Bernstein, the 9th Circuit found that the government’s prohibition on publishing source code was an unconstitutional restraint on speech.  But that holding was later withdrawn by an en banc panel (the case petered out and died); and in any event, it was a prior restraint case, not a compelled speech case.

Much more important than whether code is speech in some generic sense is whether this particular code (or Apple’s use of its cryptographic signature) is a constitutionally relevant expression. 

So is forcing Apple to write special code and sign it a “statement of a belief” of the sort that concerned the Barnette court?  I suppose the argument would go something like this:  Computer security is about identity.  Trust is the coin of the realm, and trust is earned by expressions that either encourage or discourage confidence in one’s identity.  If this sounds flighty, go scan the introductory syllabus for a crypto class.  Much of it reads like sociology or political theory.  How is identity constructed?  How is trust established?  Do we present ourselves the same way to everyone, or only to a trusted inner circle?  How do our communications change based on how much trust we have in a given system? 

In this domain, if Apple’s digital identity is its signed software that says to the world, “this is me – this is what makes me different from her,” that starts to sound like the kind of thing that shouldn’t be mandated by government.  Dictating when and how Apple speaks to a device may not be a statement of belief, per se, but it is the court telling a private citizen how to self-identify. 

The government is regularly involved in regulating speech, so let’s distinguish a few things.  Under the Central Hudson doctrine, the government has a much wider latitude to regulate commercial speech.  Indeed, the government regularly puts restrictions on the claims marketers can make (e.g.: You can’t market your product as X when it’s really Y.)  Now Apple’s claim would be about compelled speech, which Central Hudson isn’t concerned with.  But the government plainly has the authority to tell firms to disclose harms caused by their products (e.g.: You must clearly tell your customers that cigarettes are bad for their health).  But could the government mandate that a firm put its corporate logo on a billboard or website supporting a particular message?  Arguably, those are expressions of one’s identity, of the sort that the First Amendment was designed to protect.  Indeed, there is a line of cases regarding compelled subsidies, where food producers challenged programs that required that they contribute money to federal programs that advertised on their behalf.  In the most recent case, Johanns v. Livestock Marketing Association, the Supreme Court allowed the government’s speech (on behalf of American beef), but noted that strict scrutiny might apply if the government’s speech implies that someone endorses the message when in fact they do not.

But not all expressions of one’s identity are protected.  A criminal can’t successfully resist a government order to unlock his trunk by saying, “I’m a thug; my identity is wrapped up in resisting the government.  If you ask me to open this trunk, that would be bad for my image.  First Amendment, man.”  That wouldn’t fly.  But what if the government asked the thug to post a sign on his lawn that says “I love law enforcement.”  Would that be allowed?  Perhaps not.  Now the state is using the private citizen in order to make a statement; the message is not incidental to the compelled action but the very goal.  Court’s have been especially skeptical of that kind of compelled messaging. 

In Wooley v. Maynard, the Supreme Court held unconstitutional a New Hampshire regulation requiring all license plates to include the motto “Live Free or Die.”  The court noted: “We are thus faced with the question of whether the State may constitutionally require an individual to participate in the dissemination of an ideological message by displaying it on his private property in a manner and for the express purpose that it be observed and read by the public. We hold that the State may not do so.”

While there is much language in this opinion that might help Apple, this holding seems to create an additional hurdle.  Not only would the firm need to establish that a belief or ideology is being promoted by the government’s order, but also that the government’s order is issued “for the express purpose” of making Apple the vehicle for expressing some message (whatever it is). 

What is that message, again?  It might lie somewhere in between the thug’s specious argument that he can’t be forced to comply with a search warrant because it sends a pro-law enforcement message, on the one hand, and the religious objections to the message “Live Free or Die” on the other.  We are in fact talking about a law enforcement operation.  But we’re not talking about a thug making a one-off statement.  Apple might be able to point to a longstanding course of conduct that expresses a clear and consistent set of beliefs about the importance of computer security and the risks of government surveillance -- matters of faith for some.  

I wonder if Apple doesn’t have another First Amendment argument, too.  If the government enforces this order, Apple could argue that the firm will have less freedom to speak in the future.  By signing this particular copy of iOS, that is, Apple and its engineers are forced to say something to a device – and possibly others – that is inconsistent with their earlier statements.  As a result of the government order, users won’t know whether to trust Apple’s digital signature. If I can’t know whether Apple’s signed software is truly Apple’s or the government speaking through Apple, I might be less inclined to listen.  Apple’s credibility is eroded, and as a result its speech rights harmed.  This isn’t the government chilling speech; it’s cheapening it.

I’m not aware of cases where the government mandated speech that made someone less credible.  But can the government make you express something that is false?  Arguably not.  Unfortunately, the best known case on this point essentially dodged the question.  A South Dakota law required abortion providers to tell patients that suicide is a “known risk” of abortion, despite the fact that there is no evidence to this effect.  Planned Parenthood challenged the law on First Amendment grounds.  As Professor Dorf points out, the case was resolved (and the law upheld) because the court found that the law did not in fact require a misleading statement.  The question of whether the government can compel a lie was not resolved. 

Of course, even if it were unconstitutional to compel a lie, that rule would have its limits.  If someone tweeted “I will never help law enforcement,” and then law enforcement compelled their assistance, they would be forced to provide it.  They could not say, “but by complying with your order, I will be made a liar!  No one will trust me!”  But Apple’s case is different.  For one, Apple is a custodian of a digital space where people communicate and store their most valuable things.  This might lend a court to be warier of government regulations mandating the Apple take particular positions – especially if those positions weaken trust in a security environment where everything turns on it. 

Both arguments turn on the substantive content of the expression that Apple claims it is being compelled to make.  This may be a deeper, theoretical (even metaphysical) question about the substantive message conveyed when one robot says to another: “Hi, it’s me. Trust me.”