Apple v. FBI

Apple CEO Tim Cook is in the news again for publicly protesting a court order requesting Apple to assist the FBI in unlocking an iPhone used by one of the San Bernardino terrorists. Tim Cook has made several strong statements against compromising the security of Apple products to facilitate law enforcement, and when taken at face value this seems to be another iteration of the ongoing encryption backdoor debate. Indeed the timing seems almost suspiciously appropriate, as FBI Director James Comey is also in the news regarding his comments to the Senate on encryption’s impact on law enforcement. I’ve discussed the encryption debate previously, but I’m highlighting this particular scenario because it strikes me as quite a different question than Tim Cook is letting on, and warrants some discussion.

The Story So Far

For those who haven’t been following the encryption debate too closely, here’s a quick recap: beginning with iOS 8, Apple announced it would begin encrypting iPhones using full device encryption, meaning that everything on the iPhone is encrypted, with the key to decrypt that information stored only on the iPhone itself. To decrypt the phone, you either need a copy of the cryptographic key, (around 64 characters), or have access to the passcode/passphrase the user uses to unlock their phone. (Apple has never stored the passcode/passphrase.) The shift to full device encryption was viewed negatively by US law enforcement, who previously relied on Apple’s copy of the key to decrypt devices (with a warrant).

The majority of the debate thus far has been over keys: since Apple no longer retains a copy of the keys, neither Apple nor law enforcement agencies can use those keys to decrypt phones. Congress and the Obama administration both considered mandating phone manufacturers retain a copy of the key, which was colloquialized as “mandatory backdoors.” (This would effectively require Apple to reprogram iOS.) However, the backlash from the tech sector and civil liberties advocates was strong enough to pretty much end that debate. The Obama administration is moving away from mandatory backdoors, and indeed Congress is currently considering legislation to prevent the states from enacting their own mandatory backdoor laws. Nevertheless, the FBI still frequently argues that full device encryption is hampering investigations, and makes it so they can’t access the phone of one of the San Bernardino terrorists.

Or at least, they can’t do so easily.

After all, there is still the passcode; the FBI doesn’t need the key if they can just guess the passcode. There are a limited number of possibilities (albeit a very high number) so they can just brute-force it, which literally means “try every possible combination.” But Apple, being the savvy programmers they are, went one step further, and added the option to limit the number of guesses at unlocking the phone, after which the phone would forget its own key, making the contents effectively unreadable. (They also input a time delay after failed attempts, and typically require the passcode to be input by hand.) The FBI request in this case targets these secondary defense mechanisms: they don’t want the key; they want Apple to help them get around the restrictions on guessing. With unlimited guesses, cracking the passcode is just a matter of time.

So to clarify, this is a different question than the mandatory backdoors debate. The FBI is not asking Apple to alter its software; they are asking Apple to develop an exploit for the existing software. As popularly characterized, they are asking Apple to “hack the iPhone.”

All Writs Considered

I’ll start with the law. Normally for issues like this one, the best place to start is with CALEA. CALEA can be summarized as the “help us wiretap digital communications act,” and is the primary justification for forcing telecom companies to alter their infrastructure to make technology more amenable to government surveillance. CALEA was similarly unpopular with civil liberties advocates, but it has been the law since 1994. (Of note, CALEA supports the civil liberties advocates in the mandatory backdoors debate.) But CALEA doesn’t cover this case. Absent specific statutory authority, courts are forced to fall back on their more generalized judicial powers. This historically included things like analogue phone wiretaps, pen registers, customer records, and so forth. Basically, if a company already has the information, or can easily get it, courts can typically order companies to comply notwithstanding the lack of specific statutory authority.

This is why the FBI is relying instead on the All Writs Act, the judicial catch-all for authorizing court orders. (Although frequently cited as dating from 1789, the current form of the Act was passed in 1911, with minor changes in 1948 and 1949.) The All Writs Act authorizes judges to “issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of law.” I’ll skip the history lesson on English writ law. Instead, the important things to know are that the Supreme Court generally interprets the All Writs Act pretty broadly, although it specifically says that “unreasonable burdens may not be imposed,” and that the Act is not controlling if there is another statute governing the issue. On both counts, this case is fuzzy. (The FBI’s request is technically feasible, and while CALEA probably should cover this, I’m doubtful that it does.)

While use of the All Writs Act for this purpose strikes me as a far fetch away from any of the common law writs I’m aware of, and the closest appellate precedent involves pen registers from the 1980s, there is plenty of precedent for courts ordering companies to cooperate with law enforcement, and this isn’t the first time the FBI has used the All Writs Act to request help unlocking a phone. Ultimately I suspect it will boil down to how burdensome the request is; while it’s debated, my understanding is that Apple’s complaint rests more on principles than difficulty.

Hack Yourself Before You Wreck Yourself

So putting aside the murky legality, how does this fit into the larger debate about mandatory backdoors? I would argue that asking Apple to hack its own device is a separate issue entirely from mandating that Apple make its products hackable. (It might still be bad policy, but it would be for separate reasons.) This isn’t asking Apple to program in a backdoor; it’s asking them to exploit an existing backdoor. Theoretically speaking, if Apple can currently do it, someone else could as well. (In reality, this would probably be absurdly difficult for anyone not in Apple’s position.) But the distinction is nonetheless important, because the primary argument against mandatory backdoors is that they fundamentally weaken the security of the system, whereas this is only asking Apple to exploit its already insecure system. (Apple’s argument seems to be relying more on the idea that once they develop the exploit, others may get their hands on it.)

Viewed from another perspective, this could be seen as a good thing for security. If Apple does learn about a vulnerability in its own systems that is exploitable, nothing in this court order prevents them from fixing it in a subsequent patch. Companies already employ white hat hackers to test their systems for vulnerabilities, I don’t see why exploiting one of those vulnerabilities found on a known criminal’s phone is particularly worrying. And even if the vulnerability isn’t fixable, simply refusing to help the government doesn’t guarantee that someone else won’t find and exploit it.

I’ve seen some commentary worrying that this makes law enforcement hacking too easy, because they are essentially conscripting Apple to do it for them, although I’m not sure how persuasive this is. For one, I don’t like relying on practical protections in lieu of substantive ones: the limit on government hacking shouldn’t simply be how easy it is. It’s better to make these hacks easy and rare than common and cumbersome. This line of reasoning assumes a failure of process, and decides the appropriate remedy is to hamstring efficacy. (It’s like saying it’s too easy to get a warrant, so all police officers must execute warrants with one hand tied behind their back.) And besides, law enforcement enlisting the help of a safe manufacturer to help break into a criminal’s safe is functionally equivalent to this case, and yet seems far less egregious. Ultimately I think this argument isn’t sure what it’s upset about.

As is often the case, much of the substance of Apple’s argument (and those of Apple’s supporters) relies on an assumption of government overreach and abuse, as well as the ever-present slippery slope. While not necessarily unjustified, their rhetoric heavily invokes the encryption debate, where the argument is much stronger, and fails to adequately justify why this case creates a similar threat to the security of Apple’s customers. Viewed from a practical standpoint, the likelihood of this exploit actually impacting anyone else’s cybersecurity looks very low.

This Message Will Self-Destruct in 10 . . .

Just briefly, I think it’s worth mentioning that the security feature the FBI wants to circumvent – the iPhone’s automatic deletion of data upon too many failed unlocking attempts – arguably is less important than the integrity of the encryption they use. Although an effective method for countering brute-force attacks, automatic deletion comes at a fairly substantial cost: self-immolation of all of your data. It is an extreme solution, akin to a burglary alarm that burns down your house to prevent anything from being stolen. Such extremity makes me question whether it is something the average user would actually want, particularly since most iPhone thieves will wipe the phone regardless. This certainly seems like a feature that primarily benefits criminals. (Although in my hypothetical the thief would be stealing your entire house, so maybe burning it down makes sense in scenarios where you know you won’t get any of it back.) While my mind isn’t settled on the issue, targeting this type of feature seems like it might strike a better balance in the privacy vs. security debate.

I’ll close up by bringing up some of my standard talking points about these encryption debates. The big one is the potential international ramifications of any US policy compromising tech security. Mandating that all phones be open to surveillance by US law enforcement will make them more unpopular internationally, (although phone companies may just develop multiple models to meet the requirements for each market), and the financial impact of the regulation is potentially substantial. While most acute with regard to mandatory backdoors, it may still be an issue in this case. There is also the perennial problem of efficacy, as no matter what concessions Apple makes with regard to security, there will always be ways for savvy criminals to reinforce the security of their devices. Getting Apple to hack into a device isn’t helpful when the underlying content is also encrypted. But then again, saying it might not work isn’t a great argument for not even trying.

Viewed on the merits, I think the biggest concern is probably the lack of clear guidelines about when law enforcement can use this power. While getting the tech company to assist in recovering data seems perfectly legitimate, The All Writs Act really doesn’t seem like the appropriate avenue for doing so, and it would be preferable to have an Act of Congress clarifying if and when these types of requests can be made. As long as companies aren’t being compelled to alter their products to make them less secure, I don’t think this raises the same issues as mandatory backdoors, and shouldn’t be the cause for concern many are making it out to be. Unless your primary concern is the potential overuse of the All Writs Acts.

Until next time.

-Scott

 

One thought on “Apple v. FBI

  1. Pingback: A Win for Microsoft in Ireland? | The CACR Supplement

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s