Nothing breeds success like success.
In the recent FBI-versus-Apple court case (you know which one we mean), the US judiciary ordered Apple to cook up an iPhone backdoor to sidestep the very security it had baked into the iPhone in the first place.
Apple, of course, refused, for many reasons.
One reason was this: going ahead, even for a court order tied to a single, high-profile terrorism case in which 14 people were murdered in San Bernadino, California, would establish a general backdoor precedent.
That, in turn, might well lead to a world in which the only thing between your mobile phone and all the data on it would be the procedural legalisms of the backdoor policy.
Which would be fine if we could trust everyone to abide by the process, in the same way that you wouldn’t need an anti-virus if you only ever downloaded software from a website that contained the text, “Our downloads are virus-free, honest.”
In the end, the FBI seems to have extracted the data it was after in some other way, and the court case was dropped.
Loosely speaking, that’s both a positive and negative outcome.
The plus is that both sides got what they wanted without setting a precedent; and the negative is that the precedent was never actually decided, so we’ll probably get to go through all of this again.
Of course, that plus itself has both a positive and negative outcome, as astute Naked Security commenters have pointed out:
4caster: Thank goodness it was the FBI who managed to crack the iPhone. That is their job, and I hope they succeed in catching the criminals. They can’t be expected to tell Apple or anyone else how they did it. At the same time, it will be Apple’s job to close the security gap before less scrupulous people can use it.
Wayne Andersen: Well, the phone got unlocked. This means there is some unknown third party out there that has some unknown procedure for unlocking it. Kinda gives you a warm fuzzy feeling, Not!
The upside of this downside is that it seems likely that the “crack” used by the FBI was a tricky and physical one, perhaps involving disassembling the device and even desoldering or piggybacking its flash storage chips.
So, for all that an unintentional backdoor is now known to be available, it’s probably not a simple point-and-click matter:
David Pottage: If that is the case, then that is the kind of back door that I am fairly relaxed about. Technically the FBI was able to breach Apple’s security, but in practice, the technique is not remotely exploitable, extensive, time consuming, and cannot be done without the knowledge of the suspect. The costs and time involved means that it can only be used against serious criminal suspects, and can’t be used for mass surveillance. The fact that it destroys the phone in the process means that it can’t be covertly used against political opponents without alerting them.
Nevertheless, the FBI succeeded, or said that it succeeded, by getting what it wanted, even though it’s not saying what it actually wanted, nor how it got there.
As a result, we’ve already heard of another iPhone in a US murder case being referred to the FBI:
Faulkner County [Arkansas, US] Prosecuting Attorney Cody Hiland said the FBI agreed to the request from his office [on] Wednesday afternoon, [30 March, 2016]. A judge on Tuesday agreed to postpone the trial of 18-year-old Hunter Drexler so prosecutors could ask the FBI for help. Drexler’s trial was moved from next week to June 27. Drexler and 15-year-old Justin Staton are accused of killing Robert and Patricia Cogdell at their home in Conway, 30 miles north of Little Rock, in July .
There’s an interesting difference here, even though both involve capital (death penalty) offences.
In the San Bernadino case, the only people who could reasonably have been expected to supply the passcode to comply with a search warrant were dead, shot in a gun battle after their murder rampage.
In the Arkansas case, the defendant is still alive.
So he could, at least in theory, unlock the phone for investigators, in the same way that he might be expected to step away from his car to allow it to be lawfully searched, and forced to do so if he resisted.
Of course, he could just refuse to put in his password, but that still changes the game.
In some countries, even where there is a statutory right to silence to avoid self-incrimination, laws exist that criminalise the refusal to hand over data when required by law. (That doesn’t necessarily mean giving away your password, merely refusing to get things to the point that your data can be accessed.)
That’s similar to having a law to criminalise a refusal to submit to a breath or blood test for drink-driving, on the grounds if you didn’t have to take the test unless you knew you were sober, then the test would be pointless.
A law for mandatory data disclosure in a case like this, assuming it were consitutional, would have interesting consquences: how to set the penalty?
As harsh as the maximum for the offence to which the data relates?
Or lower, on the grounds that the data alone is unlikely to prove the entire case?
Or higher, to discourage treating refusal as a sort of default plea bargain?
Image of cracked iPhone by ymgerman / Shutterstock.com.
5 comments on “FBI already called in to unlock another murder case iPhone”
Parse the argument fine enough and I can build an argument for strip-searching you to see if you are wearing your “Friday” labeled undies like a good boy.
At some point, this will devolve into an argument about “rights” versus “privileges”. Driving a motor vehicle is not a right, it’s a privilege. Privacy (at it’s most basic) is a right in many Western countries.
This is kind of worthless, anyway, as encryption is getting lower cost and quicker.. Already, today, you can buy a 32 bit controller with hardware encryption (and ‘on the fly’ encode/decode) for 50 bucks. Who is the FBI going to try and convince to allow access to the data… Don’t think anything but the passphrase can allow you in, so if they kill them, no access. Maybe they should stop killing the suspects, at least until they talk with them… 🙂
It seems to be common practice for law enforcement to break and enter. Permission in hand, no questions asked.
Why penalize someone for something they can’t do? For example, My other self isn’t available at the moment” or in a case of amnesia, or if you have as many pass phrases as I do…
Whatever happened to innocent until proven guilty?
what of the FBI didn’t break the encryption, if they just pretend they did?
I guess we’ll hear a groan of disappointment from Arkansas…