FBI cracks *that* iPhone

Big news!

The Grand Final of cryptographic lawsuits is over, abandoned shortly before the last quarter of play.

You know what we’re referring to, of course: the FBI versus Apple court case that saw the US judiciary tell Apple, “You must provide the FBI with a backdoor to circumvent iPhone security in order to assist a criminal investigation.”

Regulators and law enforcement in many jurisdictions have been pressing for mandatory computer security backdoors for years, claiming that modern-day encryption software is so strong that crooks, terrorists and foreign spies can hide behind it and thereby evade both detection and prosecution.

Therefore, say anti-encryption campaigners, we need to have backdoors programmed into security products by law, so that duly-authorised officials can sidestep security when necessary.

FBI versus Apple

Greatly simplified, that was the case that the FBI made against Apple in its recent lawsuit, and the court agreed.

Rather than going to court for a blanket right to unlock iPhones in the future, the FBI chose a specific iPhone connected with a particularly abhorrent crime: the 2015 mass shootings in San Bernadino, California.

Two shooters were involved, a man and his wife who murdered 14 people before fleeing from the police a rented vehicle, only to end up dead themselves after trying to shoot it out with the police from inside their car.

The couple had apparently destroyed their own mobile phones before undertaking the attack, but the husbands’s work phone, technically the property of the San Bernadino council, was bagged by the FBI to see what investigative intelligence it could reveal, if any.

That’s what led to the court case, when the FBI found itself stuck up against the iPhone’s passcode.

If you’ve configured your own iPhone for security, you’ve probably set a passcode of your own, so that your device can’t easily be used for evil if someone steals it and sells it on for further criminal activity by crooks, terrorists or foreign spies.

After all, strong encryption is nowadays much more about protecting us from the Bad Guys than the other way around. (You only need to look at the number and the magnitude of recent data breaches to see why we need data protection more then ever.)

A cop who seizes a passcode-protected iPhone could try guessing the most likely passcodes, but it’s a slow process that is designed so that it can only be carried out on the iPhone itself.

In theory, at least, you can’t just copy the iPhone’s memory and storage and attack the passcode offline at any speed you like.

Robots to the rescue

Robotic passcode testing devices exist that can type in passcodes much faster, more accurately, and for far longer than a human can manage…

…but iPhones have a secondary security setting that makes the device wipe all its data after 10 incorrect guesses.

That’s the same sort of protection used by financial institutions at ATMs, whereby the cashpoint machine swallows your bank card after three mistakes, so that a crook who steals your card has only a tiny chance of getting it to work.

Apparently, the FBI had already changed the murderer’s Apple ID password in order to access data he may have stored in Apple’s iCloud, instead of first trying to induce the locked-but-alive iPhone to sync its local data first, and only then going after the data in the cloud.

Thus the lawsuit, intended to force Apple to assist against its own judgment by deliberately “backdooring” its way around existing security measures, using a technique that critics said was far too general, and would create a dangerous precedent for all security vendors in the future.

It’s one thing to work with existing security vulnerabilities to recover protected data, but quite another, says the #nobackdoors argument, to create and deploy a purposeful security vulnerability that didn’t exist before.

Simply put, you can’t make security stronger for the future by deliberately weakening it.

After all, if you do deliberately weaken it, you create a hole that potentially puts everyone at risk, notably the law-abiding amongst us.

(Crooks don’t really care about backdoors. They’re crooks, so they’re unlikely to comply with any laws that require the use of backdoored encryption. They’ll just use unencumbered encryption software instead of or as well as the deliberately-weakened tools foisted on the rest of us.)

💡 LISTEN NOW: Live podcast – The Great “Backdoor” debate” ►

Peace with honour?

So, why has the FBI abandoned its lawsuit, at least for now?

According to media outlets the world over, a spokesperson for the US Department of Justice gave this reason:

As the government noted in its filing today [Monday 2016-03-28], the FBI has now successfully retrieved the data stored on the San Bernardino terrorist’s iPhone and therefore no longer requires the assistance from Apple required by this Court Order. The FBI is currently reviewing the information on the phone, consistent with standard investigatory procedures.

And here’s how the government put it to the court:

What next?

What we don’t know, and the FBI isn’t saying, is exactly what was retrieved, or how.

  • Perhaps the passcode was 0000 or 2580, and the FBI got lucky?
  • Perhaps the lock-out limit on guessing wasn’t turned on and so the FBI had thousands of tries, not just 10?
  • Perhaps the iPhone had enough unencrypted data left in RAM (rather than encrypted in flash storage) to help the investigation, or to leak the passcode?
  • Perhaps the FBI had a way of re-writing the iPhone’s RAM and flash storage to allow 10 guesses over and over again, albeit slowly?
  • Perhaps the FBI purchased an existing zero-day vulnerability in iOS to allow it to bypass security through an unintentional backdoor?
  • Perhaps the FBI recovered the passcode via some smart image processing, using fingerprint grease stains on the screen to infer the digits that had been used?

What we can assume is this:

  • This courtcase is over, but the principle of legally-mandated backdoors is no closer to being settled.
  • Apple will be trying to figure out what the FBI did, and how to patch it.

After all, if Apple can figure out how the Feds got in, then so can anyway else, especially if information about how it was done leaks out. (Security through obscurity is a poor basis for cryptographic protection.)

As a result, the rest of us will want the hole fixed once it’s known…

…and so, we have to assume, will any number of regulatory bodies around the world, such as Information Commissioners’ Offices.

Ironically, privacy and data security regulators are widely tasked with using legal pressure to reduce the number of security holes that could lead to data breaches, privacy violations, and identity theft.