In the heated debate over encryption, backdoors, and a locked iPhone at the center of a terrorism investigation, facts have sometimes been overwhelmed by rhetoric.
Yesterday, we got to hear a lot of facts (and some rhetoric too), in a Congressional hearing about whether the US government can legally compel Apple to defeat its own security measures to unlock an iPhone at the request of law enforcement.
FBI Director James Comey and Apple General Counsel Bruce Sewell appeared before the House Judiciary Committee, where they presented prepared testimony and answered questions from lawmakers for several hours.
Cyrus Vance, Jr., district attorney for New York, and Dr. Susan Landau, a respected expert on cryptography, also testified before the committee.
Comey, who has talked for almost two years about the problem of criminals and terrorists “going dark” through the use of encrypted communications and devices, acknowledged in his testimony that “we’ve been talking past each other in the tech community and the government.”
In the interest of finding facts we can agree on, let’s review what’s happened in the past few weeks and look at what new information we learned from yesterday’s Congressional hearing in Washington.
Quick overview – how we got to this point
There is a long history over several decades of governments demanding backdoors to defeat encryption, but in recent months the temperature of the debate has risen, after the terrorist attacks in Paris last November, and a mass shooting in San Bernardino, California by self-proclaimed followers of Islamic State.
In pursuit of its investigation into the San Bernardino terrorist attack, the FBI wants to access data on an iPhone that was possessed by one of the shooters (who is now dead), although the device actually belonged to his employer, San Bernardino County.
The iPhone is protected by a passcode, which means the device is encrypted and data on it cannot be read without the passcode, which is unknown.
The US prosecutors in this case sought a court order, which was granted two weeks ago, compelling Apple to create a new version of iOS that the FBI could load onto the shooter’s iPhone.
This new code would allow the FBI to “brute force” the passcode, by making unlimited guesses until finding the right one.
Why the FBI can’t get data off the iPhone
The San Bernardino shooter’s device is an iPhone 5c running the most recent version of Apple’s mobile operating system, iOS 9, and it is protected with a four-to-six digit passcode on the lockscreen.
Since iOS 8, released in September 2014, encryption has been enabled by default, and Apple does not store the encryption key.
As Apple made clear in its response to the court order, it has no way to unlock the device without creating new code to defeat its own security.
Because the iPhone is running iOS 9, there are two protections in place that Apple designed to prevent the compromise of the passcode: an auto-erase function that deletes the device’s data after 10 failed passcode attempts; and a timing delay between passcode guesses.
Without these limitations, the FBI could brute force the passcode in 26 minutes, Comey said.
The third technological solution the FBI is seeking is a way to submit passcode attempts electronically, without using the touchscreen.
In yesterday’s hearing, Comey admitted that the FBI made a “mistake” in the hours after the attack when it told San Bernardino county officials to change the shooter’s Apple ID password to access the iCloud account.
By doing so, they inadvertently cut off any possibility of the iPhone backing up its data automatically through a known Wi-Fi network.
Apple said it could have assisted the FBI in accessing that data through iCloud if it had first asked for technical assistance, which Apple has been providing in this and other cases.
Is the FBI seeking access to just this one iPhone?
Apple says this is about more than just a single iPhone in the San Bernardino case – and that is undeniably true.
The FBI has also asked for an iPhone to be unlocked in a New York drug case, although its authority to do so was ruled unconstitutional by a judge earlier this week.
According to the judge’s ruling in that case, there are a total of 12 pending cases in federal courts where the US government is seeking access to a locked and encrypted iPhone.
Yet there are potentially thousands more cases at the state and local level where law enforcement is in possession of locked iPhones.
Vance, the New York district attorney, testified that his office alone has 205 devices it wants unlocked, and there are another 100 in Texas and many more cases in other states.
Apple believes that going forward with the FBI’s solution for unlocking one iPhone would set a precedent that other courts would use to require a similar solution in other cases.
When asked whether the San Bernardino case would set such a precedent, Comey appeared to agree, saying: “sure, potentially.”
Comey also said the code the FBI is seeking could only be used on an iPhone 5c, and would not work on later iPhones such as the iPhone 6 or 6s.
But Sewell said that “nothing would preclude it from being used on any iPhone that’s out there.”
Why is Apple refusing to write the code?
In its legal brief to the California court, Apple argues that code is equivalent to speech, and believes that the court’s order to create code that does not exist is a violation of the First Amendment, because it would be “compelled speech.”
Apple also argues that creating the code would be “forced labor,” and a violation of the Fifth Amendment.
Beyond that, Apple says creating the code is “too dangerous” because it creates a backdoor that could be exploited by hackers or nation states, potentially putting millions of innocent iPhone users at risk.
Comey said in his testimony that he has “a lot of faith” that Apple can protect the code from falling into the wrong hands.
But in her testimony, encryption expert Landau said it is impossible to guarantee the security of the code, because it would be “the target of organized crime and nation states.”
Landau co-authored an influential paper with other cryptographers and security experts laying out a detailed technical rebuttal to the effectiveness of backdoors.
She said the government’s request for a tool to get around the passcode and encryption on an iPhone is “a security mistake” for several reasons.
One reason: requiring Apple to perform the unlocking procedure on many iPhones would require a new process, using the unique ID of each phone and digitally signing the code before loading it onto each device.
“It becomes a routine process and routine processes get subverted,” Landau said.
Where do we go from here?
Comey said he believes there can be a solution that will satisfy law enforcement and the tech companies, but such a solution does not exist today.
The FBI could look for its own backdoor to the iPhone without Apple’s assistance, but Comey said it hasn’t found a way in and other US government agencies (such as the NSA) did not offer a solution either.
In the meantime, these cases will continue to play out in the courts.
Congress could pass a law that requires Apple and other technology companies to create code such as what the FBI is seeking; or it could pass a law that does the opposite, prohibiting the creation of any backdoors.
At this point, however, a political solution seems to be as far off as a technological one.
Image of phone hacking courtesy of Shutterstock.com.
8 comments on “Apple and FBI testify in hearing on locked iPhone: What we learned”
A question: if the Firmware is preventing access, and a possible erasure, why not disassemble the phone to get at the RAM. Yes, it would still be encrypted, but with access and time and without the OS/Firmware protection, you’d think the encryption could be broken.
If all you’ve got is a raw data dump of the encrypted device, you have to try all 2256; different AES keys, one at a time, until you find the one that unscrambles the data. Good luck with that 🙂
There’s a shortcut, which is to read the actual encryption key out of Apple’s “Secure Enclave,” a tamper-proof storage module that’s part of the fingerprint sensor hardware. And that is very, very hard – unless you know the passcode.
You can’t just connect up wires and read data out of the secure enclave, any more than you can read the next two-factor authentication number out of your 2FA token…and if you try too hard to hack it, it will simply forget everything it knew.
I think this is one of the items that the representatives were getting confused about between breaking the actual encryption and brute forcing a defined length pass code. From my perspective, the FBI director, Apple’s legal council and and the security expert did a great job, as well as a lot of the Rep’s that asked some very good questions, some clearly were not thought out lines of questions, but mostly good, and most bipartisan, which I found was interesting. The South Carolina (maybe north?) Rep needs to back off if he wants people to help him though..
Apple claims that they do not store a the key on the device?
After all the admissions if people being able to bypass security on any phone you would think that one of these hackers would come forward to the FBI asking for a deal in payment for unlocking the phone, just a thought!
Just out of curiosity, does Apple actually encrypt the stored data? It seems with the way an iPhone is built, it’s difficult at best to get to the point of access required for a physical connections. It just doesn’t seem like they would, but… On my home machine the disks are encrypted, which makes them a tough target.
I saw on the new this morning about some director in the Government siding with Apple.
Data on iPhones does get encrypted. All of it, by default now. They even put in specialized hardware to do it. There’s a unique AES key that gets burned into the hardware of every phone that, practically speaking, cannot be read from the device without destroying it in the process, before you’re able to read it. The user’s passcode is what does the initial unlocking of the device which opens-up access to the key, so-to-speak. It’s more complicated than that but that’s the gist of it. (This kind of key pairing isn’t uncommon either. Ransomware can use an RSA key to secure the AES key that encrypted the data being ransomed, and legit software like 1Password use the user’s password hash (with salt) to unlock a much larger and random AES key that’s used to do the actual bulk of the encryption. It’s good because it makes getting just a dump of the raw data pretty useless, just like on an iPhone.)
So, yeah, the data is all encrypted and if you try to access it directly, you’ll have a looooooooong time to go trying to break it.
Sorry but I don’t believe the FBI’s excuse. A vast majority of this data on this phone had to traverse ISP servers unencrypted if this guy was talking to anyone. The NSA probably already has the data. Also, brute forcing all the AES keys is still within the realm of possibility according to many retired NSA employees. It is just a matter of sharing with the FBI and diverting resources. This is nothing but a political game. The FBI wants to have NSA powers but they don’t have the military budget. The number one rule for any computer forensics is always make a backup first and work from the backup! Had the FBI initially contacted apple this would not be an issue!