How much of your personal data on your iPhone or iPad would you be willing to bet law enforcement or a hacker can grab from your device, even if you’ve encrypted it?
How about all of it?
A “backdoor” that Apple built into iOS for developers can be used to spy on iPhones and iPads by governments, law enforcement, or cyber criminals, according to forensics researcher Jonathan Zdziarski.
For the backdoor to be exploited by a spy, your iDevice needs to be synced to another computer via a feature called iOS pairing.
Once your iDevice is paired to your PC or Mac, they exchange encryption keys and certificates to establish an encrypted SSL tunnel, and the keys are never deleted unless the iPhone or iPad is wiped with a factory reset.
That means a hacker could insert spyware on your computer to steal the pairing keys, which allows them to locate and connect to your device via Wi-Fi.
Because iPhones and iPads automatically connect to Wi-Fi networks with names they recognize, an attacker could then set up a hotspot using a spoofed network name to get your device to connect, and grab all your data.
Zdziarski used his talk at the HOPE X hacker conference on 18 July to state that Apple’s backdoors give access to personal data that’s beyond what developers or Apple itself need.
In mentioning that the Snowden leaks revealed the National Security Agency (NSA) had used backdoors in iPhone, Android and BlackBerry, Zdziarski also implied that the NSA may have used Apple’s backdoors for easy access to iPhones and iPads.
Apple issued a statement to reporters, acknowledging the access through pairing.
But what Zdziarski described as a backdoor, Apple calls “diagnostic functions” – Apple said developers and IT departments need them for “troubleshooting.”
Apple’s statement also flatly denies any cooperation with the NSA, or government agencies “from any country.”
We have designed iOS so that its diagnostic functions do not compromise user privacy and security, but still provides needed information to enterprise IT departments, developers and Apple for troubleshooting technical issues.
A user must have unlocked their device and agreed to trust another computer before that computer is able to access this limited diagnostic data. The user must agree to share this information, and data is never transferred without their consent.
As we have said before, Apple has never worked with any government agency from any country to create a backdoor in any of our products or services.
On his own blog, Zdziarski explained further that he doesn’t think Apple is in cahoots with the NSA, but he said these features (or bugs) should not be in iOS.
Apple’s seeming admission to having these back doors, however legitimate a use they serve Apple, unfortunately have opened up some serious privacy weaknesses as well.
I think at the very least, this warrants an explanation and disclosure to the some 600 million customers out there running iOS devices.
The lack of disclosure of these security loopholes is a bit puzzling, but Apple seems to have, at least, done the disclosing part now.
Will Apple back down?
Will the programmers in Cupertino be instructed to remove the libraries, or perhaps limit their use to developers debugging their apps?
Chances are that’s not going to happen, not least because Apple obviously went to some trouble to get all this stuff working in the first place.