Forget ransomware, forget the Internet of Things, forget all the other computer security stories of recent days…
…except for the red-hot topic of 2016, #nobackdoors.
Simply put, IT backdoors are deliberately-programmed weaknesses that give you a way to sidestep computer security when it suits you.
A bit like hiding a spare key to your house under the doormat, in case you lose your regular key while you’re out shopping.
You know you’re making a mockery of the good-quality lock you bought to give you better security in the first place…
…but, hey, as long as no one thinks to look under the mat, you should be OK.
Sadly, everyone knows to look under the doormat, so your well-chosen lock is as good as useless.
That’s exactly the same risk that we face if we accept programmatic backdoors in computer security products.
And it’s why, whenever we write about backdoors on Naked Security, our readers generally groan in collective dismay, leaving comments along the lines of, “What were they thinking?” or “Why did anyone ever imagine that could end well?”
Examples of tricks used to implement password backdoors include:
- Programming a hard-wired, “secret” password into the authentication software so that there is always a guaranteed way in.
- Getting device vendors to generate two passwords for every unit sold. You get one of them, which you can change, but the vendor keeps the other one somewhere, and you can neither change it nor delete it.
- Deliberately weakening an encryption algorithm so that it’s just secure enough to stop an average attacker from cracking it, but just weak enough that a serious adversary, such as the NSA or the PLA, could crack it if needed.
All of these approaches carry obvious and massive risks:
- Hard-wired passwords are like a key under the doormat. As soon as someone reveals the secret, all security bets are off.
- Vendor-stored passwords are simply an technological “sword of Damocles” hanging over your head. At any time, some or all of the password database could be stolen in a data breach, sold off by crooked insiders, or acquired by court order. You simply can’t tell what security you have, if any.
- Weakened encryption systems get weaker over time as computers get faster. Cracking times fall year-by-year until they’re within reach of the average cybercrime gang, and ultimately even of a determined loner at home.
In the plainly-spoken words of the Information Technology Industry Council: “Weakening security with the aim of advancing security simply does not make sense.”
We agree, and that’s why we’ve published our own #nobackdoors page right on the Sophos website.
Standing up for #nobackdoors is especially important right now, as Apple prepares to fight a US court order that as good as demands the company to come up with a backdoor to allow the FBI to access a passworded iPhone that’s part of a serious criminal investigation.
It’s a socially and emotionally charged case, because the FBI only wants to “backdoor” a single iPhone, and it’s one that was used by Syed Rizwan Farook.
Farook isn’t around to reveal the password himself: he was shot dead, along with his wife, after killing 14 people and seriously wounding 22 in a mass shooting in San Bernardino, California, on 2 December 2015.
Nevertheless, Apple is determined to stand its ground, arguing that to create a programmatic backdoor, even in a dramatic case like this, would open a password-cracking Pandora’s Box.
To backdoor one iPhone would effectively betray all of Apple’s many millions of law-abiding customers, and pave the way for similar writs against other American companies and their customers.
Unsurprisingly, other American companies, including Google, WhatsApp and Microsoft, are backing Apple and saying, #nobackdoors.
And so is Sophos, because weakening security with the aim of advancing security simply does not make sense.
18 comments on “Sophos says: #nobackdoors!”
Totally agree. I also think the industry should be calling out the administration for their obvious political theater and posturing. This is hardly a topic worthy of discussion when the facts are presented.
So What’s App is security-minded! Interesting…
While the backdoor issue may be debateable, although frankly I think of numerous ways it could be done by a manufacturer HOWEVER the issue with Apple is disingenuous, as the FBI is not requesting a permanent backdoor, it is looking to chnage some of the security features which disallow multiple password entries and other featiures, all of which can be done at Apple HQ without FBI interference. This is not about backdoors, this is about a court order and I personally hope Apple and Cook get sued from here to there for comtempt of court.
“It is looking to change some of the security features” — that’s called adding a back door to the security. Even if the back door only gets you into the garage instead of into the house proper, it’s still a back door. And software written to target a specific (previous generaton) phone doesn’t suddenly get unwritten as soon as it has done its work. The FBI will need access to it, the Apple development team will need access to it, and it will likely live on in backups and future All Writs requests from government agencies worldwide.
But you’re right; the issue isn’t really about backdoors, it’s about the government using the All Writs Act to compel speech (because the courts have already determined that writing software is speech).
As for the court order itself, there’s been no contempt of court. Apple has a week to respond to the order, and it is expected that it will respond by appealing the decision. This will send the case to the appeals court. This is the way the law is supposed to work.
This is not about terrorism.
This is about totalitarianism.
“the FBI only wants to “backdoor” a single iPhone, and it’s one that belonged Syed Rizwan Farook.”
Except that the terrorist never owned that iphone, but his employer did. And his employer (the owner of said iphone) approves of the FBI gaining access to the contents of this iphone. #nobackdoors we just need Apple to cooperate with the federal court order.
I’ll adjust the word “belonged” to “was used by,” which is the point I wanted to make.
11:27am pst. Was there just an update from the DoJ to compel Apple?
The news I saw said that the court extended Apple’s deadline to respond. Apple now has until 26 Feb 2016.
This is really embarrassing for the US. The Feds have the phone but they are so technology backwards they cannot get what they want so they have to try to beg, shame or bully Apple into being their consultant – despite the billions we have spent on Homeland Security..
And for what? It is not going to undo what happened and is very unlikely to bring anyone else to justice.
I am sure that in North Korea and China they are all having a good laugh over this.
Meanwhile the TSA can continue to grope travelers’ crotches and paw through Granny’s luggage in case she might have > 3ozs of White Shoulders.
So you think these tech companies should be allowed to hide terrorist activity? On the one hand everyone is screaming about protection from terrorism yet they won’t allow law enforcement with proper warrants to access a known terrorist’s mobile? I see this as contributing to terrorist activity, not diminishing it.
Short answer: no.
Long answer: Apple isn’t hiding terrorist activity; they’re arguing that the government shouldn’t be able to compel them to write custom tools to break carefully designed security protocols in their product.
This would be like the FBI demanding that Microsoft write a patch for Windows that disables tamper protection. Or, to make it more analog, this would be like the FBI demanding a vault maker create a tool that disables the time lock feature on their vaults — so the FBI can use it to get in to a specific vault to see if a terrorist’s safety deposit box holds anything of interest.
The question that nobody in favour of this hack seem willing to answer is how you keep this tool away from anyone with ISIS sympathies once you’ve made it. These things have a terrible record of being leaked to malefactors.
Under the current plans, if the government hack goes ahead then you must assume that ISIS can access the iphones of anybody they take hostage.
In this case, you could argue that the FBI’s request to “do it in RAM” solves the disclosure problem…sort of…but there’s the bigger problem of, “What next?” If the backdoored phone revealed a single GPGed file, for example, would the GPG team be next in the firing line behind Apple? Heck, if we aren’t careful, this approach of court-mandated one-off backdoors could be used for what is almost a DDoS attack where we turn on each other…
Even though any terrorist (domestic and foreign) should be stopped, there is a limit. There are many great comments here about why Apple should stand it’s ground.
Please allow me to give the greatest reason why Apple should stand it’s ground. It’s a quote by that old guy on your $100 bill…
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.”
Guys, this really is a ‘heads I win, tails you lose”debate.
Do I want to FBI to be able to access that despicable cowardly terrorists sms’s and email’s, so that they can find out who collaborated with them before more innocent people/children/ …. are murdered in cold blood? Absolutely.
Do I want them to then have free access to all our data etc? definitely not.
Does anyone actually have the answer? in all probability no, not right now, no matter how loud certain imbeciles are screaming that they know exactly what is right for everyone else on the planet, or beyond for that matter.
Perhaps a subsidiary of the supreme court to decide on each case individually?
I really do not know, but it is something that a solution must be found for, and sooner rather than later.
Are all Sophos software open source code that allows us to verify if there are indeed no backdoors?
Same goes for Apple. Until Apple shows the source code there’s little reason for us to trust them.
The public fight against FBI tells us nothing about what secret deals Apple has with govs.
I back Apple’s stance. The FBI says it’s a “once only” but no one can uninvent what has been invented. And who is going to pay for it? It needs to be designed, coded and tested before going to the FBI. That costs money to produce.