Apple CEO Tim Cook sticks to his guns: “No encryption backdoors”

Apple CEO Tim Cook appeared on CBS’s 60 Minutes TV show last night.

As you can probably imagine, the topic of encryption came up, in particular the issue of what backdoors.

Bluntly put, a backdoor is a deliberate security hole – for example, an undocumented master decryption key – that is knowingly added to a software product.

Some backdoors are there as a temporary convenience, for example to speed things up during development, a bit like wedging your real-life back door open while you’re shuttling the garbage out into the yard.

But temporary software backdoors have a way of getting forgotten, and ending up in production builds, which is a strong argument for avoiding backdoors in the first place, convenience notwithstanding.

Some backdoors are there as a “feature”, for example so that the support desk can help you more quickly if you are on the road and forget your password, without needing to read you a lengthy, one-off recovery code that you have to type in within a limited time.

But backdoors like this soon end up widely known, and widely misused, which is a strong argument for avoiding backdoors in the first place, convenience notwithstanding.

Lastly, some backdoors are requested by law enforcement or a country’s regulators, supposedly as an aid in fighting crime.

The claim is that strong encryption that can’t be cracked gives criminals and terrorists an unfair advantage, because it means they can communicate without fear of their conversations being eavesdropped or investigated.

Unbreakable encryption, say its detractors, is as good as contempt of court, because crooks can laugh at search warrants that they know can’t be carried out.

But Tim Cook told 60 Minutes that he doesn’t agree:

Here’s the situation…on your smartphone today, on your iPhone. There’s likely health information, there’s financial information. There are intimate conversations with your family, or your co-workers. There’s probably business secrets and you should have the ability to protect it. And the only way we know how to do that, is to encrypt it. Why is that? It’s because if there’s a way to get in, then somebody will find the way in. There have been people that suggest that we should have a backdoor. But the reality is if you put a backdoor in, that backdoor’s for everybody, for good guys and bad guys.

Which is a strong argument for avoiding backdoors in the first place, convenience notwithstanding.

Indeed, mandatory cryptographic backdoors will leave all of us at increased risk of data compromise, possibly on a massive scale, by crooks and terrorists…

…whose illegal activities we will be able to eavesdrop and investigate only if they too comply with the law by using backdoored encryption software themselves.

In other words, Tim Cook is right: if you put in cryptographic backdoors, the good guys lose for sure, while the bad guys only lose if they are careless.

We know this because we have tried enforcing mandatory backdoors before, and it did not end well.

In the 1990s, for example, the US had laws requiring American software companies to use deliberately-weakened encryption algorithms in software for export.

The US legislators intended that these export-grade ciphers would make it safe to sell cryptographic software even to potential enemies because their traffic would always be crackable.

But the regulations ended up affecting Americans in a double-whammy:

  • International customers simply bought non-US products instead, hurting US encryption vendors.
  • EXPORT_GRADE ciphers lived on long after they were no longer legally required, leaving behind backdoors such as FREAK and LOGJAM that potentially put all of us at risk.

Those who cannot remember the past are condemned to repeat it.

💡 LEARN MORE – To encrypt or not to encrypt? We explore the issues ►

💡 LEARN MORE – The FREAK bug, a side-effect of weakened encryption ►

💡 LEARN MORE – The LOGJAM bug, another side-effect of weakened encryption ►

Image of open doors courtesy of Shutterstock.