Android developers - just how much can we trust them to do web security properly?

Filed Under: Android, Cryptography, Featured, Google, Privacy

Six German academics have just published a very detailed and systematic paper about web security on Android.

Catchily entitled Why Eve and Mallory Love Android: An Analysis of Android SSL (In)Security, the paper sets out, amongst other things, to answer the question, "Just how well-informed are Android developers, and how much can we trust them to do web security properly?"

As you can imagine from the title, the answer is, "Not enough."

By the way, in cryptographic documentation, Alice and Bob are always the two parties who want to communicate (longhand for A and B), while Eve is the eavesdropper, and Mallory (who is sometimes known as Mallet) is the malicious man-in-the-middler.

→ A man-in-the-middle (MITM) attack is devious but simple. I trick you into connecting to me, instead of, say, to your bank. You do a transaction, but I suck up all the data: username, account number, token code, the lot. I then immediately use this data, while it's still valid, to transact with your bank. Except that I pay the money to myself.

So, if the Eves and Mallories of the world really love Android, as the authors claim, that's bad news.

I won't try to encapsulate the intricacies of the whole paper here - it's worth reading in its entirety, not in some newsily abbreviated form - but I am going to discuss some of its findings.

Here's what interested me: the authors downloaded 13,500 apps from the Google Play Store. They looked at apps that used HTTPS (HTTP over SSL, or secure HTTP), with a view to finding out how many apps that bothered with encryption actually bothered to do it properly.

→ Just to remind you: SSL is intended to deliver a security trifecta of confidentiality (the data transmitted is encrypted), integrity (the data hasn't been tampered with) and, through a system of digital certificates, authenticity (you really are talking to the right server). It's the digital certificates that stop MITM attacks: used correctly, the certificates mean that Mallory can't pretend to be your bank. He can get in the middle, but you'll notice.

What the authors discovered was bemusing. 790 of the apps went to all the trouble of using SSL, but accepted any certificate at all. For all you could tell, your data might be perfectly secure all the way to a crook's website.

284 of the apps did slightly better: they insisted on an approved certificate (one that was issued by an approved Certification Authority, or CA), but didn't care which site it had been issued for.

→ SSL certificates deliberately have a website name knitted into them, precisely so that a crook can't create one for his own site and then use it as if it belonged to someone else. You're supposed to check that the site and its certificate match to prevent this sort of masquerading.

There were lots of other problems, too, notably that most apps using HTTPS didn't provide any signal that they were doing so.

With nothing to differentiate between an encrypted connection and an insecure one, there's nothing for security-savvy users to watch out for.

For years we've been trying to teach ourselves to watch out for signals such as the browser padlock, and to wean ourselves of the habit of clicking past certificate warnings.

Then - a cynic might conclude - along comes a raft of non-browser apps on Android to set us back a decade or so.

The authors suggest a number of remedies, including:

  1. Expecting developers to learn from what browser makers are doing, and to spend time and effort finding and clearly informing users about security problems.
  2. Expecting Google (and other app market providers) to raise the bar for Play Store apps by vetting apps for weak web security coding.
  3. Using an automatic verification tool to help with both of the above.

In good, salesy style, the authors just happen to have written a software product for point (3): MalloDroid.

The good news is that they're planning to release it as an online service, and it sounds as though it'll be free. Presumably you'll upload an app of interest to their server and it will tell you whether it contains any cryptographically inept web functionality.

That'll be a good starting point.

I like the fact that on Android, the barrier of entry to software developers is lower (and the freedom and variety for users higher) than on Apple's iOS. It feels more egalitarian and less controllingly commercial.

But I don't like the fact that the barrier of entry - even in the official market, Play Store - is as low as this paper suggests. You can be open and egalitarian without being sloppy.

, , ,

You might like

8 Responses to Android developers - just how much can we trust them to do web security properly?

  1. Myself · 645 days ago

    Less than 8% of problematic apps? Doesn't seen so bad.

    • Paul Ducklin · 645 days ago

      It's not 8% of problematic apps - it's 8% of all apps selected (popularity was used as a filter IIRC), even including those that don't use the network at all. Doesn't seem so good to me.

      Remember - these are apps endorsed by inclusion in the Play Store.

      The paper goes on to examine 100 of the problematic apps in more detail, revealing yet more woes in a small but non trivial sample of real world apps, including ones related to online financial transactions.

  2. @secolive · 645 days ago

    Quoting @txs, "developers will never ever produce secure code". If the code/application is not reviewed from a security standpoint in some way, it is insecure (just like an app which hasn't been tested does not work). Period.

    • Gavin · 644 days ago

      It's a generalization, but an all-too-accurate one. I couldn't agree more -- which is why the authors' Recommendation #2 is so needed:

      "Expecting Google (and other app market providers) to raise the bar for Play Store apps by vetting apps for weak web security coding."

      Unless there is improved enforcement of basic security (implementing SSL vaguely correctly should be considered pretty basic if you're writing web apps) the situation will likely not improve.

      Developers are like all other human beings and -- on average -- work to the lowest common denominator of what is required.

  3. Pete · 644 days ago

    This article touches a raw nerve in that it raises the issue of authenticity. It's a pet peeve I've had for years, and one that seems to have fallen on deaf ears. At least, no one has jumped to implement it yet.

    Consider the problem of email spam. I don't like it, and I know many others don't, but it seems that the vast majority of people don't care, judging by their email habits. Anyhow, what if everyone could elect to receive mail only from senders who sign their messages with identity-trusted certificates? (OK...obviously, this is already a fantasy, but stay with me on this...)

    What if those who host SMTP servers refused to send messages that weren't signed by identity trusted certs? Wouldn't that obliterate a large amount of email spam? Granted, that wouldn't eliminate the stuff that comes from junk servers whose sole purpose is to send spam. But what if I my ISP or other mail hosts' servers wouldn't even accept incoming messages that weren't signed by identity trusted certs? Wouldn't that reject virtually all of the rest of it?

    I say "virtually" because the potential for bogus or stolen certs always exists, but I'm not talking about perfection here; I'm talking about getting rid of the vast majority of the junk. A huge amount of the junk I receive comes from jerks who obviously don't want their identity known.

    Why wouldn't ISPs want to educate their users, make it easy for them to get identity-trusted certs, and provide a vastly more secure environment for them? Wouldn't it pay in recovered bandwidth and reduced storage requirements if the vast bulk of the junk were extinct?

    I don't have enough information to do a cost analysis, but it seems to me that the only people who could possibly object would be those who are inclined to send garbage in the first place.

  4. njorl · 644 days ago

    Could Android offer all the tricky SSL stuff on its side of the API? Perhaps the provided interface to the internet could try to use HTTPS by default. Apps could be flagged to say whether they're allowed to fall back to a non-encrypted connection.

  5. Sum Guy · 636 days ago

    It seems with every new app I install my phone runs worse. The adware that comes with these apps is a virus in my opinion. I guess since i installed it the Sophos app see it as malware. The permissions these apps require are unnecessary, but i guess it is the cost of free. But the paid versions of any given app doesn't change the resources used from what I have seen.

    I am not happy with Android's fragmentation. I won't buy iPhone due to apples patent trolling. I think all the phone operating systems out there still suck and it is too early to be trolling. I may try windows phone 8 next, but I am sure that will be a disappointment too. The only thing that will make me happy is when I can use my phone to run a desktop environment on it natively. With current specs being close to 2003's $400 PCs I can see that being a possibility now. I am hoping the surface release will lead to this. I just hope apple doesn't screw it up with troll suites like they always do.

  6. Hmm, would my 4 years on active duty (out of 20) during the Carter administration count as a ptsd-causing event? How about having my retirement certificate signed by Bubba

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

About the author

Paul Ducklin is a passionate security proselytiser. (That's like an evangelist, but more so!) He lives and breathes computer security, and would be happy for you to do so, too. Paul won the inaugural AusCERT Director's Award for Individual Excellence in Computer Security in 2009. Follow him on Twitter: @duckblog