Starbucks got into and out of privacy trouble over the past week.
The brouhaha started when a US researcher named Daniel Wood went public on the Full Disclosure security mailing list, reporting a rather serious data leakage problem in the Starbucks iOS mobile app.
We’ve written (and spoken) about the shortcomings of mobile apps before.
Even organisations that take security seriously when you interact with them in your browser, such as financial institutions, have been found wanting when it comes to their mobile apps.
Starbucks, it seems, fell into the same trap.
Wood didn’t say in his Full Disclosure posting, but one news report suggested he went public to push Starbucks to act after getting the run-around for a couple of months from Starbuck’s customer service.
The main part of Wood’s complaint was that Starbucks was sloppy with the data it wrote into its log files, dumping usernames and passwords in plaintext.
Of course, app logs can be very handy, notably if something goes wrong with your software.
Mobile apps are on-line most of the time, so you can easily and efficiently collect logfiles after a crash.
Logfiles are really useful when you are debugging software that is used by millions of people in a myriad of different configurations on thousands of different networks.
Indeed, it looks as though the main purpose of the Starbucks logfile was to help the company’s developers, because the logs are created by a third-party software component called, amusingly, Crashlytics.
As Wood found out, even if you protected the Starbucks app with a PIN (thus inhibiting someone who snatched your phone from firing up the app and paying for their own coffee at a click), the logfiles were unencrypted and accessible to attackers who got hold of your device.
Of course, the most effective crash-busting logs are those that include useful information leading up to the crash, rather than merely what could be extracted from the app’s memory image after it has imploded.
In other words, even if an app doesn’t crash, it may be quietly squirreling away a hoard of information, including, in the case of Starbucks, your username and password.
But usernames and passwords constitute PII (personally identifiable information) and ought not to be stored unencrypted, not least in this case because anyone who acquires them can freely spend your money at Starbucks.
→ Starbucks offers an “auto-reload” feature that will grab money from your payment card to replenish your Starbucks account if it runs low, so that you will never be caught short at the checkout with a steaming Grande in your hand. If you have that feature turned on, your financial risk from a security blunder in the mobile app is, obviously, much greater.
Starbucks, it seems, has published an update to the app, and no longer logs the offending information.
Sadly, the company didn’t manage to avoid the now-traditional cliches, leading with the words, “Your security is incredibly important to us,” and claiming that the software update was produced “out of an abundance of caution.”
We’ll disagree with both of those statements, because they don’t explain why anyone thought it was appropriate to store the user’s raw password in cleartext the first place, for any purpose.
Here’s our advice:
- If your app must store actual passwords, for example so that users don’t have to type them in every time they grab a cup of coffee, use a secure storage mechanism such as the Apple Keychain (as the updated Starbucks app now apparently does).
- Never allow decrypted passwords to be written to disk, even to temporary files, or sent across the network except over a secure connection such as SSL/TLS.
- Never store the passwords for an online service on the server, even encrypted. You simply don’t need to. Use a salt-hash-stretch technique instead, and store one-way hashes that will validate passwords, but can’t be reversed to recover the actual password.
Further information
Learn more about SSL in our Techknow podcast, Understanding SSL:
Learn more about server-side safe password storage in our Serious Security article How to store your users’ passwords safely:
Moral of the story – don’t use these ‘apps’ at all as they are all potentially unsafe. Solution, good old cash.
So What? It’s hardly national security stuff. Yeah, they made a mistake, they then fixed it as soon as it was pointed out to them. Hardly worth all this hate.
This sort of melodramatic response is NOT going to help foster a healthy attitude toward security researchers within large companies. The next time this happens, they’ll look at what happened when starbucks behaved responsibly and say “well that didn’t work for them, let’s try something else”
Indeed, we should let everything go and just say, hahaha, until it’s important. Since that sends a great message about how to behave. Is the lashback a little overkill? Only in regards to misleading articles and by people who don’t understand the smallish scope of this mistake. Still, in the end, Starbucks could have prevented this by not making the mistake in the first place.
Not every incident needs to deal with national security. All of these other smaller cuts attacks the fabric and expectation of doing security correctly at any level. And I’m sure someone has learned from Starbucks’ mistake.
Are you serious? *Melodrama*? (I don’t know whether to be indignant or very slightly flattered 🙂
Firstly, I think I was pretty conciliatory in the opening sentence: “Starbucks got into and out of privacy trouble over the past week.”.
Secondly, this is the sort of mistake that oughtn’t to be made. Why was the password being logged at all, even if it was only supposed to be for crash analysis? Someone knowingly wrote the code to put it there, even though they knew they ought not to have. We need to aim higher than that.
Thirdly, we really have to talk companies out of using confusing doublespeak when they write about their mistakes. (I didn’t mention it above, but calling this a “theoretical vulnerability” is plain wrong: it is, as you admit, plainly and unambiguously, a mistake – a genuine vulnerability, that is undeniably exploitable.)
Fourthly, this sort of this *is* national security stuff, not least because the sort of organisation that is into surveillance just loves this kind of programming mistake, and so do the cybercrooks. And if you can’t trust the official “app for that,” what can you trust?
I guess my confusion at your comment is that if this really is a case of “they made a mistake and fixed it immediately,” why didn’t they just come out and say that?
It’s not “an abundance of caution” to fix an app that leaks passwords in this way this any more than it is an “abundance of caution” to go and put your car through its compulsory annual safety test after you get busted by the cops for letting the roadworthy expire.
The ‘melodrama’ was aimed at the response as a whole, not you directly (sorry that wasn’t clear) although calling this ‘National security stuff’ might be on the way to melodramatic!
On that last analogy, you have to have physical access to someone’s device to pull this off, i.e. they have to have already been comprimised. and even then it only gives you access to someone’s password for a coffee app, not life-or-death like roadworthiness of a car.
A better analogy would be putting a lock on your bicycle when it’s in your garage in case someone breaks into your house. Which probably would be ‘an abundance of caution’
Counts as a bit more than just ‘a mistake’. A multi-national corporation with massive resources released an app that breaks one of the fundamental principles of data protection by storing passwords in plaintext. The end result of that could have been users losing their money.
Somebody in that company designed the app to behave that way, and somebody else signed off on that design decision. From a company this big, with the resources to do this sort of thing the right way, it’s inexcusable.
We need to make a fuss about this kind of thing. Companies need to know that if they cut corners and play roulette with our data and our money that they will suffer bad publicity and loss of revenue. If we just brush things like this aside then they have no incentive to do it right first time.
“Wood didn’t say in his Full Disclosure posting, but one news report suggested he went public to push Starbucks to act after getting the run-around for a couple of months from Starbuck’s customer service.”
If that is true it is their own fault but funny how they try to avoid blame. Paul does seem to be generous on his point of view.
Mr Wood has commented below, linking to his own blog where he discusses the timetable of his findings. It was more than one but less than two months.
You might argue he was a little hasty in his disclosure, but he didn’t directly tell you how to get the data, and he did say you have to gt hold of the device, so it’s not quite like he published a Metasploit plugin that could mount an unpatched attack from afar.
As I said above, how much better IMO (and shorter!) if they’d just said, “Ooooer. That stuff shouldn’t be in the logfile, even though the crooks would have to steal your phone first to read it. We’ll take it out. Thank you so much for the heads up,” and then fixed it.
@Paul: Yup, I agree with that! Giving a business 1 month to respond before going to FD is reasonable. (In fact, I’m not sure this issue is important enough to have not just gone FD from the start, but I’m also closer to being pro-FD than pro-HushHush.)
There probably will be some changes in the customer service side of things as well, maybe a dedicated abuse/security point of contact. It doesn’t need to be staffed with security persons, but at least some POC whose SOP is to move it up the chain to more technical folks.
if Starbucks had just responded early on and said, “Thanks, we’ll look into it. By the way, we classify this as a relatively low risk issue, but an issue nonetheless.” Sure, Mr. Wood may have still gone to FD with the findings after the fix (and should, really), but at least it wouldn’t have been a PR issue at all.
Enjoyed your post on this Paul. I’ve tried to clear some things up at my blog: http://securitycrush.blogspot.com/
I’m glad you covered the “theoretical vulnerability” wording. I mentioned it in an earlier comment as the wrong thing to say, but probably ought to have pointed it out in the article (though two “callouts” from the Starbucks posting were probably enough 🙂
I don’t think this issue is a giant deal, but it is an important and useful example that everyone can learn from…I reckon there are plenty of mobile app coders right now saying, “There, but for the grace of God, go I.”
The thing is that if you don’t log passwords, your log collection code can’t upload them. If you don’t upload them, you don’t have them. If you don’t have them, people can’t point a finger at you for having them, and *you* win, too.
After all, it was Starbucks that chose the words, “Your security is incredibly important.” Not wrong.
Please tell me people aren’t allowing ‘apps’ to access their bank accounts. It’s just… stupid.
Sadly we live in an age where people do incredibly stupid things all in the name of slight convenience.
The “we need an app for everything” demand has companies rushing these apps through that usually end up being either highly suspicious or highly incompetent at basic security.
“Your security is incredibly important to us” might well be true, although it still doesn’t explain why it wasn’t important enough for Starbucks to implement enough oversight of their app coding so that storing PII in plain text never happened in the first place. But chalk that one up to “lessons learned” (hopefully) and move on.
The statement about “theoretical vulnerabilities” is another matter entirely. It’s the worst kind of PR spin-slime. There was nothing “theoretical” about the vulnerability. It was there. It was real. The fact that no one might have exploited it doesn’t make it any less real. That’s the difference between “vulnerability” and “attack”.
I think Starbucks knows that such a mistake is inexcusable by any reasonable standard. But they’ve only compounded their culpability by not having the cojones to fess up, and pouring on the “theoretical” spin-slime. Shame on them.
Starbucks has admitted storing users’ passwords in plain text on its mobile apps, creating security and privacy risks.