PayPal, one of the first companies to offer a bug-reporting program, announced on Thursday that it’s sweetening the deal with bounties.
Michael Barrett, PayPal’s chief information security officer, said in a blog posting that he was initially leery of the concept, but the positive experiences of internet behemoth brethren that pay bounties – Facebook, Google, Mozilla and Samsung – have changed his mind.
His words:
"I originally had reservations about the idea of paying researchers for bug reports, but I am happy to admit that the data has shown me to be wrong - it's clearly an effective way to increase researchers' attention on Internet-based services and therefore find more potential issues."
How sweet is the deal going to be for white-hat security researchers? Nobody knows, since Barrett didn’t offer PayPal’s bug pricing scheme.
But PayPal will assuredly take note of what companies are offering, as well as the fact that they’re expanding their programs’ initial scopes. To wit:
- In May, Google boosted its maximum reward from $3,133 to $20,000 and added a $10,000 payment for SQL injection bugs or for what it deems to be “significant” authentication bypass or data leak vulnerabilities.
- In 2010, Mozilla increased its payout to $3,000 for each eligible security bug. It also broadened the program’s scope to include not only Firefox and Thunderbird but also Firefox Mobile and any Mozilla services that rely on those products.
- Facebook pays at least $500 for security hole reports. Last summer, Facebook chief security officer Joe Sullivan reported that the company had paid out over $40,000 within the first three weeks of the company’s decision to pay bounties.
Using the same secure reporting process with PGP encryption that it previously used for unpaid reports, PayPal plans to categorize reported bugs into one of four categories:
- XSS (Cross Site Scripting),
- CSRF (Cross Site Request Forgery),
- SQL Injection or
- Authentication Bypass
And how is PayPal’s attitude about all this?
When Facebook launched its bug bounty program last summer, Sophos’s Paul Ducklin got a bit prickly – justifiably – over the wording in the company’s Responsible Disclosure Policy.
Of course, as Ducklin pointed out at the time, you can’t expect a company to pay for bugs if a researcher doesn’t give Facebook time to fix the bug before it’s publicly posted.
But yikes, Facebook got heavy-handed with somewhat threatening language:
If you give us a reasonable time to respond to your report before making any information public and make a good faith effort to avoid privacy violations, destruction of data and interruption or degradation of our service during your research, we will not bring any lawsuit against you or ask law enforcement to investigate you.
Does PayPal’s bounty program use such a heavy stick? Yes, but the tone’s a bit more muted:
To encourage responsible disclosure, we commit that - if we conclude that a disclosure respects and meets all the guidelines [outlined in the policy] - we will not bring a private action or refer a matter for public inquiry.
In other words, PayPal says, in slightly less in-your-face language than Facebook, we won’t set our lawyers on you or ask the police to investigate you if you refrain from, say, DDoSing us or poking your nose into other users’ data without asking their OK first.
Well, fine. There are obviously quite a few lawyers hammering out wording for this kind of program.
And regardless of the legalese, it’s a positive thing when companies reward responsible researchers as opposed to punishing them.
Take, for example, security researcher Patrick Webster. Last fall, he alerted his investment fund company of a glaring security lapse.
They thanked him with a legal threat and notice that he just might be billed for the security fix.
Instead of displaying such a lack of gratitude, PayPal joined its peers (and accomplished a first in the world of financial services companies, its CISO believes) by acknowledging security researchers for their hard work and stepping up to recompense them for it.
Kudos. Let’s hope the trend continues to grow.
Fist of dollars image courtesy of Shutterstock.
If only these companies used the money to bring in security trainers, promote security thinking and not pressure their developers with first-to-market pressures we wouldn't have anywhere near as many bugs to find…
Which would you rather – being told the wheel has fallen off and that you're heading for the ditch, or being warned during the design phase that the wheel is likely to fail and giving you the opportunity to correct?
Without an agreement in writing between you or your company, and one of these com-
panies, signed by an executive officer who has the power to enter into this agreement,
and duly notorized under oath, I wouldn't give anything to any of them.
Otherwise they can say they were working on it before you, found it themselves, or any
number of excuses to renege out of paying you for your work. BEWARE! Otherwise you
can be ripped off by these companies.
I just stumbled across Bruce Schneier's writings about the vulnerability market and am rethinking the thing entirely. Interesting read: http://www.schneier.com/crypto-gram-1206.html.
The market is larger than we realize, he says: " It's not just software companies, who sometimes pay bounties to researchers who alert them of security vulnerabilities so they can fix them. And it's not only criminal organizations, who pay for vulnerabilities they can exploit. Now there are governments, and companies who sell to governments, who buy vulnerabilities with the intent of keeping them secret so they can exploit them."
This new market, he said, "perturbs the economics of finding security vulnerabilities. And it does so to the detriment of us all."
Perhaps public disclosure is best, as he argues, since it forces companies to fix them rather than sweeping them under the rug. While I'm glad PayPal is paying rather than suing, I'm no longer sure bounty programs are in our best interest, considering all this.
For a different point of view you should listen to Brad Arkin, director of product security and privacy at Adobe, who had some interesting things to say about Bruce Schneier's article.
You can hear Brad talk about it in the Risky Business podcast: http://risky.biz/RB243
Thanks, Graham. Yet more excellent perspective.