Approximately one in six buckets Amazon S3 storage buckets (of the 12,328 identified) are leaking sensitive data and company secrets, claims a new report.
Amazon Simple Storage Service (S3) is a web services interface for storing and retrieving static data from Amazon’s cloud that gives developers a way to store and access, for example, server backups, company documents, web logs, and publicly visible content, including images and PDFs.
Such content is organized into “buckets”, accessible at predictable URLs.
Here’s the type of bucket information to which any interested party (or e-scumbag prone to network attack or black market vending) has free and open access if users have set the buckets to public access, according to Rapid7 Senior Security Consultant Will Vandevanter:
- Personal photos from a medium-sized social media service
- Sales records and account information for a large car dealership
- Affiliate tracking data, click-through rates, and account information for an ad company’s clients
- Employee personal information and member lists across various spreadsheets
- Unprotected database backups containing site data and encrypted passwords
- Video game source code and development tools for a mobile gaming firm
- PHP source code including configuration files, which contain usernames and passwords
- Sales “battlecards” for a large software vendor
Those are just some of the materials that Vandevanter, assisted by HD Moore and inspired by Robin Wood, gathered from 1,951 buckets they found open to public scrutiny.
The sheer number of files made it unrealistic to test the permissions of every single object, so a random sampling was taken instead. All told, we reviewed over 40,000 publicly visible files, many of which contained sensitive data.
Roughly, that’s about one in six buckets that have been “left open for the perusal of anyone that’s interested,” he says.
Defining the security risks, Vandevanter says, is a no-brainer:
A list of files and the files themselves – if available for download – can reveal sensitive information. The worst case scenario is that a bucket has been marked as ‘public’, exposes a list of sensitive files, and no access controls have been placed on those files.
In situations where the bucket is public, but the files are locked down, sensitive information can still be exposed through the file names themselves, such as the names of customers or how frequently a particular application is backed up.
This isn’t Amazon’s fault, mind you. Amazon S3 buckets’ default setting is private. Buckets set to “public” will list all files and directories to anybody who asks.
By default, buckets will have one of two predictable, publicly accessible URLs, Vandevanter says.
Using the two URL syntaxes he provides, users will either be denied access by a private bucket, or they’ll be peering into the first 1,000 objects stored in a public bucket.
As for remedies, this one is straightforward, Vandevanter writes: “Check your buckets. If they’re open, determine whether the content is something you don’t mind exposing.”
If your content should be kept private, check out Amazon’s tutorial on how to lock it down.
Unfortunately, making a bucket private now doesn’t extend to bucket versions stored away in Google indexing.
Vandevanter recommends the Internet WayBackMachine to find previously open buckets. Also, he used a modified version of @mubix’s Metasploit module to find “a few hundred” currently private buckets that were previously open.
Here are Amazon’s instructions on how to manage access control lists for objects in buckets.
Amazon, clearly, isn’t at fault, as Vandevanter points out. The company has set the buckets to private by default, and it’s published plenty of resources to instruct users on how to keep data safe.
A spirited discussion on Slashdot questions a) whether Vandevanter is vulnerable to prosecution over the classic security technique of testing doorknobs to see which are open and b) whether this is news at all, rather than a case of RTFM (Caution: Link may be NSFW).