Digital Shadows recently published a “Too Much Information” report, which stated that 1.5 billion files were, essentially, freely available to the public via file-sharing services. The report looked at exposed data across Amazon S3 buckets, rsync, SMB, FTP servers, misconfigured websites, and Network Attaches Storage (NAS) devices.
All sorts of data are, of course, stored in the cloud. Perhaps the most sensitive types of data that is at risk and was analyzed in the report included medical files, confidential employee payroll and tax return files, as well as sensitive IP exposures.
The cloud service providers aren’t to blame – alas, we can only blame ourselves, as users of these services, for not properly securing the data we store within them. The particular concern here is that we tend to assume there’s an inherent, built-in security to cloud services, and the trap we can fall into is that we often take that assumed security for granted as default. As a consequence, we may wind up inadvertently leaving loads of data freely open and publicly available.
Computer Business Review wrote a great piece on this report, which is worth a read. Amazon S3 buckets remain an ongoing issue for user carelessness, the report notes:
“Amazon Simple Storage Service, or S3, is storage that is designed to make web-scale computing easier for developers. Buckets are the fundamental container in Amazon S3 for data storage – with each object typically stored and retrieved using a unique developer-assigned key. Privacy, in short, is set by default and careless use to blame.”
In cases such as this, a little user education can go a long way to ensuring better security. Here are a few basics to ensure that the data you’re storing in the cloud is, in fact, secured against attackers whose goal it is to obtain access and use that data to exploit you or your organization: