New pricing will enable enterprises to save up to 92% as access needs change
Dominant cloud services provider Amazon Web Services (AWS) is continuing to drive down storage costs, this time by making it significantly less expensive for enterprises to store and access the files they need less frequently.
The Lowdown: The company is reducing the price of its Amazon Elastic File System (EFS) Infrequent Access (IA) with Lifecycle Management, a move that officials said will help users save up to 92% on the cost of storing files as their access needs change.
The Details: The price drop means that users can store and access the files natively in a file system for as little as 8 cents per gigabyte of data per month. EFS is a fully managed cloud-native file system for Linux-based workloads that can be used with AWS services and on-premises systems. The elastic storage capabilities in EFS mean that it will automatically shrink and grow as files are created and deleted and demand changes.
With EFS IA with Lifecyle Management price reduction, enterprises can save up to 92% on file storage costs when compared with EFS Standard. When EFS Lifecycle Management is enabled on a file system, those files not accessed in accordance with a customer’s policy will automatically be moved to the cost-optimized EFS IA storage class rather than keeping the data ready in a faster tier. Those less-used files will remain immediately available and within the same namespace, though with slightly higher latency.
When the EFS IA storage class was introduced in February, the cost was 45 cents/GB per month. It’s since been dropped to 25 cents/GB per month. AWS’ EFS IA with Lifecycle Management is available now.
The Impact: Storage costs are a key concern as enterprises move more of their business to the public cloud, which is why so many vendors and cloud providers are looking for ways to help reduce those costs. That includes for data that’s not accessed often and just sits in storage. According to IDC analysts and AWS’ own analysis of usage patterns, only about 20% of data stored is in active use. The remaining 80% is infrequently accessed.
Background: In March, AWS rolled out S3 Glacier Deep Archive, a service for long-term data storage needs that enables organizations to use the cloud for archiving rather than legacy, on-premises tape storage devices. The cloud provider’s service, based on storage tools from the likes of Commvault and Veritas, offers cold storage for only $1 per terabyte of data per month, which officials said is significantly less expensive than tape and off-site storage options.
The Buzz: “As storage grows, the likelihood that a given application needs access to all of the files all of the time lessens, and access patterns can also change over time,” said Steve Roberts, senior technical evangelist at AWS. “Two common drivers for moving applications to the cloud are to maximize operational efficiency and to reduce the total cost of ownership, and this applies equally to storage costs. Instead of keeping all of the data on hand on the fastest-performing storage, it may make sense to move infrequently accessed data into a different storage class/tier, with an associated cost reduction. Identifying this data manually can be a burden, so it’s also ideal to have the system monitor access over time and perform the movement of data between storage tiers automatically, again without disruption to your running applications.”