Written by Mark Molyneux, CTO of EMEA at Cohesity
Every company will need more storage as their data grows by an average of 40 to 50 percent each year. At the same time, storage prices are increasing, whether on-premises or in the cloud.
Several factors contribute to making data more expensive to store. The war in Ukraine has increased energy prices and, among other factors, pushed the inflation rate in industrialized countries to more than nine percent in 2022, the highest it has been since the 1980s. Global Technology Market Analysts such as Canalys expect that public cloud providers, especially in Europe, will increase their prices by at least 30 percent to account for rising energy costs.
At the same time, the amount of data continues to grow rapidly, as a report by ESG shows. For every TB of production data, companies need an additional four TB of storage for secondary data, which they store for privacy and other non-production reasons. There are also other factors that will ensure that this development will not slow down in the future. For example, IBM’s Storage Evangelist Shawn Brume predicts that autonomous driving of more than 48 million vehicles on US roads will generate 23 exabytes of data for deep storage. These emerging modern services will by default generate massive amounts of data that organizations will need to store for 20 to 30 years before it becomes valuable.
As a result, organizations are faced with storing more data as costs increase, coupled with higher prices for new storage resources, as shown by the US government’s most recent December 2022 Producer Price Index. The price of computer storage rose 1.1 percent in December after rising 3.9 percent in November. So companies seem to be spending more money on storage to make room for data growth
So if, as Gartner expects, IT budgets should increase by an average of 2.4 percent this year, the bottom line is that CIOs urgently need to find ways to reduce costs, because inflation alone will more than equalize the budget growth. At the same time, it is important not to jeopardize forward-looking steps towards more digitization or agile IT, despite economic pressure. Because these new services will open up additional revenue streams and will help organizations to address customers in a modern way.
Renaissance of data reduction
The data explosion is a scientific fact and shows how important any form of smart data reduction technology is today. With a data management platform with hyperscale architecture, all data is automatically compressed and deduplication algorithms look for redundant data structures, which they can replace with small placeholders.
The big question is how much these technologies can actually achieve in practice outside of optimized set laboratory conditions. Cohesity commissioned ESG to evaluate actual rates at more than 3,000 Cohesity customers in the field.
The December 2022 Cohesity Data Cloud Data Management Platform report shows:
- 89% of 3,000 customers achieved data reduction of 96x or greater, with many achieving significantly higher reduction rates.
- 39 percent of the companies surveyed reported that their data volume was increasing by 20% or less each year, 31% said it was between 21 and 50% annually, and 28% said the volume of data was growing by more than 50% each year.
- Today, more than three-quarters (77%) of the companies surveyed operate at least three data centers, with almost two-thirds (63%) of the companies planning to have at least six data centers in five years to accommodate this volume of data and power new services.
Data reduction can massively help companies to store data more cost-effectively, since these mechanisms automatically reduce the amount of secondary data in the background as soon as it is generated, without anyone having to actively initiate the process. This pays off immediately when you take a close look at the costs.
Although hardware is generally becoming cheaper because the disks provide more storage per euro, the operating costs drive the price up. According to an analysis by Nasuni, it costs $3,351 a year to store one TB of file data. Existing storage resources are protected by data reduction, so that investments in new storage resources can be postponed.
Synergies for enhanced cyber resiliency
Organisations should consolidate their disparate application data silos onto a single centralized data management platform like from Cohesity that is based on a scalable hyper converged file system. In this case the data stored will be automatically analyzed by the deduplication and compression functions to achieve the highest reduction rates across the organization.
To protect stored data, the platform takes the Zero Trust model even further by implementing strict access rules and multi-factor authentication. The platform also encrypts the data automatically , both during transport and at rest, to further enhance security against cyber threats like ransomware. And it generates immutable backup snapshots that cannot be changed by any external application or unauthorized user.
These backup snapshots are being analyzed by AI-driven algorithms to identify indications of possible anomalies. These can be passed on to security automation tools from vendors such as Cisco or Palo Alto Networks, in order to examine the potential incident in more detail.
Finally, a modern data management platform like Cohesity also provides more insights from data analysis thanks to integrated classification. Organisations can better understand their compliance risks by getting visibility in their dark data, which according to Gartner affects between 55% and 80% of the data a company stores. They can decide with confidence whether to keep certain records or delete them with no risk.
All these synergy effects of a modern data platform enhance the cyber resilience, reduce the operating and storage costs and help organizations to long term manage the growing volumes of their data.