175 zettabytes.
That is the amount of data the International Data Corporation (IDC) predicts will be present in the world's datasphere by 2025; with most of the data set to be located on the cloud, access to infrastructure across the world is primed to see a massive shift in storage and infrastructure as compared to the last decade. But 175 ZB?! That's a whopping 61% in CAGR!
Let me help you think of that number a little better in case you are not as awed as I am.
1 zettabyte is 1,000,000,000,000 gigabytes. That's 12 zeros. Now multiply that number by 175.
If data were a company, it would be the #1 stock on every investor's list for eternity.
“If one were able to store 175ZB onto BluRay discs, then you’d have a stack of discs that can get you to the moon 23 times,” David Reinsel, senior vice president at IDC explains. “Even if you could download 175ZB on today’s largest hard drive, it would take 12.5 billion drives. And as an industry, we ship a fraction of that today.”
The pandemic has changed things for everyone. These changes are most likely permanent, transforming the way people shop, game, access services, and interact with businesses. According to a Marketplace Pulse study, eCommerce is up more than 44 percent year over year, compared to an annual growth rate of between 11 and 17 percent from 2011 to 2019.
👆 Read that as more data, more frequent surges, higher IOPS requirements, and an incessant demand for more secure and seamless storage services. More of everything needed to keep the infrastructure running as well.
The datasphere has three locations:
IDC predicts by 2025, the amount of data stored in the core (data centers & cloud) will be more than double the amount stored on endpoint storage devices taking the total amount of data to be stored on cloud to be around 49% by 2025.
Let's hold on to the thought for a second: what this directly translates to is that cloud storage infrastructure needs to grow at the same pace as the rising demand for seamlessly performing user-facing applications.
Which means that bolstering the public cloud needs to be the single most critical task that needs to be undertaken by giants like AWS, Azure, and GCP. But till the day AWS manages to scale their entire existing infrastructure, there's always solutions like Lucidity that can help you drastically improve the performance of existing storages through smart allocation and dynamic expansion.
Want to know more about how data is going to transform infrastructure demands on the public cloud? Subscribe to the Lucidity newsletter and we'll send you timely updates on the biggest transformations in the world of data, storage, and DevOps.