This is part of Solutions Review’s Premium Content Series, a collection of contributed columns written by industry experts in maturing software categories. In this submission, Scality Chief Product Officer Paul Speciale offers several long-term public cloud storage obstacles to know.
Organizations were forced to change gears fast due to the events of the last 2.5 years. They had to accelerate their timeline for their enterprise cloud strategy, which included the expansion or adoption of remote and hybrid work capabilities. As it empowered organizations to maintain operations, the cloud proved its business value. It’s no surprise, then, that Gartner discovered that almost 70 percent of cloud-based enterprises plan to increase cloud spending. And the analysts at Canalys recently found that worldwide spending on cloud infrastructure services increased by 33 percent to $62.3 billion in 2022.
Many enterprises rapidly created short-term cloud frameworks, some of them employing almost a lift-and-shift approach. While this was a quick fix, it’s not the best way to go about the long-term cloud transformation process. And for many organizations, the public cloud may not be the best fit for all purposes. It’s a good option for some applications and even cost-effective for short-term data storage, or for bursty applications that are not running 24×7. However, it doesn’t always provide enough control over the infrastructure, it can be inefficient, and it sometimes lacks in performance and security.
What’s needed is a solution that can combine the best of both worlds – one that’s scalable, secure, and cost-effective while providing access to your data via both legacy applications and your new cloud/cloud-native applications. This is where unbreakable cloud storage for data centers fits in. What do we mean by this? Read on.
The Obstacles of Long-Term Public Cloud Storage
Data is a key value for today’s businesses, with more of it being created and being retained for the long term than ever before. An unbreakable cloud data storage solution will go the full distance in protecting your data with capabilities to protect against planned and unplanned events such as common component failures (in a large system, this will happen regularly simply based on statistical failure rates.) It also gives you the ability to maintain service during system upgrades and while growing (scaling), and to ensure data integrity is never compromised.
Don’t assume that cloud providers will simply take care of data security. Your organization still has to play an active role in ensuring the right mechanisms are used, and it can be confusing to determine what your actual responsibility is when it comes to securing a public cloud deployment. What you gain in convenience from the public cloud can sometimes result in losing control over your own security policies and procedures. And to this point, 57 percent of IT pros who responded to a recent Enterprise Strategy Group (ESG) study said they believe 20 percent to 50 percent of their sensitive data already stored in the public cloud probably isn’t properly secured.
Data Residency and Data Sovereignty
Maybe you don’t want to be beholden to Amazon Web Services (AWS) or another single large public cloud vendor. You want control over your own data and the freedom to easily migrate your data to another platform, whenever you wish. That’s the goal of data residency – to be able to store an organization’s data in a geographic location of their choice. And related to this is data sovereignty, in which not only is the data stored in a designated location, but it’s also subject to the laws of the region where it’s physically stored. When you commit to a major public cloud, you typically give up much of your control over either of these concepts.
There is increasing concern about this; a recent report from the CapGemini Research Institute found that 68 percent of organizations surveyed are concerned about a lack of transparency and control over what’s done with the data in the cloud.
Lockdown and Lock-in
When it comes to the public cloud, a significant challenge is cloud vendor lock-in, similar to the old hardware/software lock-in. Many organizations are in favor of hybrid or multi-cloud approaches, but this is not always possible due to vendor lock-in issues.
Meeting the Challenges with a New Approach
Ransomware and other cybersecurity incidents are proliferating at break-neck speed. The reality is that today, ransomware is a when, not an if – the chance of it occurring to your organization is absolutely real and needs to be anticipated. You need a solution that can protect your data even in the event of an attack. Object storage systems provide data immutability to ensure that your stored data can’t be deleted or overwritten by a ransomware attack.
In other words, even if the bad actors get in, your valuable information can remain safe and you can more easily recover from such an incident. And while some clouds offer data immutability, not all of them do – and none of them do so the same way. Unbreakable cloud storage for your data center gives you further freedom by decoupling you from cloud-specific approaches, and further freeing you from big vendor lock-in.
A Long-Term Solution for Long-Term Storage
It is now the right time to look at long-term solutions for your business. The short-term cloud strategies you may have deployed early on in the pandemic-induced switch to remote work must be adapted for long-term requirements. There are more apps, which means there is more data and the need for more storage. The right storage solutions for today’s landscape are designed to be hardware-agnostic, so they can span across and absorb the never-ending training of new technologies to avoid obsolescence. And that means you can trust they will work for the long-haul, rather than becoming yet another piece of outdated technology that fails to meet your ongoing needs and has to be replaced.
- 4 Long-Term Public Cloud Storage Obstacles to Know - October 6, 2022