Ad Image

Resiliency Now! Addressing Past Data Mistakes and Planning the Future Today

resiliency

resiliency

Solutions Review’s Expert Insights Series is a collection of contributed articles written by industry experts in enterprise software categories. Sonya Duffin of Veritas argues the facts why enterprises need to address past data mistakes, plan for the future, and embrace resiliency now.

Expert Insights badgeWhether strategically planned or duct-taped together out of necessity, the number of organizations managing a multi-cloud strategy has skyrocketed in recent years. Eighty-nine percent of global cloud decision-makers report they have a multi-cloud approach. This could mean a siloed business unit deploying to AWS, a rouge team using Google collaboration tools while the rest of the company’s secondary data storage is on Azure, all while the sales team stores customer information on Salesforce. The evolution of continuous data sprawl has left IT leaders navigating unanticipated complications that come with multi-cloud complexity, and organizations must evaluate cloud storage scalability weighed against cost-cutting pressures. In today’s economic downturn, it is tempting to create a cobbled-together solution that works fine for today, while ignoring the long-term business needs that require a more significant investment. For example, the reliance on data collaboration tools has increased data storage costs significantly, but they are a necessity for remote work.

Reducing the number of cloud collaboration tools might be a quick fix to cut spending, but the cost of infrastructure complexity isn’t just monetary. Organizations must build business resiliency into their cloud strategies and consider the larger business risks of negative environmental impact, cyber threats like ransomware attacks, and compliance issues, or they open themselves up to serious repercussions later– which can be even more costly.

Looking to add a SIEM solution to your resiliency strategy? Check out our free Buyer’s Guide here!

The Fight for Sustainable Data Reduction Is An Uphill Battle

When organizations expand their infrastructure to run a new cloud workload, they often find it difficult to keep track of what data they are storing, especially if they have automatic snapshot policies that result in the duplication of data, some of which is obsolete or trivial. This forces IT teams, to continue expanding compute and storage resources that they might not actually need. More storage resources allocated to an organization’s workloads means more energy is needed for cloud data centers, which can have a negative impact on the environment. Scope 3 emissions, those that an organization is indirectly responsible for, are under increased scrutiny as stakeholders take a more comprehensive look at an organization’s sustainability efforts.

The U.S. Environmental Protection Agency (EPA) suggests that storing just 1 petabyte (100,000 gigabytes) of unoptimized backup data in the cloud for one year could create as much as 3.5 metric tons of CO2 waste. It is predicted that 100 zettabytes will be stored in the cloud by 2025. Using the EPA’s estimate this would equal 350 million metric tons of CO2. To put this into perspective, a tree can only absorb 48 pounds of CO2 per year, meaning it would take more than 16 billion trees to absorb the current cloud storage carbon footprint.

While soaring storage costs are the most obvious repercussion of cloud complexity, the environmental effects of an IT strategy cannot be ignored in light of increased sustainability regulations and stakeholder scrutiny. As a result, organizations must prioritize gaining full visibility into their data storage needs and optimizing their cloud storage management, particularly if they have Scope 3 emission reduction goals to reach.

Heightened Ransomware Threats Loom as IT Departments Scramble to Avoid Mistakes

As IT infrastructure becomes more intricate, data management becomes more challenging. When short-staffed IT teams are responsible for managing an increasing number of clouds, systems, and applications, even the smallest flaw creates a ransomware attack vector for bad actors. Ransomware attacks have increased threefold since 2020, and attackers continue to look for ways to make attacks more efficient and more profitable. IT leaders face an evolving threat as attackers’ techniques evolve in sophistication and innovation, such as testing AI-powered bots like ChatGPT to write malware and create encryption tools. If organizations are not prepared to detect and/or recover from an attack quickly, it can result in hundreds of thousands of dollars in business disruption, recovery efforts, and lost productivity. Organizations also risk reputation damage and regulatory repercussions if a ransomware attack is not addressed as part of a holistic multi-cloud strategy.

Additionally, the EU’s new Digital Operational Resilience Act, which applies to any financial services company globally that does business within the European Union, could result in additional repercussions if an organization doesn’t follow resilience testing and threat intelligence sharing standards. Companies should start planning in anticipation of similar policies in the United States as regulators look to reduce the cyber threats and ransomware risk in key industries.

Meeting Compliance Standards Gets More Complicated

With rumors of a federal-level data privacy bill and state legislatures enacting their own compliance laws, it is difficult for organizations to keep up with these standards if their data is not fully visible. With the California Privacy Rights Act (CPRA) now operative, there’s even more pressure on organizations to meet a heightened standard of compliance to ensure any personally identifiable information (PII) remains protected. When utilizing multiple clouds for data computing and storage, data can often get “lost” in the cloud, meaning IT professionals aren’t able to keep track of their important data. This ultimately creates more challenges for organizations to comply with data privacy regulations. Considering that an average of 50 percent of a company’s data is dark, PII can be mishandled accidentally, especially when it’s lurking in an image, audio or video. If they don’t account for the possibility of non-compliant data in the cloud, organizations will risk repercussions like hefty fines and increased scrutiny from government agencies monitoring data privacy compliance.

Mitigating Risk and Reducing Cloud Costs for a Resilient Future

While managing a multi-cloud deployment, especially in a large multi-national organization, IT departments can take control and gain better visibility into their data with the right strategy. AI and ML provide organizations with visibility into their data and the observability to take control of dark data, allowing leaders to make more informed decisions about data lifecycle management. This helps reduce storage, which lowers cloud spend and environmental impact and helps maintain data privacy compliance.

The upfront cost of multi-cloud data management is just part of the story. The real cost of cloud complexity is the additional risks introduced to a company’s business resiliency and regulatory planning. While the upfront investment may be more costly, organizations can mitigate these risks and reduce costs in the long run with cloud strategies that account for sustainability, ransomware protection and privacy compliance.


Widget not in any sidebars

Share This

Related Posts