Solutions Review’s Expert Insights Series is a collection of contributed articles written by industry experts in enterprise software categories. In this feature, Fivetran Senior Director of Product Marketing Alexa Maturana-Lowe offers a comprehensive resource on maximizing IT infrastructure ROI while minimizing costs.
Despite growth in IT budgets over the past few years, the continued economic slowdown in 2023 is posing challenges with inflation and layoffs at tech companies that grew too fast during the pandemic. When times are tight, we as leaders may have the reflex to reduce spending across the board, trimming a percentage across the top and treating more things like a cost center. But if Chief Data Officers can reorient their perspective, and lean into analytics, they can help their organizations outperform competitors and emerge from this turmoil in a strong place to grow.
The challenge is that when market conditions are changing this quickly, the best way to see what’s happening is by looking more closely at the data that provides insights into today’s market forces and evolving consumer preferences. Gartner says IT spending will normalize in 2023 at 6% growth, following a 10% jump in 2021, but only 3% growth last year. In an effort to reduce costs and improve agility given this uncertainty, companies have been accelerating the transition from on-prem ownership of IT assets to cloud-based SaaS subscriptions to reduce costs and improve agility. Now, some leaders are tempted to cut budgets or deprioritize infrastructure entirely by postponing these initiatives. Where will that leave businesses as they struggle to balance preparing for new growth… while also keeping the lights on today?
IT Infrastrcture ROI
Focusing on data-supported decisions is a better approach in this market than relying on experience alone, especially when the conditions that formed that experience have changed. With all the shifts during the pandemic, you can’t count on 20 years of instincts to steer you right. Customer behavior is changing. Therefore, you have to find the data that matters for today and tomorrow, and that means more analysis of what exactly has changed, by identifying the patterns that deviate from the past. When macro-economic and societal signals are as confusing as they are today, data is a much more reliable guide than personal experience or legacy dashboards.
These economic unknowns are an opportunity to evaluate how first party data is helping your business. How can you treat data as a guide to how your market has changed or where to find new markets? What patterns can inform your marketing? How are fundamental changes to your business affecting your product strategy?
If you’re in a place where you can’t rely on historical data because things have changed so much, start breaking down the components of your analysis to find the outliers that are throwing things off. You may be getting stale answers because your team is asking yesterday’s questions. Have your data team run analyses that can reveal unexpected patterns by looking at anomaly detection, deviations in trends and key driver analysis. These steps will not only help identify what’s changed, but also what matters.
Reset Your Expectations for What Data Teams Deliver
There are probably certain reports you’ve been using for years, and your team may spend a lot of time updating dashboards. Now is the time to see how good your data team really is by asking them to look beyond their traditional reporting and find opportunities to change business processes. This often leads to cost savings (bonus!) as data professionals identify waste like duplicate data storage costs. You can usually save money on data warehousing if you move some data to cold storage. But first you need to understand how all the data you collect can best serve your business.
Focus On Your Broader Business Strategy
Rising material costs are a concern, but don’t just buy cheaper supplies, look at trends to see how market changes are impacting your profits. Take a bakery with rising flour prices. It could just buy cheaper flour, but this could hurt sales and reputation down the road. A deeper analysis could point to eliminating a low-volume line of macadamia nut cookies for example, because macadamia prices doubled. The objective is not to make your core product worse with across-the-board cuts. The objective is to improve the most profitable parts of your business today, while trimming unprofitable products, and this is the same if you’re selling services or manufactured goods.
Cut back on complicated moon shots
Amazon invested a lot of money in hardware devices this decade, with tremendous success for Kindle and Alexa. But they’ve also launched everything from indoor security drones to talking sunglasses to interactive projectors. Today, Amazon is cutting back on market-building and streamlining their core business – and even trimming on Alexa. When you do need to cut costs, it’s best to return to the requirements that drove you to make expensive investments and see if those requirements still hold. If you’re cutting out a moonshot AI program, maybe you can also cut back on the streaming platform that feeds it, rationalize the architecture that multicasts your data, and so on.
Simplify Your Systems
One of the biggest challenges with gleaning insights from machine learning is that most data models are not intelligible – it’s not easy to link raw data to predicted actions. If one source of information feels off, it’s not easy to check without looking in multiple tools. This level of complexity prevents insight into what’s important. The best approach is to build data pipelines so all sources of data are captured in one place automatically. Then simplify your architecture so these insights are accessible to non-specialists at a glance. Streamline your processes so things are less finicky and brittle, then you can put those items on autopilot and focus your data team on guiding you through this environment.
Reduce Costs Thoughtfully
Years of chasing hot trends in data and machine learning, like real-time recommendations, robotic process automation and generative AI can rapidly complicate your IT systems, often with duplicate data pipelines feeding siloed applications. If you’re reducing costs from these long-range investments, take care unwinding them to ensure you’re capturing all the potential cost savings from stacked decisions. You don’t want to damage the core capabilities that give you the power to make data-supported decisions that are relevant for today’s shifting market.
When you’re facing a business environment where long-term trends have gone topsy-turvy, it’s only reasonable to focus on tactical action items. But your instincts from the past may not fit business behaviors and patterns today. To prevent this, dig deep into your data and the mechanics of your business. This will help you understand where and how you should cut budgets so you’ll be in a stronger position for whatever happens next. I’ll cover more in the second part of my article.
In Part 2 of this article, I wanted to dig a little deeper into ways for IT teams to simplify data access, streamline systems and save money, without cutting off the insights that a strong data team can deliver.
But first I wanted to follow up on my point about “moonshots.” The explosion of Generative AI, from GPT-3 to DALL-E is driving a seismic shift for many organizations as they jump on the opportunity to automate customer support, design tasks or marketing copy. Many software companies are adding AI capabilities anywhere they can to stay competitive. For employers looking to trim headcount, it’s very tempting to look at how AI can support marketing or design teams. But today, this type of AI content is really just a hype train, not a key business driver. Companies that try to jump on this train may find that these new technologies come with significant costs, and not a lot of short-term gains. I think it’s safe to sit on the sidelines for a bit without suffering long-term consequences around Generative AI. Look at Apple’s approach to markets – not always the first, but usually the company waits until there’s an opportunity to use commodity parts, and combine those parts in new ways to drive innovation and success. I think AI will significantly change online interactions, but we’re not there yet. Instead, look at how your business is operating today.
Uncovering Insight Requires Focus
One big roadblock to maximizing the value of a data team is often just legacy processes and reports. I mentioned dashboards before – this is a great way where you can refocus your metrics. Break traditional dashboards into their components so you can track what’s changing rapidly, and which parts of the business are operating normally. One of the easiest things to streamline is to start building automated data pipelines, so your data team can focus on higher value activities.
There’s often a gap between business decision makers and the people actually working with the data – and to be honest, many of those people may not even understand how the data is being used by executives. Here’s one place where a “scarcity” mindset can help you out. Find out what questions are top of mind for executives, and what data is needed to provide insight. Then cut back on any dashboards or reports that don’t provide insight.
Data-Driven Means Data-Informed
Along with making sure your data team is looking more deeply at which specific components of the business are driving growth, you also need to make it easier for more people to access data so that everyone can use the data to drive better outcomes for your business. Simplify the architecture and operations for your data stack so it’s more accessible to non-specialists. This means making sure data pipelines don’t break easily, centralizing data storage so it’s easier for teams to get to what they actually need, and focusing on putting more of the data management on autopilot, so teams can focus on finding the insights that will drive your business forward. Stakeholder teams shouldn’t need to go to a data boot camp to do their job – they should be able to quickly find what they need and dig in deeper.
Infrastructure Should Improve Speed
Unfortunately many companies feel a need to put up roadblocks in the form of access requests or specific data pulls to manage who can use specific data sources. A better approach is to focus more on keeping data clean and cataloged, so that employees can access what they need on their own. When data is categorized properly, it’s much easier to add guardrails around privileged data like Personally Identifiable Information (PII) to ensure proper access, without having to put up a roadblock to an entire source of data.
When you look at data operations and business intelligence today, many of the analytics reports are focused on tracking small changes and making incremental improvements. Think about a public company that’s trying to push its earnings per share from $2.10 to $2.15. This is a standard task for a data team. But when revenue jumped or dropped by 50% in a quarter, which happened to many companies after the pandemic, trying to make a 7% improvement on the bottom line is missing the point.
Instead, challenge your data teams to do what humans do best – and that’s take a step back, look at what is important for the business today, and how you can combine data in new ways to try and make some sense of these changing markets – but you need access to the data first.
- Maximizing IT Infrastructure ROI While Minimizing Costs - May 18, 2023