Ad Image

96 Data Management Predictions from 52 Experts for 2023

Solutions Review editors received data management predictions from experts for 2023, part of the 4th-annual BI Insight Jam.

As part of Solutions Review’s third-annual #BIInsightJam, we called for the industry’s best and brightest to share their data management predictions for 2023. The experts featured here represent the top data management providers with experience in this niche. Data management predictions have been vetted for relevance and ability to add business value as well. These are the best predictions from the dozens we received. We believe these are actionable and may impact a number of verticals, regions, and organization sizes.

Note: Data management predictions are listed in the order we received them.

Download Link to Data Management Buyers Guide

Data Management Predictions from Experts for 2023


Christian Buckner, SVP Data Analytics and IoT at Altair

Big data isn’t dead yet

Providers will attempt to get ahead trends, and we will see many start to advertise that “Big data is dead.” Instead, many organizations are leaning into “smart data” for greater insights. But despite the advertisements, big data will continue to play an important role in business operations — for now. The key is to make sure you have easy-to-use, self-service tools in place that enable cleansing, verifying, and prepping of the data that can then be plugged into a data analytics model for valuable results and smart decisions. The companies that turn their big data into smart data will be the ones that will benefit from the new ways of thinking about data.”

Andy Palmer, Co-Founder and CEO at Tamr

The future of data lakes

“For years, data lakes held the promise of taming data chaos. Many organizations dumped their ever-growing body of data into a data lake with the hope that having all their data in one place will help bring order to it. But data lakes are overhyped and often lack proper governance. And without clean, curated data, they simply do not work. That’s why many organizations who implemented data lakes are realizing that what they actually have is a data swamp.

Having clean, curated data is valuable. That’s a fact. But dirty data swamps are not and organizations must prioritize the importance of accurate and integrated data, develop a strategy of eliminating data silos, and make cleaning data everyone’s responsibility.”

Nick Halsey, CEO at Okera

Privacy regulations will continue to proliferate, requiring a proactive approach

“Anxiety about the proliferation of data privacy regulations – around the world and within U.S. states – will ratchet up in 2022. Driven by both the fear of fines and damage to brand reputation, companies progressing on their compliance journey will shift their concern from simply the how-to, to now focusing on how to arbitrate among different regulations. A common approach will be to fulfill the technical requirements for one major regulation, perhaps CCPA or GDPR, then layer in the required capabilities for other regulations as needed. The consequences of this wait-and-see approach toward regulatory compliance will result in companies falling further behind while risks continue to increase – if they don’t act decisively in the coming year.”

Governance goes real-time

“When we think of governance, we usually think about putting a policy in place – what role can access what data – and having the system allow or disallow user access based on that policy. State-based policies change the game. Some regulations restrict not only who can access what data, but also where the authorized users are allowed to be when they attempt to access the data. Other regulations restrict access depending on the date, time, system status, and other variables. This combination of various state-based regulations and variables can imply a more refined data access policy, placing a new layer of requirements on governance systems. The policy, no longer static, must react to certain variables in real-time. In 2022, we will see increasing pressure on enterprises and vendors to put the tools in place that enable real-time, state-based policy enforcement.”

Nik Acheson, CDO at Okera

Data subject access requests get supercharged

“With more breaches becoming public, policy makers are being forced to represent a frustrated consumer base and hold companies more accountable. As such, we’re continuing to see a boom in policies, regulations, and permissibility, with corporate executives being held accountable for not following best practices.

In 2023, new technologies along with attention from the legal community will pick up steam enabling individuals to gain greater visibility and control of what, where, and how their data is being used. Worse, it will cripple many enterprises that still struggle with over-provisioning of data, lack of full visibility, and legacy patterns operating in contemporary distributed data environments.” 

Data contracts become more real and the business finally gets involved

“Too many engineering teams are struggling to maintain data quality, access, and track usage patterns. While many businesses have in-function analytics professionals collaborating with core enterprise analytics teams, data and/or analytics engineering professionals are still navigating data domains and certifying data coming out of data build tools.

In 2023, the continued proliferation of data is going to finally force the business to take more ownership of data, not just in use and interpretation, but also in the patterns of how it is managed and provisioned. Distributed stewardship will become a reality and the best way to enable this will be with tools that are not built for engineers, but with data contracts that clearly map ownership, use, dependencies, etc. This will become more visible as features in data catalogs and/or a few startups emerging since confluence will not cut it at scale.”

Larger separation of companies embracing data mesh vs. those locking down

“As we saw with GDPR in the early days, data access requests lead to two dominant responses: Accelerated digital and data innovation and modernization versus deletions and lockdowns. Similar to the Covid virus, some liquidated assets are tough to survive, while others embraced the new digital landscape and realize that things will not be the same and customers are going to change their behaviors, buying patterns, and interests. In 2023, companies looking to survive long-term will need to address the evolving policies around data permissibility, usage, and visibility or be hit with lower valuations and legal troubles. Customers will increasingly demand more accountability and respond with their purchasing behavior towards companies they no longer trust.”

Increased industry cross-platform connection

“It was only a few years ago that a CEO at a fast growing unicorn company told me not to use another technology company, another unicorn that is now operating within the center of the data universe. This CEO noted that they’ll never integrate with them, that they would force me to use their native tools, and that the company would either build it better or buy them. Given my dependence on this particular company, and as a critical part of my architecture, he was right—I had no choice at the time but to oblige. In the few years since, I continue seeing connected technology communities growing and collaboration happening between leaders. These leaders discuss concerns and openly discuss shared customers’ needs and deeply care about the combined success. And I see companies continuing to focus and expand on niche areas with a bias for open architectures.

In FY23, we will see much more public sharing of “modern data stacks” with technology companies promoting one another, others improving abstraction layers, and enterprises continuing to go all-in on modularity and building intentional maturity routes that delivers business and customer improved outcomes faster. Data proliferation and democratization gets dangerous when data and systems fail to operate across the full spectrum of visibility and in FY23 we will see technology companies continue to take steps towards customer bias and success via enablement and more open connectedness.”

Armon Petrossian, Co-Founder and CEO at Coalesce

The rise of Data-as-a-Product

In 2023, data-as-a-product will reach maturity resulting in increased quality and trust in data at companies. This will lead to more robust data organizations within enterprises that require an increased need for data modeling technologies and data teams/engineers.”

Satish Jayanthi, Co-Founder and CTO at Coalesce

The return of data modeling

In 2023, industry veterans who spent nearly a decade calling for thoughtfulness in building fundamental data infrastructure instead of rushing to build buzzworthy products will get their “I told you so” moment. Data modeling is making a comeback, alongside the realization that without the infrastructure to deliver high-quality data, businesses will not get very far towards the promise of predictive analytics, machine learning/AI, or even making truly data-driven decisions.”

The rise and fall of everything-as-code

In 2023, as budgets likely continue to tighten, a trend will emerge towards seeking optimization and productivity. Rather than continuing to grow teams, companies that are forced to do more with less will look towards ways to automate data processes that they once did manually. That is good news for platforms and tools that enable automation, are simple to use, and free up time spent on repetitive tasks to focus instead of creating impact for the business.”

Stephen Cavey, Chief Evangelist at Ground Labs

Automation rises

“A quiet revolution is taking place as companies unite long-siloed systems and processes for the first time. Many firms can now adopt a “data-first” strategy to measure, optimize, secure, and automate every possible process. Implementing this strategy is a significant and gradual effort — far greater than any single team can handle alone. While the business will set the direction, define the standards, choose the tools and provide the governance, end-users will continue to play a key part in ensuring the security and management of their data with the help of automation. Within the next 10 years, security technologies and data management will be automated by default, rather than by exception.”

Ryan Welsh, Founder and CEO at Kyndi

Businesses will finally benefit from their unstructured data

“IDC estimates that 80 percent of all data is unstructured, or free-form, making it difficult to assess and derive insights. Organizations struggle to extract relevant insights when they search for answers in text data, mainly because the search tools they are using are not capable of effectively and efficiently processing unstructured data.

Recognizing the immense value that is being left on the table, organizations in 2023 will apply practical methods to dramatically improve efficiency and unlock the value that has been elusive for so long. Remote and hybrid work have exacerbated the pain of unsatisfying search outcomes because so many employees work from their own locations and access information at different hours, making information sharing within an organization a major challenge. You can’t simply reach out to your colleague sitting next to you for answers whenever you think necessary. In the coming year, expect to see employees turning to Natural Language Search tools to find relevant information across all structured and unstructured sources.”

Ravi Mayuram, CTO at Couchbase

Cloud databases will reach new levels of sophistication in an era where fast, personalized, and immersive experiences are the goal

“From a digital transformation perspective, it’s about modernizing the tech stack to ensure that apps are running without delay – which in turn gives users a premium experience when interacting with an app or platform. Deploying a powerful cloud database is one way to do this. There’s been a massive trend in going serverless and using cloud databases will become the de facto way to manage the data layer. In the next year, we will also see the decentralization of data as it moves closer to the edge to offer faster, more dependable availability. Additionally, we’ll start to see the emergence of AI-assisted databases to enable teams to do more with less. The proliferation of data will only continue, making AI-assisted databases a critical strategy to making the data lifecycle more operationally efficient for the business.”

Chris Lubasch, Chief Data Officer & RVP DACH at Snowplow

Modern data stack

“It was a year of fast-moving discussions around the modern data stack. Lots of new vendors popped up, and major ones like Snowflake and Databricks continue their journey to take over many technical components, despite the challenging economic situation. But at the same time, voices emerged who questioned the modern data stack as such, whose decoupled approach often leads to many tools and high costs, let alone the complexity of getting it all together. The discussions around the ‘postmodern data stack’ (as just one out of many terms) were started, and we’re all eager to see where this will lead us in the coming years.”

Bassam Chahine, Senior Consultant at Instaclustr by NetApp

Data management best practices

“2023 will see enterprises searching to wring ever-greater cost efficiency from their data layer technology, as economic uncertainty drives budget constraints. As a best practice, organizations should use this as an opportunity to explore their options among the many proven (and enterprise-ready) open-source data technologies available that can directly replace more expensive proprietary (or open core) alternatives. Completely, 100 percent open-source data-layer technologies, such as Apache Cassandra, Apache Kafka, Postgres, Redis, and more, offer world-class availability, scalability, and performance to meet the most data-intensive use cases.

But just as importantly, open-source data-layer technologies in their fully open-source versions offer the freedom for enterprises to own their own code while paying zero in licensing fees. In juxtaposition, open core offerings are increasingly an open con, taking true open-source projects and repackaging them along with add-ons that are supposed to justify expensive licensing costs. Open core solutions are often passed off as having open-source portability, but the entire business model is based on capturing enterprises with vendor and technical lock-in. As a best practice in the coming year, enterprises should focus on recognizing and avoiding open core in their data layer, increasing flexibility while stretching their budgets.”

Brian Dunagan, CTO at Retrospect

Data management

“Freedom and flexibility will become the mantra of virtually every data management professional in the coming year. In particular, data management professionals will seek data mobility solutions that are cloud-enabled and support data migration, data replication and data synchronization across mixed environments including disk, tape and cloud to maximize ROI by eliminating data silos. We will likewise see an uptick in solutions that support vendor-agnostic file replication and synchronization, are easily deployed and managed on non-proprietary servers and can transfer millions of files simultaneously – protecting data in transit to/from the cloud with SSL encryption.”

Deepak Mohan, EVP of Engineering at Veritas Technologies

More scrutiny on cloud budgets in 2023

“According to Veritas research, 94 percent of organizations are overspending on cloud and are going over their allocated cloud budgets by an average of 43 percent. As the amount of data continues to grow year over year, so does the cost of storing it in the cloud, which is becoming harder to justify. Though most companies have realized advanced business strategies through cloud adoption, CEOs and boards will increasingly demand transparency surrounding the ROI of cloud spend.

With many economists predicting a continued downturn next year, we expect scrutiny on IT spending to intensify further in 2023, which will put pressure on IT leaders to justify their cloud budgets while identifying new ways to reduce data volumes. This could lead to more effective data storage and management strategies, such as deduplication techniques to ensure reduced storage consumption.”

Low code/no code applications will create compliance issues

“Low code/no code application development has been instrumental in democratizing application development across companies. In 2023, low code/no code adoption will become mainstream, and non-technical employees (citizen developers) across any organization will have the power to create their own app. While this will significantly alleviate the burden on IT teams, it will also create a big compliance risk for organizations. Because citizen developers don’t have the same experience in implementing security and privacy, most of the applications they develop won’t be adequately protected and protection policies may be inaccurately applied. As a result, not only will organizations face compliance issues, their applications may also create new vulnerabilities for bad actors to exploit.”

More edge devices mean more vulnerabilities

“Gartner predicts that by 2025, more than 50 percent of enterprise-managed data will be created and processed outside the data center or cloud. As more data processing moves to the edge, it complicates IT architecture and increases the attack surface. What’s more, enterprises often don’t apply the same level of protection to the edge as they do in the data center or the cloud, often due to skills and staffing shortages. To fully protect the enterprise, each of these edge devices needs to be protected and backed up. On top of that, organizations need to determine what data coming from edge devices is critical versus non-critical to maintain storage and protection costs, understanding the added scrutiny on IT budgets.”

Haoyuan Li, CEO at Alluxio

Multi-cloud adoption is accelerating organizations’ data strategies evolve

“As more organizations evolve their data strategies in 2023, multi-cloud data infrastructure adoption is accelerating and will become the new norm. Organizations are expected to embrace this trend and ensure their cloud applications are portable regardless of cloud provider. More organizations will transform cloud computing into an undifferentiated commodity and ease application burden. They aim to realize flexibility, security, and agility while simplifying their operations.”

Big models for AI are driving innovations in specialized infrastructure and solutions

“Over the past few years, AI and deep learning have become mainstream and reached the same maturity level as data analytics. Big models, from OpenAI’s DALL-E 2 image generation model to Google’s LaMDA conversation agent, are expected to dominate the landscape in 2023. Billions of files will be used to train big models for longer periods of time, requiring more specialized infrastructure and solutions. Next-generation AI infrastructure will be developed to handle the scale.”

From centralized Hive catalog to open table formats in data lakes

With data lakes becoming the primary destination for a growing volume and variety of data, having a table format for data stored in a data lake is a no-brainer. More organizations now have realized that Hive catalogs have become the central bottleneck. In the cloud-native era, decentralized open data table formats are popular, especially in large-scale data platforms. In 2023, we can expect to see more enterprise data being stored in open table formats as Apache Iceberg, Hudi and Delta Lake are rapidly adopted.”

Demand for simplified data access and data sharing is one on the rise

Data has become increasingly distributed as the amount of data grows. In 2023, organizations will have an ever-increasing need to manage their scattered data wherever it exists. Furthermore, data sharing across organizations and platforms will become more critical. It will be necessary for organizations to develop and implement a data strategy for managing and sharing distributed data across regions, organizations, clouds and platforms.”

Angel Viña, CEO and Founder at Denodo

As recession looms, companies will look to optimize infrastructure cost

“Whether North America is in recession or not, companies are actively cutting costs, and reducing IT infrastructure, which has always been an easy choice for CEOs. While compute and storage costs continue to be reduced through the usage of cloud it still can lead to huge bills for organizations given their heavy investments in data and analytics infrastructure. Thanks in part to the breadth of choices of storage, compute, and applications, companies often take a rip-and-replace strategy to modernize their data and analytics efforts. That approach is not only is costly, but it can often lead to disruption in IT operations. In 2023, more companies will see IT focusing on modern, non-disruptive ways to update their IT infrastructure, whether their data resides entirely in one cloud, multiple clouds, or in a hybrid environment including on-premises.”

While multi-cloud gets real, FinOps in cloud becomes necessary

“For many companies, strategic data assets are spread across multiple clouds and geographical locations, whether that is because various business units or locations have their preferred cloud service provider (CSP), or because merger and acquisitions have led these assets to reside in different cloud providers’ boundaries. As more data continues to move to the cloud, and different geographies see prominence of certain cloud providers vs. the others, there is accelerated adoption of multi-cloud architecture for multinational corporations. Currently, there is no easy way to manage and integrate data and services across these different CSPs. Failure to address this problem always results in data silos and a fragmented approach to data management, leading to data access and data governance complications.

Also, and contrary to popular belief, cloud costs are increasingly becoming a material expense due to the sheer volume of data and related egress charges, to name a few. For many organizations, cloud investments do not deliver the economic and business benefits as intended. As a result, they are leveraging FinOps to provide a framework for controlling cloud costs and usage, identify cost vs. value, and understand ways to optimally manage it across modern hybrid and multi-cloud environments. In the coming year, expect FinOps to gain momentum as a critical initiative to help companies better manage their hybrid-cloud and multi-cloud spend.”

Accelerated adoption of data fabric and data mesh

“Over the past two decades, data management has gone through cycles of centralization vs. decentralization, including databases, data warehouses, cloud data stores, data lakes, etc. While the debate over which approach is best has its own proponents and opponents, the last few years have proven that data is more distributed than centralized for most of the organizations. While there are numerous options for deploying enterprise data architecture, 2022 saw accelerated adoption of two data architectural approaches – data fabric and data mesh, to better manage and access the distributed data. While there is an inherent difference between the two, data fabric is a composable stack of data management technologies and data mesh is a process orientation for a distributed groups of teams to manage enterprise data as they see fit. Both are critical to enterprises that want to manage their data better. Easy access to data and ensuring it’s governed and secure, is important to every data stakeholders — from data scientists all the way to executives. After-all, it is critical for dashboarding and reporting, advanced analytics, machine learning, and AI projects.

Both data fabric and data mesh can play critical roles in enterprise wide data access, integration, management and delivery, when constructed properly with the right data infrastructure in place. So in 2023, expect toa rapid increase in adoption of both architectural approaches within mid-to-large size enterprises.”

Augmentation of data quality, data preparation, metadata management, analytics

“While the end result of many data management efforts is to feed advanced analytics and support AI and ML efforts, proper data management itself is pivotal to an organizations’ success. Data is often being called the new oil, because data- and analytics-based insights are constantly propelling business innovation. As organizations accelerate their usage of data, it’s critical for companies to keep a close eye on data governance, data quality and metadata management. Yet, with the growing amount of volume, variety, and velocity of data continues, these various aspects of data management have become too complex to manage at scale. Consider the amount of time data scientists and data engineers spend finding and preparing the data, before they can start utilizing it. That is why augmented data management has recently been embraced by various data management vendors where, with the application of AI, organizations are able to automate many data management tasks.

According to some of the top analyst firms, each layer of a data fabric — namely data ingestion, data processing, data orchestration, data governance, etc. should have AI/ML baked into it, to automate each stage of the data management process. In 2023, augmented data management will find strong market traction, helping data management professionals focus on delivering data driven insights rather than being held back with routine administrative tasks.

While these are the five most important trends in our mind, there are other areas of data and analytics practice which will shape up how digital business will not only survive but thrive in 2023 and beyond. The last two to three years have definitely taught us that digital business is not really a fall back option when the world cannot meet in person, but that is where the future lies. Hopefully your organization can gain some insights from these article as you lay out your digital business plan.”

Brian Anderson, CEO at Nacelle

Data flow becomes the engineering focal point

“Due to the significant growth in online commerce, many new best-of-breed vendors are focusing on solving just one problem and making that solution 10x better than what the out-of-the-box, monolithic platforms offer. This trend will become increasingly beneficial to merchants as it allows them to compose a stack of solutions that best meets their needs.

Today, most vendors offer an API, yet forward-thinking CTOs know that an API is not enough; data flow to and from the network of individual best-of-breed vendor solutions becomes the key to success. In 2023, shrewd architecture work will implement best practices in this distributed world, and engineering patterns like data normalization, event replays, data transformations, and abstraction will become the norm.”

Lior Gavish, CEO at Monte Carlo

Data contracts

“Designed to prevent data quality issues that occur upstream when data generating services unexpectedly change, data contracts are very much en vogue. Why? Thanks to changes made by software engineers who unknowingly create ramifications via updates that affect the downstream data pipeline and due to the rise of data modeling gives data engineers the option to deliver the data into the warehouse, pre-modeled. 2023 will see broader data contract adoption as practitioners attempt to apply these frameworks.”

Data monetization

“In lean times, data teams have more pressure than ever to align their efforts with the bottomline. Data monetization is a mechanism for data teams to directly tie themselves to revenue. It also allows for the addition of data insights and reporting to products, a differentiator within an increasingly competitive marketplace.”

Infrastructure as code

“Modern data operations require hyper-scalable cloud infrastructures, but constantly provisioning and maintaining these services can be tedious and time consuming. laC allows data engineers to create more seamless data pipeline infrastructure that is easier to provision, deprovision, and modify – critical when budgets are tight and headcount is limited.”

Data reliability engineering

All too often, bad data is first discovered by stakeholders downstream in dashboards and reports instead of in the pipeline— or even before. Since data is rarely ever in its ideal, perfectly reliable state, data teams are hiring data reliability engineers to put the tooling (like data observability platforms and data testing) and processes (like CI/CD) in place to ensure that when issues happen, they’re quickly resolved and impact is conveyed to those who need to know before your CFO finds out.”

Alexander Lovell, Head of Product at Fivetran

2023 will be put up or shut up time for data teams

“Companies have maintained investment in IT despite wide variance in the quality of returns. With widespread confusion in the economy, it is time for data teams to shine by providing actionable insight because executive intuition is less reliable when markets are in flux. The best data teams will grow and become more central in importance. Data teams that do not generate actionable insight will see increased budget pressure.”

We will see companies taking a deeper look at cloud data warehouse costs

“Information on total cost of ownership is critical to effectively managing cost for data teams because driving down costs one at a time will undermine the efficacy of the whole system. The burden of proof shifts to data teams to demonstrate efficiency with total cost of ownership and that data insights are driving excess value to the business.”

Meera Viswanathan, Lead Product Manager at Fivetran

The modern data stack

“Companies will be looking to standardize their data stack at an increasing rate in 2023 because of too many tools being used across the company. Further, it’s crucial they make decisions based on the right data – especially with things like revenue being under an increasingly critical eye. Data trust and visibility will be more important than ever due to these factors.” 

Data governance

“Data enablement will be important, but data governance might be an afterthought in 2023 – primarily because governance is typically a failed initiative and seen as insurance only when required. The most successful enterprises will be defined by those that enable their data teams to have quick and easy access to trusted data.

There will be more standardization in the governance market in 2023, with metadata exchange formats being defined.”

Balaji Ganesan, Co-Founder and CEO at Privacera

Data security

We’ve recently seen many new vendors in the data security space looking to solve several challenges; what exactly is in the data enterprise-level companies have and use, what data is (or isn’t) sensitive, how to ensure the right people have access to that data, and how to ensure they remain compliant with the increasing regulations. This has no sign of slowing down.”

Mona Rakibe, Co-Founder and CEO at Telm.ai

Data quality

“Today most companies have become data-driven in their decisions and have embraced a modern data architecture to accelerate that path. With this shift, the importance of data quality(DQ) is also rising rapidly. DQ has started becoming its own function, with product managers at the head of this function, spearheading the mission to raise the value of data. To ensure they win from day 1, DQ product managers hold data quality to the same rigor and standards as any other product. There is an MVP to showcase the highest and most impactful value followed by incremental progress. There is constant value assessment and prioritizing of data improvements on the most pressing issues, and there is a proper product life cycle – design, usability, engineering, quality control, performance, and user trust and adoption.”

Ryan Splain, Director of Customer Success at ZL Technologies

Company data culture

“With terms like ‘quiet quitting’ and ‘the Great Resignation’ rising in popularity over the last few years, we’ll see a definitive shift in the way employers use and analyze employee data. In the past, data has been traditionally used by employers for business analysis, but looking ahead to 2023, there is a ripe opportunity for organizations to harness unstructured data to understand employee sentiment, workplace culture and productivity levels to garner a better understanding of the productivity and commitment of your current workforce.”

Unstructured data

“With the economy in a state of flux, more and more organizations are looking to hire project-based contractors or part-time employees. With this, there is an opportunity for organizations to leverage unstructured data to glean insights and information that is key to making decisions in support of a target objective on a case-by-case basis. Unstructured data uncovers the temporal, spatial and/or elemental relationship of employee data that cannot be gleaned from traditional structured data sources. This information will help organizations understand the performance and productivity of full-time vs. contract employees and make informed decisions about hiring needs (where full-time positions are needed vs. contract work, etc.).”

Stijn Christiaens, Co-Founder and Chief Data Citizen at Collibra

Companies embracing data culture will see the biggest ROI

“As the datasphere continues to explode and companies more clearly understand the value of data, 2023 will be all about the people behind it. In other words, the companies that see the greatest success in 2023 will be those who fully embrace the building of a data culture, giving more people access to data and prioritizing data literacy across their organizations. We will start to see more organizations move towards a decentralized and democratized approach to data.”

Madalina Tanasie, CTO at Collibra

Data mesh will continue to grow in popularity

“The future of cloud adoption relies on a migration to a data mesh architecture. In the past few years, data mesh has emerged as a promising trend in enterprise tech as more organizations look to decentralize data and prioritize domain ownership. Moving into 2023, we will see more companies electing to implement data mesh in order to drive cloud adoption and avoid disturbances to application performance.”

Kirk Haslbeck, VP of Data Quality at Collibra

Data observability is on the agenda for c-suite discussion

“As more and more companies begin to treat data as a product, we will start to see an emphasis on data democratization, giving everyone in an organization access to data to help streamline processes. For this reason, companies will start to more formally train employees on data observability, putting a heavy focus on data quality to ensure data is up-to-date and relevant across the organization.”

Jennifer Kuvlesky, Director of Product Marketing at Snow Software

Proper data framework essential to expanding SaaS Offering

“We’re going to see the continued expansion of SaaS functionality or companies acquiring functionalities to bundle offerings and integrate to drive expansion revenue in the coming year. However, you need to have the data framework to be able to integrate these acquisitions and for the capabilities to work together. To effectively sell bundled offerings, you need that effective data and presentation layer to simplify for the customer. Salesforce recently announced this transformation with their new Salesforce genie data layer to make app integrations faster.

Technology intelligence companies will seek to do this using data integration to bring silos of data together to provide more value for IT teams. We see this in various industries, such as marketing where there are a lot of disparate applications, but you can’t get value from them because the data is not integrated. Companies need to make their investments more usable by other areas of the business by correlating and integrating their tools and data to gain additional perspective.”

Gur Steif, President of Digital Business Automation at BMC Software

DataOps adoption

“In our conversations with large enterprise companies, a persistent challenge continues to be efficiently embedding complex data pipelines into critical enterprise applications to improve customer experiences.  They struggle to do so at scale in production, particularly across complex hybrid cloud environments. To improve data-driven application and business outcomes in the coming year, organizations will accelerate adoption of DataOps best practices and continue to invest in application and data workflow orchestration platforms.  To get the data-driven agility they desire, they realize giving their Ops and Data teams the freedom to deliver innovation within a secure, agile and operationally robust framework is a must.  Those that can successfully operationalize their data and analytics projects at scale will see greater success rates leading to improved business outcomes.”

Shireesh Thota, SVP of Engineering at SingleStore

Ushering an era of unified databases

“2023 is going to be the year of unified databases. Unified databases are designed to support large amounts of transactional and analytical workloads at the same time. This allows a simplified and flexible data architecture for companies to process massive workloads. 

In 2023, we will witness a convergence of specialized databases that will be built on four primary characteristics: distributed, shared-nothing architecture, cloud-native, multi-model and relational foundation. Organizations will need one platform to transact and reason with data in milliseconds in a hybrid, multi-cloud environment.

2022 saw many popular vendors move in this direction, and it will pick up a significant pace in the coming year.”

End in sight for legacy databases

“2023 will see an acceleration of the end for legacy databases. With the world moving towards real-time unified databases, speed has been an important differentiator, and legacy systems can’t keep up anymore with the real-time nature we are seeing in this digital services economy. We have seen this trend in industries like finance up until this point, but it’s now becoming apparent to business leaders across sectors that the digital revolution begins with the tech stack that holds your company together: the database.

We are ushering in an era of unified, simple, modern data in real time. Without this, your company will likely not see 2024.”

Distributed SQL gets a refresh and mainstream treatment

“Distributed SQL as it is now defined will be challenged. The current cohort doesn’t include real-time, unified/HTAP  capabilities. Along with the convergence of relational and non-relational, the increasing customer needs will force the convergence of transaction and analytical capabilities at scale, in what shall become the better version of distributed SQL. This trend will disproportionately accelerate the other trends!”

Mike Waas, Founder and CEO at Datometry

The return of SQL

“We are seeing a major sea change in the NoSQL space. In a reversal of their previous thrust to disavow all things SQL, all remaining NoSQL databases are now eager to add SQL and SQL-like language extensions to their products. 2023 might well become the year of the SQL-Again database.”

Jeff Tao, Founder, CEO and Core Developer at TDngine

Open-source digital transformation of traditional industries

“Open source and systems will become even more critical over the next year. In particular, traditional industries like manufacturing in the United States will look to open systems to rebuild infrastructure to become more modern, cost-effective, and globally competitive. Open systems will allow traditional industries not to be locked in by legacy vendors and allow them to be on the pulse of cutting-edge tools and technologies like AI, ML, AR, and more. It will remove the data silos allowing data to be shared easily internally or with outside partners for better analysis. 

I predict that there will be two ways that this transformation will happen. First, through embracing the cloud. Cloud-based open systems engineers will be able to share data easier and take full advantage of modern data processing, analytics tools, and the elasticity of the cloud to reduce operating costs. Additionally, democratizing infrastructure by embracing open-source projects will open traditional industries like manufacturing and automation to a larger developer community ecosystem.”

Tomer Shiran, Co-Founder and CPO at Dremio

Snowflake will become a niche technology as legacy providers’ costs rise

“In 2023 Snowflake will become more of a niche technology. With Snowflake’s costs increasing on average 71 percent year over year, based on their earnings report, customers are getting to a point where they can no longer afford to continue that kind of exponential increase in costs. Because of this, customers are going to be much more cautious about what they put in there, and will put up walls of approvals and rules regarding who’s allowed to use and access what. 

With companies becoming more careful in this regard, they will be looking for open alternatives. The demand to make data accessible and to become data driven is still there, and data’s still growing very fast. But, customers need systems that are able to do that at scale, and customers need them to be cost efficient. The industry is moving towards those types of systems.”

As companies need to save $, expensive vendor lock in is out

“Our prediction is that the economy will enter a recession in 2023. Therefore, naturally, companies are going to be more cautious about how much they spend. Companies will also look for solutions where the cost is more predictable. Snowflake is the opposite of both of those, so it’s going to hurt them.

Additionally, when using open data architecture, like Dremio, you’re minimizing the number of copies of data you have. Thus, you’re minimizing how many data pipelines you need to have – and that cuts costs tremendously. All companies are going to want efficiency across all aspects of their businesses, and certainly efficiency of data infrastructure and how you do analytics is going to be part of that.”

Lakehouses will takeover and leave warehouses in the past

“Enterprise adoption rates of lakehouse technology is skyrocketing. We’re seeing lots of movement in companies embracing open source, file and table formats. Will data warehouses go away in a year? We can’t say yes for sure, but trends are pointing in that direction.”

Donnie Berkholz, SVP of Product Management at Percona

Data ownership, sovereignty, and control will continue to expand

“Rules on data privacy and digital sovereignty are continuing to expand. Following on from the GDPR, CCPA and EU rules on data privacy, more countries have adopted these rules and regulations to protect their citizens. Countries want to prevent too much control over data by foreign companies. For the EU, this includes looking at how to manage this when US companies effectively own the cloud computing market, and what this means for the future.

This is a problem for businesses that have to operate across regions and countries, as they will have more restrictions on where they can and can’t process their data. Open-source database communities are responding to this – for example, PostgreSQL 15 launched this year, with its improvements to Logical Replication, so you can set limits and geo-fence subsets of your data so it is restricted to specific locations and can’t be replicated outside where it is needed.”

PostgreSQL will continue to take over the world

“PostgreSQL continues to grow as a project and as a community. It will eventually take over the position that MySQL holds on the DB-Engines ranking and become the most popular open source database, but this will be a while. There are lots of new projects being launched that base themselves on PostgreSQL, and then offer their spin on top.

The reason for this is that it is easy to make PostgreSQL do what you want it to, and the license it is released under makes it possible to build businesses on this as well. For users, it is simple to implement and the community is a strong one.”

Database reliability engineering will make a comeback

“Following on from the success of Site Reliability Engineering in the past five years, there was a move to apply the same methodology to database management.  However, Database Reliability Engineering (DBRE) did not catch on in the same way. For many companies, their existing database teams were enough, or they wanted to shift their approach to the cloud.  

However, the DBRE approach seems to be picking up again now. More people want to apply those lessons to how they manage database instances, reducing overheads and improving resiliency. The growth of database deployments on Kubernetes is partly responsible for this new wave of interest, so there should be more demand for DBREs in 2023.”

Lenley Hensarling, Chief Product Officer at Aerospike

Real-time data

“Despite an uncertain global economy, real-time data will continue to grow at 30%+ in 2023 as the need for an accurate, holistic, real-time view of a business increases. Enterprises will examine how to leverage real-time data to mitigate risk and find more value in margins and operational costs. For example, financial institutions compute risk models every five minutes to understand how to navigate change, rather than every few hours, which is critical in fluctuating global economies. Delayed responses can cost millions. 

Mitigating risk with real-time data will become more mainstream, expanding beyond the financial services sector to find more value in operations costs. Numerous industries like healthcare and manufacturing already use real-time data in day-to-day operations, but new industries will start to leverage the gains from instantaneously available data to control costs.”

Ed Macosky, Chief Innovation Officer at Boomi

CIOs will rely on low-code to offset skill shortages

“To combat IT skill shortages while addressing mounting pressures to keep costs in line, some businesses will take on low-cost applications that serve niche needs, creating numerous data silos that lead to quality and governance issues. To address this, we will see more CIOs partner with their C-Suite peers and adopt low-code iPaaS/automation tools that provide governance and security, while still fulfilling businesses’ needs.”

Rajesh Raheja, Chief Engineering Officer at Boomi

CIOs, chief data officers, and data managers will need to confront the cost of modernization

“Modernization and digital transformation involve many long-term initiatives like migrating major systems to the cloud or constantly integrating existing data and systems in order to meet business process needs. These can be high-cost initiatives, but are more and more necessary in today’s digital world. To minimize the risks of failures with these initiatives, executives should look for ways to balance their long-term vision with the short-term ROI of modernization, like working with a data integration platform or automating the integration process.”

Data-driven enterprises will explore new ways to deliver customer value through modern integration systems

“The proliferation of cloud-based systems and increase in SaaS applications will create massive opportunities for data-driven enterprises. We are seeing more data regulations and an explosion of IoT devices on the market. These factors will drive more focus on data governance and new potential uses for data-driven processes that will help meet the growing consumer expectations for custom, targeted service.”

Software engineering will mature to meet other engineering standards

“No doubt software is eating the world; however, the standards for building software are not up to par with other industries’ engineering standards, such as those used in manufacturing. As software is increasingly embedded throughout our daily lives, such as in cars and aircrafts, it is now responsible for the safety of human lives. This means  there will be more focus on software quality and design processes, on more mature DevOps and DevSecOps, and on security. Standards will evolve to provide full transparency on not only what the software does, but how it does it.”

Ben Haynes, Co-Founder and CEO at Directus

The rise of hybrid “bring-your-own-database” (BYODB) cloud deployments

“The benefits of moving certain data-driven projects to the cloud are undisputed — quicker deployment, reduced infrastructure and maintenance costs, built-in support and SLAs, and instant scalability when you need it. However, there will always be use case obligations that require keeping data on-premises, including performance, security, regulatory compliance, local development, and air-gapped hardware (to name a few). A more flexible solution is for modern data vendors to support hybrid “bring-your-own-database” (BYODB) cloud deployments in addition to the more common on-premises and fully-managed cloud service options. This new approach will catch on in the years ahead, allowing data to be kept in situ and unaltered but remotely connected to SaaS services that layer on top from nearby data centers. This provides all the benefits of the cloud, while still allowing for full authority and control over the company’s most precious resource… its data.”

Modern no-code and low-code solutions will follow a bottom-up approach

“No-code and low-code platforms like AirTable have been instrumental in democratizing company data. However, while they provide highly intuitive facades for non-technical business users, their top-down architecture is extremely limiting or inaccessible to engineers. While quickly adoptable, these band-aide services have an unconsidered backend that is unable to scale, and therefore needs to be replaced over time. In the coming years, modern NC/LC solutions will follow a bottom-up approach that lays a foundational data layer comprised of powerful developer tools, performant APIs, tailored data stores, and an unopinionated tech stack. True data democratization can’t be achieved without equally enabling both non-technical and highly technical users.”

Organizations will work toward a “hub and spoke” approach to microservices

“There’s no question that the microservice approach is superior in most ways to legacy monolithic architectures. However, there are a number of downsides to a vast matrix of microservices. Overall complexity leads to a data ecosystem that is difficult to understand and maintain, requires many costly licenses, and forces a steep learning curve for user training and onboarding. These microservices don’t perfectly bookend to each other, leaving gaps in capabilities that need to be filled with custom code and logic — and data is siloed across disparate platforms, with tenuous integrations. Moving forward, organizations will work toward a more balanced “hub and spoke” approach (e.g. they will turn to solutions that lay a complete and solid data foundation that covers business needs (the “hub”) while still integrating with microservices to allow specialization, as needed). This more balanced solution will avoid the overrotation to microservice complexity.”

Marinela Profi, Data Scientist at SAS Software

Data management becomes automated with AI

“We continue to see organizations struggling to keep up with the speeds and feeds of their data, spending 80 percent of their time simply wrangling data and 20% of their time performing analysis and modeling. Over the next decade, one of the largest impacts AI can make to overcoming the information overload is by automating data management processes so customers can spend 80% of their time performing analysis and deploying more models into production.”

Jay Upchurch, EVP and CIO at SAS Software

Data becomes an organic entity with a life of its own

“Business leaders will embrace data as an asset with a lifecycle to make data democratization and data productization within a data mesh environment more feasible, creating an organic and fluid data portrait.  Data will be tracked like it is inventory with consideration to possible data productization strategies and opportunities to include data as a facet of the delivered end product. Understanding the shape and breadth of data at any point in time – whether that’s at the time a software package is deployed or when a decision is made based on the data – is something modern software users will expect.”

Enterprises move from traditional data warehouses to real-time data storage

“In 2023, we will continue to see movement away from traditional data warehousing to storage options that support analyzing and reacting to data in real time. Organizations will lean into processing data as it becomes available and storing it in a user-friendly format for reporting purposes (whether that’s as a denormalized file in a data lake or in a key-value NoSQL database like DynamoDB). Whether a manufacturer monitoring streaming IoT data from machinery, or a retailer monitoring ecommerce traffic, being able to identify trends in real time will help avoid costly mistakes and capitalize on opportunities when they present themselves.”

Matt Carroll, Co-Founder and CEO at Immuta

CISOs will need to become the enablers – not the bottlenecks – of the modern data stack

“The rapid shift of data from on-premises to the cloud is spurring one of the greatest cybersecurity challenges to date. Despite most CISOs having a full arsenal of tools for protecting data in the cloud, the proliferation of cloud players such as Snowflake, Databricks, Google BigQuery, Amazon Redshift, and other cloud-based SaaS solutions has accelerated data sharing to a breaking point. Traditional approaches that worked for on-premises environments just can’t keep up with the exponential growth in the number of users, data sources, and policies that must be governed, managed, and secured in today’s environment. 

As a result, in 2023 we’ll see a major shift in data security architecture, forcing CISOs to roll up their sleeves and put controls into place around this budding “Modern Data Stack.” This will include proper access controls that effectively balance access and security, continuous monitoring of business intelligence, and data science activities for anomaly detection. At the same time, how we think about monitoring will have to change – zero trust won’t work using traditional approaches because there are too many endpoints. At the end of the day, monitoring within the modern data stack must evolve to keep pace with the speed of data.”

The rise of the data processing agreement (DPA)

“How organizations process data within on-premises systems has historically been a very controlled process that requires heavy engineering and security resources. However, using today’s SaaS data infrastructure, it’s never been easier to share and access data across departments, regions, and companies. With this in mind, and as a result of the increase in data localization/sovereignty laws, the rules as to how one accesses, processes, and reports on data use will need to be defined through contractual agreements – also known as data processing agreements (DPA). 

In 2023, we’ll see DPAs become a standard element of SaaS contracts and data sharing negotiations. How organizations handle these contracts will fundamentally change how they architect data infrastructure and will define the business value of the data. As a result, it will be in data leaders’ best interest to fully embrace DPAs in 2023 and beyond. These lengthy documents will be complex, but the digitization of DPAs and the involvement of legal teams will make them far easier to understand and implement.”

No-copy data exchanges will take hold

“In 2023, as data sharing continues to grow, and data and IT teams are strapped to keep up, no-copy data exchanges will become the new standard. As organizations productize their modern data stack, there will be an explosion in the size and number of data sets. Making copies before sharing just won’t be feasible anymore. In 2023, enterprises will flock to established platforms, like Snowflake’s Data Exchange and Databricks’ Delta Sharing protocol, to make it easier to securely share and monetize their data.”

Sophie Stalla-Bourdillon, Senior Privacy Counsel and Legal Engineer at Immuta

Getting access to data does not necessarily mean being in a position to derive useful insight

In this data deluge, the successful organizations will be those who will be able to crack the data governance dilemma by leveraging both self-executing policies, such as access control and obfuscation, and auditing capabilities, with a view to reduce time to data. They will discard meaningless pre-approval workflows and federate data governance by making data owners the key players: data owners will be both domain experts and data stewards.”

Cloud-based data analytics become mainstream

“And data-driven organizations free up resources to set up trusted or secure analytics environments to prevent data breaches and mitigate compliance risks. While the COVID-19  pandemic has accelerated the ingestion of healthcare data into secure analytics environments, other verticals will quickly follow a similar path. Organizations at the forefront of this move will build multi-cloud data analytics environments and will have to abstract or federate their governance layer to adequately govern data in multiple locations.”

Vishal Singh, Head of Data Products at Starburst

The future is decentralized and agility and collaboration will be key

“Despite attempts to house data in a central location over the past few decades, the reality is that data remains distributed, and it will continue to exist across different regions, clouds, domains, and/or business units. So, how do organizations succeed in an environment where centralized strategies are being advocated for but distributed data is the reality? 

In 2023, agility and collaboration will be the keys to successful decentralized data management. The organizations that will succeed will do so by enabling data driven insights to be easily curated and shared across the enterprise. A foundational component of this will be the widespread adoption of the concept of treating data as a product.

Rather than needing to go elsewhere to answer questions they may have about the data they are working with, Data Products provide a self-service component that enables them to fill in the gaps between data creation and consumption. In the coming year, we’ll see more organizations laying these foundations of democratizing data access and/or strengthening their Data Mesh framework through Data Products.”

Lior Gavish, Co-Founder and CTO at Monte Carlo

Data contracts

“Designed to prevent data quality issues that occur upstream when data-generating services unexpectedly change, data contracts are very much en vogue. Why? Thanks to changes made by software engineers who unknowingly create ramifications via updates that affect the downstream data pipeline and due to the rise of data modeling gives data engineers the option to deliver the data into the warehouse, pre-modeled. 2023 will see broader data contract adoption as practitioners attempt to apply these frameworks.”

Data monetization

“In lean times, data teams have more pressure than ever to align their efforts with the bottomline. Data monetization is a mechanism for data teams to directly tie themselves to revenue. It also allows for the addition of data insights and reporting to products, a differentiator within an increasingly competitive marketplace.”

Infrastructure as code

“Modern data operations require hyper-scalable cloud infrastructures, but constantly provisioning and maintaining these services can be tedious and time consuming. laC allows data engineers to create more seamless data pipeline infrastructure that is easier to provision, deprovision, and modify – critical when budgets are tight and headcount is limited.”

Chris Gladwin, Co-Founder and CEO at Ocient

Hyperscale will become mainstream

“Data warehouse vendors are will develop new ways to build and expand systems and services. Some leading-edge IT organizations are now working with data sets that comprise billions and trillions of records. In 2023, we could even see data sets of a quadrillion rows in data-intensive industries such as adtech, telecommunications, and geospatial.  Hyperscale data sets will become more common as organizations leverage increasing data volumes in near real-time from operations, customers, and on-the-move devices and objects.”

Pipelines will get more sophisticated

A data pipeline is how data gets from its original source into the data warehouse. With so many new data types—and data pouring in continuously—these pipelines are becoming not only more essential, but potentially more complex.  In 2023, users should expect data warehouse vendors to offer new and better ways to extract, transform, load, model, test, and deploy data. And vendors will do so with a focus on integration and ease of use.”

Data complexity will increase, while data analysis will be continuous

Next-generation cloud data warehouses must be versatile—able to support multimodal data natively, to ensure performance and flexibility in the workloads they handle. The need to analyze new and more complex data types, including semi-structured data.  Technology strategists have long sought to utilize real-time data for business decision-making, but architectural and system limitations have made that a challenge, if not impossible. Emerging use cases such as IoT sensor networks, robotic automation, and self-driving vehicles are generating ever more real-time data, which needs to be monitored, analyzed, and utilized.”

Frank Liu, Director of Operations at Zilliz

Vector databases take hold to unleash the value of untapped unstructured data

“As businesses embrace the AI era and attempt to make full use of its benefits in production, there occurs a significant spike in the volume of unstructured data taking all sorts of forms that need to be made sense of.  To cope with these challenges in extracting tangible value from unstructured data, vector databases–a new type of database management technology purpose-built for unstructured data processing–is on the rise and will take hold in years to come.”

The synergy between structured and unstructured data

“Despite the exponential growth of unstructured data, structured data will still carry substantial value in the future. And it’s almost inevitable for organizations to deal with both structured and unstructured data simultaneously to realize maximum business growth. 

Incumbent solutions originally engineered to deal with structured data for traditional data analytics can extend their processing capabilities to unstructured data through plug-ins, like “native vector search” in ElasticSearch 8.0 and “vector similarity search” in Redis 6.0. For AI applications known for intensive unstructured data, that’s where a purpose-built solution like vector databases shines, complemented with the hybrid search functionality that supports filtering based on tags, attributes, etc.”

Heterogeneous computing is a place to be leveraged to supercharge performance 

“CPU prevails as the hardware of existing solutions for its widely recognized cost efficiency. As the proliferation of AI gives rise to more diverse applications, there are higher requirements for performance (high throughput) in certain scenarios that GPU-accelerated solutions can only achieve. For example, billion-scale image search and video analysis at Meta.”

Sean Knapp, CEO at Ascend.io

Data Lakes and Warehouses will converge as data infrastructure vendors of all sizes try to differentiate themselves through innovation

“We are currently in a rare moment in the evolution of the data infrastructure industry as users are increasingly seeing competitive parity features emerge from major players like Snowflake, Databricks, etc. In turn, this will create an even stronger buyer’s market as vendors continue to innovate and differentiate themselves, resulting in greater industry integration, interoperability and a standardization of best practices. In 2023 and beyond, data warehouses, data lakes and other similar infrastructure technologies will see notable consolidation as buyers sift through vendors and features to find the most value for their data stack while removing redundancies and the need to build and manage their own bespoke platforms.”

Older concepts like catalogs and MDM will resurface and become a battleground for data infrastructure vendors

“As companies face tool and metadata fragmentation, there will be a revisiting of legacy enterprise disciplines such as catalogs and master data management (MDM). These terms or reimagined versions of them such as “data lineage” will come back into vogue and will be aggressively espoused by all sizes of data vendors from Amazon, Databricks and Snowflake to startups. In 2023, independent catalog providers will benefit from the increased awareness, but also face stiffer competition from vertically integrated offerings.”

Businesses will take incremental steps toward data mesh and fabric adoption

“While the tech industry as a whole may still be debating what defines each of these (as well as what differentiates them), there is undeniable interest among organizations to gain the benefits of these specific architectures, namely speed and agility at scale. In 2023 and the years ahead, while most data teams will not have the resources to dive headfirst into working with a new mesh or fabric architecture, we will see them incrementally work their way towards the adoption of these solutions, often first by defining new best practices and strategies that support these future cutting-edge architectures.”

The data Engineer role will continue down the path of specialization

“Over the past couple of decades, we’ve seen the role of the software engineer split into a variety of other roles such as infrastructure, backend, frontend, mobile, and even product engineer. The role of the data engineer will follow a similar path, and we will see continued growth not only in today’s “data engineer” (i.e. the full stack engineer), but expansion in the “analytics engineer” (i.e. the frontend engineer), as well as the “data platform engineer” (i.e. the infrastructure engineer). This specialization will enable organizations to tap into a greater body of talent and skills, as well as aligning jobs with what is of greatest interest to their developers.”

Developer platforms will overtake loosely connected tools as teams tire of integrations

“While only a handful players continue to dominate the data infrastructure stack (Snowflake, Databricks, and the big 3 clouds), businesses have been using a hodgepodge of tools upstack for tasks like data ingestion, transformation, orchestration, management, and observability. In 2023, businesses will reach their breaking point as they tire of assembling and managing these upstack tools and don’t see enough returns on those investments. As they become increasingly frustrated by the inherent inefficiency of the fragmentary model, businesses will start to consolidate their vendor tools in order to prioritize developer team productivity and ease of maintenance.”

Brian Spanswick, CISO at Cohesity

Security, data management, and storage

“As we think about the data security, data management, and storage industries in 2023 two dominant trends will be an increased focus on preventing and reducing the impact of cyberattacks such as ransomware. Market conditions and budgets will continue to be dynamic and will put greater emphasis on the need to adopt modern platform solutions, offering the most cost-effective and efficient strategies for data security and management.”

Data security

“Cybersecurity and data security will become even more important for organizations, resulting in the merge of data security, cyber security, and data management. The convergence of these three areas will be necessary to combat increasingly sophisticated cyber attacks such as ransomware. Leaders in these areas will partner and build AI-powered, integrated solutions providing customers with end-to-end security that helps prevent a breach.”

Danny Sandwell, Senior Solutions Strategist at Quest Software

New data sovereignty laws will spur businesses to make data more visible and interoperable 

“We expect to see businesses take a more proactive role in creating their own data governance policies amid the current wave of regulatory action. The current global patchwork of data sovereignty and privacy laws has made it more complicated than ever for businesses to create consistent policies on data sharing, integration and compliance. This will continue to have a significant impact on organizations’ ability to maximize the use of data across their IT infrastructure, unless they put together clear plans for data integration and governance. In 2023, the passing of more data sovereignty and sharing laws will spur businesses to invest in getting visibility into their data and creating clear plans for sharing and integration across their IT landscape.”

Iddo Gino, CEO at Rapid

APIs will drive democratization of data

“APIs make it easy to adjust, transform, enrich and consume data – traditionally there was a need for hundreds of highly paid engineers to manage the data and data scientists were needed to understand algorithms. In 2023, we will see a shift towards APIs technologies managing data as a way to gain insights and also control data related costs which means people will no longer need to have highly developed engineering skills to harness the power of data.”

Hillery Hunter, GM, Cloud Industry Platforms & Solutions; CTO at IBM Cloud

Preparing for evolving regulations and data sovereignty

“As organizations strive to meet the demands of today’s digital-first customers, they are modernizing at a faster rate than ever before. But at the same time, they need to balance innovation with growing regulatory requirements and data sovereignty laws. With regulatory requirements like DORA heating up across the globe, compliance will be top of mind for business leaders in the year ahead – and concerns will be even greater for those in highly regulated industries such as financial services and those handling client personal information. In fact, more than half of business and technology leaders believe ensuring compliance in the cloud has been too difficult according to a recent study by IBM. As the proliferation of cloud technologies gains more oversight from regulators and introduces emerging operating models for the industry like Sovereign Cloud, organizations will need to adopt technologies that allow them to drive innovation while adhering to growing requirements in 2023 and beyond.”

Protecting data as global ransomware threats grow

“As organizations strive to meet the need for instant gratification demanded by today’s digital-first customers, they also need to stay ahead of growing ransomware threats. While it’s important to ensure platforms and services are easy for customers to use, it’s even more critical to ensure they are impenetrable to bad actors that want access to sensitive financial data or wish to hold data ransom. In 2023, finding balance will be key as businesses –

especially those in highly regulated industries – modernize and strive to reduce 3rd and 4th party risk. As a result, we’ll see more organizations adopt business transformation strategies that are designed to handle their most mission-critical workloads and ensure data remains protected, especially through cybervault technologies.”

Michael Krause, Data Science Director at Beyond Limits

The fate of forgotten data

“The trend towards better data management, which we have already seen emerging out of companies like Snowflake, AWS, and Databricks, will strengthen in 2023 as the industry prioritizes sustainability. Many companies have not invested in good data management practice and underlying IT infrastructure, and are therefore not well positioned to take advantage of AI technologies. To accelerate digital transition, businesses will look to adopt platforms that automatically manage a large part of the data value chain for them.”

Dan Spurling, SVP of Product Engineering at Teradata

Data reduction

“There is an exponentially increasing amount of data, but I believe we will see rise of solutions that deduce the meaningful bits of data from the overall mass of data collected, or even reduce the footprint of data using new technologies beyond current classic data storage techniques.”

Download Link to Data Management Vendor Map

Share This

Related Posts