Business Intelligence Buyer's Guide

44 Analytics & Data Science Predictions from 24 Experts for 2024

For our 5th annual Insight Jam LIVE! Solutions Review editors sourced this resource guide of analytics and data science predictions for 2024 from Insight Jam, its new community of enterprise tech experts.

Note: Analytics and data science predictions are listed in the order we received them.

Analytics and Data Science Predictions from Experts for 2024


Rahul Pradhan, Vice President of Product and Strategy at Couchbase

Real-time data will become the standard for businesses to power generative experiences with AI; Data layers should support both transactional and real-time analytics 

“The explosive growth of generative AI in 2023 will continue strong into 2024. Even more enterprises will integrate generative AI to power real-time data applications and create dynamic and adaptive AI-powered solutions. As AI becomes business critical, organizations need to ensure the data underpinning AI models is grounded in truth and reality by leveraging data that is as fresh as possible.”

“Just like food, gift cards and medicine, data also has an expiration date. For generative AI to truly be effective, accurate and provide contextually relevant results, it needs to be built on real-time, continually updated data. The growing appetite for real-time insights will drive the adoption of technologies that enable real-time data processing and analytics. In 2024 and beyond, businesses will increasingly leverage a data layer that supports both transactional and real-time analytics to make timely decisions and respond to market dynamics instantaneously.”

Expect a paradigm shift from model-centric to data-centric AI

“Data is key in modern-day machine learning, but it needs to be addressed and handled properly in AI projects. Because today’s AI takes a model-centric approach, hundreds of hours are wasted on tuning a model built on low-quality data.”

“As AI models mature, evolve and increase, the focus will shift to bringing models closer to the data rather than the other way around. Data-centric AI will enable organizations to deliver both generative and predictive experiences that are grounded in the freshest data. This will significantly improve the output of the models while reducing hallucinations.”

Multimodal LLMs and databases will enable a new frontier of AI apps across industries

“One of the most exciting trends for 2024 will be the rise of multimodal LLMs. With this emergence, the need for multimodal databases that can store, manage and allow efficient querying across diverse data types has grown. However, the size and complexity of multimodal datasets pose a challenge for traditional databases, which are typically designed to store and query a single type of data, such as text or images. “

“Multimodal databases, on the other hand, are much more versatile and powerful. They represent a natural progression in the evolution of LLMs to incorporate the different aspects of processing and understanding information using multiple modalities such as text, images, audio and video. There will be a number of use cases and industries that will benefit directly from the multimodal approach including healthcare, robotics, e-commerce, education, retail and gaming. Multimodal databases will see significant growth and investments in 2024 and beyond — so businesses can continue to drive AI-powered applications.”

Nima Negahban, CEO and Co-Founder at Kinetica

Generative AI turns its focus towards structured, enterprise data

“Businesses will embrace the use of generative AI for extracting insights from structured numeric data, enhancing generative AI’s conventional applications in producing original content from images, video, text and audio. Generative AI will persist in automating data analysis, streamlining the rapid identification of patterns, anomalies, and trends, particularly in sensor and machine data use cases. This automation will bolster predictive analytics, enabling businesses to proactively respond to changing conditions, optimizing operations, and improving customer experiences.”

English will replace SQL as the lingua-franca of business analysts

“We can anticipate a significant mainstream adoption of language-to-SQL technology, following successful efforts to address its accuracy, performance, and security concerns. Moreover, LLMs for language-to-SQL will move in-database to protect sensitive data when utilizing these LLMs, addressing one of the primary concerns surrounding data privacy and security. The maturation of language-to-SQL technology will open doors to a broader audience, democratizing access to data and database management tools, and furthering the integration of natural language processing into everyday data-related tasks.”

Vasu Sattenapalli, CEO at RightData

NLP-Powered Analytics Will Be the Next Wave of Self Service

“Analytics have been stuck in dashboards, which will no longer be the only way to consume business insights. Voice and Generative AI will enter the analytics space where you can ask questions of your data verbally and get a response back in minutes, if not seconds. Imagine even pulling out your phone with an app specific to your organization’s data and being able to access a world of insights. It’s coming!”

Shawn Rogers, CEO and Fellow at BARC

AI is driving innovation in data management, especially through automation and speed

“Having strength at this core level of your data stack is critical for AI success. NLP and conversational UI’s will open the door for the true democratization of analytics. It’s an exciting time for data and insights.”

Bernie Emsley, CTO at insightsoftware

CTO’s will need to bring even more collaboration and education to the C-suite

“Over the past few years, the CTO role has become the bridge between the tech-savvy and the business-savvy, charged with enabling the right solutions to create the best overall business outcomes. This comes with its communication challenges as the CTO needs to navigate how to translate tech into an ROI for the organization’s board and C-suite. In 2024, the ability to educate their C-level colleagues will become even more important as artificial intelligence (AI) technologies become commonplace. The CTO will not only need to be able to collaborate with the tech side of the business to ensure what is realistically possible in the realm of AI but will need to communicate on a business level its potential – both from employee productivity and product standpoint.”

Strong data engines will make financial data movement possible

“Financial organizations are just starting to realize the potential their data holds, using it for guidance in financial planning and analysis, budgetary planning, and more. However, much of this data is still siloed, and we have reached the point where these organizations have so much of this data, that they need to start thinking about how it can bring value to the company or risk losing their competitive advantage. In 2024, we will see finance organizations seek to classify and harmonize their data across repositories to enable new solutions. In response, data engines, data platforms, and data lakes will be just a few tools that will become crucial to understanding and utilizing such data effectively. As a result, we can expect to see the growth of fintech applications to enable this aggregated data analysis, reporting, and visualization to take place.”

Joy Allardyce, General Manager, Data & Analytics at insightsoftware

A continual shift to cloud resources

“The continued push to re-architect technology landscapes to a cloud/SAAS approach will prevail, and many organizations who’ve made large bets ($1B+) contracts on the cloud will find they can’t innovate fast enough to deliver on those commitments. Some, on the other hand, don’t see it as a migration for cost, but an opportunity to modernize and transform how they use data in their business.”

The rise and adoption of AI

“AI, like all reporting projects, is only as good as the data it has access to and the prompts used to make a request. With the push for AI, many are still stuck getting their data foundations established so that they can take advantage of AI. To avoid pilot purgatory, starting with the outcome (use case) in mind that shows a quick win and demonstrable value vs. a one-off project is key.”

Democratizing data

“While the notion of centralized data management is a trend, the reality is that departments still own their data AND have domain expertise. How organizations can adopt a democratized and open fabric but employ the right data governance strategies to support faster innovation and adoption will be crucial. Doing so will only further support the adoption of AI, which requires strong domain knowledge for value to be truly extracted.”

Andy Oliver, Director of Marketing at CelerData

Java will continue to be used for a great many legacy and even current systems and applications

“Java, though showing its age and looking slower in today’s environments, will continue to be used for a great many legacy and even current systems and applications, regardless of the low level of support and leadership from Oracle*

The challenge with implementing real-time data has been more about storage than anything else. I think in the past people were obsessed with real-time versus batch. Sometimes it seems like a choice between something that’s big enough but too slow vs. something that’s fast enough but too small.

However, real-time and batch will come together, to meet the requirements of user numbers, and we will see more unified analytical database technologies for functions and insights that demand real-time analysis.

Not everything will need to move over to real-time, though – there are plenty of things where there’s no good reason to do it.

I think we’re going to see most of the nonsense shake out from operational AI if it can really learn and stick to core organizational needs, and be deployed responsibly and effectively. That’s where VCs are going to focus in the future, the rest will keep falling by the wayside.”

Casey Ciniello, Product Owner and Marketing Manager at Infragistics

More Businesses Will Rely on Predictive Analytics to Make Decisions in 2024

“Making decisions based on gut instinct is a thing of the past as organizations are fully realizing the power of analytics to make data-driven decisions, evidenced by the number of software platforms incorporating embedded analytics. Analytics will be all encompassing in 2024 as we become reliant on data for everything from everyday business research such as inventory and purchasing to predictive analytics that allow businesses to see into the future. Predictive analytics will drive businesses forward by helping them make informed, data-driven decisions, improve productivity, and increase sales/revenue — rather than merely reacting in response to events that have already taken place.”

Justin Borgman, Co-Founder and CEO at Starburst

Two hot topics, data products & data sharing, will converge in 2024

“Data sharing was already on the rise as companies sought to uncover monetization opportunities, but a refined method to curate the shared experience was still missing. As the lasting legacy of data mesh hype, data products will emerge as that method. Incorporating Gen AI features to streamline data product creation and enable seamless sharing of these products marks the pivotal trifecta moving data value realization forward.”

Mike Carpenter, VC Advisor for Lightspeed Venture Partners

AI to Drive Real-Time Intelligence and Decision Making

“Next year will be foundational for the next phase of AI. We’ll see a number of new innovations for AI, but we’re still years away from the application of bigger AI use cases. The current environment is making it easy for startups to build and prepare for the next hype cycle of AI. That said, 2024 is going to be the year of chasing profitability. Due to this, the most important trend in 2024 will be the use of AI to drive real-time intelligence and decision-making. This will ultimately revolutionize go-to-market strategies, derisk investments, and increase bottom-line value.”

Brian Peterson, Co-Founder and Chief Technology Officer at Dialpad

Influx of data talent/AI skills 

“As businesses continue to embrace AI, we’re going to see not only an increase in productivity but also an increase in the need for data talent. From data scientists to data analysts, this knowledge will be necessary in order to sort through all the data needed to train these AI models. While recent AI advancements are helping people comb through data faster, there will always be a need for human oversight – employees who can review and organize data in a way that’s helpful for each model will be a competitive advantage. Companies will continue looking to hire more data-specific specialists to help them develop and maintain their AI offerings. And those who can’t hire and retain top talent  – or don’t have the relevant data to train to begin with – won’t be able to compete. 

Just like we all had to learn how to incorporate computers into our jobs years ago, non-technical employees will now have to learn how to use and master AI tools in their jobs. And, just like with the computer, I don’t believe AI will eliminate jobs, more so that it will shift job functions around the use of the technology. It will make everyone faster at their jobs, and will pose a disadvantage to those who don’t learn how to use it. ”

The commoditization of data to train AI

“As specialized AI models become more prevalent, the proprietary data used to train and refine them will be critical. For this reason, we’re going to see an explosion of data commoditization across all industries. Companies that collect data that could be used to train chatbots, take Reddit for example, sit on an immensely valuable resource. Companies will start competitively pricing and selling this data.” 

Wayne Eckerson, President at Eckerson Group

“Within five years, most large companies will implement a data product platform (DPP), otherwise known as an internal data marketplace, to facilitate the publication, sharing, consumption, and distribution of data products.”

Helena Schwenk, VP, Chief Data & Analytics Officer at Exasol

FinOps becomes a business priority, as CIOs analyze price / performance across the tech stack

“Last year, we predicted that CFOs would become more cloud-savvy amidst recession fears, and we watched this unfold as organizations shifted to a “do more with less” mentality. In 2024, FinOps practices the financial governance of cloud IT operations, as the business takes aim at preventing unpredictable, sometimes chaotic, cloud spend and gains assurance from the CIO that cloud investments are aligned with business objectives.

As IT budgetary headwinds prevail, the ability to save on cloud spend represents a real opportunity for cost optimization for the CIO. One of the most important metrics for achieving this goal is price/performance, as it provides a comparative gauge of resource efficiency in the data tech stack. Given most FinOps practices are immature, we expect CIOs to spearhead these efforts and start to perform regular price/performance reviews. 

FinOps will become even more important against the backdrop of organizations reporting on ESG and sustainability initiatives. Beyond its role in forecasting, monitoring, and optimizing resource usage, FinOps practices will become more integral to driving carbon efficiencies to align with the sustainability goals of the organization.” 

AI governance becomes C-level imperative, causing CDOs to reach their breaking point

“The practice of AI governance will become a C-level imperative as businesses seek to leverage the game-changing opportunities it presents while balancing responsible and compliant use. This challenge is further emphasized by the emergence of generative AI, adding complexity to the landscape. 

AI governance is a collective effort, demanding collaborative efforts across functions to address the ethical, legal, social, and operational implications of AI. Nonetheless, for CDOs, the responsibility squarely rests on their shoulders. The impending introduction of new AI regulations adds an additional layer of complexity, as CDOs grapple with an evolving regulatory landscape that threatens substantial fines for non-compliance, potentially costing millions.

This pressure will push certain CDOs to their breaking point. For others, it will underscore the importance of establishing a fully-resourced AI governance capability, coupled with C-level oversight. This strategic approach not only addresses immediate challenges, but strengthens the overall case for proactive and well-supported AI governance going forward.”

Florian Wenzel, Global Head of Solution Engineering at Exasol

Expect AI backlash, as organizations waste more time and money trying to ‘get it right’

“As organizations dive deeper into AI, experimentation is bound to be a key theme in the first half of 2024. Those responsible for AI implementation must lead with a mindset of “try fast, fail fast,” but too often, these roles need to understand the variables they are targeting, do not have clear expected outcomes, and struggle to ask the right questions of AI. The most successful organizations will fail fast and quickly rebound from lessons learned. Enterprises should anticipate spending extra time and money on AI experimentation, given that most of these practices are not rooted in a scientific approach. At the end of the year, clear winners of AI will emerge if the right conclusions are drawn.

With failure also comes greater questioning around the data fueling AI’s potential. For example, data analysts and C-suite leaders will both raise questions such as: How clean is the data we’re using? What’s our legal right to this data, specifically if used in any new models? What about our customers’ legal rights? With any new technology comes greater questioning, and in turn, more involvement across the entire enterprise.”

Nick Elprin, Co-Founder and CEO at Domino Data Lab

An army of smaller, specialized Large Language Models will triumph over giant general ones

“As we saw during the era of “big data” — bigger is rarely better. Models will “win” based not on how many parameters they have, but based on their effectiveness on domain-specific tasks and their efficiency. Rather than having one or two mega-models to rule them all, companies will have their own portfolio of focused models, each fine-tuned for a specific task and minimally sized to reduce compute costs and boost performance.”

Generative AI will unlock the value and risks hidden in unstructured enterprise data

“Unstructured data — primarily internal document repositories — will become an urgent focus for enterprise IT and data governance teams. These repositories of content have barely been used in operational systems and traditional predictive models to date, so they’ve been off the radar of data and governance teams. GenAI-based chat bots and fine-tuned foundation models will unlock a host of new applications of this data, but will also make governance critical. Companies who have rushed to develop GenAI use cases without having implemented the necessary processes and platforms for governing the data and GenAI models will find their projects trapped in PoC purgatory, or worse. These new requirements will give rise to specialized tools and technology for governing unstructured data sources.”

Kjell Carlsson, Head of Data Science Strategy and Evangelism at Domino Data Lab

Predictive AI Strikes Back: Generative AI sparks a traditional AI revolution

“The new hope around GenAI drives interest, investment, and initiatives in all forms of AI. However, the paucity of established GenAI use cases, and lack of maturity in operationalizing GenAI means that successful teams will allocate more than 90% of their time to traditional ML use cases that, despite the clear ROI, had hitherto lacked the organizational will.”

GPUs and GenAI Infrastructure Go Bust

“Gone are the days when you had to beg, borrow and steal GPUs for GenAI. The combination of a shift from giant, generic LLMs to smaller, specialized models, plus increased competition in infrastructure and also quickly ramping production of new chips accelerated for training and inferencing deep learning models together mean that scarcity is a thing of the past. However, investors don’t need to worry in 2024, as the market won’t collapse for at least another year.”

Forget Prompt Engineer, LLM Engineer is the Least Sexy, but Best Paid, Profession

“Everyone will need to know the basics of prompt engineering, but it is only valuable in combination with domain expertise. Thus the profession of “Prompt Engineer” is a dud, destined, where it persists, to be outsourced to low-wage locations. In contrast, as GenAI use cases move from PoC to production, the ability to operationalize GenAI models and their pipelines becomes the most valuable skill in the industry. It may be an exercise in frustration since most will have to use the immature and unreliable ecosystem of GenAI point solutions, but the data scientists and ML engineers who make the switch will be well rewarded.”

GenAI Kills Quantum and Blockchain

“The unstoppable combination of GenAI and Quantum Computing, or GenAI and Blockchain? Not! GenAI will    be stealing all the talent and investment from Quantum and blockchain, kicking quantum even further into the distant future and leaving blockchain stuck in its existing use cases of fraud and criminal financing. Sure, there will be plenty of projects that continue to explore the intersection of the different technologies, but how many of them are just a way for researchers to switch careers into GenAI and blockchain/quantum startups to claw back some of their funding?”

Arina Curtis, CEO and Co-Founder at DataGPT

Data and Business Teams Will Lock Horns Onboarding AI Products

While business user demand for AI products like ChatGPT has already taken off, data teams will still impose a huge checklist before allowing access to corporate data. This tail wagging the dog scenario may be a forcing function to strike a balance, and adoption could come sooner than later as AI proves itself as reliable and secure.”

Businesses Big and Small Will Prioritize Clean Data Sets

“As companies realize the power of AI-driven data analysis, they’ll want to jump on the bandwagon – but won’t get far without consolidated, clean data sets, as the effectiveness of AI algorithms is heavily dependent on the quality and cleanliness of data. Clean data sets will serve as the foundation for successful AI implementation, enabling businesses to derive valuable insights and stay competitive.”

Doug Kimball, CMO at Ontotext

Shift from How to Why: Enter the Year of Outcome-based Decision Making

“In 2024, data management conversations will experience a transformative shift and pivot from “how” to “why.” Rather than focusing on technical requirements, discussions next year will shift to a greater emphasis on the “why” and the strategic value data can bring to the business. Manufacturers recognize that data, once viewed as a technical asset, is a major driver of business success. Solution providers that deal with these needs are also seeing this change, and would be wise to respond accordingly.

In the coming year, data strategy and planning will increasingly revolve around outcomes and the value/benefit of effective data management, as leaders better understand the key role data plays in achieving overarching business objectives. Manufacturers will also reflect on their technology spend particularly those that have yielded questionable results or none at all. Instead of technical deep dives into intricacies like data storage and processing, crafting comprehensive data strategies that drive lasting results will be the priority.

Next year, manufacturers will move beyond technical deep-dives and focus on the big picture. This strategic shift signals a major change in the data management mindset for 2024 and beyond, ideally aligning technology with the broader objectives of the business such as driving growth, enhancing customer experiences, and guiding informed decision-making.”

Christian Buckner, SVP, Data Analytics and IoT at Altair

AI Fuels the Rise of DIY Physics-based Simulation 

“The rapidly growing interaction between Data/AI and simulation will speed up the use of physics-based simulations and extend its capabilities to more non-expert users.”

Mark Do Couto, SVP, Data Analytics at Altair

AI Will Need to Explain Itself

“Users will demand a more transparent understanding of their AI journey with “Explainable AI” and a way to show that all steps meet governance and compliance regulations. The White House’s recent executive order on artificial intelligence will put heightened pressure on organizations to demonstrate they are adhering to new standards on cybersecurity, consumer data privacy, bias and discrimination.”

Molham Aref,  Founder and CEO at RelationalAI

2024: the Rise of the Data Cloud to Advance AI and Analytics 

“While data clouds are not new, I believe there will be a continued emergence and a clear distinction made between data clouds and compute clouds in 2024. With compute clouds like AWS or Azure, we have had to assemble and stitch together all the components needed to work with AI. So with data clouds, like Snowflake or Microsoft Fabric, users have it all pre-packaged together in a single platform, making it much easier to run analytics on data needed to build AI systems. The rise of the data clouds will offer a better starting point for data analytics and Artificial Intelligence (AI) and Machine Learning (ML).”

Dhruba Borthakur, Co-Founder and CTO at Rockset

In 2024, Enterprises Get A Double Whammy from Real-Time and AI – More Cost Savings and Competitive Intelligence 

“AI-powered real-time data analytics will give enterprises far greater cost savings and competitive intelligence than before by way of automation, and enable software engineers to move faster within the organization. Insurance companies, for example, have terabytes and terabytes of data stored in their databases, things like documentation if you buy a new house and documentation if you rent. 

With AI, in 2024, we will be able to process these documents in real-time and also get good intelligence from this dataset without having to code custom models. Until now, a software engineer was needed to write code to parse these documents, then write more code to extract out the keywords or the values, and then put it into a database and query to generate actionable insights. The cost savings to enterprises will be huge because thanks to real-time AI, companies won’t have to employ a lot of staff to get competitive value out of data.”

The Rise of the Machines Powered by Real-Time Data and AI Intelligence

“In 2024, the rise of the machines will be far greater than in the past as data is becoming more and more “real-time” and the trajectory of AI continues to skyrocket. The combination of real-time data and AI make machines come to life as machines start to process data in real-time and make automatic decisions!”

Zandra Moore, CEO at Panintelligence

“The AI rush will continue into 2024, at least in the SaaS sector, whose products are the gateway through which most people and businesses will access AI. More than half of SaaS companies plan to progress new AI innovations by the end of 2024.”

“Following 2023’s Generative AI spree, AI strategies will shift in 2024. The focus is moving to more savvy innovation. 2024 will be the year of ‘pragmatic AI’. Our research indicates that SaaS companies will embrace Deep Learning, Predictive Analytics and Causal AI in 2024. “

“While one in six vendors are currently testing new Generative AI functionality ahead of planned launches, more than a quarter are testing Predictive Analytics to help users predict future outcomes based on historical data.”

“Causal AI, which helps understand data relationships and decision-making processes, also looks to gain prominence, addressing the need for transparent AI models. The number of SaaS vendors using this technology will double in 2024.”

“The number of SaaS vendors using Deep Learning technologies could also double. Almost a fifth of SaaS vendors are testing neural networks capable of learning complex patterns and representations ahead of target launch dates next year.”

Jonathan Friedmann, CEO at Speedata

“Since 2023 was all about the mainstreaming of AI and the crushing demand for specialized infrastructure (even from the category leaders), in 2024, we will see a reckoning for capacity to support other specialized workloads. The collective crisis standing in the way of business innovation is no longer just big data and the quality, compliance, and privacy concerns that come with it. It’s now big-time processing to unblock the teams, initiatives, and workloads within each particular domain. 

We have seen the GPU boom. But what comes next? Faced with enormous capacity constraints – including data center space and energy, as well as budget and performance – enterprises will have to strongly consider their future needs to efficiently and strategically do more with less. In the next year, we’ll start to see the shift to dedicated hardware for dedicated workloads to accelerate processing and break the cycle of scaling the footprint of generic compute across every conceivable industry and endeavor.”

Register for Insight Jam (free) to gain exclusive access to best practices resources, DEMO SLAM, leading enterprise tech experts, and more!

Share This

Related Posts

Insight Jam Ad

Latest Posts

Insight Jam Ad

Follow Solutions Review