By Nenshad Bardoliwalla
Since the birth of our information technology paradigm, its role in business has been on a constant growth curve. In the early days, it was about automating manual processes, like order entry. This was followed by optimizing business processes with the emergence of packaged applications – first for the back-office and finance functions, and then for the customer facing ones with CRM and SFA. Today, it is about the data we collect, not just from our packaged applications, but our informal applications like our web sites, mobile apps, and also millions of devices with embedded sensors transmitting observations on machines, people, weather, and anything we can meaningfully measure.
The undeniable result is that the value of IT is becoming deeply intertwined with every function and process that make up your business. While the technology landscape has undergone massive change and disruption, how one pays for the information technologies we need or use in our business has not always kept pace. Traditional pricing was based on a perpetual licensing model where you have to estimate what technology, and how much of it, you need – usually for the next 3 to 5 years. Once decided, you would purchase the capacity and pay up front. If any of your estimates proved too ambitious, you are stuck sitting on any unused software and hardware. Not only did this disconnect the cost of the technology from the real business value derived, it created a situation in which every CIO questions the value IT can bring.
Abut ten years ago, new pricing models emerged that started changing this equation. Amazon with AWS and numerous Software as a Service (SaaS) vendors (most notably Salesforce) have brought two powerful changes: (1) it allows the buyer to buy software or hardware on an annual subscription basis, and (2) it delivers the service via the cloud so there is no longer a need to install hardware or software in your own data center. The change was phenomenally disruptive, but it was the downturn of 2008 that became the proving ground for how this benefits the business. For example, if the number of sales staff needed to be reduced due to the downturn, you could decrease your subscription of Salesforce. Traditional (perpetual) purchases of SAP CRM or other solutions could not be reduced in similar fashion. AWS did similar to the likes of HP and IBM selling data center hardware.
Today, we are on the cusp of yet another massive disruptive trend towards true consumption-based pricing. Businesses are demanding more granular pricing, and consumption-based models seems to be the direction the market is going. Expect the concept to continue in 2019 thanks in part to the following:
- Infrastructure lead the way: AWS and Azure have paved the way on the compute and storage side. Why would you pay for “renting” processing in the cloud when you are not using it? Today, organizations are paying for only the minutes their clusters are up and working. If they stop using them, then the costs would drop to zero or near zero.
- Analytics vendors following quickly: On the software side, upstarts in the data market like Snowflake and Databricks are following suit. In both cases, businesses can now get their data processing environment metered so they are only charged by the hour of usage. Storage in the cloud costs nearly nothing, so an organization can easily store all its data and pay 24 by 7. When analytical jobs need to be run, you simply spin up the processing engines which run on your data stored already in the cloud. Upon completion of the job, the processing resources simply go way, and you only get charged for the actual resources consumed.
- Major dependence on Kubernetes and other enabling technologies: At the heart of this notion of paying only for the engines when they are running, like very sophisticated technologies that need to spin up and break down processing resources across distributed compute environments. Virtual machines started as a way to deliver the first generations but has proved to be too slow to start and stop. They also retained major complexity to manage. In the Hadoop world it was left to YARN, however the latest approach towards containerization of these workloads find Kubernetes winning that battle.
Beware of the surprise bill at the end of the month by aligning with the right initiatives
One of the side effects of a consumption-based pricing model is the surprise bill at the end of the month. Much like the careless use of mobile phones and the resulting surprise bill due to out-of-country roaming charges, we have already heard of bad surprises resulting from running Spark jobs over a weekend and running up monstrous bills. The great news here is that the early entrants actually provide a great recipe for solving this in the form of prepaying and AWS’ Spot Pricing options. Much like we are used to doing with our mobile phones, one can estimate the requirements for up to a year out and prepay at hugely discounted rates over the minute to minute normal rates.
One of the big advantages of consumption-based pricing models is that it provides very granular costs that can be tied to very specific business activities. Something that does not return the right value can simply be stopped. Something that does add value could get more funding for more resources. And delivering it via the cloud means there is no lag between the need to increase or decrease capacity and the actual delivery of that service. While one can discuss which kinds of apps will best fit this consumption-based pricing model, it is clear that data and analytical tasks fit the bill perfectly and will be one of the areas to benefit from this new pricing paradigm in 2019. This is because new analytical projects usually start, run for a few weeks or months, and shut down once the insights are developed. Analysts themselves spend only a few hours a day in the data and analytical tool and the rest of the day performing other tasks such as presenting the results or building it into presentations.
As CPO for Paxata, Nenshad sets the product vision and strategy. Nenshad is an executive and thought leader with a proven track record of success leading product strategy, product management, and development in business analytics. He formerly served as VP for Product Management, Product Development, and Technology at SAP. Nenshad is also the lead author of Driven to Perform: Risk-Aware Performance Management From Strategy Through Execution.
Latest posts by Timothy King (see all)
- Did We Just Witness the Biggest Data Migration Fail in History? - March 20, 2019
- Trifacta Adds Data Quality Functionality to its Data Preparation Suite - March 20, 2019
- The Single-Most Overlooked Part of the Data Integration Process - March 15, 2019