
BARC Perspective: Microsoft Fabric
Microsoft has announced Microsoft Fabric as its latest update and bundling of its data and analytics product stack.
What happened?
Microsoft has announced Microsoft Fabric as its latest update and bundling of its data and analytics product stack. The product is in preview status today, although early adopters have been part of an ongoing pre-beta program.
Why is it important?
Microsoft has gained a significant market share in the business intelligence market since the 90s with SQL Server and mainly Power BI over the last decade. In recent years, Azure, Synapse, and other components have been the focus of a renewal of the data and analytics strategy to implement modern data management and, in particular, to secure a substantial share of the AI market opportunity for Microsoft.
The latest announcement reconfigures the components, and although with “Data Activator,” only one smaller component is purely new, The MS Fabric offering is sure to apply new pressures to the data & analytics market.
What is interesting about it?
- The modernization of data management is progressing rapidly. Driven by the trend toward the Cloud, Artificial Intelligence, and an increasingly dynamic environment, many major providers are rebuilding or modernizing their backends with concepts such as Data Mesh, Data Lakehouse, or Data Fabric. Microsoft is finalizing its transformation with its newly branded “MS Fabric” offering.
- Through the announcement, Microsoft, as one of the dominant vendors in the data & analytics market, unites its products in one stack – not only from a licensing aspect, but also with a unified UI and easier interaction of the individual components. In recent months, the acquisition of Talend by Qlik and the product announcement by SAP have given rise to two larger providers who have also announced a stronger integration of the front-end and back-end side within their offerings.
- Microsoft Fabric is entering the market with a more aggressive licensing model designed to give midmarket companies, in particular, access to a complete modern data & analytics stack. Until now, the rather ambitious price model, especially around the core component “Synapse” for modern data management and data warehousing, has been an obstacle to widespread use. The new licensing model is purely capacity based, which makes it harder to calculate the cost of ownership in advance.
- It appears that after the good experiences with Power BI as an inexpensive entry-level product, Microsoft also wants to leverage that market position by bringing more users to its platform on the cloud data management and data warehouse side. One hope is undoubtedly to leverage more use cases, especially in the current AI hype, to the platform through an initially more affordable and complete offering and to monetize them later through significant usage of Azure compute.
- The concept of “Fabric” has been hyped in the Data & Analytics market for some time. Microsoft is taking the bold step of using a potentially cyclical term in the product name. This support from Microsoft substantially upgrades the term “fabric” in the D&A industry.
- The engine behind MS Fabric has now been completely rebuilt compared to Synapse following new Cloud Storage and Compute Separation paradigms. The goal is to compete and scale comparably with vendors like Databricks or Snowflake while providing the same flexibility.
Background and technological fit
MS Fabric brings together components that have also worked well together in the past:
- Azure Data Factory for Data Integration
- Synapse for Cloud based Data Warehousing and Engineering
- Different services for Machine Learning and AI
- Purview for Data Governance
- Power BI as a widely spread front-end for dashboarding and reporting
OneLake is, in one part, a rebranded version of Azure Data Lake Services (ADLS) with an enhanced user interface; on the other hand, it includes new functionalities for virtualization, such as integrating AWS S3 data. The functionality is called “Shortcuts” and will likely be extended to other Cloud technology providers such as Google or SAP. It appears that Microsoft is driving OneLake in the direction of an overarching data virtualization engine.
“Data Activator” has also been announced as a new component for defining rule-based event processing and automizing workflows. This functionality was supported in previous Azure environments with scripting, but Data Activator offers a new, end user-oriented low code toolset with more flexibility.
MS Fabric could be seen primarily as a new commercial bundle of familiar components, but there are two other interesting aspects:
- First, a user interface which will be unified step by step, which is intended to significantly simplify the interaction between the components – a precise assessment will only be possible from the public beta phase when the integration is largely complete. Integrating generative AI capabilities throughout the product portfolio has also been announced as an important part of the unified user interface (and beyond)
- A new pricing model based on combined usage and billing with a Microsoft tenant. Once activated, all components of MS Fabric are available and ready to use.
Potential negative effects for customers
- Bundling products has no added value in itself, and the deeper technical integration could even slow the development of the individual components for some time. Microsoft has an excellent track record in the data & analytics space, especially from integrating Power Query, Power View, and Power Pivot years ago to Power BI. On the other hand, some of the components bundled (e.g., Purview for data governance and data intelligence) have functional gaps in comparison to major competitors, which still need to be closed.
- Microsoft is trying hard to demonstrate the continued openness of the platform for partners and this is also underpinned by press releases from partners such as Informatica. Nevertheless, customers with best-of-breed strategies could suffer from the fact that cooperation with third parties has potentially less focus in the future.
- A stronger bundling of products in a cloud provider’s stack increases the lock-in effect and makes customers more susceptible to subsequent price increases.
- Although MS Fabric follows a multi-cloud approach and includes functionalities to integrate data from AWS and Google Cloud, the level of integration is rather basic. Customers with muti-cloud strategies could be pushed to standardize their data in Azure.
- For existing Synapse users, MS Fabric is a logical upgrade path, but the migration seems to require a lot of manual effort. It remains to be seen whether Microsoft will offer migration tools in the near future.
Potential positive effects for customers
- Existing users of individual components like Power BI get an easier path to evaluate and try out additional components and benefit from better integration of components and a user interface that feels familiar.
- MS Fabric tries to simplify and push implementation tasks that used to be pure “Data Engineering” tasks closer to the business user by integrating them into the Power BI workspace. This could help scale the data engineering process in companies.
- The integrated approach of MS Fabric will help users move from simple data management applications to more complex data science scenarios from the technical aspect.
- It appears that the new licensing model will be more affordable, especially for mid-size companies, although we must wait for the detailed pricing announcements.
- Companies receive more flexibility in using their existing Azure contingents for different MS Fabric components.
- The administration of the components will be easier with a centralized approach within one tenant.
Strategic outlook
In general, the trend in the market is back to the standardization of stacks by the larger vendors. Since its inception movements between “best-of-breed” approaches and vendor-specific stacks have accompanied the data and analytics market. After a prolonged period with more combined and “self-service” approaches, it seems that the centralization of the data and analytics infrastructure is coming back. This is driven mainly by the mega-vendors, which try to push as many workloads as possible in their single cloud environment.
Such a development has positive and negative effects for customers. Greater integration makes it easier for users to implement new and more complex use cases. On the other hand, the lock-
in effect with individual providers becomes higher. Nevertheless, the providers strive to demonstrate openness and ensure interoperability. It will be interesting to observe how practicable multi-cloud concepts will evolve in the data & analytics world in the future.
In so far, we believe that “Data Fabric” solutions will become more functionally rich and widespread as they are more capable of dealing with multi-cloud environments compared to pure Data Lakehouse solutions. However as long as the mega-vendors are doing very good business with migrating customers into their “single” cloud, we expect new Fabric-features to trickle in at a slow to medium pace.
The transformation of the data & analytics infrastructure from the concept of the good old data warehouse – which still has its justification for some scenarios – to new data management concepts is in any case in full progress.