Data Integration Buyer's Guide

The History of Data Integration: A 90-Second Primer

The History of Data Integration: A 90-Second Primer

The History of Data Integration: A 90-Second Primer

The following is an excerpt from Solutions Review’s Data Integration Vendor Map, a visual map of the top-24 providers for Integration Platform as a Service, enterprise application integration, and self-service data preparation.

Without data integration, accurate analytics are impossible to achieve. Imagine trying to make a decision based on incomplete data. The less information available, the more likely a decision leads to an undesirable outcome. Now, multiply this challenge—decisions will now involve millions of dollars, hundreds of data sources, and terabytes of data. In order to steer a business correctly, data integration needs to handle a heavy burden.

Download Link to Data Integration Vendor Map

Data integration grew out of extract, transform, and load tools (ETL), which were designed simply: take data from one source, transform that data until it’s in a form that another application can recognize, and then load it into that application. A basic example would be taking raw financial data from a bunch of receipts, putting them into a spreadsheet and loading it into an accounting program.

As time went on, enterprises began to both generate more data and expect more from it. In response, data integration tools have become both more user-friendly and more granular. Administrators can choose to migrate data either in transactions or in batches, they can transform and filter data during run time, and they can even set up conditions where the data integration platform handles errors automatically.

Businesses now face a bewildering array of choices and tools options around various types and features of data integration platforms. Furthermore, you may need pieces of one and not another. What platform is best for a particular business?

What about ETL?

Formerly the market standard for moving data from source to target, ETL (Extract, Transform and Load) tools increasingly are increasingly taxed by the expansion in unstructured data volumes that come from non-traditional sources. For some, ETL may be the most direct way to unify enterprise data assets, but for others, it might create a bottleneck that contributes to latency. Thankfully, integration tools branch out in many different directions to offer functionality that meets a wide variety of analysis styles.

Organizations in specific verticals may choose to store all of their data in a single repository (Data Lake) and run self-service data preparation on it there. Still too some use cases may warrant federating data so that a virtual copy can provide a unified view of all enterprise data. In other scenarios, fine-tuning the relationship between applications so that they may communicate with one another is most important.


Widget not in any sidebars

Share This

Related Posts