In managing databases, extract, transform, load (ETL) refers to three separate functions combined into a single programming tool. Fragmentation in the legacy Data Integration market made us wonder whether or not traditional integration tools were becoming obsolete before our very eyes. With this in mind, we asked the crowd whether or not they believed Data Integration as we’ve known it was dying. To our surprise, the answer was a resounding no, and it appears that legacy tools are still being used in many verticals as enterprises prepare for the next wave in data tools.
In a recent presentation at Spark Summit EU, ING’s Chapter Lead in Analytics Bas Geerdink spoke to this very topic, recommending a migration from ETL to Apache Spark for data processing and movement. Geerdink, who is also a certified Spark developer argues that ETL has seen no real technological or market evolution like BI and the data warehouse have in recent years. ETL tools don’t seem to have a major role in the future outside of niche use cases, with this slideshow even referring to these solutions as “ETL Hell.” Make your own conclusion, and click through the presentation to learn more.
Latest posts by Timothy King (see all)
- A Two-Part Solution to the Data Integration Challenge of ETL - June 18, 2019
- Tamr Adds Data Mastering Workflows to Unify Spring 2019 - June 13, 2019
- Trifacta Unveils Industry’s First Snowflake Cloud Data Preparation Tool - June 5, 2019