Data Integration Buyer's Guide

Talend Unveils Apache Beam-Powered Big Data Prep Tool

Talend Unveils Apache Beam-Powered Big Data Prep Tool

Talend Unveils Apache Beam-Powered Big Data Prep Tool

Talend has announced the first Apache Beam-powered solution for Big Data prep. Beam is a unified programming model for executing both batch and streaming data processing pipelines that are portable across a variety of runtime platforms, and is currently a top-level Apache project. Talend’s new tool is self-service in nature and enables users to access, cleanse and analyze large data sets.

Talend Data Preparation powered by Apache Beam was first introduced back in January, as part of the Winter ‘17 release of Talend’s integration platform and signals the vendor’s commitment to this data processing technology. Talend has been collaborating on the development of Apache Beam with Google and others since 2015, having made several contributions to the Beam community over the last two years. Moving forward, Apache Beam will become a key element of the Talend Data Fabric integration stack.

Download Link to Data Integration Buyer's Guide

Talend’s Chief Technology Officer Laurent Bride speaks to the new release, adding: “Modern businesses need better access to clean actionable data, in order to support real-time insight across their organization. However, given the current rate of technology innovation, IT leaders often worry the investments made today, too quickly become obsolete and an obstacle to advancement tomorrow. We believe Apache Beam represents the future because it mitigates the need to re-write applications as new innovations are introduced, systems are moved to the cloud, or integration styles need to be alternated.”

Data preparation is a hot topic today because modern technologies and practices are finally giving users and IT an alternative to traditionally slow, manual, and tedious steps for getting data ready for analytics. Data preparation covers a range of processes that begin during the ingestion of raw, structured, and unstructured data. Processes are then needed to improve data quality and completeness, standardize how it is defined for communities of users and applications, and perform transformation steps to make the data suitable for analytics.

Read the full press release.

Widget not in any sidebars

Share This

Related Posts