Ola Mayer, Director Product Marketing at Attunity – a rapidly growing database integration provider – recently wrote a thoughtful 2-part blog post on how companies with solid legacy systems can deal with the challenges of data integration in the rapidly changing modern data environment. In post number one, Ms Mayer outlines the most basic issue, “Many enterprise-grade software systems have reliably served the needs of their businesses, and the people who use them, for years and even decades. A tremendous amount of business data and processes is tied up in legacy systems, ranging from mainframes to custom applications to other applications lacking accessible interfaces.”
As Ms. Mayer points out these legacy systems present a double-edged sword for data integration, “One benefit of these systems is their reliability and high performance. As a result, organizations are reluctant to abandon them for new technologies. In addition, the data residing in legacy systems is extremely valuable to the business, used in vital initiatives including business intelligence (BI) and analytics. These initiatives often support mission-critical operations, including marketing, HR, customer support, finance, logistics, and more, contributing to an organization’s competitive advantage.”
“Although legacy data is an indispensable resource”, she continues, “IT teams struggle to find efficient and cost-effective ways to access and leverage it for business purposes.”
Which leads to the four main challenges facing these organizations:
- The cost and complexity of migrating to newer platforms is prohibitive. IT analysts estimate that the cost to replace business logic is about five times that of reuse and that’s not counting the risks involved in wholesale replacement. In an era where IT teams must do more with less, modernizing the IT infrastructure and migrating legacy platforms and databases to newer technologies may simply not be possible.
- Accessing data in legacy systems is challenging. IT teams recognize that timely access to data in legacy systems is essential for internal customers, but accessing that information and integrating it with other database systems can be very difficult.
- Data refreshes from legacy systems may be too slow for BI and analytics purposes. Long delays in legacy system data refreshes have a negative impact on the lines of business. Analysts must work with outdated information or wait for unacceptably long periods of time to get updated data.
- Accessing data in legacy systems is often a burden for IT teams. In many organizations, business users send the IT department their information requests for BI and analytics tasks. Accessing legacy system data to meet these needs can take a long time due to limited human resources and a lack of expertise.
In post number 2, Ola Mayer outlines a strategic approach to the problem.
“The IT industry is responding to challenges via “Legacy modernization” and “legacy transformation”. This is the act of reusing and refactoring existing core business logic by providing new user interfaces or by selectively moving data from the legacy systems to the modern data warehouse systems (where data is integrated and analyzed with data coming from ‘modern sources’). Understanding and mapping your legacy systems to a modern data warehouse structure is a key success factor for a data integration project. This is where Attunity can help companies to overcome the challenge of legacy system modernization.”
- Syncsort Targets Legacy ETL Market with Ironcluster for Amazon Web Services - May 21, 2014
- Attunity Releases Maestro Platform for Conducting Big Data - April 11, 2014
- Change Data Capture:The Top 5 Use Cases - March 23, 2014