Blog > Big Data > How to Integrate Data from Legacy Systems into a Modern Data Environment

How to Integrate Data from Legacy Systems into a Modern Data Environment

Authors Photo Precisely Editor | July 16, 2020

There are some very good reasons why many of the world’s most transaction-intensive companies are still running critical business applications on mainframes and using legacy data. Those legacy systems are highly reliable, secure, and scalable. In industries such as banking, insurance, and healthcare, these qualities are extremely important. 

However, there are also drawbacks: mainframes lack the flexibility offered by distributed computing, and accessing data from those systems can be expensive and cumbersome. Very often, legacy data requires on-the-fly transformation to be compatible with modern open-standard systems. Moreover, working with mainframe data requires specialized skills, and the talent pool of experienced mainframe IT workers has been shrinking in recent years. 

As a result of these challenges, many companies are still treating mainframe data as if it needs to reside in a silo, performing batch updates to synchronize mainframe data with external platforms, or simply living with the limitations of siloed information. Those alternatives have become less and less palatable over time. Fortunately, there are better alternatives.

Let’s look at some use cases and best practices for integrating data from legacy systems.

Use case 1: Real-time analytics and legacy data

For the world’s leading enterprises, big data is the key that unlocks competitive insights. Companies that run core transactional functions on a mainframe are faced with a conundrum; how can they provide a comprehensive view of the enterprise if they don’t have timely access to their core transactional data?  

With an array of different systems including mobile applications, e-commerce, CRM, shared services, trading platforms, and more, it’s imperative that companies are able to aggregate data from multiple sources and report on that in real time. 

In today’s business climate, responsiveness is critically important. Day-old information just isn’t good enough. This is especially true during periods of disruptive change when economic behavior is shifting rapidly and business leaders need to respond quickly. Companies that can see trends right away and react swiftly will outperform those who can’t.

Use case 2: Security and audit compliance

For the IT department, timely discovery and rapid response to security events are critical. For IT Operations Analytics (ITOA) and Security Event and Information Management (SEIM), real-time visibility is non-negotiable. A majority of respondents to Precisely’s State of the Mainframe for 2018 survey placed high importance on real‑time analysis of security alerts and audit readiness. Many respondents were seeking to stream SMF and log data to platforms such as Splunk, Hadoop, or Spark for an enterprise-wide view of IT operations and security. With the passage of the European Union’s GDPR (General Data Protection Regulation) and with the likely passage of similar legislation in other countries, compliance with security and privacy standards is more important than ever.

eBook

How to Build a Modern Data Architecture with Legacy Data

Create a modern data architecture that includes any data source regardless of the data’s type, format, origin, or location in a manner that’s fast, easy, cost-effective, secure, and future-proof.

Interoperability is imperative. Customers have come to expect self-service applications as a matter of course. These applications are built on modern platforms that use open standards and operate across distributed platforms. Mainframe data needs to interact with those systems in real time. Fraud detection, similarly, requires real-time access to data. To deliver these kinds of solutions, businesses need an efficient and secure process for streaming data from the mainframe to distributed platforms. All this needs to be done securely, and without creating data processing bottlenecks. Again, real-time access is an absolute requirement. 

The good news is that there are some powerful tools for mastering these challenges. Legacy systems can continue to deliver reliable performance, but they no longer need to operate as information silos. 

For businesses seeking to modernize their IT landscape, here are some key practices to keep in mind:

Reign in complexity with legacy data formats

Compared to modern applications, legacy systems were designed with different priorities in mind. Scalability, security, and reliability were paramount in the mainframe era and they are just as important today. Interoperability – enabled by open standards and integration tools – were a secondary consideration. Native mainframe data formats can be a problem because they are often incompatible with modern platforms. COBOL copybooks and variable-length records, for example, can present significant challenges to efficient data transfer. 

Computing efficiency, likewise, is a key consideration. Data processing bottlenecks can drive higher CPU usage, resulting in increased operating costs and slower performance. 

Resolving these problems with custom code is cumbersome and relies on a hard-to-find skillset. Industry-standard tools, on the other hand, can manage this complexity with much lower risk and without the need to maintain custom code over time.

Make “real-time” your mantra

Real-time is no longer just an aspiration; in today’s world, it is expected. Many companies have relied on batch updates to populate data warehouses, data lakes, and external applications. As a consequence, the business runs on day-old information. 

For interoperability, including applications such as fraud detection or customer self-service, batch updates simply don’t work. For everything else, it simply means that business managers cannot react as quickly as they should. That translates to a competitive disadvantage.

Big data, AI and machine learning can drive innovation, but if they’re operating on outdated or inaccurate data, the results could be off target. Data quality, likewise, is essential for driving meaningful results from such systems. Real-time access across multiple systems, combined with effective data quality, translates to high-value business insights that drive competitive advantage.

Security must be built-in

For well over a decade, countless news stories have shone a spotlight on security and data governance issues. Companies that fail to adequately address these challenges are at risk of failing compliance audits or worse, exposing confidential information to bad actors. 

In Precisely’s 2018 survey, 53 percent of respondents said they lack full visibility into the movement of their data within the organization. With an increased focus on data governance and compliance audits, the importance of a security-first design is paramount. 

Custom-coded integrations must not only ensure that data is delivered accurately and efficiently, but they must also adhere to best practices for security. For IT managers, this raises a question of risk: will anything slip through the cracks?

Purpose-built integration tools, in contrast, encapsulate best practices in security and data governance, ensuring that IT departments have clear visibility to the movement of data within their organizations. 

Break down the silos

For IT managers, the mandate has never been stronger.  To enable innovation, to assure service excellence, and to comply with security standards, enterprises must bridge the gap between legacy data silos and their distributed computing platforms.

With purpose-built tools for integration and data quality, companies can eliminate data silos and modernize their IT landscape while preserving the strengths upon which their legacy systems are built. 

If your company is contemplating building a modern data environment, Precisely can help. To learn how, please download our eBook: How to Build a Modern Data Architecture with Legacy Data