The Role of APIs in Mainframe Modernization – DevOps.com

DevOps.com
Home » Blogs » The Role of APIs in Mainframe Modernization
By: on October 4, 2021 Leave a Comment
Mainframes are historically clunky systems. They are incredibly secure and can process impressive calculations, yet they are complex to manage and challenging to extract data from. This reality is at odds with modern cloud-native infrastructure, which emphasizes distributed computing and portable microservices. So, how can existing mainframes adapt to this connected, data-driven economy?
I recently met with Dr. Alex Heublein, president at Adaptigent, to talk about the state of legacy modernization. According to Heublein, many mainframes are here to stay for the foreseeable future. However, enterprises are continuing to get value from these systems by adding an integration layer on top of them. By using a REST API layer on top of a mainframe, enterprises could finally open up these systems. This could facilitate bidirectional communication with newer cloud-native services, enable phase migration and edge closer to real-time processing.
The software industry changes every day. “Nothing in the tech field is permanent,” said Heublein, yet some legacy technologies will persist into the foreseeable future. In terms of mainframe adoption, Heublein sees a couple of categories of users. First are the large financial services organizations, such as large banks and insurance companies, which are considerably invested in their mainframe architectures and which will likely continue to rely on mainframe environments for their policy management and core infrastructure. “I think there’s a lot of good reasons for that,” explained Heublein. “If you want to process billions of transactions a day, [mainframe] is still a good option.”
That said, regulations and market pressures are influencing a global movement toward open banking. For example, PSD2 mandates open banking in Europe and industry consortiums like Financial Data Exchange (FDX) are directing it in the U.S. market. These initiatives are encouraging financial services to transform their legacy environments and spurring cloud migration, especially for applications that aren’t mission-critical.
Heublein pointed out a few different patterns in enterprise mainframe modernization. One approach is that enterprises are rewriting or re-architecting old COBOL applications and keeping them on-premises. Or they are porting and re-platforming the application; moving a COBOL app to a Windows or Linux system, for example. Finally, some are shifting straight to the cloud. For solutions that don’t need to be on the mainframe, enterprises may shift toward cloud-native to free up capacity.
Distributed cloud technology is helping businesses scale to the right level. Yet, it’s difficult to jump off the mainframe entirely, noted Heublein. “Very rarely can you take all your stuff and move it all at once,” he said. “It’s a tough pill to swallow.” It’s especially difficult to move core banking infrastructure to a distributed environment—an instant shift could introduce security risks. For this reason, he recommended a ‘phase migration’ approach in which individual components are slowly migrated. For example, an insurance provider may shift their claims engine, policy engine and ratings engine to the cloud one at a time.
As a result of phase migration, however, engineers must simultaneously support both on-premises and cloud-based applications. They must also ensure bidirectional communication so that these systems can exchange data. This is where a REST API transactional layer is necessary, Heublein explained. Such a layer could enable bidirectional communication between the mainframe and cloud-native applications, abstracting complexity when calling out to the modern world, or intercepting cloud data and filtering it into a format mainframe technologies can process.
To increase developer usability, abstracting complexity from the mainframe integration process will be the key to building out connectivity. This could involve a codeless development approach with a COBOL sub-routine to call external systems, Heublein said. Automatic restructuring of data will be necessary to enable fluent bidirectional communication, too.
So what is the end result of putting an integration layer on top of mainframe systems? According to Heublein, there are four key business benefits of using such an abstraction layer:
Another concern is security. Opening up mainframe environments that house core financial infrastructure must be done with extreme care—especially as API attacks remain the number-one rising threat in the software landscape. All organizations building out new integrations will require a secure layer from an outbound standpoint. Heublein said this often involves placing an API management layer or gateway in front of the integration layer. This system enables authorization and functions like a firewall between the internal and external world. It can also enable logging and prevent denial of service attacks, he explained.
Legacy systems are entrenched in older technology and carry lots of technical debt, forcing operations to move at a slower pace. Maintainers are also culturally a bit different than the newer breed of cloud-native engineers. “Mainframe people are not … fast,” said Heublein.
DevOps folks want to build, test and promote projects to production very quickly and, arguably, this agility is necessary to satisfy rising digital innovation demands. Improving integration capabilities for on-premises systems could enable businesses to better utilize preexisting investments while also improving agility.
Low-code/no-code capabilities could also increase collaboration potential, Heublein added, which could bridge the gap between on-premises engineers and fast-moving business folks. When adopting low-code/no-code tools for mainframe integration, he cautioned against general-purpose citizen developer frameworks. Whereas universal integration layers can work well for many situations, he stressed that a very specific boundary box is best for building APIs for mission-critical applications.
How is your organization approaching modernization for legacy systems? Let us know in the comments below!
Filed Under: Blogs, Business of DevOps, Continuous Delivery, DevOps Practice, Enterprise DevOps, Features, Infrastructure/Networking
document.getElementById( “ak_js” ).setAttribute( “value”, ( new Date() ).getTime() );
Powered by Techstrong Group, Inc.

© 2021 ·Techstrong Group, Inc.All rights reserved.

source