automation: modernizing the mainframe for devops · 2018-09-04 · industry standard, open-source...

16
AUTOMATION: Modernizing the Mainframe for DevOps By Don Macvittie

Upload: others

Post on 23-May-2020

7 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

AUTOMATION: Modernizing the Mainframe for DevOps

By Don Macvittie

Page 2: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

Industry standard, open-source tooling for cross-enterprise DevOps#CABrightside

With only 7 percent of mainframe developers under 301, organizations face a steep skills gap maintaining the system of record that hosts 70 percent of corporate data2 in driving personalized, data-driven digital experiences across mobile, cloud and distributed platforms.

Empower developers and modernize your mainframe with CA Brightside, the first cross-enterprise DevOps solution designed for teams to control, script and develop for the mainframe like any other cloud platform.

Learn more by visiting ca.com/brightside

Copyright © 2018 CA. All right reserved. CS200-388494_0818

1 Gary Beach, Wall Street Journal CIO Journal: “Making Mainframes Cool Again”, November 2017, https://blogs.wsj.com/cio/2017/11/14/making-mainframes-cool-again/

2 Ray Overby, Infosecurity Magazine, "Most Businesses Overlook One Common Mainframe Security Vulnerability," https://www.infosecurity-magazine.com/opinions/overlook-common-mainframe

Page 3: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

Industry standard, open-source tooling for cross-enterprise DevOps#CABrightside

With only 7 percent of mainframe developers under 301, organizations face a steep skills gap maintaining the system of record that hosts 70 percent of corporate data2 in driving personalized, data-driven digital experiences across mobile, cloud and distributed platforms.

Empower developers and modernize your mainframe with CA Brightside, the first cross-enterprise DevOps solution designed for teams to control, script and develop for the mainframe like any other cloud platform.

Learn more by visiting ca.com/brightside

Copyright © 2018 CA. All right reserved. CS200-388494_0818

1 Gary Beach, Wall Street Journal CIO Journal: “Making Mainframes Cool Again”, November 2017, https://blogs.wsj.com/cio/2017/11/14/making-mainframes-cool-again/

2 Ray Overby, Infosecurity Magazine, "Most Businesses Overlook One Common Mainframe Security Vulnerability," https://www.infosecurity-magazine.com/opinions/overlook-common-mainframe

MOST OF US HAVE ALWAYS LIVED IN A WORLD WHERE MAINFRAMES DID THE BULK OF THE DATA PROCESSING. INTRODUCED FOR COMMERCIAL USE IN THE 1950S, MAINFRAMES HAVE SEEMINGLY BEEN AROUND TO DO THE HEAVY LIFTING. EVEN IBM’S “NEW” Z SERIES IS NEARLY TWO DECADES OLD (THOUGH, OF COURSE, THE TECHNOLOGY UNDER THE NAME HAS CHANGED IN THAT TIME).

The reasons why mainframes have stayed relevant and even grown in use are many, but an obvious one is simply, they work. If they didn’t, mainframes today would be relegated to the scrap heap of history with so many other high-tech items—we’d have found the time and resources to replace them if they didn’t do what we need.

What we continue to need is reliable processing for high volumes of data. In an IT shop where a mainframe sits, it is almost always the system of record. With good reason: Mainframes are more stable and reliable than other platforms. In just one example, IBM mainframes alone account for $7.7 trillion (USD) in credit card transactions a year .

The thing is, however, the very tools and processes that make the mainframe so reliable can also cause problems when they bump up against the world of DevOps. And that is a problem: Modern client applications that require access to mainframe data and subsystems are being developed using DevOps and agile methodologies, making the mainframe the slowest link in the chain to release.

We’re going to explore some steps that will help make mainframe development more suitable to the DevOps world. Our focus is on automation that might enable moving the mainframe fully into DevOps, but the point is to at least make it more responsive so that the greater DevOps process waits for mainframe updates for less time than it does today.

3

Page 4: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

DEVELOPMENT

4

Page 5: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

Development has seen a world of change over time, and sometimes it feels (to outsiders) like the mainframe has not kept up. It has—and many shops are taking advantage of those changes—but many are not yet. If mainframe development is still happening in a green screen with separate version control and build/release cycles measured in months, it is time to look around and see where the world has moved.

INTEGRATION WITH CORPORATE SYSTEMS Development is, generally speaking, development. Every language and platform combination have differences, but they all share a large number of commonalities, also. One of the first steps to automation for a mainframe team is to move source code into a state-of-the-art version control system, or “source code repository,” as we’re calling them today. Most

traditional mainframe source code management (SCM) tools do not support full build automation, but build automation is where the world has gone.

Increasingly, mainframe vendors are adding support for SCM tools such as Git so that mainframe code starts to look more like everyone else’s code from an automation perspective.

As an enabling technology, we recommend you consider one of these tools. Once you have connectors to use modern SCM on the mainframe, not only will newer developers be more comfortable working with mainframes, but automation efforts based on source and builds will be far easier.

DEV ENVIRONMENT ALLOCATION AUTOMATIONThe ability to develop locally and only after development and unit testing move those changes to the system of record is relatively standard on every platform. In older mainframe environments, though, “locally” is actually a private remote area on the mainframe. Newer

development tools will emulate mainframe functionality on a development machine enough to develop and perform unit testing before committing changes—just like every other platform out there. The ability to spin up a dev environment brings the mainframe slightly into modern application development trends and is worth checking out if you are still actively developing on the mainframe.

BUILD AUTOMATIONMost DevOps shops are doing build automation with Jenkins or one of the other continuous integration (CI) tools. The goal is to validate the code on a regular basis and find errors early. Many DevOps shops are actually set up to perform a build whenever new code is checked in. Combined with the local development environments mentioned above, this makes for a

powerful build/test system that allows development and unit test in a simulated local environment, and then does a test build when the source is in its actual environment. This functionality is available on the mainframe once newer SCM and development environment allocation tools are in use.

5

Page 6: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

NEWER HARDWARE, NEWER BINARIESIt is true that some of the mainframe applications out there are 50 years old, and many are 20 or 30 years old. The compilers used to build them targeted a set of hardware that is just as old. Backward compatibility is pretty astounding when you consider the advances

in computing technology that mainframes have adopted versus the age of applications that must run on newer hardware. But newer hardware has features that older applications just cannot take advantage of. Often, older software is run in a constrained or emulated mode to make certain it doesn’t break when a new mainframe is delivered. While nothing as complex as retargeting hardware is simple, there are tools available to help retarget applications.

GO TO THE SOURCEAs newer applications take advantage of the wealth of data, security and subsystems that were built on mainframe technology, some applications that were written long ago for a few

dozen or a few hundred users are now seeing thousands of users (normally by proxy; one webserver can represent thousands of users by itself). This can create issues of performance. If an application was designed for low usage, cranking it up to high usage can result in unexpected slowdowns. This is an application design and usage growth problem, not a mainframe problem. But there are steps that can be taken to make those mainframe applications and subsystems more responsive—all based around updating which hardware the binaries target.

If a shop has older applications and newer hardware, rebuilding is always the best option. It isn’t simple—some changes require rewriting pieces of code—but new compilers are targeted at optimizing for new hardware, making the reworking of code worthwhile. And while you’re making adjustments to aim at new hardware, processing bottlenecks can be looked at also.

NO SOURCE? NO PROBLEMOkay, not exactly “No problem,” but how about, “We’ve got you covered”? Whether it is a purchased application that is no longer supported, a contracted application for which source was

never provided or an in-house application that the source has (for whatever reason) gone AWOL, the category of tools known as binary optimization can help. Some of the hardware changes can be handled at the binary level, simply changing the compiled code to take advantage of new hardware features by looking for patterns and updating them. If a module or application is performing poorly under the stress of new agile clients, the ability to run binary optimization on it and get better—sometimes significantly better—performance will, at a minimum, offer more time to decide how to address the performance gap. In the best scenarios, it will alleviate the need to do anything more.

6

Page 7: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

Want to bring your legacy systems into the world of agile and DevOps?

Learn how to release digital services at least 50% faster by:

¾ Extending core (legacy) systems to the digital world using microservice-enabled APIs

¾ Automating manual API creation with one tool any developer can use

¾ Creating REST APIs in minutes and delivering new services in days or weeks

¾ Bypassing complex architecture layers using specialized software buit for this purpose

OPEN INNOVATION

Leverage Legacy Systems at the Speed of DevOps

Get the White Paper

www.OpenLegacy.com/DevOps

Page 8: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

TEST8

Page 9: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

INTEGRATION WITH CORPORATE SYSTEMSAutomated testing is a standard for DevOps teams. While automated testing is the standard for some mainframe teams also, it is rare that the two use integrated systems. However, if an organization is adapting mainframe development to its

DevOps architecture, integration must happen.

Corporate project roll-up tools also need integration into a full DevOps environment. Reporting needs to be accurate and timely. Transcribing is the traditional solution to this disconnect, but automated processes that can update project management tools or portfolio management tools are essential to end-to-end DevOps success. Transcription is more error-prone and slower to update than an automated system, and the overall DevOps architecture should not be slowed by mainframe status updates.

UNIT TEST AUTOMATIONUnit tests serve two primary purposes in a DevOps environment: to enable the developer to validate that a given change behaves as expected and didn’t change surrounding code behavior, and to “shift left” and catch unwanted behavioral

changes through automated testing at check-in.

Unit tests are nothing new, but deep integration into test environments—and, by extension, DevOps—is. Certainly, the DevOps portion is new in the mainframe world. Mainframe development and test tool vendors have been integrating their software with continuous test tools, but make certain you have a plan to get unit test automation on the mainframe into the overall DevOps process.

TEST ENVIRONMENT AUTOMATIONGenerating test environments is a problem that all DevOps environments have to tackle. The truth is, the test environment for application X is unlikely to be a clone of the test environment for application Y and might be completely different—will

be completely different if the applications target different platforms.

Including the mainframe (or any new platform) as part of overall test environment automation can create unique problems. Tools are available to automate the mainframe part of the process, and there are plug-ins to integrate the mainframe part of the process with the overall DevOps toolchain.

An added benefit of having these tools is developer self-service. The ability to spin up a test environment and test changes is difficult in the mainframe environment, but tools to automate the process make it far easier and allow for more developers to access test environments at the same time.

9

Page 10: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

DEPLOY

10

Page 11: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

In many ways the mainframe is ahead of most of IT in the deployment department, because the overall architecture is centralized. But in many other ways it is not, because the mainframe is purposely compartmentalized. In a nutshell, this means mainframe deployments aren’t any more or less complex, just different than other deployments.

When building an end-to-end DevOps environment, it will be necessary to include the mainframe. Make certain any DevOps deployment tools chosen include plug-ins that meet your mainframe deployment requirements.

INTEGRATIONS WITH CORPORATE SYSTEMSTools to deploy applications into the runtime environment will need to work with corporate systems such as application release automation and continuous deployment. While deployment tools for the mainframe have always been around, a higher level of automation

and integration is required in a DevOps environment. Check with corporate toolsets to look for integration points, and, if necessary, consider using tool APIs to perform an in-house integration.

APPLICATION DEPENDENCY EVALUATIONOne of the problems with mainframe applications has been determining what is impacted by a given deployment. Mainframes are large, complex systems and applications—particularly applications that have been around for a while—and it can be painful to evaluate all of the

involved subsystems to trace dependencies.

Automated tools for dependency evaluation have come a long way and using them benefits not just deployment planning activities but also new developer training. Determining what subsystems are providing what functionality and what databases are required for the larger application to run can be made easier by using automated dependency tools, and this knowledge can feed a corporate DevOps initiative.

DEPLOY ENVIRONMENT AUTOMATIONJust as automation of test environments is important to rapid testing and consistent environment creation, deploy environment automation—often with the exact same

automation toolset—allows for continuous deployment methodologies to be extended to the mainframe.

The ability to define an environment and create/configure/deploy within that environment with the push of a button brings the unique portions of mainframe deployment into line with the automation available on other platforms.

It also allows for monitoring of available deployment resources in a manner that can integrate into the overall DevOps landscape. Knowing what is available and how fast it can be allocated enables DevOps inclusion in the future while allowing more informed resource management today.

11

Page 12: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

OTHER

12

Page 13: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

The growth of mainframe usage combined with decreasing specialized mainframe staff has put an emphasis on onboarding tools and ways to make the mainframe more accessible. Eclipse-based development tools increasingly are the norm, along with other tools to help bring new developers to the mainframe.

ONBOARDING/SOURCE ANALYSISThe available tools for doing source code analysis have come a long way. It is now possible to diagram all of the interactions and subsystems of a mainframe program with an

automated toolset. The resulting analysis can be examined in a variety of formats, including graphically, and give developers an overall view of the system they are working on along with the impacts to other systems.

Any tool that makes understanding of the overall software architecture and the specific interactions of the system being worked on easier is worth the time. Developers new to the environment and/or mainframe programming can use these tools to get up to speed faster and introduce fewer inadvertent errors caused by subsystem interdependencies.

API GENERATIONModern API generation toolsets allow for the exposure of entry points in many mainframe subsystems from CICS to DB2 and IMS. By allowing access to the data on the system of record, these tools make the life of a client/server developer easier and open the mainframe

to the DevOps environment by putting the APIs into the DevOps stream. Automated generation makes accessibility easier to create than previous iterations have allowed.

Most of these tools utilize a Java-based application server to access mainframe data or entry points via CRUD APIs. This application server and the resulting generated API code are then part of the DevOps process, and while not bringing the mainframe into the DevOps world (changes to code on the mainframe remain unchanged), it does bring mainframe data and entry points into the DevOps world.

In the microservices version of API generation, single (or a small group) of APIs are generated at a granular level to run independent of other APIs. This allows more flexibility in re-using and integrating just the calls and data that a client application needs. As an easy example, an API that exposes a given DB2 table in a standalone microservice can simply be included in each new project that needs the same table’s data.

LINUXIt is old hat to run a Linux partition on the mainframe, and doing so offers the best of both worlds. Allowing modern toolsets to work on one partition while the rest of the mainframe

chugs along allows for modernization in-place. It also offers an integration point for bringing mainframe subsystems and data into corporate DevOps processes. Having a Linux partition that runs web and app servers that access mainframe resources and expose them is a huge step to modernization.

13

Page 14: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

Far from being the “old” system that is living in maintenance mode and waiting to die, mainframes are being brought into a more modern development environment through these and other tools. This has the three-fold advantage of making the mainframe more accessible to new developers, automating processes that are generally considered slow, and saving the prohibitive expense of trying to replace complex systems on a different platform.

But a fair number of mainframe environments have not yet started taking steps to bring their mainframes into the modern corporate development/DevOps environment. The mainframe is a processing powerhouse. Any steps that make that power more accessible to the broader IT environment are a good idea, and there are plenty of tools available to help. The mainframe and its software are resources you already have—and have paid for—so take steps to get the most out of that investment.

Summary

Don Macvittie is a 20 year veteran leading a new technology consulting firm focused on the dev side of DevOps, Cloud, Security, and Application Development.

14

Page 16: AUTOMATION: Modernizing the Mainframe for DevOps · 2018-09-04 · Industry standard, open-source tooling for cross-enterprise DevOps #CABrightside With only 7 percent of mainframe

http://www.devops.com https://twitter.com/devopsdotcom https://www.facebook.com/devopscom