Blog > Mainframe > 5 of the Hottest IT Trends (and How Mainframes Play a Role)

5 of the Hottest IT Trends (and How Mainframes Play a Role)

Authors Photo Christopher Tozzi | June 19, 2020

Mainframes were developed long before anyone was thinking about the trends and technologies that define modern computing. Concepts like open source software were decades away from emerging when the first mainframes were deployed in the 1950s.

Other modern IT trends and technologies, such as Docker, continuous delivery, and DevOps, did not emerge until more than a half-century after mainframes came into use.

IBM mainframes

Mainframes and modern tech

But mainframes have more in common with today’s IT technologies and methodologies than you might think. Consider the following trends and platforms that are popular today and their connections to mainframes:

  • Open sourceBlack Duck says that the open source software model — under which software source code is freely shared — is now the “default” approach to software development at a majority of organizations. While open source’s massive popularity is a relatively recent phenomenon, open source platforms have been common on mainframes for years. Mainframe implementations of Linux have long been popular on mainframes. Other open source projects, like Hercules, offer open source clones of traditional mainframe operating systems.
  • Big data. Big data analytics have become a key part of the way many organizations do business. It was only with the emergence of platforms like Hadoop and Spark that many companies began thinking about big data in a modern way. But mainframes were being used to collect large amounts of data long before “big data” became a thing. (And vendors like Precisely have long offered data integration solutions for transferring mainframe data to modern analytics platforms.)
  • Cloud computing. Terms like cloud computing and Software-as-a-Service became popular only about a decade ago. But in a sense, mainframes were being used to build clouds decades ago – and they are still an important part of the infrastructure that makes private clouds possible at many organizations. That’s because one of the main purposes of mainframes is to collect data and run applications that users access remotely – the core idea behind the cloud computing model.
  • Docker containers. Docker containers, which allows system administrators to run applications inside isolated containers, are revolutionizing the way applications are hosted and deployed. While there has been little discussion of running Docker on mainframes, you can and should. Docker works on any Linux-based operating system, including mainframe variants like Linux on z.
  • Continuous delivery. Most programmers today adhere to the idea that software works best when changes are written, tested and deployed on a continuous basis. This practice is known as continuous delivery. As a relatively new concept, continuous delivery may not seem like something that meshes with mainframes. But it does.

White paper

Why Legacy and Traditional Data Is a Goldmine for AI and Analytics

Learn about the benefits of incorporating legacy data in your analytics, AI and machine learning initiatives, the steps to creating a data supply chain for legacy data, and recommendations based on successful use cases.

So, there you have it. If you thought your mainframes were good only for sitting in dusty corners while IT innovation happens elsewhere, think again. The trends that define modern computing today apply to mainframes as much as they do to commodity infrastructure – and, in many cases, mainframes were the places where innovations like open source and the cloud established roots before they went mainstream.

For a seamless experience integrating mainframe data with the rest of your infrastructure, check out Precisely Connect.

Read our white paper “Why Legacy and Traditional Data is a Goldmine for AI and Analytics” to learn about the benefits of incorporating legacy data in your analytics, AI and machine learning initiatives.