Mainframe is not on most people’s lists of the hottest words in tech. Additionally, mainframes may seem disconnected from modern IT trends, but the latest practices and innovations are being applied to mainframes. Here’s how.
Mainframes were developed long before anyone was thinking about the trends and technologies that define modern computing. Concepts like open source software were decades away from emerging when the first mainframes were deployed in the 1950s.
Other modern IT trends and technologies, such as Docker, continuous delivery, and DevOps, did not emerge until more than a half-century after mainframes came into use.
Mainframes and IT Trends
But mainframes have more in common with today’s IT technologies and methodologies than you might think. Consider the following trends and platforms that are popular today and their connections to mainframes:
1. Open source
Black Duck says that the open source software model – under which software source code is freely shared – is now the “default” approach to software development at a majority of organizations. While open source’s massive popularity is a relatively recent phenomenon, open source platforms have been common on mainframes for years. Mainframe implementations of Linux have long been popular on mainframes. Other open source projects, like Hercules, offer open source clones of traditional mainframe operating systems.
2. Big Data
Big data analytics have become a key part of the way many organizations do business. It was only with the emergence of platforms like Hadoop and Spark that many companies began thinking about big data in a modern way.
But mainframes were being used to collect large amounts of data long before “Big Data” became a thing. (And vendors like Syncsort have long offered data integration solutions for accessing and integrating mainframe data with Hadoop data lakes and modern analytics platforms like Splunk.)
3. Cloud computing
Terms like cloud computing and Software-as-a-Service became popular only about a decade ago. But in a sense, mainframes were being used to build clouds decades ago – and they are still an important part of the infrastructure that makes private clouds possible at many organizations. That’s because one of the main purposes of mainframes is to collect data and run applications that users access remotely – the core idea behind the cloud computing model.
4. Docker containers
Docker containers, which allows system administrators to run applications inside isolated containers, are revolutionizing the way applications are hosted and deployed. While there has been little discussion of running Docker on mainframes, you can and should. Docker works on any Linux-based operating system, including mainframe variants like Linux on z.
5. Continuous delivery
Most programmers today adhere to the idea that software works best when changes are written, tested and deployed on a continuous basis. This practice is known as continuous delivery. As a relatively new concept, continuous delivery may not seem like something that meshes with mainframes. But it does, as I’ve written on this site previously.
So, there you have it. If you thought your mainframes were good only for sitting in dusty corners while IT innovation happens elsewhere, think again. The trends that define modern computing today apply to mainframes as much as they do to commodity infrastructure – and, in many cases, mainframes were the places where innovations like open source and the cloud established roots before they went mainstream.
Learn more about how mainframes and Big Data trends are merging to create new opportunites. Download the Bloor Spotlight: Big Data and The Mainframe, Issues and Opportunities today!