Mainframe Skills + Hadoop Talent = Job Opportunity and Megabucks

Worried About the Squeeze in Your Mainframe Career? Try Hadoop on for Size

The reports of the death of the mainframe have been greatly exaggerated. Over 20 years ago, “experts” began to write obituaries for Big Iron, yet still ‒ after the dawn of cloud computing, the advent of many new programming languages, and the rise of Big Data and analytics ‒ the mainframe keeps chugging along. It handles some 60 to 80 percent of the world’s most critical transactions (depending on whom you ask), amounting to about 30 billion transactions every day.

Mainframe + Big Data = The Holy Grail of IT

If you have skills in both the mainframe and Hadoop/Big Data, you’re like the Holy Grail of the IT world.

The mainframe offers unprecedented reliability and unparalleled security. There is COBOL code written for mainframes that’s been running with no more than mundane routine maintenance for 40-plus years, the majority of which has never been sniffed at by an intruder, let alone hacked. Suffice it to say, with about 70 percent of all Fortune 500 companies heavily invested in z System mainframes, it ain’t going anywhere anytime soon.

At the same time, however, Big Data is becoming vitally important to the business. Mountains of valuable unstructured data can’t be properly digested by the mainframe, meaning lots of mainframe data needs to be offloaded to a system like Hadoop for analytical purposes. This is essential for Big Data operations to support critical business intelligence, marketing campaigns, supply chain management and more.

Where does that leave you, the career mainframer? Well, you’re in a unique position to become one of the most valuable employees on planet Earth. Mainframe talent is becoming notably difficult to recruit, as colleges have ceased teaching COBOL and other mainframe skills in lieu of more modern languages. Also incredibly rare are specialists who have both mainframe and Hadoop knowledge. That isn’t just rare ‒ that’s the Holy Grail of the IT universe.

ETL Process: Extract, Transform and Load

Syncsort’s DMX-h makes it much simpler to offload mainframe data into Hadoop. But businesses still need experts to navigate the trickier waters often found here.

If you’d like to take your $70,000 to $80,000 per-year salary up to $100,000 to $200,000 per year, here are some skills that top companies are screaming for these days:

  • ETL ‒ Accessing and integrating mainframe data into Hadoop is not an easy task. Syncsort’s DMX-h greatly simplifies the process, but businesses still need specialists with talent in both mainframes and Hadoop to make things go well.
  • Security ‒ While the mainframe is inherently as secure as any machine on the planet, Hadoop is not. The Hadoop ecosystem, in fact, is notoriously insecure. Businesses desperately need experts with security skills when offloading Big Iron into Big Data.
  • Mixed shop ‒ Most businesses are not in the position to put the ax to their mainframe operations in order to take on Hadoop and Big Data. Their only solution is to access and integrate their mainframe data into Hadoop, Spark, Stream, etc., for analytics, allowing Big Iron to keep munching away at their mission-critical transactions. A player in this arena who can maneuver seamlessly between both Hadoop and the mainframe ‒ (s)he’ll be the gladiator who takes the wooden sword.

If you’re ready to start accessing and integrating all that Big Iron data into Big Data platforms like Hadoop, visit Syncsort to see our Big Data solutions.

Christy Wilson

Authored by Christy Wilson

Syncsort contributor Christy Wilson began writing for the technology sector in 2011, and has published hundreds of articles related to cloud computing, big data analysis, and related tech topics. Her passion is seeing the fruits of big data analysis realized in practical solutions that benefit businesses, consumers, and society as a whole.
0 comments

Leave a Comment

*