This post on cognitive computing is an update of an article that originally appeared on the Dancing Dinosaur blog.
You probably think of the z as a never-fail OLTP workhorse with rock solid security. Or, maybe you think of it as a scalable, fast analytics machine able to process and analyze massive amounts of data by using native Hadoop and Spark on the z and without the delay and overhead of additional ETL.
Now, it is time to also think of the z as a cognitive computing workhorse, able to process complex cognitive problems natively on the box; plus participate in machine-to-machine learning. The mainframe is emerging as a cognitive machine, and IBM is only making its cognitive capabilities available on premises for the z System. Any other platform has to access IBM’s cognitive capabilities in the cloud.
Based on my interview with Donna Dillenberger, IBM Fellow, IBM Enterprise Solutions, there are three ways to get IBM cognitive computing solutions: the IBM Cloud, Watson, or the z System. The z, however, is the only platform she noted that IBM supports for cognitive computing on premises (sorry, no Power). The z may be the peak of programmatic programming.
As IBM noted, the future lies in cognitive computing. Cognitive apparently has become the company’s latest strategic imperative, seeming to trump its previous strategic imperatives: cloud, analytics, Big Data, and mobile. As I noted in a previous piece, maybe only security, which quietly slipped in as a strategic imperative sometime in 2016, can rival cognitive, at least for now.
IBM describes itself as a cognitive solutions and cloud platform company. So, why does the z get to run cognitive on premises? Dillenberger’s answer: the bulk of an enterprise’s critical data resides on the z. Running cognitive on the z puts the processing adjacent to the data to ensure the fastest, lowest latency results.
Need for Cognitive Computing
You need cognitive computing, insists Dillenberger. It is the only way to move beyond the constraints of programmatic computing. Cognitive can take you past keyword-based searches that provide a list of locations where an answer might be located to an intuitive, conversational means to discover a set of confidence-ranked possibilities.
Getting to cognitive computing should be straight forward for a z shop. You don’t program a cognitive system, notes Dillenberger. At most, you train it, and even then the cognitive system will do the heavy lifting itself. “Just use what the cognitive system thinks is best,” she adds.
Dillenberger’s conversations with early adopters suggest the payback: the value of the revealed insights and a quick technical payback for z data centers by running analytics locally. In such cases you can realize up to 3x the performance, Dillenberger reported. When pulling data from other locations you still run 2x faster.
When the data and IBM’s cognitive system resides on the z you save big. “ETL consumes huge amounts of MIPS. When the client did it all on the z, it completely avoided the costly ETL process,” Dillenberger noted. The client reported savings of $7-8 million dollars a year by completely bypassing the x-86 layer and ETL and running Spark natively on the z.
As Dillenberger observed, cognitive computing on the z is here now, able to deliver a payback fast, and an even bigger payback going forward as you execute on the insights it reveals. And, you already have the z, the only on-premises way to run IBM’s Cognitive System.
Download our latest eBook, “Mainframe Meets Machine Learning,” to learn about the most difficult challenges and issues facing mainframes today, and how the benefits of machine learning could help alleviate some of these issues.