Syncsort CEO Josh Rogers talks future of Data Management on theCUBE

Video: CEO Josh Rogers on What’s Next for Syncsort and the Future of Data Integration, Data Quality & Data Management

Syncsort CEO Josh Rogers recently sat down with theCUBE co-hosts, John Furrier and George Gilbert, to talk about the company’s new path after acquiring Trillium Software and what that means for its future plans around data integration, data quality and data management.

In a nutshell, the secret to Syncsort’s success is applying its long-standing technical expertise of mainframe and legacy systems to solving data management challenges in both traditional and next-generation, Big Data environments. 

Benefits of Syncsort’s Data Integration Approach

Co-host Furrier compliments Rogers on Syncsort’s unique data integration strategy – “locking down mainframe and legacy systems and then transforming into a modern data company.” Furrier notes that this is a different approach when compared to Silicon Valley organizations, which tend to go after the new, innovative ideas first, and then find themselves up against the hurdle of incorporating legacy data.

When asked if this was “a genius move or accidental,” Rogers confirms that it was part of the organization’s plan to manage data at scale and help customers process that data to get more out of their business analytics. He further explains that Syncsort’s competitive advantage is in its unique ability to apply its mainframe and legacy systems expertise and technical talent to solving data integration challenges across both traditional and next-gen Big Data environments.

Related: Tendü Yogurtcu Talks Strata Trends and More during theCUBE #BigDataSV Appearance

Rogers adds the point that legacy systems aren’t going away. You need that expertise on both sides, which is why Syncsort has developed Big Data proficiency. It applies the same disciplines used in legacy environments to new environments. Syncsort also bridges both sides through its Big Iron to Big Data solutions that connect mainframe and legacy EDW data with Big Data analytics.

Trillium – the Gold Standard in Data Quality

When asked about Syncsort’s recent acquisition, Rogers emphasizes that Trillium Software is the gold standard in data quality solutions for large, complex global enterprises – a leader in Gartner’s Magic Quadrant for over a decade.  He stresses the importance for organizations to have the ability to understand issues with data and then establish business rules that improve its quality of that data so they can trust its accuracy.

Related: Trust the Data: Don’t Run Your Business on Alternative Facts

The process of building trust in data, he pointed out, is very relevant to the movement and transformation of data that Syncsort has a long-standing reputation for providing. “When you think about the development and maturity of Big Data environments like Hadoop – organizations want to do analytics and need to trust that data. The need for organizations to have the ability to apply profile and quality rules in that environment is an underserved market today.”

Rogers adds that Trillium is a great, respected firm, with best-in-market capabilities in this space, which the company will continue to provide, and can now apply in next generation environments.

Download Now: Bringing Big Data to Life - What the Experts Say

What’s New in the Data Management Landscape

When co-host Gilbert asks about areas beyond data quality, Rogers asserts that organizations are looking to metadata management to support a number of key business initiatives.  Business leaders are reviewing different styles of data movement like Change Data Capture (CDC) for moving data from source systems to Hadoop – they also want to be able to move incremental changes on an on-going basis at the speed of business.

In addition, Master Data Management (MDM) offers a gold standard of reference data that organizations can use to drive analytical capabilities, including visualization and predictive analytics.  Rogers clarifies that the best way to enable this – and get maximum value out of next generation environments that offer more flexibility, scale and better cost structure – is to leverage various engines to harness broader data sets from enterprise-wide environments.

There are two ways to deliver those capabilities: either build them from scratch, which is a long and complex process, or take tested, proven best-in-market engines and integrate them deeply in next-generation environment for faster time to value.

Rogers confirms that this is the unique approach Syncsort is taking, whereas large platform players have challenges and limitations in trying to make their architectures and code generation approaches relevant in the Hadoop environment.

Rogers notes that machine learning is also of great interest. As a power player in the data quality and integration environment, machine learning can be used, for instance, for business rules to learn as they process datasets and see how they change over time.  He adds that Syncsort is also seeing growth in cloud adoption, where more and more organizations are implementing hybrid cloud and on-premise environments.

Finally, organizations are recognizing that, to get a pay-off on investments in next-generation analytics infrastructures, they need to run mission-critical workloads, and to do that they have to be able to manage the environment – a capability Syncsort has been delivering for four years.

For more valuable insights into the future of data management, read the Syncsort eBook, “Bringing Big Data to Life: What the Experts Say.

Michael Kornspan

Authored by Michael Kornspan

Director, Corporate Communications

0 comments

Leave a Comment

*