Back to top

Podium & MapR: Executive Interview Part II

This is Part Two of a recent interview moderated between our CEO, Paul Barth, and MapR’s V.P. of Technology, Crystal Valentine.  Part One is here.

MapR and Podium first began working together in late 2015, teaming up on a Fortune 1000 company who was looking to modernize their approach to enterprise data management.  It was on this customer assignment that the MapR/Podium value became clearer.

Better Together

For organizations seeking greater performance, faster time to answers, and enterprise ready functionality, Podium and MapR’s combined expertise represents the best modern-day foundation for managing big data and optimizing business agility. Furthermore, the combined value and application of Podium and MapR represents more than a competitive advantage – it’s where the industry is going. Data and analytics programs are now embracing diversity, formalizing information as an asset, scaling governance with trust, removing data location as a constraint, and reinventing service delivery with AI and Machine Learning. These dramatic changes will help data-driven and analytics leaders thrive in the digital age.

Make Legacy Data Cool Again - Could big data technologies and modern data management platforms and architectures breathe new life into the legacy and mainframe applications? If so, how?

Valentine: This is a great topic because it echoes the themes that Paul and I have been discussing, that no enterprises that would have been called “data-driven” for the last 20 or 30 years tend to have some pretty important and robust applications in production today. It’s a situation where you don't want to throw the baby out with the bathwater, just because there are new set of technologies out there that you could leverage. If you have legacy applications, existing legacy datasets, those have real value and are impacting your business.

There's no reason to get rid of them. So what we're seeing is an approach that takes those legacy applications, and then asks the question, "How could I improve the efficiency of this application?" And here, I mean the functionality, the precision. Whether it's a customer engagement application that we're now able to make hyper-efficient because we're able to collect data at a much more granular level, and engage with individuals correctly in real time, who are interacting with the website. Or it can be something like a credit card fraud-prevention application. Historically, credit card transactions were deemed fraudulent or not on the basis of business rules that would be defined by an analyst. And business rules are incredibly fast to process, so it could be used as part of a real-time application, determining whether a particular credit card transaction is fraudulent in a matter of milliseconds. But the correctness of those rules wasn't 100%. Sometimes they would be overly conservative, or sometimes would allow fraudulent transactions to transpire. And I think what's worse is that the rules that over time grew to be quite large and cumbersome for a human to manually curate and maintain over time. And that really precluded agility. So if there were new credit card fraudulent schemes out in the world that had been detected, it might be difficult to modify the rule sets in a quick way, that would enable the credit card company to prevent future fraudulent transactions from taking place.

Barth: Furthermore, to this point, I was talking with Josh Rogers, who's the CEO of Syncsort, and their stated strategy is “from big iron to big data.” And that means that they've actually acquired a number of companies and technologies that have harvested important, meaningful data from big data platforms. Because at the end of the day, a lot of the economy is transacted on mainframes. And his statement to me is that that is actually continuing to grow in the double digits annually. He believes that that's a long-term platform because of its robustness, and efficiency, and the decades of business logic that have been built in there that don't make sense to migrate.

But at the same time, to Crystal's point, just one flag added to a record in a real time experience that's going through a transaction system could provide a significant amount of value. Whether it's fraud detection as she suggested, or next-best-product, or those types of things. And increasingly, that small amount of information in the mainframe is going to come from a backend that is using machine learning and modern dynamic scoring and the things that Crystal suggested. This is a continuum for a large established company that will continue to be supported, and it will make it very valuable.

Platforms, Point Products, and Open Source - Savvy IT leaders often make a series of strategic bets as they invest in and deploy products from a calculated mix of dominant players and their startups and narrow specialists. How should considerations around the platform approach versus point products and open source versus proprietary solutions play into those "bets" by IT organizations?

Valentine: There's two different questions in there that I can address separately. I think that in the question of a platform approach versus point product, that goes back to the conversation we had previously about agility. What we've seen is that over the last, say 15 years, there have been a crop of really interesting and innovative new technologies that have emerged from a combination of academia and the open source community and innovative startups. Typically, the vast majority of them are pretty narrow in scope and the adoption of these point products also requires that they be integrated with a whole host of other point products. Today, what we're finding is that applications require multiple different data management and processing technologies working together in concert. A typical application today, a next gen application today, doesn't just run on a single relational database. But it relies on a relational database, maybe a NoSQL database, a streaming engine, a couple of streaming analytical products, and maybe others.

The challenge with leveraging point products is that integrating them tends to be very time-consuming, and then the resulting application that you built can be somewhat fragile and brittle, difficult to extend over time. In our experience of working with customers, when they talk about agility, they talk about time to market. And really having that unified platform that can support all of the different data types that they're leveraging, and all of the different processing engines that they're leveraging, it really is an example where the benefits start to compound over time. Building that first application is easier on a unified platform, rather than using an integration of point products. And building that second application even gained incrementally more efficiencies than the previous one, so it's a compounding of efficiencies over time as we build out a suite of multiple applications that might be sharing datasets.

Barth: There are a number of these issues that I view as red herrings. The things about agility and scale and managing risk really come down to, "Do you have a coherent and cohesive platform in which you can register and connect components with each other, so that you can operate and move from custom development to lights-out management?" And that's why we really strongly believe that the platform approach is going to be superior, except for some fringe applications, for the broad adoption of data across the enterprise. Our view of this is that, as much as MapR has a terrific unified storage and processing platform, Podium has a unified data management platform on top of it. And that, you can plug in to any data sources, to legacy systems, to modern systems. You can build out end user wrangling and data preparation, all within a managed and aware context, that is self-managing over time.

And the reality is, the open source community is going to continue to do what they do best. They're going to take new computing paradigms and new technologies that have a reasonable standard base, like a SQL or cloud systems, but at the end of the day, to get to a business-ready system, I believe the platforms that are delivered by commercial vendors are going to be dedicated to that next layer. And that's where we’re going to see some mitigation of risk because of that dedication.