Home

IBM z17 Aims to Make More Possible

IBM z17 Aims to Make More Possible

Ross Mauri, GM IBM Z & LinuxONE, shares insights on the highly anticipated z17, emphasizing IBM's focus on AI and hybrid cloud innovations to maximize business agility.

Forget the mainframe of yesterday, IBM's new z17 is here.

Patrick Moorhead and Daniel Newman sat down with Ross Mauri, General Manager, IBM Z & LinuxONE at the z17 launch event to explore how this latest generation is purpose-built for today's data-intensive, AI-driven world, addressing the challenges of scale, security, and agility that define modern business.

Key takeaways include:

🔹Built for Today's Demands: The z17 physical server is engineered to handle the massive transaction volumes and data loads of today's digital world, delivering the performance, scale, and security that enterprises require.

🔹AI at the Core: IBM is infusing AI throughout the z17 platform, enabling clients to leverage powerful new capabilities for fraud detection, risk assessment, and other critical applications.

🔹Hybrid Cloud by Design: z17 is a full participant in the hybrid cloud ecosystem, supporting containerized applications, seamless data sharing with hyperscalers, and consistent operations across diverse environments.

🔹Modernizing the Mainframe Experience: IBM is committed to making the mainframe more accessible and user-friendly, empowering both seasoned experts and new talent with modern tools and workflows.

Learn more at IBM.

Watch the full video at Six Five Media, and be sure to subscribe to our YouTube channel, so you never miss an episode.

Or listen to the audio here:

Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.

Transcript

Patrick Moorhead: The Six Five is On The Road here for z17 announcement day. We are at IBM's One Madison building. Newly minted. I can even smell the fresh paint here. I love being. I'm a former recovering product guy. I love product announcements. And we are here to talk about the z17. Daniel, how are you doing, my friend?

Daniel Newman: Yeah, Good morning. This new flagship location is great. Been here a couple of times since it opened up, Pat. And it's got that kind of. Not only that new smell, but it really has that kind of new, growing, exciting, transformational company that is IBM. I mean we saw a few years ago, Arvin came in, said it's going to be about hybrid cloud. It's going to be about AI. We're seeing it start to kind of transcend, transform this business, this industry. And it's been exciting. So it's good to be here. And yes, Pat, you are an old historic product guy. And launch days are always a whole lot of fun.

Patrick Moorhead: Can't get out of my system for sure, but I can't imagine a better guy to talk about the Z17 than Ross Mauri, general manager of Z. How are you Ross?

Ross Mauri: I'm great, Patrick, thank you very much. It's an exciting day for us.

Patrick Moorhead: Totally it is. And some people think, oh, the work just starts, hey, you're done, right? You've launched this thing. But we all know that's not the case. But thanks for coming on the show again.

Ross Mauri: Absolutely. I'm excited to be here. A lot of innovation and effort by my team went over many years to get to this day. And so that's why it's so exciting for all of us and what we're about to unveil for our clients for sure.

Daniel Newman: Yeah, these launches are really, I kind of was trying to say like you're sort of shifting what's going on in the industry. And of course there's this mainframe meeting the cloud,meeting AI, meeting the moment that you've been talking about for a few generations now. But with z17 there's a lot happening. But let's take a little walk down history, walk down memory lane. You know, you've been here for several Z launches now. Mainframes still are something like 70% of all the transactions, you know, in the world right now. Talk a little about the history, the build up to z17.

Ross Mauri: Well, I'm not going to go back all the way in history.

Daniel Newman: Come on, come on, start with one.

Ross Mauri: But I just.

Patrick Moorhead: We have an hour.

Ross Mauri: I just think that, you know, if you just look at the past five years, right? 15 and 16 here, the world has changed, right? The pandemic changed things. But digital transformation is what really changed businesses and consumers, you know, I would say experiences, right. And now we've got AI on the scene. And so we've just, we've been putting out mainframes that have really met our clients needs. They need super secure, super scalable transaction and data systems. I mean it's just that basic, super secure being very important, right? Powerful, scalable, able to take, you know, stock market spikes or Black Fridays that occur on Thursday in the wrong month or whatever. Right. So the idea that we're building these systems to suit our customers needs, the banking needs, the insurance needs, retailers, airline reservation systems, governments, central banks, We've been meeting their needs. But what to me has really changed is this digital revolution, this digital transformation, this driving up of transactions has really allowed the mainframe to flourish. And now with AI, I think it's a real game changer.

Patrick Moorhead: Yeah, I mean I think we first met Z14, Z13, I forget the exact time sometimes. You can manage time with Z launches here. But at your investor day you talked a lot about the design principles and the success of z16. And it is funny, you know, probably five, I mean actually not five years ago, there was always this we're going to replace the mainframe thing. Okay, but can you talk about the design principles that go into it? That really explains its longevity of value. Talked a little bit about two or three things and maybe do the double click on that.

Ross Mauri: Sure. I think that. So the basics are high performance and high scale because these workloads are incredible. I mean you think about a credit card company or a bank payment system, they could be doing 20, 30, 40, 50,000 transactions per second. It's a lot of people doing a lot of shopping or whatever they're doing. Right. But these transaction rates are real and are sustained. We have many banks that do well over a billion transactions a day in 24 hours. So scalability and performance is key. What's the next thing? Security. Keep the data secure, keep the system secure. Right. Keep everything about it secure. I mean it's the country's economy flowing through our systems, not just our country, many countries around the world, their economies are flowing through our systems have to keep it secure. Those are like table stakes. High availability. I mean things happen. There are power outages, there are natural disasters, people make mistakes. I mean all kinds of things happen. Systems have to keep running. So high availability is key. What we've, what we've now moved into though, in my sense is some of these ilities are really coming to the forefront. And so with the notion that AI has come out. That to me is creating this ability to take all this data that's captured through all this transaction processing and do things with it that couldn't be done before. That's the difference. And I think z16, you mentioned the investor day. I'm glad you went to investor day. I'm glad that Arvin highlighted z16. It is the most successful program in IBM's history from a mainframe point of view. Glad to be a part of that. Glad that our clients were so thrilled that they're expanding their capabilities. And what's changed? I mean the cloud's been around for more than a decade now. And as you said, people have been kind of writing off the mainframe for a couple decades now. Right. And who would have thought that in the era of cloud, with cloud, cloud growing like this, shouldn't mainframes just be disappearing? Well, mainframe growth is phenomenal. Right. I mean it's 5x growth, right, that's going on here. And it's Z OS workloads, it's Linux workloads that are growing again. So why our customers are highly intelligent, very big corporations that again have to spend their shareholder money wisely is because these systems are rock solid and they do fit for purpose computing like nothing else in terms of transaction processing and data. So we're building on all those principles. I talked to you, the ilities, as you like to call them, injecting AI now and I know we're going to talk a little bit more about that. To me, that's been the game changer these past three years. We kind of surprised the world that we embedded an AI inference accelerator in the microprocessor. And people are like, what are we going to do with that? And then we said, well, here's what you can do with it. And now they're like, oh, I like that and I want a lot more.

Daniel Newman: Yeah, I'm going to hit you up on that in just a second. I do want to say, having come to Investor day, it was really good to see this highlighted. Both Pat and I have talked endlessly after various earnings about just the significant contributions that Z makes every quarter, especially during launch periods. But one of the things that did change during the Z16 was you created a bit more of a longer term scalable impact to revenue where it wasn't kind of that big wave and then fall off, which has been transformational when you talk about a company that has certain growth profile, certain profit profile and of course the way your investors invest in a company. So that's been really, really good. Now as you pivot to AI, you sort of started teasing. Pat and I are both, we both love chips, so we love the fact that you're doing something there. But again, in the end it's about the use. I mean, we really take for granted, Ross, that this stuff just works all the time. I mean, when we swipe our car to get, put gas in the car to buy dinner, breakfast, whatever it is, we just expect it to work behind the scenes. There's a lot going on there. So where does AI sort of start to create new use cases and how are your clients sort of reacting to the power of the possibility of what you can do with z17?

Ross Mauri: So let's just start with z16 quickly because what we put out there was when we launched z16 we had about 10 use cases, right? Our clients now have identified and are using more than 250 use cases. So we had a little bit of an idea of what clients could do. They adopted the technology. Why did we have sustained growth throughout the cycle and not just a bump like it used to be. There was a bump in the cycle. More demand. It's demand based, it's growth based. It's for higher resilience and it's for more transactions and more clients around the world. So that's been part of the difference here. Now as we get into 17 and the use cases, the use cases are all over the place in terms of, from health care to banking and credit card fraud. There's a very, very wide range of use cases. Our sweet spot is banking. That's where our sweet spot is. Over 30% of the use cases are actually fraud. When you say the word fraud, it's a short five letter word. There's hundreds of different types of fraud detection that need to go into all the business processes within a bank, within a credit card processor, a payment processor, a claims adjudicator. And so it's these use cases for fraud detection and prevention that are saving banks now hundreds of millions of dollars. Right? That's across an industry. And so that to them was a big game changer. They could go from doing partial fraud detection and maybe catching, maybe catching a small percentage to doing 100% of their transactions regardless of how high the transaction rate is and, and doing a pretty good job of fraud. I mean, you know, getting maybe 85, 90, 95% of the fraud out of the system, letting those transactions run and just grabbing the ones or most of the ones that they thought were fraudulent. Now with z17 we'll talk a little bit about the technology, but from a use case point of view, we're allowing something called multimodal AI to occur so you can run multiple different types of models against a fraud case in a transaction and make that determination. Close that window to such a small point that you know something's fraudulent or you don't, because it's that gray case in the middle that the banks are still toying with. So there's lots of use cases. We pick on fraud just because it's easy for everyone to understand. And again, there's a lot of demand for it.

Daniel Newman: Just really quickly, when you talk about these fraud detection and multiple models, you're talking about something that happens in almost zero latency too. And I think that's really important for everyone out there to understand is, you're basically running multiple models against transactions so that, you know, that time that I accidentally clicked that wrong link And I give my debit card and make a payment on fake PayPal, it's able to quickly see that this is happening. And by the way, thousands, tens of thousands of these types of transactions are happening per second and have to be detected. That's pretty, pretty big.

Ross Mauri: I mean, so the new z17 can do 450 billion inferences a day. 450 billion. So you start, just do the math and break it down. A transaction takes 4 to 6 milliseconds. Right. So you have to do full fraud. And all the other processing might be multiple reads, could be multiple writes, all within that transaction. So all of our AI inferencing has to go on within 1 millisecond to make that whole window.

Patrick Moorhead: Well, there's even legal requirements for that that a lot of people just kind of paper over. And availability.

Ross Mauri: Yes, yeah, absolutely. So I think the power of say, taking doing a machine learning inference for fraud and then backing it up with a large language model, an encoder, large language model like Bert or something like BERT Large or Roberta, one of those, one of those large language models and basically doing the fraud two different ways and then comparing the results, that's what the banks are looking for us. Now we've worked with a number of enterprises around the world that have guided the requirements that drove this into z17. And we've got over 100 clients right now waiting for us to come out with this multimodal support.

Patrick Moorhead: Yeah. So it is launch day and we kind of talked a little bit about z17. We're kind of getting at it. But I want to ask about z17 in the context of cloud and we talked about hybrid cloud and AI. I think when most people in tech, when they think of a mainframe, they may not know its hybrid cloud capabilities and really how it fits into this overall IBM strategy of hybrid cloud and AI. Can you double click on that for us?

Ross Mauri: I can try. So hybrid cloud means a couple different things. Let me just try to parse it up a little bit. So if you're, lots of people think hybrid cloud means containerized system like Kubernetes and maybe Red Hat openshift. Right. So you have a containerized based application development where the containers can be transported and run in different systems in different environments, depending on the workload characteristics. Full player in an OpenShift world. Right. So Z is a full player there. So for some people, hybrid cloud means ensuring that you can share data between clouds. So we've got a whole API strategy and APIs up and down the system software stack so that you can kick off transactions, you can gather data, and you can do it in a very high speed and efficient way to connect with a client. In fact, we've got six patterns that our clients use to connect with. The hyperscalers make it very easy to connect with a hyperscaler to share data back and forth. So it's kind of one big long processing engine. If you would mainframe and hyperscaler working together in unison, then some people think hybrid cloud means application development and moving into a modern application DevOps environment with a CICD pipeline. All of that software is available for use on the mainframe, or you can develop elsewhere and bring your applications in. All of that software is available. And then the final thing would be operations, observability, performance management done across the enterprise's full infrastructure. Hyperscaler, public cloud, private cloud, z cloud, open, all those things together. We have that software as well. So we participate in the hybrid cloud at all levels. And I went to those four kinds of spaces, Patrick, because people say hybrid cloud and then in their minds, they usually go to one of them.

Patrick Moorhead: So I'm glad you did.

Ross Mauri: We play in all of them.

Patrick Moorhead: Yeah, I'm glad you did because it is the first I usually caveat with. Here's my definition of it. What I mean. But I'm glad you expanded the definition to pretty much everybody's definition of hybrid cloud. And I think people are still amazed that Z runs Linux, it runs containers and you can leverage a lot of the work that you've done throughout the enterprise with transportability in there. And that, by the way, that's my definition of hybrid cloud. So I'm glad you hit that first.

Daniel Newman: Yeah, it's interesting. When I started this I kind of talked about how everything and how Arvin brought everything back to hybrid cloud and AI. And it's great that you kind of explained that. It's also really important to note that you've done all this with security in mind and I think not to circle back to that because you sort of, you were sort of made that point earlier. But I think a lot of times when people start to go, oh, you're connecting your mainframe to the cloud, doesn't that sort of, isn't that sort of breaking the whole purpose of the mainframe? Kind of. But there's been a lot of work that's gone into building the right encryptions to be able to move data to and from. So what you're really doing is you're creating a modern cloud environment in many ways that runs and has all the security and kind of scalability profiles that a mainframe had, which is incredible. And I mean that's why 70%, that's why people aren't moving off and that's why those titles that they love to use in articles is something like the mainframe is dead. Long live the mainframe. Exactly. We're in one of those.

Patrick Moorhead: Oh, I've seen some hyperscalers take a few runs at it and now they're.

Daniel Newman: All friends, they've all face planted. It just doesn't work. Like you can't move off of it. Ross, we got just a minute left. Thank you so much for spending the time and covering this with us. Just kind of a takeaway for everybody out in the audience right now is kind of what are the one or two things that you really think they should be most excited about as they look to this opportunity to move and to upgrade and to buy, advance and continue their partnership with Z?

Ross Mauri: I think the biggest thing is we mentioned machine learning and predictive AI, but on this mainframe there is also going to be generative AI for assistance and assistance running agents. So think of chatbots of the most advanced kind. So not only will we be doing on the mainframe with its security and its robustness. Right. And its RAs traditional machine learning, but we'll be doing generative AI things like Watson Code Assistant, Watson X Code assistant for Z or, or now the new Watson X assistant. A whole new way for a system program or system administrator to be able to deal with the mainframe, to be able to set it up through a fully modern interface. And you know, the old system programmers like me, you know, gray hair, but especially for the new generation that wants to come in, that learns differently, wants to get on board in different ways than perhaps I did in the 80s, they want to do it in a much more online, ask questions, do things manner. They can do that. So I think the power of AI in generative AI, again with assistance and agents, is going to be fully unleashed. And we've got some specialized hardware now in the mainframe to enable that at performance scale and with security.

 

Daniel Newman: Well, Ross, I want to thank you so much for joining us. Big day. I know you're excited. T shirt on.

Ross Mauri: Yes. You know, I might tell them two.

Patrick Moorhead: I want one of these.

Daniel Newman: Yeah, we're gonna, we're gonna get one of these, you know, Silicon nerds, right? Hey, it's good. Hardware's cool again.

Ross Mauri: It is.

Daniel Newman: It's great to be in infrastructure. Infrastructure is cool.

Patrick Moorhead: I always thought it was. I always thought chips were cool. Just had to get the rest of the industry to come along.

Daniel Newman: Well, you know, we called the semiconductors would eat the world. And here we are. Ross Mori, thank you so much for joining us on this launch day. Congratulations. What a moment.

Ross Mauri: My pleasure. Thank you, Daniel. Thank you, Patrick.

Patrick Moorhead: Thanks.

Daniel Newman: And thank you everybody for tuning into this episode of the Six Five. We are On The Road here at IBM z17 launch day. Hit subscribe. Join us for all of our great content and coverage here at the event. And of course, all of the Six Five. We appreciate you being part of our community. We'll see you all later.

MORE VIDEOS

How Enterprises Are Innovating with the Best of Oracle Database and Microsoft Azure

Brett Tanzer and Karan Batta discuss the groundbreaking Oracle Database capabilities within Microsoft Azure's cloud, emphasizing real-world applications and strategic expansions.

From Automation to Autonomy: How Security, Observability, and QoE Drive Next-Gen Networks

Cody Bowman and Eben Albertyn join Will Townsend to share expert insights on the pivotal role of AI, security, and QoE in the evolution towards autonomous networks.

IBM z17: Doing More at the Core - Six Five On The Road

Tina Tarquinio and Chris Berry from IBM join hosts to explore the z17's cutting-edge advancements and reliability in the tech world.

See more

Other Categories

CYBERSECURITY

quantum