Home
From Promise to Profits: A Hybrid Approach to Unleash AI’s Power at Scale - Six Five Media at NVIDIA GTC
From Promise to Profits: A Hybrid Approach to Unleash AI’s Power at Scale - Six Five Media at NVIDIA GTC
Hillery Hunter, CTO at IBM, shares insights on AI scalability, the challenges in adopting GenAI, and IBM's unique approach to AI models and its partnership with NVIDIA.
How can companies start to unlock true ROAI?
At NVIDIA GTC 2025, host Patrick Moorhead and Hillery Hunter, CTO of IBM Infrastructure and GM of Innovation, discuss the practical realities of enterprise AI adoption and use cases. Hillery shares IBM’s approach to helping customers navigate the journey from experimental phases to large-scale, impactful deployments.
Key takeaways include:
🔹Scaling for ROI: Forget endless POCs. IBM sees 2025 as the year enterprises finally unlock the true potential of AI, moving from experimentation to tangible business value.
🔹Open Source & Tailored Solutions: IBM's commitment to open-source models, like the efficient Granite family, empowers businesses to customize AI solutions, addressing the crucial gap between generic models and enterprise-specific data.
🔹Data Governance: Data privacy and policy are paramount. IBM's platform-centric approach, leveraging Red Hat and open-source, ensures consistent AI deployment across all environments, from the cloud to the mainframe.
🔹Strategic Partnerships for Enterprise Needs: IBM's collaboration with NVIDIA focuses on delivering practical, enterprise-grade solutions, including advanced storage and cloud integration, to solve real-world business challenges.
Learn more at IBM. Watch the full video above and be sure to subscribe to our YouTube channel, so you never miss an episode.
Disclaimer: Six Five On The Road is for information and entertainment purposes only. Over the course of this webcast, we may talk about companies that are publicly traded, and we may even reference that fact and their equity share price, but please do not take anything that we say as a recommendation about what you should do with your investment dollars. We are not investment advisors, and we ask that you do not treat us as such.
Patrick Moorhead:
The Six Five is On The Road here at Nvidia GTC 2025 in San Jose, California. It has been an incredible event so far and unsurprisingly we're talking a lot about AI. Imagine that. I feel like this is all we've been talking about for the last two years. But it's great stuff. And one of the core themes that is interesting. I don't always say hey I told you so but even 10 years ago I said the cloud will be hybrid and you know, guess what? Hybrid AI is hybrid as well. And it makes sense. This is what enterprises want. They want workloads split between on-prem cloud and even the public cloud because that just completely makes sense. I am in the IBM booth but hey, I want to introduce Hillery. It's great to see you.
Hillery Hunter:
Good to see you again.
Patrick Moorhead:
Welcome to the Six Five. You and I haven't done a video together but we've been on a lot of conference calls together. So it's great to actually get out and do this together.
Hillery Hunter:
That's what these shows bring is the opportunity to get together.
Patrick Moorhead:
No, no, totally. And you can hear the booth behind us. I was in your booth a couple weeks ago at Mobile World Congress literally playing ping pong and showing the power of Watson X. Like how can that be possible? Who's tracking my ball? I lost to somebody who I didn't want to lose to. They gave me a full LLM readout on all the things I did well and all the things I didn't do so well. But I just thought it was cool. Couldn't fit into here otherwise I might be playing that in ping pong. So anyways, let's actually get down to talking through this. So AI is a journey as we've seen with everything with enterprises. You don't just start and stop and decide to do something that changes everything overnight. And you know, we saw experiments, we saw POCs and now we're finally starting to think about scale and I think, my research firm thinks 2025 will be the era where it starts delivering ROI. But listen, you talk to a lot of clients probably more than I. What are you experiencing? What are they experiencing through their AI journey?
Hillery Hunter:
Yeah, I think everything we're seeing is very aligned to what you just described. I think to maybe throw a few numbers behind it though, we periodically do these IBM Institute for Business Value IBV studies that look at how the C suite is feeling about adoption. And we've definitely seen a market uptick between 2023 data that in some cases is in the 20 something percent about real confidence that it was about to be deployed. Now to the end of last year, the latest copy of the survey was well over 70%. Feel like the time is now and I think your example is a great place to start this conversation. Right. This sports environment for understanding and analyzing and providing feedback. It's just the tip of the iceberg. And what we're seeing. This is all industries, all businesses. In many cases it starts with internal transformation opportunities around digital labor and contracts and all this other kind of stuff that no one wants to be doing anyway. People then can do more high value work because of that stuff. And similarly then it extends all the way into enterprise and enterprise data sets and even out to the edge where most enterprises are interfacing with their customers. It's powerful. We're seeing not only in our own experience and things like customer support, higher NPS, but our clients that are implementing solutions in those kinds of areas are reporting the same thing. You know, a much faster time for resolution, higher NPS, why not?
Patrick Moorhead:
Right. And I remember Arvin giving some really good early proof points, even on an HR chatbot, which I thought was incredible. He was one of the first CEOs to really go public with this all. I thought it was really impressive. So I'm in tech. I've been doing tech for a long time. You have too. We both know that, hey, you can't do enterprise generative AI things like agents by swiping a credit card and going to ChatGPT. It's just not that simple. So what are some of the challenges that your clients are having? I know we've moved on a little bit, but what's keeping them from scaling and just doing this?
Hillery Hunter:
Yeah, I mean we've had a lot of focus. And you see in our release of the most recent Granite family, just a couple weeks ago we announced the Granite 3.2 model family where not only with the core models, but the you know, the reasoning capability, the multimodality. We are keeping the model sizes relatively small, relatively compact. Our guardian model that helps you ensure the trust and the safety of the model is down to 5 million parameters, 800 million active parameters in a mixture of expert kinds of context. And so we really are trying to help our clients build based on open source AI that is at an affordable size in terms of the model, it still is accelerated in many cases by Nvidia GPUs, but it's accelerated in order to get that fast response time and such. But you know, being able to create AI that then enables enterprise scale is I think one piece of it, but the other thing is data, data privacy, data policy, all that stuff. And so I like to say that AI needs to be not only a which AI, but sort of the how of the AI, which AI platform. And I think that's also a very active conversation that folks are having. And our together with Red Hat is to leverage open source, but make the capabilities available across all different types of environments. And that spans even for us into the AI conversation being on premises on systems like the mainframe, because that's where there's a need to do in transaction fraud detection or other things that AI capabilities can really help with. And so the AI conversation I think is pervasive and we're trying to help people get past that barrier to get to the valuable data, the places that are going to have the highest ROI in their implementations and cross that bridge between experimentation into production because there's an ROI and a return from it.
Patrick Moorhead:
Yeah, I'm glad you brought up Red Hat because again, all my prognostications are not all right. I like to think the important ones are. And quite frankly, 10 years ago I said, listen, the cloud is going to be hybrid and it is. And people want a consistent way to do things on-prem sovereign cloud and the public cloud and things even like Enterprise SaaS. One of the biggest conversations that has come up with a challenge not necessarily to the hybrid cloud and hybrid AI is data and data management. And it seemed easy from a “we knew this would be a challenge.” I mean I started school and probably you. And it was garbage in, garbage out. It's a long time ago. Well, it seems to be amplified with generative AI and in particular agents. And unlike previous AI, we're mixing data, hey, ERP and PLM and CRM. And then it gets very complex. You gave a talk with Visa on some of their data management challenges. And I was wondering, can you talk through that on that use case for data management? Because it's literally my number one thing that we research now.
Hillery Hunter:
Yeah, I mean, Visa talked yesterday about the context of their industry. They, like most of our enterprise clients, are the stewards of really sensitive information. Right. So everything that they're doing as a company, how they do it, consumer information, the financial information, all that, everything that a company like that does is not only regulated, but is sensitive. Right. They want to be stewards of that. So very much along the lines of the hybrid cloud conversation, one of the things that they spoke about is that they made a decision to build out an AI environment on premises quite intentionally and in part for, you know, kind of data protection reasons. And then the next step becomes how do you bring that AI conversation and generative AI creation and such into harmony with the rest of what you were doing from an analytics perspective and such. And so for them that meant having a common data model and data management. And so we have a great solution with them where they're using IBM Spectrum scale storage to keep high utilization of their DGX Nvidia DGX based cluster for training. So providing high throughput, high bandwidth storage capabilities to really leverage the value of what they have from a compute perspective, but then also using our AFM technology in that product to help with the data management and consistency across their traditional analytics environment in their AI cluster. So I think it speaks to many of the themes that you were bringing up.
Patrick Moorhead:
So let's go to models which I'm glad we didn't start there. Sometimes everybody wants to start there. I get it. IBM was one of the first to say multimodel is the way to go. And you have your own models, you support other models like llama. But let me ask you this, what is your vision for models across the infrastructure that scale with your clients across a multitude of platforms, including Z?
Hillery Hunter:
Yeah, I mean there was a statistic that last year 66% of models that were released were open source. And we're definitely in that camp of open source. And optionality, fundamentally what you mentioned as that strategy that yes, we are opening, we are creating models open, sourcing them under an Apache 2.0 license. But part of that is to enable customization. Another figure is that 1% or less of enterprise data is represented in those models off the shelf. Of course they're not having enterprise data in them. That's all behind firewalls. And so the process and the conversation I think is both about model optionality, not just our models but those of other partners and open source capabilities. The conversation is one also of customization of models to include enterprise data. For us that's where our REL AI Red Hat enterprise Linux AI capabilities around instruct lab for model realignment to include your data and knowledge and skills comes into play. And then you know, from an overall perspective of governance, one of the things we announced at the conference here is the inclusion of NIMS based capabilities as an option for how you deploy and what you use within Watts Next data and our governance capability Watts Next governance. And so again from a governance perspective we're very open. We understand people are going to have AI in different places, have different model choices, have customized models and they're going to want a single platform for governing all of that. So I think platform is the common word that you will consistently, you know, hear us use. And that means partnering with many different strategic vendors from a SaaS and data source perspective. It means partnering with different folks from a model perspective and ultimately helping enterprises have a consistent view across that full landscape.
Patrick Moorhead:
Yeah, I mean optionality is winning out the day which surprise, its enterprises. One thing I really do appreciate here at Granite Models is how efficient they are, how they're not trained on stuff that makes no sense to business. Okay. But I hope that continues well into the future here. So listen, we're at Nvidia's big show here and we haven't talked about Nvidia yet. So what are you doing with Nvidia that differentiates you because a lot of people are partnering with Nvidia here.
Hillery Hunter:
Yeah, well, I mean a lot of what we're doing and I think some of the difference in kind of our tone and conversation versus some of the rest of the show floors, we're very enterprise oriented. In our press release today, we talked about stuff across all facets of the IBM business spanning from our software portfolio, the things I mentioned around what's next inclusion and NIMS based capabilities into our infrastructure with our content aware storage. Again, there's a key problem in AI and its recency. Right. Oftentimes AI isn't the best way to get information on what just happened in the stock market in the last hour. And enterprises need to be up to date. They need to be providing their employees the most up to date stuff. Integrating the ability to vectorize data as soon as the file hits the storage system enables you to have AI that's searchable and up to date up to the minute. And so you know, that kind of thing in our, in our hardware portfolio, we announced as well in our cloud addition of the next generation of H200 GPUs into the cloud and then consulting. And the consulting conversation again is very much about the enterprise. Many at the C suite are saying they're not totally confident they have all the skills. And we've got 75,000 Gen AI certified practitioners ready to go and build these kinds of things like we as a company have been doing in our own internal transformation. Right. So it's a very enterprise focused conversation for us.
Patrick Moorhead:
I'm not asking you to pre announce anything, but what are some of the things that we should think about next? What can we expect next from either IBM or IBM and Nvidia?
Hillery Hunter:
Yeah. You know, I think you're going to continue to see us focusing, you know, in all aspects of our company and addressing this sort of platform conversation. We've been very consistent in that, you know, know since the beginning and looking forward, basing AI on open source, on pervasively deployable platforms, continuing to help people rapidly build and create AI capabilities and fundamentally getting over that hump from concept to profit. Right. You know, we titled our talk yesterday From Promise to Profits with good reason because that's been our own journey around efficiency and internal transformation. And I think that that's the step that we're really looking for this year to help more and more clients make.
Patrick Moorhead:
Yeah. So, Hillery, wow, great conversation. I appreciate it. Something tells me we're going to be seeing each other soon at an IBM event. Can't wait. But no, seriously, thank you so much for doing this and you know, it's our first conversation that's public now here, but I'm sure we will be doing a lot of Zoom calls together. But I really appreciate you coming on the show.
Hillery Hunter:
Thanks so much.
Patrick Moorhead:
This is Pat Moorhead and the IBM GTC 2025 in San Jose, California. We're talking hybrid cloud AI. It's great. One of my favorite topics, it's the combination of the two. AI and hybrid cloud. Great stuff. Tune into all the IBM content we have out there and all of the GTC content we have there as well. Hit that subscribe button and take care.
Other Categories
CYBERSECURITY

Threat Intelligence: Insights on Cybersecurity from Secureworks
Alex Rose from Secureworks joins Shira Rubinoff on the Cybersphere to share his insights on the critical role of threat intelligence in modern cybersecurity efforts, underscoring the importance of proactive, intelligence-driven defense mechanisms.
quantum

Quantum in Action: Insights and Applications with Matt Kinsella
Quantum is no longer a technology of the future; the quantum opportunity is here now. During this keynote conversation, Infleqtion CEO, Matt Kinsella will explore the latest quantum developments and how organizations can best leverage quantum to their advantage.

Accelerating Breakthrough Quantum Applications with Neutral Atoms
Our planet needs major breakthroughs for a more sustainable future and quantum computing promises to provide a path to new solutions in a variety of industry segments. This talk will explore what it takes for quantum computers to be able to solve these significant computational challenges, and will show that the timeline to addressing valuable applications may be sooner than previously thought.