Power is the New Name of the Game

The proliferation of artificial intelligence and the drive towards electrification have led to an unprecedented surge in energy needs. Generating additional electricity might appear to be the straightforward answer, yet the true resolution resides in improving efficiency and power innovation.

Hassane El-Khoury will offer his perspective on the looming global issue of escalating energy requirements and discuss how advancements in power technology will be crucial in reducing the strain on our power grids.

  • The convergence of AI and electrification is creating a critical tipping point in global energy demands, necessitating a dual focus on meeting these demands and improving energy efficiency, particularly in AI data centers and electrification processes.
  • Power semiconductors are emerging as a crucial technology for optimizing energy conversion and management, serving as the foundation for electric vehicles, renewable energy, smart grids and AI data centers, and are key to enhancing the efficiency and potential of electrification.
  • As industries incorporate more computing and complexity, the demand for power increases; thus, innovation in power technology, particularly through advanced power semiconductors, is essential for sustainable growth in these sectors.

Transcript

Daniel Newman:
Hey, everyone. Welcome back to the Six Five Summit. Excited for this next session that we have, Hassane El-Khoury. Hassane is the CEO, president of onsemi. Hassane, I want to welcome you to the event. It’s been a great day so far. I can’t wait to have this conversation with you.

Hassane El-Khoury:
Thank you. Thank you. Glad to be here.

Daniel Newman:
We’ve got a big number of items related to AI that people are focused on, but one of these big items that maybe needs more attention, doesn’t always get the attention, should get more attention is the electrification power management requirements. I’m reading reports out there that show in certain regions data center power or available energy for data center is reaching near zero. We are parts of the world where the demand for AI and the scale of data center build-outs to do AI, not to mention the edge electrification of vehicles and all these things that you play in, are all going to be utilizing so much energy.

Yet I don’t hear us talking about it as much. I hear us talking about building larger models, trillions of parameters, and putting AI on every device and every phone and every PC and more in the data center, new data centers. Just give me the quick background. This is something you’re really focused on at onsemi. How are you guys thinking about this problem right now and how big of a problem is this?

Hassane El-Khoury:
I’ll start by saying it is a big problem. You hear a lot of the headlines. All the focus is how much consumption is for these GPUs and the systems that are coming up? Nobody is talking about that’s all great. Now what? Now that’s on the cloud perspective. Now add to that this train on the grid already from electrification. As you add more and more vehicles on the road, you’re going to have that intensity that add on top of that areas or times in the year where the grid whether it’s air conditioning or cooling larger facilities becomes even more strained.

All of these combined are creating a power crunch. Now what do we do about that? There are two sections for it. You can think first on how do we get power to the building? That’s kind of the first thing that’s grid related. There are a lot of technologies that we’re involved in and I’ll go a little bit into that. Then, when you get into the building how do you best utilize it? That’s where the efficiency comes in. What we look at from an onsemi perspective is what I call the sustainable ecosystem because all of them have to be related.

It starts with energy generation, energy distribution, energy consumption whether it’s an EV or cloud. All of these have to work together and all of them have to be developed together if one were to sustain the other. Meaning if you don’t have a charging network or the grid is not ready, EVs are not going to be adopted. If you don’t have renewable energy or energy grid that is supporting the AI deployment, the AI deployment is not going to be to the scope or extent we can do.

It’s an ecosystem as we describe ecosystem interrelated. Getting into the building, back to the first section of the front, you have technologies that we also participate in and your resource system where you bring the power close to the consumption site. You remove a lot of the intensity from the grid itself by call it micro griding. You still have to get the power there whether renewable or grid power, but at least now you are in a proximity. Now when you go in that’s where a lot of the semiconductor also comes in.

Let me just put it in perspective. 1% efficiency or 1% improvement, which may not seem like a lot from a power perspective can save enough energy to reduce by one terawatt hour for the year. I mean just think about the extent of efficiency in power conversion getting from grid all the way through the GPU. That 1% efficiency can deploy or be reused either to power homes or to power another AI center that is able to achieve the output that we need.

Daniel Newman:
Hassane, I want to talk a little bit about power semiconductors in just a moment, but I did want to ask you how much of a compromise do you see between getting power energy right and actually being able to proliferate these technologies? I mean we’re at a very small percentage of electrification in terms of vehicles on the road. We’re seeing it grow and it’s been very positive, more new designs, new entrants, new participants, and exciting versions of these cars.

People that maybe weren’t even thinking about electrification are like, “Wow, that looks great, drives great.” It’s all getting in the right direction. Same time I hear this insatiable appetite that’s going on in the market for next generation transformer models, LLMs, and not to mention other AI, which we’ve forgotten about, but that is really important to doing a whole lot of different tasks. Can we balance this?

Is this something that can be achieved? It’s kind of been odd to me how we were very focused on sustainability and then all of a sudden AI comes out and you’re hearing a lot less about sustainability. It almost is a diametrically opposed force right now for the tech industry to try to manage carbon and at the same time execute on what customers want from AI.

Hassane El-Khoury:
Yeah, I’d say, look, I don’t think it’s one or the other. I think we can achieve both. The question, the variable here is time. It’s not the level meaning how fast can we deploy these things? Look, already the industry is and has been solving this problem of renewable energy as a source to power even for electrification of vehicles already. I talked about micro griding. Look, we all know the grid is not going to be able to support it as is.

If we sit here and talk about how the grid is going to be upgraded in order to support electrification and the AI load, it’s probably not going to be in my lifetime. Therefore, the industry is going about it a different way because of time and therefore the time is now. The consumption is now. AI happened very quickly. We’re already on this trajectory that we need to accelerate. Now from a grid upgrade perspective, that’s where micro griding comes in.

Now whether you micro grid by having a power plant next to an AI center or a renewable energy capture that gets put on an energy storage system, that’s also local. Again, what you want to get rid of is the distance, the losses that you get, so you maximize energy generation to the energy consumption while minimizing losses. Again, I mentioned the example of 1%.

1% at every step of the conversion adds up to now you can have a bigger deployment with the same energy that you were otherwise having. That is what we need to focus on. I don’t think it’s one or the other. I don’t think we have to sacrifice sustainability and the climate initiatives because we need and everybody wants and it’s very helpful to get the AI and the expansion on our models and so on.

Daniel Newman:
Hassane, let’s pivot for a minute here and I appreciate you kind of running that down. I think there’s different schools of thought. Given where your business is focused and handling, you are really trying to handle a load problem that a lot of these other semiconductors are just, they’re all talking power and performance per watt. Those metrics haven’t changed and they won’t change. When I hear Jensen talking about 72 black wells in Iraq and you start looking at the power consumption of one rack and then you think about the scale of these systems, you’re like, “Oh, my gosh, what are we going to do?”

Talk about power semiconductors and the role that you expect to play and what onsemi is doing to sort of provide a pathway to not have to compromise between having the best capabilities and having world leading AI technology and also at the same time being responsible. By the way, just managing the situation, I mean there’s part of it that’s wanting to do good and part of it’s just doing what we have to do to keep this tech proliferating at the same time while managing our grid because I don’t think you or I are going to be happy on a 100 degree summer day if our air conditioning isn’t working.

Hassane El-Khoury:
Yeah, and by the way, you can think about a rack requires as much power as an electric vehicle, a premium electric vehicle. We’re talking hundreds of kilowatts at this point. To me, I look at it as an entity of consumption whether it’s a car or it’s a rack at this point because you have to do the technology for the same. Now from a semiconductor perspective, what are the things we can do in order to help that? It goes back to efficiency.

I guess every power unit that you don’t waste from the entry to the rack to the GPU is a power unit that you can add for another GPU if that makes sense. If you say 100 a month, you can power another GPU and therefore efficiency is the primary metric. That’s where we come in. That is at the base of the technology itself. Now people talk about efficiency and I’ve always talked about it. Even in the automotive domain people think about semiconductor efficiency or call it wide bandgap.

It could be silicon carbide or it could be silicon, but that’s not where efficiency only comes in from. You have to have the best technology at a dye level, but if you don’t have the proper packaging to dissipate that heat, you are going to lose efficiency even for the best semiconductor device setup. I look at technology for power as both dye and package and you have to have both. You have to design both together so when they land on a board you’re not having any waste of energy that you save coming into the chip, but you waste in the chip itself.

You save 1% from a technology. Then, you waste it in heat. Guess what? The output hasn’t changed. Therefore, you have to have both. We are heavily investing in both. That’s for point number one. Now there’s the packaging itself. Of course, it’s very easy to say you can have a complicated package and have it large, but what happens on real estate? Those racks, you’re landlocked and if you think about anything that’s landlocked, real estate is very expensive. How do you maximize real estate with the power density? That’s where you have packaging or constructs that help with that.

Daniel Newman:
It’s really interesting that you say that too about the power and being landlocked, Hassane, because I’ve talked a lot about how much power we can get to the racks and then the actual racks themselves. By the way, then the power per rack are all going up by orders of magnitude. As I sum this up, and we only have a minute left, I really appreciate you running me through this. This is a substantial problem. It’s not a problem that it is not only looking for a solution. It needs a solution.

We cannot do these two things at different speed. We have to progress power management and building power semiconductors that deliver efficiency to these racks, to these vehicles, basically edge to cloud to device at a really rapid pace. If we keep growing these models and we keep growing the rollouts of these AI solutions and we don’t think about power, we’re going to come to the edge of this pretty quickly.

Hassane El-Khoury:
That’s right. That’s right and it’s our responsibility as an industry, but also for onsemi as a leader in power semis. You’re seeing more and more products that we’re coming out with like our T10 MOSFET or the 650 volt silicon carbide both on silicon and silicon carbide in order to stay ahead of it.

What we’re working on now is the next generation of AI in order to get more out of the rack space that we get because, look, racks have not changed and didn’t get bigger, maybe a little taller, but they didn’t get bigger as far as width and depth. Therefore, we have to fit more into it and that comes at the core with technology that we’re doing here at onsemi.

Daniel Newman:
I want to thank you so much for joining me here at this year’s Six Five Summit. I have a feeling that this is not the last time we’re going to be talking about power when it comes to the growth of the overall data center industry, electrification of vehicles, edge to cloud solutions.
I think in every part the power and performance per watt metrics will not go away, but the need for innovation like what you’re doing at onsemi to help deliver on the potential of these technologies is going to only become more and more important. I can’t wait to keep in touch, to keep tracking, following, and providing our analysis of what you’re doing. So far we’re very impressed, Hassane, and hope you’ll come back and join us here again on the Six Five or at next year’s Six Five Summit.

Hassane El-Khoury:
Great, thank you.

Daniel Newman:
Everyone out there, thanks so much for tuning into this session. We’ve got so much more for you. I’m going to send it back to the studio.

Other Categories