Tuesday, December 17, 2024
spot_img

MaRS Climate Impact: Power demands and the AI industry

how tech needs are reshaping industries and impacting climate goals

There’s a clear competitive positioning opening in AI, albeit one nearly out of reach for smaller players, opening to feed a seemingly insatiable demand. Since ChatGPT announced itself two years ago and made generative AI a ubiquitous application (with astounding speed) and a household world, it’s become increasingly clear the AI race is all about power. Processing power, power power, with demand at scales and costs once considered unimaginable – and unmeetable. 

Doomsayers continue to forecast the imminent collapse of the AI industry, and ChatGPT in particular, but the market is not backing away. Customer growth in this almost impossibly complex environment at companies like ServiceNow and Alexi continues to defy precedent and expectations. What does this exploding demand mean for emissions targets?

The market is already moving to meet demand, readily provisioning options of unknown impact, not only on other markets and industries but on global climate evolution. If emissions targets and specifically NetZero 2050 have any hope of being met, a hope that is increasingly remote, nuclear will have to play an increasingly significant role in lowering the inevitable growth of AI.  

Long held concerns about nuclear power seem to be dissolving in the face of AI demand. Constellation Energy just announced the once nearly unthinkable, the possible refurbishing of Three Mile Island for Microsoft’s energy needs. “Small reactors”, believed to be less risky than otters, are underway by Google and Meta among others. “The U.S. will need up to 900 GW of new clean, firm power generation capacity to reach net-zero emissions by 2050 and “nuclear power is a proven option that could be deployed to meet this growing demand,” the Department of Energy said in October. (Applications are due in January for up to $900 million in funding)

Why? From a recent Tech Target report on big tech’s forecasted use of nuclear power:

“Data centers are already consuming vast amounts of power, roughly 2% to 3% of the total U.S. power consumption, and are estimated to reach 9% by 2030, according to the Electric Power Research Institute’s study. The country’s aging power grid struggles in certain instances to meet demand now.

“Despite this, chipmakers continue to increase their power usage, because in computing, power equals performance. Recently, Elon Musk built an AI cluster of 100,000 Nvidia graphics processing units, each drawing 1,000 watts of power. That’s 100 megawatts (mW) in all, enough to power a small city. The data center, based in Memphis, Tenn., has a capacity of 150 mW … expect further strain on the nation’s power infrastructure as AI advances. A Goldman Sachs report calculated a 160% increase in data center power consumption by 2030, noting some large data centers already consume 100 mW, the equivalent of powering 80,000 homes for an hour.”

We aren’t going to get to nuclear quickly, and demand for AI processing power does not become more efficient over time or growth. It is still completely linear, both processing and energy. Demand will skyrocket; it already is. Nuclear will take time to catch up, so gas is thought to be the interim stopgap, which Canada has just become a dominant player in with Enbridge’s recent purchase of three American natural gas companies, discussed recently by Enbridge Gas President Michele Harradence in remarks to the Empire Club in Toronto. 

Enbridge Gas President Michele Harradence in remarks to the Empire Club in Toronto. 

Against this backdrop, the MaRS Climate Impact conference in Toronto brought together thought leaders to discuss AI’s growing energy footprint and its implications for the planet. European cloud provider OVHcloud was a panel participant. OVHcloud now has more than 400,000 servers spread across over 43 datacenters in 4 continents, is a global player and Europe’s leading cloud service provider to more than 1.6 million customers. I spoke with Germain Masse Product Marketing Manager, Artificial Intelligence & Data and Katya Guez, Startup Program Manager, OVHcloud; part one of our discussion follows.

JE: When blockchain and crypto first came into prominence, there was a lot of talk about just how much energy demand it was consuming. Hashing, solving and registering were seen to be huge energy consumers. What’s the comparison to that in AI?

GM: It’s a good comparison. I would say that at the beginning of the hype of the crypto mining seems a bit similar than AI. But finally, the main issue with crypto had not so much happened to be to be honest, so we didn’t need so much funding. Seeing Bitcoin reach a new level we may need to ask this question again, but it depends. It’s very related to this, to this, to this case. I don’t want to say that blockchain is only used for crypto, for crypto, crypto mining, cryptocurrency. It can be useful for other use cases, but compared to AI it’s not so huge. It’s not so big.

What is interesting and also frightening is the fact that AI is, I think here for a long time, it’s not just a hype. It will be a bit going down in the in the following months of our year, to be honest, but it will stay for a long time. It means that the way that we were doing software in the past, with code and so on, will change for a long time, and we will include AI within our device, within our software. More and more. We are just at the beginning. We only have one main supplier of chips, which is Nvidia. We are questioning lot of things. Companies, big tech companies, are planning a huge amount of power. You know, it’s so huge in Europe, where we know very well the market, when we have a very big footprint, but we don’t have the energy available, the resources. We don’t have the reasons to (power forecasted need), even nuclear or hydraulics or whatever. 

I’m pretty sure that we will have very impressive increase of efficiency of the chips, of the of those, of the models of AI models themselves. We’ve seen from two years huge, huge improvement, even from one year ago. You need a very big model training with hundreds of parameters, billions of parameters, and now you have, you have the same or even better result with 10 times smaller model, and a smaller model require less resources, and we consume less electricity also. So we are at the beginning. So yes, we are not going the right direction, to be honest, for the climate, and that’s a big issue, but I think that we it’s not as bad as it appears and it is improving.

JE: Sure, it seems too that you know you were talking about what were essentially point in time, very hot technologies at their time, like cloud versus on premise, Blockchain, et cetera. AI is an all encompassing technology, and it’s evolving very rapidly as well. So right now we’re really talking about the power consumption of generative AI. What do you anticipate as AI evolves. Is it as it becomes, you know, is it the agentic model? I don’t even want to breathe AGI because I think it’s so far out, and I might be wrong about that. But how do you forecast the dimensions of AI and what they’re going to look like in three to five years? And how do you factor that into your planning.

GM: It’s too early to exactly know what will happen on AI, on generative AI, we are currently wondering if, for example, the AI market will be limited to four five or six giant companies building very, very big multi models. So I mean being able to use speech, to recognize images and so on, and potentially generative video, are big, very big models. And to be honest, that would not be so good for a village cloud, but more globally for it, because, you know, when you are reducing the competitive landscape, it’s often not a good, good way to do things. There is an idea where, like we did in the software with open source, we could have smaller model that people can fine tune and improve model or aggregate smaller models themselves, and they can build new models of software based on those bricks, and that this way would be better, because smaller model, again, and our specialized model are more efficient. It gives power to many different people and to help with collaboration, finding new way of doing things.

So if we go in the in the way that we have plenty of models, we could imagine that companies will start to train their own models, or fine tune their own models. So people will create new models and new models and new models that could also lead to, because at some point, in terms of impact, environmental impact, training is huge. So if you are not using a lot of train a pre trained model, you will spend a lot of energy for training and what not for using it would train use versus training, I can say a percentage, because it’s depend on the usage. If you, for example, to train a GPT four model, so one of the latest from open AI, it requires 10 1000s of GPU consuming. It’s almost one kilowatt. It’s a huge amount of energy, and they do that for weeks, potentially months, to be able to train. So it’s huge, but some scientists have concluded that that open AI models are really used, massively used. Finally, the cost, or the energy needed for trained this model is finally negligible compared to use to the usage. So it depends but the opposite, we can see right now in open source. So people are training or fine tuning and so on, so on. But the models are not really used. The issue will be on the training. We have an issue in both cases, and I cannot say which one is better right now, right? And obviously, as you said, grows that that changes the dynamics quite a bit as well. 

JE: Want to come back to a couple of things that you said on the panel, and just now, actually, the idea of competition versus collaboration is competition inefficient now in that is it better to be working with others on a few big models? And I think I’m talking about better for the climate here, versus everybody trying to build their own things. And to what extent do you think that efficiencies could be seen if it were more collaborative? I’m not really talking about nuclear because that’s at such a scale, it has to be collaborative, but maybe we’re getting there too. What are your thoughts on that?

GM: I definitely think that, you know, there is a lot of collaboration on scientist, scientific level right now, all the papers are coming out. Google invented the way that we are currently training LLMs, and everybody is using their research. So there is a lot of collaboration in the scientific space. What we don’t have right now is collaboration or competition, you know, but in the hardware that, yeah, we have not a lot of major players. It’s more in the chip or in the cloud industry, where we are very, very big actors, and that’s very difficult, it requires a huge amount of money and very deep skills, technical skills, to create chips, to even to create these models. I often mention open source. Open Source when we did software. We were using time of brain of the developers, you know, so they could decide to contribute to a project based on their free time, or potentially even on their corporate time, but with the cooperation from the company, because it’s improving the software and so on. And that’s only time, right? In the case of AI, you need money. You need money to train, to use the GPU, to buy the GPU, and to pay for the for the electricity bill at the end of the month. And that’s a that’s a very big issue. So we need to find a way so collaboration slash competition. Honestly, I believe that competition is also good. The situation right now is not satisfying with Nvidia, for example, and we haven’t solved the issue of the cost of training. So perhaps at some point, and this is already the case in some countries, governments should make resources or infrastructure available for researcher, for people doing open source models and so on, in order to help them to improve and to collaborate on AI models.

Germain Masse, OVHcloud

Part Two on pricing models and nuclear impact to come next week

Featured

How to Keep Your Customers Happy Round the Clock

Pexels - CCO Licence Keeping your customers happy is no...

Combating Counterfeits: Open Commerce Platforms Redefine Brand Integrity in Digital Marketplaces 

By Justin Floyd, Founder and CEO, RedCloud Technologies In an increasingly...

Building a Business on Your Own Terms

Fatima Zaidi is the CEO and Founder of Quill...

Maximizing Business Efficiency: The Role of IT Consultancy in Glasgow

In today’s rapidly evolving business landscape, technology plays an...

How Charities Can Manage Enormous Public Money Dumps

Pexels - CC0 License Charities and nonprofits are critical for...
Jennifer Evans
Jennifer Evanshttp://www.b2bnn.com
principal, @patternpulseai. author, THE CEO GUIDE TO INDUSTRY AI. former chair @technationCA, founder @b2bnewsnetwork #basicincome activist. Machine learning since 2009.