Business

AI’s insatiable energy use drives electricity demands

GPUs, used to train large language models and respond to ChatGPT queries, requires more energy than your average microchip and give off more heat

A data center in San Jose, California. NYT file photo
 
A data center in San Jose, California. NYT file photo
A few weeks ago, I joined a small group of reporters for a wide-ranging conversation with Bill Gates about climate change, its causes and potential solutions. When the topic turned to the issue of just how much energy artificial intelligence was using, Gates was surprisingly sanguine.

“Let’s not go overboard on this,” he said during a media briefing on the sidelines of an event he was hosting in London.

AI data centers represent a relatively small additional load on the grid, Gates said. What’s more, he predicted that insights gleaned from AI would deliver gains in efficiency that would more than make up for that additional demand.

In short, Gates said, the stunning rise of AI will not stand in the way of combating climate change. “It’s not like, ‘Oh, no, we can’t do it because we’re addicted to doing chat sessions,’” he said.

That’s an upbeat assessment from a billionaire with a vested interest in the matter. Gates is a big-time climate investor, and is the former head of Microsoft and remains a major stockholder in the company, which is at the center of the AI revolution.

And while it’s too early to draw a definitive conclusion on the issue, a few things are already clear: AI is having a profound impact on energy demand around the world, it’s often leading to an uptick in planet-warming emissions, and there’s no end in sight.

AI data centers have a big appetite for electricity. The so-called graphic processing units, or GPUs, used to train large language models and respond to ChatGPT queries, require more energy than your average microchip and give off more heat.

With more data centers coming online almost every week, projections about how much energy will be required to power the AI boom are soaring.

One peer-reviewed study suggested AI could make up 0.5 per cent of worldwide electricity use by 2027, or roughly what Argentina uses in a year. Analysts at Wells Fargo suggested that US electricity demand could jump 20 per cent by 2030, driven in part to AI.

And Goldman Sachs predicted that data centers would account for 8 per cent of US energy usage in 2030, up from just 3 per cent today.

“It’s truly astronomical potential load growth,” said Ben Inskeep, the program director at Citizens Action Coalition, a consumer watchdog group based in Indiana that is tracking the energy impact of data centers.

Microsoft, Google, Amazon and Meta have all recently announced plans to build new data centers in Indiana, developments that Inskeep said would strain the grid.

“We don’t have enough power to meet the projected needs of data centers over the next five to 10 years,” he said. “We would need a massive build-out of additional resources.” Tech giants are scrambling to get a grip on their energy usage. For a decade now, those same four companies have been at the forefront of corporate efforts to embrace sustainability.

But in a matter of months, the energy demands from AI have complicated that narrative. Google’s emissions last year were 50 per cent higher than in 2019, largely because of data centers and the rise of AI. Microsoft’s emissions also jumped for the same reasons, up 29 per cent last year from 2020. And Meta’s emissions jumped 66 per cent from 2021 to 2023.

In statements, Google and Microsoft both said that AI would ultimately prove crucial to addressing the climate crisis, and that they were working to reduce their carbon footprints and bring more clean energy online. Amazon pointed to a statement detailing its sustainability efforts.

There are two ways for tech companies to meet the demand: tap the existing grid, or build new power plants. Each poses its own challenges.

In West Virginia, coal-fired power plants that had been scheduled to retire are being kept online to meet the energy needs of new data centers across the border in Virginia.

And across the country, utilities are building new natural-gas infrastructure to support data centers. Goldman Sachs anticipates that “incremental data center power consumption in the US will drive around 3.3 billion cubic feet per day of new natural gas demand by 2030, which will require new pipeline capacity to be built.”

At the same time, the tech giants are working to secure a lot more power to fuel the growth of AI.

Microsoft is working on a $10 billion plan to develop renewable energy to power data centers. Amazon has said it used 100 per cent clean energy last year, though experts have questioned whether the company’s accounting was too lenient.

All that new low carbon power is great. But when the tech companies themselves are consuming all that electricity to power new AI data centers, pushing up energy demand, it isn’t making the grid overall any cleaner.

The energy demands from AI are only getting more intense. Microsoft and OpenAI are planning on building a $100 billion data center, according to reports. Initial reporting suggests it may require 5 gigawatts of power, or roughly the equivalent of five nuclear reactors.

And at the same time that companies are building more data centers, many of the chips at the heart of the AI revolution are getting more and more power hungry. Nvidia, the leader in AI chips, recently unveiled new products that would draw exponentially more energy from the grid.

The AI boom is generating big profits for some companies. And it may yet deliver breakthroughs that help reduce emissions. But, at least for now, data centers are doing more harm than good for the climate.

“It’s definitely very concerning as we’re trying to transition our current grid to renewable energy,” Inskeep said. “Adding a massive amount of new load on top of that poses a grave threat to that transition.” — The New York Times

GRAPH POINTS

1. AI could make up 0.5 per cent of worldwide electricity use by 2027, or roughly what Argentina uses in a year.

2. GPUs require more energy than your average microchip and give off more heat

3. Microsoft and OpenAI are planning on building a $100 billion data center. It may require 5 gigawatts of power, or roughly the equivalent of five nuclear reactors.

4. Data centers would account for 8 per cent of US energy usage in 2030, up from just 3 per cent today.