close
close

Can the climate survive the insatiable energy demands of the AI ​​arms race? | Technology sector

The boom in artificial intelligence has pushed share prices of major technology companies to new highs, but at the expense of the sector’s climate ambitions.

Google admitted on Tuesday that the technology threatens its environmental goals after revealing that data centers, a core part of AI infrastructure, have helped increase its greenhouse gas emissions by 48% since 2019. Google said the “significant uncertainties” around achieving its goal of net-zero emissions by 2030 – reducing the total amount of carbon emissions for which the company is responsible to zero – include “uncertainty about the future environmental impacts of AI, which are complex and difficult to predict.”

Will the technology be able to reduce the environmental costs of AI, or will the industry continue unconcerned because the price of superiority is so high?


Why does AI pose a threat to the environmental goals of technology companies?

Data centers are a central component for training and running AI models such as Google’s Gemini or OpenAI’s GPT-4. They contain the sophisticated computing equipment or servers that process the vast amounts of data that underpin the AI ​​systems. They require large amounts of electricity to operate, which, depending on the energy source, releases CO2. There is also “embedded” CO2 from the costs of manufacturing and transporting the necessary equipment.

According to the International Energy Agency, total electricity consumption by data centers could double from 2022 levels, reaching 1,000 TWh (terawatt hours) in 2026. That’s equivalent to the energy needs of Japan. Research firm SemiAnalysis also expects that data centers will consume 4.5% of global energy production by 2030 thanks to artificial intelligence. Water consumption also plays a significant role: According to one study, AI could be responsible for up to 6.6 billion cubic meters of water by 2027 – almost two-thirds of England’s annual water consumption.


What do experts say about the impact on the environment?

A recent UK government-backed report on AI safety says the carbon intensity of the energy source used by tech companies is “a key variable” in calculating the environmental cost of the technology. However, it also says a “significant proportion” of training AI models still relies on fossil fuels.

In fact, technology companies are scrambling to secure renewable energy contracts to meet their environmental goals. Amazon, for example, is the world’s largest corporate buyer of renewable energy. But some experts argue that this is pushing other energy users toward fossil fuels because there isn’t enough clean energy for everyone.

“Not only is energy consumption growing, but Google is also struggling to meet this increased demand from sustainable energy sources,” says Alex de Vries, founder of Digiconomist, a website that monitors the environmental impact of new technologies.


Is there enough renewable energy for everyone?

To reduce fossil fuel consumption in line with climate goals, governments around the world aim to triple renewable energy resources by the end of the decade. But the ambitious pledge, agreed at last year’s climate talks at COP28, is already being challenged. Experts fear that the sharp increase in energy demands of AI data centers could make it even more unattainable.

The IEA, the global energy regulator, has warned that the world will not be able to double its share of renewable energy by 2030 under current government plans, even though global renewable energy capacity is set to grow at its fastest rate in 20 years in 2023.

The answer to AI’s hunger for energy may be for technology companies to invest more in building new renewable energy projects to meet their growing electricity needs.


How quickly can we build new renewable energy projects?

Onshore renewable energy projects such as wind and solar farms can be built relatively quickly – their development can take less than six months. However, slow planning regulations in many developed countries and a global stalemate in connecting new projects to the grid could add years to the process. Offshore wind farms and hydroelectric plants face similar challenges, in addition to construction times of two to five years.

This has raised doubts about whether renewable energy can keep pace with the proliferation of AI. According to the Wall Street Journal, major tech companies have already tapped a third of U.S. nuclear power plants to provide low-carbon electricity to their data centers. But without investing in new energy sources, these deals would deprive other consumers of low-carbon electricity, leading to more fossil fuel use to meet overall demand.


Will AI’s power consumption continue to rise forever?

Normally, the law of supply and demand would suggest that as AI uses more electricity, energy costs will rise, forcing the industry to conserve. But the unique nature of the industry means that the world’s largest companies may instead choose to take the spike in electricity prices and burn billions of dollars in the process.

The largest and most expensive data centers in the AI ​​sector are those used to train “frontier AI,” systems like GPT-4o and Claude 3.5 that are more powerful and capable than any other. The market leader has changed over the years, but OpenAI is generally at the forefront, battling for position with Anthropic, the maker of Claude, and Google’s Gemini.

Already, competition for the top spot is considered a winner-takes-all situation, and there is little to stop customers from switching to the latest market leader. That is, if one company spends $100 million on a training program for a new AI system, its competitors must decide whether to spend even more themselves or drop out of the race altogether.

Worse still, in the race to create so-called “AGI,” or artificial intelligence (AI) that can do anything a human can, it might be worth spending hundreds of billions of dollars on a single training run—if it gives your company a monopoly on a technology that, as OpenAI says, could “take humanity to a higher level.”


Won’t AI companies learn to use less electricity?

Every month, there are new breakthroughs in AI technology that allow companies to do more with less. In March 2022, for example, a DeepMind project called Chinchilla showed researchers how to train groundbreaking AI models with radically less computing power by changing the ratio between the amount of training data and the size of the resulting model.

However, this did not result in the same AI systems using less electricity; rather, the same amount of electricity was used to develop even better AI systems. In economics, this phenomenon is known as the “Jevons Paradox,” named after the economist who observed that James Watt’s improvement of the steam engine, which allowed for significantly less coal consumption, instead led to a huge increase in the amount of fossil fuels burned in England. When the price of steam power plummeted after Watt’s invention, new uses were discovered that would not have been worthwhile if electricity prices had been high.

Leave a Reply

Your email address will not be published. Required fields are marked *