A.I. Could Soon Need as Much Electricity as an Entire Country

By The New York Times
Published on October 10, 2023

OpenAI’s ChatGPT exploded onto the scene nearly a year ago, reaching an estimated 100 million users in two months and setting off an A.I. boom. Behind the scenes, the technology relies on thousands of specialized computer chips. And in the coming years, they could consume immense amounts of electricity.

 

A peer-reviewed analysis published Tuesday lays out some early estimates. In a middle-ground scenario, by 2027 A.I. servers could use between 85 to 134 terawatt hours (Twh) annually. That’s similar to what Argentina, the Netherlands and Sweden each use in a year, and is about 0.5 percent of the world’s current electricity use.

 

“We don’t have to completely blow this out of proportion,” said Alex de Vries, the data scientist who did the analysis. “But at the same time, the numbers that I write down — they are not small.”

 

The electricity needed to run A.I. could boost the world’s carbon emissions, depending on whether the data centers get their power from fossil fuels or renewable resources.

 

 

Read More