ChatGPT er en kolossal strømsluger
Et team hos BestBrokers har analyseret energiforbruget i forbindelse med brugen af ChatGPT. Det er skræmmende læsning (in english).
OpenAI just announced the new GPT-5.2 model, its most advanced artificial intelligence model to date, which is said to enhance its general intelligence, coding, and long-context understanding.
More advanced AI models, however, also come at a higher cost, with the electricity consumption of ChatGPT, in particular, going through the roof.
To put this into context, Bestbroker has completed an comprehensive analysis, revealing that ChatGPT’s power needs for answering user questions alone have reached a massive 17 TWh a year.
As AI systems scale at breakneck speed, their energy appetite is ballooning just as quickly, straining power grids, pushing up carbon emissions, and raising uncomfortable questions about the environmental cost of intelligence on demand.
ChatGPT is a prime example: each query is estimated to consume 18.9 watt-hours, more than 50 times the energy used by a standard Google search (0.3 Wh).
To illustrate the real scale of this, the team at BestBrokers calculated the model’s total electricity consumption over a full year of responding to user prompts and calculated what it would (using the average U.S. commercial electricity rate of $0.141 per kWh as of September).
The math paints a stark picture: ChatGPT’s annual energy use amounts to 17.23 tera-watt-hours, equivalent to the consumption of a relatively small nation such as Puerto Rico or Slovenia.
The annual electricity needed to answer user queries would be enough to power New York City for 113 days, or nearly 4 months. At the latest commercial electricity prices, that translates into an estimated $2.42 billion in annual power costs, solely to keep the model answering questions.
- The newest iterations of ChatGPT and other frontier models are delivering remarkable gains in reasoning and generative accuracy, but their energy demands are becoming impossible to ignore, says Alan Goldberg, data analyst and author at BestBrokers, and he continues:
- Training runs for state-of-the-art systems now require tens of gigawatt-hours, and inference isn’t cheap either: each query from the most advanced models can use an order of magnitude more power than older architectures. Efficiency improvements are real, but they’re being outpaced by surging global usage and escalating model size.
- What’s especially striking is how little transparency surrounds the true electricity footprint of these systems. Without clearer reporting and more aggressive optimisation standards, AI’s rapid progress risks creating infrastructure and environmental pressures that the industry is still reluctant to fully acknowledge. Tech’s next breakthrough may need to be efficiency itself.
ChatGPT in perspective
ChatGPT’s annual energy needed to answer userpPrompts (17.23 TWh) can supply electricity to these nations for:
• China: 15 hours
• United States of America: 1 day and 10 hours
• India: 3 days and 2 hours
• Russian Federation: 5 days and 6 hours
• Japan: 6 days and 4 hours
• Brazil: 8 days and 6 hours
• South Korea: 10 days and 1 hour
• Canada: 10 days and 2 hours
• Germany: 12 days and 9 hours
• France: 13 days and 11 hours
0.189 kWh per query
ChatGPT uses around 0.189 kWh per query, according to recent research by the University of Rhode Island’s AI lab. With 810 million active weekly users asking an average of 22 questions each week, it ends up consuming about 17.228 billion kWh annually.
At the average U.S. commercial electricity rate as of September 2025, that adds up to a whopping $2.42 billion in energy costs.
On a daily basis, this translates to more than 2.5 billion daily requests, consuming over half a 47.2 million kilowatt-hours of energy. With the average U.S. and Western European household consuming around 29 kWh per day, this means that the energy needed by ChatGPT every year could easily power all households in the U.S. for more than 4 and a half days.
For further context, the energy ChatGPT consumes in a year could fully charge about 238 million electric vehicles, each with an average battery capacity of 72.4 kWh. With an estimated 6.5 million EVs on U.S. roads as of mid-2025, the annual electricity for answering prompts could use all these vehicles at least 36 times.
Read the full release from Bestbrokers here