Concerns about the rising energy consumption of artificial intelligence (AI) have intensified, as a new analysis reveals that the latest reasoning models demand significantly more energy than their predecessors. This trend raises alarms that the energy requirements and carbon footprint associated with AI could escalate more rapidly than previously anticipated.
As AI tools increasingly integrate into daily life, the electricity required to operate them has become a focal point of concern. Initial apprehensions were primarily centered around the substantial costs involved in training large models. However, current energy demands largely stem from the need to respond to user queries.
Research conducted by teams from Hugging Face and Salesforce highlights that the newest generation of AI models, which process information step by step before delivering answers, are notably more power-intensive. The study indicates that certain models consume up to 700 times more energy when their reasoning functionality is activated.
“We should be smarter about the way that we use AI,” stated Sasha Luccioni, a research scientist at Hugging Face and co-lead of the project, in comments to Bloomberg. “Choosing the right model for the right task is important.”
This analysis is part of the AI Energy Score project, which seeks to establish standardized metrics for evaluating AI energy efficiency. Each model undergoes testing across ten tasks using custom datasets and the latest generation of graphics processing units (GPUs). Researchers assess the number of watt-hours consumed to respond to 1,000 queries.
Models receive a star rating from one to five, similar to energy efficiency labels found on consumer products in various countries. However, this benchmarking process is limited to open or partially open models, leaving many leading closed models from major AI labs untested.
In the latest update to the project’s leaderboard, researchers examined reasoning models for the first time. They discovered that, on average, these models consume 30 times more energy than those without reasoning capabilities or with their reasoning modes disabled, with some models demonstrating energy usage hundreds of times greater.
The increased energy consumption stems from the nature of AI reasoning. These models function as text generators, where each segment of text produced requires energy. Instead of simply providing straightforward answers, reasoning models engage in an internal dialogue (“think aloud”), generating extensive text as they navigate problem-solving. This process can amplify the volume of generated text dramatically, leading to a corresponding surge in energy consumption.
Identifying which models are most susceptible to this energy-intensive behavior can be complex. Traditionally, model size was the primary indicator of energy use; however, with reasoning models, their verbosity during reasoning tasks often serves as a more significant predictor. This complexity emphasizes the necessity of benchmarks in assessing energy efficiency.
Previous studies, including one published in June in Frontiers in Communication, have similarly found that reasoning models can produce up to 50 times more CO₂ emissions compared to models designed for concise responses. Despite their inefficiency, reasoning models offer heightened capabilities, complicating the decision-making process for users.
“Currently, we see a clear accuracy-sustainability trade-off inherent in LLM technologies,” remarked Maximilian Dauner, a researcher at Hochschule München University of Applied Sciences, who led the earlier study. “None of the models that kept emissions below 500 grams of CO₂ equivalent achieved higher than 80 percent accuracy on answering the 1,000 questions correctly.”
As the understanding of energy impacts associated with reasoning models continues to evolve, persuading users to forgo these powerful tools may prove challenging. The balance between enhanced capabilities and sustainability remains a pressing issue in the ongoing discourse surrounding the future of AI technology.
See also
New York’s RAISE Act Sets Tough AI Regulations, Imposing Up to $3M Fines on Violators
Al Jazeera Unveils ‘The Core’: A Game-Changing AI Model Transforming Journalism with Google Cloud
Google Gemini 3 Surpasses ChatGPT as AI Race Intensifies Amid Data Monopoly Concerns
Integrating AI Ethics into Education: Essential Skills for Navigating Digital Risks
Microsoft CEO Demands Total AI Commitment Amid Operational Struggles and Investor Pressure



















































