OpenAI has unveiled its latest artificial intelligence model, GPT-5.2, which the company claims represents a significant leap forward in general intelligence, coding capabilities, and the understanding of long-context inputs. In a notable move, OpenAI also appointed George Osborne, the former Chancellor of the Exchequer in the UK, as its managing director, a decision that has drawn attention given Osborne’s limited environmental credentials.
However, the advancements in AI technology come with considerable costs. The power requirements for ChatGPT, particularly, have surged dramatically. Currently, the model’s energy consumption for responding to user queries is estimated at an astonishing 17 terawatt-hours (TWh) annually. This figure is comparable to the electricity consumption of small countries such as Puerto Rico and Slovenia, highlighting a growing concern regarding the energy demands of AI systems.
This annual electricity requirement would be sufficient to power New York City for over 113 days or the entire United Kingdom for about 20 days. The power-hungry data centers housing the AI models further exacerbate this issue, contributing to a substantial carbon footprint. As AI technologies rapidly evolve, their energy demands are increasing at a parallel rate, straining power grids and exacerbating carbon emissions. With each query, ChatGPT consumes approximately 18.9 watt-hours, which is over 50 times more than a typical Google search, which consumes just 0.3 watt-hours.
Recent analysis by BestBrokers has revealed that the total annual electricity cost for operating ChatGPT, based on the average US commercial electricity rate of $0.141 per kilowatt-hour as of September, is estimated to reach $2.42 billion. This staggering figure highlights the financial implications of maintaining the AI model’s operations solely for user interactions.
To put this into perspective, the electricity consumed by ChatGPT in a year could power several major nations for short periods. For example, the energy could supply China for 15 hours, the United States for 34 hours, and India for just over three days. Researchers from the University of Rhode Island’s AI lab have corroborated that ChatGPT consumes about 0.189 kWh per query. With approximately 810 million active users making an average of 22 queries weekly, this translates into an annual energy demand of about 17.228 billion kWh.
On a daily basis, ChatGPT processes over 2.5 billion requests, consuming more than half a million kilowatt-hours of energy. To contextualize this, the annual energy consumption could theoretically power all households in the United States for more than four and a half days. Moreover, the electricity needed by ChatGPT annually could fully charge around 238 million electric vehicles, given each vehicle has an average battery capacity of 72.4 kWh. With approximately 6.5 million electric vehicles on US roads as of mid-2025, the energy required for ChatGPT could recharge all these vehicles at least 36 times.
As the demand for AI technologies continues to grow, the environmental implications of such energy-intensive systems raise pressing questions about sustainability and responsible development. The rapid scaling of AI capabilities must be balanced with a commitment to reducing carbon footprints and considering the broader environmental effects. OpenAI’s advancements may represent a leap forward in intelligence, but the energy costs associated with maintaining such intelligence highlight a vital area needing attention as the industry evolves.
See also
Hut 8 Announces $7B Deal to Develop 245-MW AI Data Centre in Louisiana
IBM Commits to Skill 5 Million Indian Youth in AI, Cybersecurity, and Quantum by 2030
Intel Pursues $1.6 Billion Acquisition of AI Startup SambaNova to Strengthen Market Position
MSI Launches Cubi Z AI 8M Mini PC with AMD Ryzen 7 and 16GB RAM Starting at £476



















































