In an era where artificial intelligence (AI) is rapidly evolving, scholars at Vanderbilt University are challenging the prevailing notion that energy disclosures could hinder its development. In a forthcoming article, Michael P. Vandenbergh, Ethan Thorpe, and Jonathan Gilligan argue that increased transparency regarding AI’s energy consumption could enhance efficiency and competition while benefiting the environment.
While consumers increasingly seek energy efficiency data in their decision-making processes, accurate information about the electricity usage of AI models remains scarce. Advocates within the industry often contend that prioritizing AI’s potential to boost workplace efficiency and address environmental challenges should take precedence over regulatory measures, including mandatory energy disclosures.
Vandenbergh, Thorpe, and Gilligan assert that greater transparency would empower consumers to make informed choices, foster competition among AI developers, and optimize the performance of data centers and AI programs, ultimately reducing energy demands. They express concern that as AI technology advances, the energy requirements associated with it could expand significantly. For example, a one-megawatt data center may consume around 7 million gallons of water annually for cooling and indirectly necessitate approximately 10 million gallons for the power plants that supply it with electricity. In 2023, U.S. data centers consumed an estimated 176 terawatt hours of electricity, enough to power 16.3 million homes, resulting in 61 million metric tons of carbon dioxide emissions.
Currently, AI is responsible for about 15 percent of total demand on data centers, a figure the authors predict could triple by 2028. They highlight the energy consumption linked to various AI applications, including advanced reading models, AI chatbots, and image and video generation, which are all contributing factors.
Despite these predictions, Vandenbergh, Thorpe, and Gilligan note the significant uncertainties surrounding AI’s overall power consumption metrics. They emphasize that little is known about the energy required for specific AI tasks, largely due to a complex interplay of variables unique to each AI model, such as program code, hardware, data center location, and the model’s size and complexity. This variability complicates efforts to establish benchmarks for energy efficiency, especially as proprietary algorithms and cutting-edge chips come into play.
Looking at existing tools to estimate AI energy consumption, the authors evaluated four different AI footprint calculators and found them to be highly inconsistent. Despite attempts to standardize input variables, the calculators produced estimates that varied widely, with the highest being 58 times greater than the lowest for the same model. This variability suggests that while these calculators might serve as preliminary tools for policymakers and consumers, their unreliability could undermine the push for greater transparency in the industry.
Vandenbergh, Thorpe, and Gilligan also address arguments against energy disclosure regulations. Critics often argue that the inherent utility of AI technology outweighs its environmental impacts or that workplace efficiency gains provide an overall net benefit. While they acknowledge that tasks such as coding may be executed more efficiently by AI than human workers, they highlight that without transparent data on actual energy usage, it remains unclear if an AI model or a human would consume more energy in performing the same task.
The authors contend that transparency does not inherently act as a barrier to rapid development and could offer significant advantages for consumers, the industry, and the environment. Access to energy usage and environmental impact data would enable conscientious consumers to select AI models that align with their values while providing a competitive edge to companies that prioritize energy efficiency. By clarifying the energy trade-offs between general-purpose and task-specific AI models, consumers could make more informed decisions about which technologies to adopt.
Furthermore, public disclosures could also benefit data center operators by allowing them to manage energy costs more effectively while maintaining service levels. Vandenbergh, Thorpe, and Gilligan predict that these savings could be channeled back into companies that focus on developing more efficient AI models, thereby reducing overall energy consumption and carbon emissions.
In today’s “deregulatory era,” the authors acknowledge that substantial government action toward transparency is unlikely, especially given the current administration’s commitment to broad deregulation in the tech sector. Nevertheless, they remain cautiously optimistic that some AI developers will voluntarily disclose energy usage data, potentially catalyzing a broader industry shift toward transparency.
Ultimately, Vandenbergh, Thorpe, and Gilligan argue that enhanced energy transparency is essential for maximizing AI’s net benefits, regardless of whether this shift emerges from regulatory frameworks, consumer advocacy, or self-interest within the industry. They posit that aligning the needs of users, developers, and operators through transparency can help mitigate AI’s considerable environmental footprint.
See also
Ai Holdings Stock Dips 0.70% Amid Sector Concerns; Meyka AI Rates ‘B+’ with Long-Term Growth Potential
China’s AI Ambitions Stalled by State Control, Echoing Soviet Industrial Failures
Transforming AI Integration: Channel Partners Shift from Resellers to Strategic AI Factory Builders
Boards Must Prioritize AI Ethics: Global Regulations Shift from Guidelines to Compliance
DeepSeek AI Surpasses ChatGPT in Downloads Amid Rising Concerns Over Data Privacy Risks


















































