The Energy Consumption of AI: Understanding the Impact
As artificial intelligence becomes an integral part of daily life, many are left pondering its environmental effects. Recently, Sam Altman, the CEO of OpenAI, shed light on an intriguing statistic: a typical ChatGPT query consumes approximately 0.34 watt-hours of energy. To put this into perspective, that equates to the energy used by a conventional oven in just over one second or a high-efficiency light bulb in a couple of minutes.
With 800 million weekly active users of AI models like ChatGPT and this number steadily increasing, the cumulative energy consumption from these queries raises substantial questions. However, experts argue that Altman’s figure lacks the necessary context. Without knowing how this average was calculatedâ€â€including details on whether image generation is included or how much energy is expended in cooling servers and training modelsâ€â€the statistic remains ambiguous.
The Urgency of Environmental Transparency in AI
Climate scientist Sasha Luccioni, who leads climate research at Hugging Face, regards Altman’s claim with skepticism. “He could have pulled that out of his ass,” she notes, emphasizing the need for OpenAI to provide clearer information about this energy measurement. The absence of concrete context underscores a larger problem in the AI industryâ€â€the lack of transparency surrounding environmental impacts.
A recent analysis conducted by Luccioni and her colleagues highlights a concerning trend: 84 percent of large language model (LLM) usage in May 2025 came from models with no environmental disclosures. Consumers are engaging with AI technologies without a clear understanding of their carbon footprints, similar to the way we assess fuel consumption in cars. It’s mind-boggling that, in our advanced technological age, metrics on energy efficiency and emissions largely remain unregulated and undisclosed.
As society grapples with climate change, the energy demand driven by AI is growing. A widely circulated statement claimed that a single ChatGPT query uses ten times the energy of a Google search. Tracing this statement back, it originates from a comment made by John Hennessy, chairman of Alphabet. It’s a quote from a corporate figure suggesting another company’s energy consumptionâ€â€an unreliable premise at best. Yet, this thinly sourced statistic continues to be quoted in various reports.
A Call for Regulatory Action and Research
The staggering reliance on unverified energy consumption figures points to a critical gap in AI governance. Luccioni argues for the urgent need for regulatory frameworks that mandate transparency in environmental impacts related to AI technologies. As businesses integrate generative AI into their operations, the potential for increased carbon emissions is significant. This underscores the necessity of monitoring and standardizing energy usage across the industry.
As research in this field expands, more concrete data will help illuminate the true energy landscape of AI. It will empower consumers to make informed decisions and prompt companies to consider their environmental responsibilities seriously. AI’s remarkable capabilities must not come at the expense of our planet.
With the rapid evolution of technology, the dialogue about energy efficiency must keep pace. As developers and users amplify their pleas for efficiency, both the private sector and regulators must foster transparency that reflects the contemporary climate challenges we face. Only then can society strike a balance between leveraging AI advancements and protecting our environment for future generations.