
How Much Energy Does AI Use? The People Who Know Aren't Saying
Jun 19, 2025 6:00 AM A growing body of research attempts to put a number on energy use and AI—even as the companies behind the most popular models keep their carbon emissions a secret. Photograph: Bloomberg/Getty Images
'People are often curious about how much energy a ChatGPT query uses,' Sam Altman, the CEO of OpenAI, wrote in an aside in a long blog post last week. The average query, Altman wrote, uses 0.34 watt-hours of energy: 'About what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes.'
For a company with 800 million weekly active users (and growing), the question of how much energy all these searches are using is becoming an increasingly pressing one. But experts say Altman's figure doesn't mean much without much more public context from OpenAI about how it arrived at this calculation—including the definition of what an 'average' query is, whether or not it includes image generation, and whether or not Altman is including additional energy use, like from training AI models and cooling OpenAI's servers.
As a result, Sasha Luccioni, the climate lead at AI company Hugging Face, doesn't put too much stock in Altman's number. 'He could have pulled that out of his ass,' she says. (OpenAI did not respond to a request for more information about how it arrived at this number.)
As AI takes over our lives, it's also promising to transform our energy systems, supercharging carbon emissions right as we're trying to fight climate change. Now, a new and growing body of research is attempting to put hard numbers on just how much carbon we're actually emitting with all of our AI use.
This effort is complicated by the fact that major players like OpenAi disclose little environmental information. An analysis submitted for peer review this week by Luccioni and three other authors looks at the need for more environmental transparency in AI models. In Luccioni's new analysis, she and her colleagues use data from OpenRouter, a leaderboard of large language model (LLM) traffic, to find that 84 percent of LLM use in May 2025 was for models with zero environmental disclosure. That means that consumers are overwhelmingly choosing models with completely unknown environmental impacts.
'It blows my mind that you can buy a car and know how many miles per gallon it consumes, yet we use all these AI tools every day and we have absolutely no efficiency metrics, emissions factors, nothing,' Luccioni says. 'It's not mandated, it's not regulatory. Given where we are with the climate crisis, it should be top of the agenda for regulators everywhere.'
As a result of this lack of transparency, Luccioni says, the public is being exposed to estimates that make no sense but which are taken as gospel. You may have heard, for instance, that the average ChatGPT request takes 10 times as much energy as the average Google search. Luccioni and her colleagues track down this claim to a public remark that John Hennessy, the chairman of Alphabet, the parent company of Google, made in 2023.
A claim made by a board member from one company (Google) about the product of another company to which he has no relation (OpenAI) is tenuous at best—yet, Luccioni's analysis finds, this figure has been repeated again and again in press and policy reports. (As I was writing this piece, I got a pitch with this exact statistic.)
'People have taken an off-the-cuff remark and turned it into an actual statistic that's informing policy and the way people look at these things,' Luccioni says. 'The real core issue is that we have no numbers. So even the back-of-the-napkin calculations that people can find, they tend to take them as the gold standard, but that's not the case.'
One way to try and take a peek behind the curtain for more accurate information is to work with open source models. Some tech giants, including OpenAI and Anthropic, keep their models proprietary—meaning outside researchers can't independently verify their energy use. But other companies make some parts of their models publicly available, allowing researchers to more accurately gauge their emissions.
A study published Thursday in the journal Frontiers of Communication evaluated 14 open-source large language models, including two Meta Llama models and three DeepSeek models, and found that some used as much as 50 percent more energy than other models in the dataset responding to prompts from the researchers. The 1,000 benchmark prompts submitted to the LLMs included questions on topics such as high school history and philosophy; half of the questions were formatted as multiple choice, with only one-word answers available, while half were submitted as open prompts, allowing for a freer format and longer answers. Reasoning models, the researchers found, generated far more thinking tokens—measures of internal reasoning generated in the model while producing its answer, which are a hallmark of more energy use—than more concise models. These models, perhaps unsurprisingly, were also more accurate with complex topics. (They also had trouble with brevity: During the multiple choice phase, for instance, the more complex models would often return answers with multiple tokens, despite explicit instructions to only answer from the range of options provided.)
Maximilian Dauner, a PhD student at the Munich University of Applied Sciences and the study's lead author, says he hopes AI use will evolve to think about how to more efficiently use less-energy-intensive models for different queries. He envisions a process where smaller, simpler questions are automatically directed to less-energy-intensive models that will still provide accurate answers. 'Even smaller models can achieve really good results on simpler tasks, and don't have that huge amount of CO 2 emitted during the process,' he says.
Some tech companies already do this. Google and Microsoft have previously told WIRED that their search features use smaller models when possible, which can also mean faster responses for users. But generally, model providers have done little to nudge users toward using less energy. How quickly a model answers a question, for instance, has a big impact on its energy use—but that's not explained when AI products are presented to users, says Noman Bashir, the Computing & Climate Impact Fellow at MIT's Climate and Sustainability Consortium.
'The goal is to provide all of this inference the quickest way possible so that you don't leave their platform,' he says. 'If ChatGPT suddenly starts giving you a response after five minutes, you will go to some other tool that is giving you an immediate response.'
However, there's a myriad of other considerations to take into account when calculating the energy use of complex AI queries, because it's not just theoretical—the conditions under which queries are actually run out in the real world matter. Bashir points out that physical hardware makes a difference when calculating emissions. Dauner ran his experiments on an Nvidia A100 GPU, but Nvidia's H100 GPU—which was specially designed for AI workloads, and which, according to the company, is becoming increasingly popular—is much more energy-intensive.
Physical infrastructure also makes a difference when talking about emissions. Large data centers need cooling systems, light, and networking equipment, which all add on more energy; they often run in diurnal cycles, taking a break at night when queries are lower. They are also hooked up to different types of grids—ones overwhelmingly powered by fossil fuels, versus those powered by renewables—depending on their locations.
Bashir compares studies that look at emissions from AI queries without factoring in data center needs to lifting up a car, hitting the gas, and counting revolutions of a wheel as a way of doing a fuel-efficiency test. 'You're not taking into account the fact that this wheel has to carry the car and the passenger,' he says.
Perhaps most crucially for our understanding of AI's emissions, open source models like the ones Dauner used in his study represent a fraction of the AI models used by consumers today. Training a model and updating deployed models takes a massive amount of energy—figures that many big companies keep secret. It's unclear, for example, whether the light bulb statistic about ChatGPT from OpenAI's Altman takes into account all the energy used to train the models powering the chatbot. Without more disclosure, the public is simply missing much of the information needed to start understanding just how much this technology is impacting the planet.
'If I had a magic wand, I would make it mandatory for any company putting an AI system into production, anywhere, around the world, in any application, to disclose carbon numbers,' Luccioni says.
Paresh Dave contributed reporting.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
12 minutes ago
- Yahoo
Uber Expands AI Data Platform to Power Next-Gen Enterprise and AI Lab Needs
SAN FRANCISCO, June 20, 2025--(BUSINESS WIRE)--Uber Technologies, Inc. (NYSE: UBER) today announced a major expansion of its AI data services business, Uber AI Solutions, making its technology platform available to support AI labs and enterprises around the world. The new offerings include customized data solutions for building smarter AI models and agents, global digital task networks, and tools to help companies build and test AI models more efficiently. Over the past decade, Uber has developed deep expertise in collecting, labeling, testing, and localizing data for its own global operations, including optimizing the search of places or menu items, training self-driving car systems, building Gen AI agents for customer support, and translating content in more than 100 languages. Now that same expertise is being made available to other businesses through Uber AI Solutions, the company's dedicated team focused on powering the next generation of artificial intelligence. "We're bringing together Uber's platform, people, and AI systems to help other organizations build smarter AI more quickly," said Megha Yethadka, GM and Head of Uber AI Solutions. "With today's updates, we're scaling our platform globally to meet the growing demand for reliable, real-world AI data." What's New Global digital task platformUber AI Solutions is now available in 30 countries with a platform that connects enterprises to global talent, including experts in coding, finance, law, science, and linguistics. These tasks include annotation, translation, and editing for multi-lingual and multi-modal content. Powered by Uber's foundational platforms for identity, verification, payments and more, this expands Uber's gig work model into the world of AI. A new data foundryA new service that provides ready-to-use and custom-collected datasets—including audio, video, image, and text—to train large AI models. Built with data collected by individuals around the world using Uber technology, the data foundry supports use cases on generative AI, mapping, speech recognition, and others, with built-in privacy and compliance. Agentic AI supportUber AI Solutions is offering the tools and data to help train smart AI agents, including realistic task flows, high-quality annotations, simulations and multilingual support, helping AI agents understand and navigate real-world business processes. Shared infrastructure for AI buildersUber is making its internal platforms available to enterprise clients. These are the same platforms Uber uses to manage large-scale annotation projects and validate AI outputs, and includes AI-powered smart onboarding, quality checks, smart task decomposition and routing, and feedback loops to ensure accuracy and efficiency. Building the human intelligence layer for AI With these advancements, Uber AI Solutions is poised to become the human intelligence layer for AI development worldwide—combining software, operational expertise, and its massive global scale. Looking ahead, Uber is building an AI-powered interface that will allow clients to simply describe their data needs in plain language, letting the platform handle setup, task assignment, workflow optimization, and quality management for scalable AI training. About Uber Uber's mission is to create opportunity through movement. We started in 2010 to solve a simple problem: how do you get access to a ride at the touch of a button? More than 61 billion trips later, we're building products to get people closer to where they want to be. By changing how people, food, and things move through cities, Uber is a platform that opens up the world to new possibilities. View source version on Contacts Uber Press Contact: press@

Yahoo
12 minutes ago
- Yahoo
Kroger's Q1 earnings top estimates, revenue falls short
-- Kroger reported first-quarter earnings that exceeded analyst expectations, while revenue fell slightly short of estimates. The grocery retailer also raised its full-year sales guidance but maintained its earnings outlook. Kroger (NYSE:KR)'s share price is flat in premarket trading. Kroger posted adjusted earnings per share of $1.49 for the first quarter, surpassing the analyst consensus of $1.45. Revenue came in at $45.12 billion, just below the $45.28 billion analysts had projected. Compared to the same quarter last year, revenue decreased slightly from $45.3 billion, primarily due to the sale of Kroger Specialty Pharmacy. The company reported identical sales growth without fuel of 3.2% YoY, driven by strong performance in pharmacy, eCommerce, and fresh categories. eCommerce sales jumped 15% compared to the previous year. "Kroger delivered solid first quarter results, with strong sales led by pharmacy, e-commerce and fresh," said CEO Ron Sargent. "We made good progress in streamlining our priorities, enhancing customer focus, and running great stores to improve the shopping experience." For the full fiscal year 2025, Kroger raised its identical sales without fuel guidance to a range of 2.25% to 3.25%, up from its previous forecast. However, the company maintained its adjusted earnings per share outlook of $4.60 to $4.80, in line with analyst expectations of $4.76. CFO David Kennerley commented, "Our strong sales results and positive momentum give us confidence to raise our identical sales without fuel guidance. While first quarter sales and profitability exceeded our expectations, the macroeconomic environment remains uncertain." Related articles Kroger's Q1 earnings top estimates, revenue falls short Darden Restaurants operating income slips in fourth quarter despite sales jump Accenture shares slide despite raised full-year outlook, third quarter beat Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

Yahoo
12 minutes ago
- Yahoo
Michael Hicks column: The labor demand shocks of artificial intelligence
The most disruptive technology in human history was almost certainly the wheel. That economic shock, and all the others that followed, give us useful insight into labor market effects of artificial intelligence. The invention of the wheel cut transportation costs by 80% or 90%, dramatically reducing demand for workers who carried goods across and between towns. We've had other technology shocks — the use of fossil fuels, steam and then electric power, the internal combustion engine and computers. All these technologies replaced tasks that were part of jobs. The wheel replaced a strong back, the steam loom replaced strong legs. The use of fossil fuels replaced the cutting, splitting and drying of lumber, and electricity replaced the use of steam looms. For us, the computer has been the most disruptive technology. It radically changed the types of work that almost everybody performs. It also changed our ways of communicating, our amusements, our safety and health. It brought us the internet, social media and now AI. AI has been around in some form since the 1950s. I first heard about it in 1992, when a colleague of mine, then an infantry captain, was sent to obtain a master's degree in AI. By 1997, I was learning the use of rudimentary AI in economic modeling. The new, commercial applications of AI are much more advanced — and interesting — than the early AI algorithms of the 1990s. The large language models are superb for writing reports, school papers and summaries of some topics. Generative AI can construct pictures and movies that are almost indistinguishable from the work of actual humans. The potential applications of these new technologies are boundless, to the extent that any one person could predict. I see all types of uses in economics and warfare, the two fields I've been trained and educated in. There are also limitations. I've asked commercial versions of LLMs to provide novel testable hypotheses in economics — the lifeblood of economic analysis. The LLMs are good at naming data sources and, with enough prompts, can even construct the mathematical model to support a hypothesis. But none of the hypotheses were really any more than most middle school kids could have derived. The generative AI models are equally poor right now, delivering pictures of people with seven fingers or grilling burgers with lettuce, tomatoes and buns. They'll get better, of course, but what is AI likely to do to the demand for labor? I think the easy answer is that it will increase the demand for labor, in much the same way as the wheel, the steam loom, the automobile and the computer. That is, in a very nuanced way. Technology doesn't replace jobs; it replaces tasks. Almost always, the tasks replaced are the most mundane, routine and trainable ones. In so doing, the technology makes the uniquely human part of the job more valuable. The best long-form description of this comes in an accessible paper by David Autor who described Polanyi's Paradox, that 'we can know more (about our jobs) than we can tell.' The point of Polyani, which Autor fleshed out in superb contemporary detail, is that the unseen part of technology is how humans adapt it to complement their innate skills. Since the end of World War II, technology has replaced more than 80% of the work done by the average American. Throughout the longest and most impactful technology shock, the U.S. boosted wages, production and employment. AI may be different than any technology before it, but the adaptation was not technological — it was human. We humans are much as we've always been, and the economic incentive to match complementary human and technology skills remains robust. The most likely outcome of AI adoption will be positive, like all the other technology adoptions before it. But that doesn't mean there won't be challenges. The most dramatically unpleasant periods of technology adoption occurred in the places, and among the people, that could not adapt. James Whitcomb Riley's The Raggedy Man of 1888 described a type of itinerant worker that existed until at least the 1960s in U.S. agriculture. The Raggedy Man is gone now, because the skills he brought to a farm are no longer sufficient to earn him three meals and a simple room. Even then, one is tempted by this poem to conclude that he was employed for reasons beyond labor productivity. Technology eliminates the less skilled tasks a worker does, pushing them to more skilled — and more uniquely human — tasks. AI is likely to impact skills held by more educated workers than the robotics of the 1980s and later, or the digitization of the 2000s. AI will write simple research summaries, press releases and perform straightforward design work. This will lead to increased demand for more detailed and complex research summaries, more insightful press releases and more innovative designs than AI can produce. AI will also open demand for employment totally divorced from the direct complementarity to technology. As easily replicable human skills become inexpensive, the relative value of scarcer, purely human skills will rise. What does AI portend for education and regions? The one common thread of all previous technologies is that they complemented human-specific intellectual and social skills. So, job losses were clustered among those who were armed with skills that were more readily replaced. Thus, AI is likely to boost demand for workers with a lengthier, broader and more complex education. That education accesses more latent human skills. This used to be called a liberal education, but a better moniker is a classical education. Of course, such an education is not trendy today, in part because it is costly. It is much cheaper and faster to prepare for the last technological shock than the next one.