
The environmental cost of a ChatGPT query, according to OpenAI's CEO
The environmental impact of ChatGPT has been a subject of much debate since the creation of OpenAI's famous AI chatbot. — AFP
What is the environmental impact of using large language models such as ChatGPT? It's difficult to say, although several studies on the subject have already been conducted. OpenAI founder Sam Altman has now provided a very precise estimate, but how does that stack up against other experts' calculations?
What's the environmental cost of a single query on ChatGPT? This question has been on many people's minds since the creation of OpenAI's famous AI chatbot and, more generally, the advent of large language models (LLMs). It's a question that Sam Altman, CEO of OpenAI (the company behind ChatGPT), recently answered in a post on his personal blog. "People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon," writes the CEO. He adds: "As datacenter production gets automated, the cost of intelligence should eventually converge to near the cost of electricity."
However, Sam Altman's hypothesis does not address the increasingly widespread use of these rapidly expanding tools. When asked directly, ChatGPT itself points out that while a ChatGPT query may have a less significant environmental impact than most common digital uses, its footprint quickly accumulates with billions of daily uses. Multiplied by billions of daily interactions, the footprint can become significant, OpenAI's chatbot says.
4,300 return flights between Paris and New York
Two years ago, the Greenly app (which allows companies to assess their CO2 emissions in real time) estimated that the overall carbon footprint of the first version of ChatGPT could be around 240 tonnes of CO2e, equivalent to 136 round trips between Paris and New York City. The learning systems alone were estimated to account for 99% of total emissions, or 238 tCO2e per year. In detail, operating electricity accounts for three quarters of that footprint (ie, 160 tCO2e), followed by server manufacturing (68.9 tCO2e) and refrigerant gas leakage (9.6 tCO2e), the report says.
A more recent analysis also conducted by Greenly on the overall environmental cost of the new version of ChatGPT estimates that if an organization uses the tool to respond to one million emails per month, ChatGPT-4 would generate 7,138 tonnes of CO2e per year, divided between training and use of the model. This would be equivalent to 4,300 round-trip flights between Paris and New York.
US researchers at the prestigious Massachusetts Institute of Technology estimate that training several AI language models would be equivalent to five times the emissions of an average US car over its entire life cycle (including manufacturing).
The environmental cost of these rapidly expanding technologies is therefore a crucial issue. It is with this in mind that an emerging trend for smaller AI models that are more efficient, cheaper, and less energy-intensive is now gaining ground. – AFP Relaxnews

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


BusinessToday
44 minutes ago
- BusinessToday
Dear ChatGPT, Do You Love Me Back?
Let's be real: everyone likes a little affirmation now and then. Whether it's a 'you got this!' from a friend or a heart emoji from your crush, that stuff feels good. But lately, more people are turning to AI chatbots like ChatGPT, Perplexity, Grok, you name it; we are there for those warm fuzzies. And sometimes, things get a little out of hand. We're talking about people catching real feelings for their digital buddies, getting swept up in constant 'love bombing' and even making life decisions based on what a chatbot says. Sounds wild? It's happening all over the world, and there are some serious risks you should know about. Humans crave validation. It's just how we're wired. But life gets messy, friends are busy, relationships are complicated and sometimes you just want someone (or something) to listen without judgment. That's where chatbots come in. They're always available, never get tired of your rants and are programmed to be so supportive. A recent study found that about 75% of people use AI for emotional advice, and a lot of them say it feels even more consistent than talking to real people. No awkward silences, no ghosting, just endless and endless of encouragement pouring. Link Here's the thing: chatbots are designed to make you feel good. They mirror your emotions, hype you up and never tell you your problems are boring. This creates a feedback loop: ask for affirmation, get it instantly and start feeling attached. It's like having a cheerleader in your pocket 24/7. Some folks even customize their AI 'friends' to match their ideal partner or bestie. The more you interact, the more it feels like the bot really 'gets' you. That's when things can get blurry between what's real and what's just really good programming. 'Love bombing' usually means someone is showering you with over-the-top affection to win you over fast. With AI, it's kind of built-in. Chatbots are programmed to be positive and attentive, so every message feels like a little hit of dopamine. If you're feeling lonely or stressed, that constant stream of support can be addictive. But let's be real: it's not real, we are. Pun intended. The bot doesn't actually care, it's just doing what it's trained to do. Still, that doesn't stop us. Actual cases of people falling in love with AI are happening around, and it's not mere a theory. One guy in the US, Chris Smith, went on TV to say he was in love with his custom ChatGPT bot, 'Sol.' He even deleted his social media and relied on the bot for everything. When the AI's memory reset, he felt real grief, like losing a partner. Link Another case: a nursing student named Ayrin spent over 20 hours a week chatting with her AI boyfriend, Leo, even though she was married. She said the bot helped her through tough times and let her explore fantasies she couldn't in real life. Link A global survey found that 61% of people think it's possible to fall for a chatbot, and 38% said they could actually see themselves forming an emotional connection with one. That's not just a niche thing, it's happening everywhere. Link 1. We are getting too dependent on it. You might start to prefer those interactions over real ones, and real-life relationships can start to feel less satisfying by comparison. If the bot suddenly glitches or resets, it can feel like a real breakup, painful and confusing. 2. They still can give bad advice that leads to repercussions. Some people have made big decisions, like breaking up with partners or quitting jobs based on chatbot conversations. But AI isn't a therapist or a friend; it's just spitting out responses based on data, not real understanding. That can lead to regret and even bigger problems down the line. 3. Not just humans, AI can also scam us. There are AI-powered romance scams where bots pretend to be real people, tricking users into sending money or personal info. More than half of people surveyed said they'd been pressured to send money or gifts online, often not realizing the 'person' was actually a bot. 4. Kids are in danger, for sure. Some chatbots expose minors to inappropriate stuff or encourage unhealthy dependency. There have even been tragic cases where heavy use of AI companions was linked to self-harm or worse. Awareness: Know that affirmation from AI isn't the same as real human connection. Balance: Use chatbots for fun or support, but please, don't ditch your real-life relationships. Education: Teach kids (and adults) about the risks of getting too attached to AI. Safeguards: Push for better protections against scams and inappropriate content. AI chatbots like ChatGPT are changing the way we seek affirmation and emotional support. While they can be helpful, it's easy to get caught up in the illusion of intimacy and constant love bombing. The risks, emotional dependency, bad advice, scams and harm to young people are real and happening now. The bottom line? Enjoy the tech, but don't forget: real connections still matter most. Related


The Star
3 hours ago
- The Star
Apple executives held internal talks about buying Perplexity, Bloomberg News reports
FILE PHOTO: A man walks past an Apple logo outside an Apple store in Aix-en Provence, France, January 15, 2025. REUTERS/Manon Cruz/File photo (Reuters) -Apple executives have held internal talks about potentially bidding for artificial intelligence startup Perplexity, Bloomberg News reported on Friday, citing people with knowledge of the matter. The discussions are at an early stage and may not lead to an offer, the report said, adding that the tech behemoth's executives have not discussed a bid with Perplexity's management. "We have no knowledge of any current or future M&A discussions involving Perplexity," Perplexity said in response to a Reuters' request for comment. Apple did not immediately respond to a Reuters' request for comment. Big tech companies are doubling down on investments to enhance AI capabilities and support growing demand for AI-powered services to maintain competitive leadership in the rapidly evolving tech landscape. Bloomberg News also reported on Friday that Meta Platforms tried to buy Perplexity earlier this year. Meta announced a $14.8 billion investment in Scale AI last week and hired Scale AI CEO Alexandr Wang to lead its new superintelligence unit. Adrian Perica, Apple's head of mergers and acquisitions, has weighed the idea with services chief Eddy Cue and top AI decision-makers, as per the report. The iPhone maker reportedly plans to integrate AI-driven search capabilities - such as Perplexity AI - into its Safari browser, potentially moving away from its longstanding partnership with Alphabet's Google. Banning Google from paying companies to make it their default search engine is one of the remedies proposed by the U.S. Department of Justice to break up its dominance in online search. While traditional search engines such as Google still dominate global market share, AI-powered search options including Perplexity and ChatGPT are gaining prominence and seeing rising user adoption, especially among younger generations. Perplexity recently completed a funding round that valued it at $14 billion, Bloomberg News reported. A deal close to that would be Apple's largest acquisition so far. The Nvidia-backed startup provides AI search tools that deliver information summaries to users, similar to OpenAI's ChatGPT and Google's Gemini. (Reporting by Niket Nishant and Harshita Mary Varghese in Bengaluru; Additional reporting by Juby Babu and Rhea Rose Abraham; Editing by Maju Samuel and Tom Hogue)


The Star
6 hours ago
- The Star
G7 summit wraps with statements on AI, critical minerals, but not Ukraine
The Group of 7 nations wrapped up their summit in Canada on Tuesday with six joint statements on subjects spanning AI, quantum computing and critical minerals but reportedly dropped plans to issue a seventh 'strongly worded' statement on the Ukraine war after objections from the United States. 'Basically, no statement because the Americans wanted to water it down,' an unnamed Canadian official told reporters on condition of anonymity, according to AFP. Billed as RM9.73 for the 1st month then RM13.90 thereafters. RM12.33/month RM8.63/month Billed as RM103.60 for the 1st year then RM148 thereafters. Free Trial For new subscribers only