Generative artificial intelligence tools have become a reliable copilot in many people’s life, whether they are used for writing wedding vows or responding to work emails. However, an increasing amount of research indicates that there are unintended environmental costs associated with each issue AI resolves.
Every word in an AI prompt is divided into groups of digits known as token IDs, which are then transmitted to enormous data centers—some of which are bigger than football fields—that are run by natural gas or coal. There, dozens of quick computations are performed by stacks of massive computers to produce answers.
Related Articles
-
PG&E plans San Jose power hub upgrades as economic growth beckons
-
Pope Leo calls for an ethical AI framework in a message to tech execs gathering at the Vatican
-
Magid: Using ChatGPT for everyday tasks
-
Elon Musk s xAI burns through $1 billion a month as costs pile up
-
Adobe launches standalone app for generating AI videos, images
According to a commonly cited estimate by the Electric Power Research Institute, the entire procedure can require up to ten times as much energy to complete as a standard Google search.
What is the harm for each prompt you provide AI? German researchers used both multiple-choice and free-response questions to test 14 large language model (LLM) artificial intelligence (AI) systems. Carbon dioxide emissions from complex queries were up to six times higher than those from questions with simple answers.
Furthermore, the study found that when asked the identical question, more intelligent LLMs with greater reasoning skills generated up to 50 times as much carbon emissions than simpler systems.
Maximilian Dauner, a PhD student at Hochschule M nchen University of Applied Sciences and the first author of the Frontiers in Communication paper released on Wednesday, stated, “This shows us the tradeoff between energy consumption and the accuracy of model performance.”
When processing token IDs, these more intelligent, energy-intensive LLMs typically include tens of billions more parameters and biases than smaller, more condensed models.
It can be compared to the brain’s neural network. According to Dauner, the more connections between neurons, the more thinking you can perform to get the answer.
What you can do to reduce your carbon footprint
According to Dauner, the lengthy answers that many AI models are programmed to deliver for complex issues contribute to the higher energy requirements. According to him, if you ask an AI chatbot to answer an algebraic problem for you, it can walk you through the steps it took to arrive at the solution.
According to Dauner, AI puts a lot of effort into being kind, particularly if the user is nice and says things like “please” and “thank you.” However, this only lengthens their answers and requires more effort to produce each word.
Because of this, Dauner advises users to communicate with AI models in a more direct manner. Either state that you don’t require an explanation at all, or specify the length of the response you desire and keep it to one or two phrases.
Sasha Luccioni, the climate lead at AI startup Hugging Face, stated in an email that Dauner’s work most importantly shows that not all AI models are made equal. Users might choose models more deliberately for different tasks if they want to lessen their carbon footprint.
According to Luccioni, task-specific models are frequently smaller, more effective, and equally proficient at every context-specific activity.
An AI model designed for coding can be required if you are a software engineer who handles challenging coding issues on a daily basis. However, depending on sophisticated AI tools to aid with schoolwork is like using a nuclear-powered digital calculator for the typical high school student.
The reasoning capability of various model solutions can differ even within the same AI company, so find out which features best meet your goals, advised Dauner.
In order to complete fundamental tasks, Luccioni advises using phone calculators and online encyclopedias wherever feasible.
Why it s hard to measure AI s environmental impact
It has been difficult to quantify AI’s effects on the environment.
According to the study, the hardware used to run AI models and the user’s closeness to local energy grids can have an impact on energy consumption.According to Dauner, this is one of the reasons the researchers decided to depict carbon emissions within a range.
Shaolei Ren, an associate professor of electrical and computer engineering at the University of California, Riverside who specializes in studying AI’s water consumption, added that many AI companies do not disclose information about their energy consumption or specifics like server size or optimization strategies that could aid researchers in estimating energy consumption.
It is simply meaningless to claim that AI uses this much water or energy on average. According to Ren, we must first look at each model separately before analyzing its usage for each activity.
According to Dauner, revealing the quantity of carbon emissions linked to each prompt is one way AI businesses may be more open.
Generally speaking, individuals could begin to question whether it’s really essential to transform themselves into action figures just because they’re bored if they knew more about the average (environmental) cost of producing a response. Or do I have nothing better to do than tell jokes on ChatGPT? “Dauner said.”
Furthermore, according to Luccioni, users may not have much control over when or how they utilize the technology as more businesses strive to integrate generative AI technologies into their systems.
Web search does not require generative AI. According to Luccioni, no one requested AI chatbots on social media or in messaging apps. Since there are actual repercussions for our world, this rush to incorporate them into every piece of technology is quite upsetting.
Consumers have fewer options when there is less information available about how AI uses its resources, according to Ren, who also stated that regulatory push for greater openness is unlikely to arrive in the US very soon. Rather, the cost-effectiveness of consuming less energy might be the best hope for more energy-efficient AI.
All things considered, I remain optimistic about (the future). According to Ren, a lot of software engineers are putting in a lot of effort to increase resource efficiency. Even while other businesses use a lot of energy, this does not negate the issue of AI’s environmental impact. Without a doubt, we ought to listen.
A Warner Bros. Discovery Company, 2025 Cable News Network, Inc. and The-CNN-Wire. All rights reserved.