top of page

What Is the Environmental Cost of AI Worldwide?

  • Writer: Stéphane Guy
    Stéphane Guy
  • Jan 17
  • 6 min read

Artificial intelligence is becoming an invisible pillar of our daily lives. It optimises our routes, anticipates our searches, manages connected devices, and sometimes even writes our texts. But behind this apparent “magic” lies a very physical reality: complex computations executed by machines that are extremely resource-intensive. Every time an AI answers a question, solves a problem, or generates content, it performs billions of operations on specialised servers. These operations come with a significant energy cost. They require heavy infrastructure running continuously, cooled by powerful systems, and powered by electricity sources that are not always renewable. This raises a fundamental question: how does AI work from an energy perspective? What actually happens behind the scenes when you ask ChatGPT a simple question or request an image from Midjourney?


Image generated by AI
Image generated by AI

In short

  • AI relies on massive data centers. These facilities house thousands of servers running computations 24/7. They require large amounts of electricity and constant cooling systems.

  • Training AI models is extremely energy-intensive. Training a model like GPT-4 can require weeks of computation across hundreds of processors, resulting in a substantial carbon footprint.

  • Everyday usage also has an impact. Each query consumes energy. Multiplied by millions of users, this operational consumption becomes significant and adds to the initial training cost.

  • AI also consumes large volumes of water for cooling. Some models have reportedly required hundreds of thousands of liters of water for training alone. The impact varies depending on local climate and cooling technology.

  • Solutions exist, but adoption remains limited. Algorithmic optimization, renewable energy, and waste heat recovery are known levers, yet few actors commit to them in a transparent or binding way.


How AI Works: A Resource-Hungry System


Behind a Single Query: Servers, Energy, and Heat


When you ask a question to an AI system, you are interacting with a pre-trained model hosted on a server inside a data center. Each request triggers a complex chain of computations to interpret, process, and generate a response. These operations produce heat, which must be dissipated.


As a result, large amounts of electricity are required, both to power the servers and to cool them. Cooling is typically handled through air-based systems or liquid cooling. In some cases, the heat is reused to warm nearby buildings, but such initiatives remain marginal.*


*YouTube : Des data-centers qui chauffent les habitations, c'est possible grâce à l'énergie fatale


Training: An Energy Sinkhole


Training is the phase during which an AI model learns to improve its performance. This involves analysing billions of data points to identify patterns and refine responses. The larger the model, the longer and more expensive this process becomes. The issue is that as scientific advances continue, the energy cost of AI training keeps increasing. On one hand, scientists estimate that training the BLOOM AI model emitted ten times more greenhouse gases than a French citizen does in an entire year. On the other hand, by 2026, the increase in electricity consumption from data centers, cryptocurrencies, and AI could be equivalent to the total electricity consumption of Sweden or Germany compared to 2022.*


*Polytechnique Insights : Generative AI : energy consumption soars


Importantly, these figures only account for model training. To this must be added the energy consumed each time the AI is used once it becomes available to the general public.


Photo by imgix sur Unsplash.

AI’s Carbon Footprint: Numbers and Comparisons


AI consumes large amounts of electricity and therefore emits significant quantities of CO₂. While it remains difficult to precisely quantify emissions for individual models, some estimates exist. For example, a short conversation of a few messages with GPT-4 is estimated to generate around 272 grams of CO₂. At that rate, making about ten queries per day for a year could result in nearly one metric ton of CO₂ emissions—roughly half of the annual carbon budget recommended under the Paris Agreement.*


*Vert Eco, Électricité, eau, minéraux, CO2 : on a tenté de mesurer l’empreinte écologique de ChatGPT


And this estimate only concerns ChatGPT, without accounting for image or video generation tools. As AI usage grows, so too does its energy consumption—and its carbon emissions.


The Water Cost: The Overlooked Impact


Beyond CO₂ emissions, another critical resource is often overlooked: water. Data centers rely heavily on water for cooling AI infrastructure.


Microsoft has reported that water consumption in AI-related data centers increased by 34%, a figure that clearly reflects the rapid expansion of these facilities. In 2022 alone, the scaling of AI models reportedly consumed 6.4 billion liters of water: equivalent to roughly 2,500 Olympic-size swimming pools.*


*Le Monde Informatique : La consommation d'eau liée à l'IA générative inquiète


This growing water usage is causing concern among municipalities hosting data centers. In Des Moines, Iowa, for example, local authorities reported that in July 2022, at the end of GPT training, Microsoft pumped 34.5 million liters of water, representing 6% of the city’s total water capacity.*


*IBID


As a result, some municipalities are refusing to authorize additional data centers and are asking major tech companies to reduce their water consumption. This has led to the emergence of terms such as “water positive,” notably at Microsoft, to describe commitments to making data centers more water-responsible. To reach this goal, companies are exploring various solutions, such as doubling training speed to reduce data center operating time—and therefore overall resource usage.


Can AI Be Made More Environmentally Friendly?


This raises a key question: can artificial intelligence be made more environmentally sustainable? By design, AI is energy-intensive, but several approaches exist to reduce its impact.


  • Model distillation: This method breaks a large model into specialized sub-models that are activated only when needed. For example, DeepSeek, a Chinese AI competitor to ChatGPT, activates a specific sub-model depending on the request (mathematics, translation, etc.) instead of running the entire system. This significantly reduces energy consumption.


  • Quantization: Quantization reduces the precision of calculations during training without significantly degrading output quality. By compressing data, it makes AI models less resource-intensive. However, this technique is still being refined and can, in some cases, affect processing quality.


  • Smarter integration of AI and infrastructure: Reducing AI’s environmental footprint also requires better integration into existing systems. This includes reusing waste heat from data centers to heat buildings, or shifting computational workloads to off-peak hours, similar to energy-saving habits already adopted in daily life.

Photo by Arif Riyanto sur Unsplash.

A Necessary Transition to Renewable Energy


Today, data centers are still largely powered by fossil fuels, which contributes heavily to their carbon footprint. But could cleaner energy sources realistically meet AI’s needs? In 2024, researchers demonstrated that it is possible to build a solar-powered, self-sufficient AI system capable of operating even under low energy input conditions.*


*CNRS, Alimenter l’intelligence artificielle aux énergies naturelles


In theory, then, large language models could be supplied by renewable energy. The remaining question is whether major tech companies will actually implement such solutions at scale.


Conclusion: Is AI Environmentally Sustainable?


The development of AI carries a real and growing ecological cost: electricity consumption, CO₂ emissions, and increasing pressure on water resources. Yet most users remain largely unaware of these impacts.


Should we slow the race toward ever-larger models? Promote more frugal, localized, and responsible AI systems? Or impose strict environmental standards on digital giants?


Solutions do exist. But in the face of increasingly widespread AI usage, it is legitimate to question whether they will be sufficient to curb the ecological and energy deficit driven by artificial intelligence.


FAQ


  1. Does artificial intelligence consume a lot of energy?

    Yes. AI relies on large data centers that run continuously and require significant electricity for both computation and cooling, making energy consumption one of its primary environmental costs.

  2. Is generative AI worse for the environment than traditional software?

    Generally, yes. Generative AI models require intensive training and powerful hardware, resulting in higher energy use and carbon emissions compared to conventional digital services.

  3. How much CO₂ does a single AI query emit?

    Estimates vary, but a short interaction with a large language model can emit several hundred grams of CO₂, depending on the model size, data center efficiency, and energy source.

  4. Why does AI consume so much water?

    Water is widely used to cool data centers. As AI workloads increase, so does water demand, particularly in regions where cooling relies on evaporative systems.

  5. Can AI become environmentally sustainable

    Partially. Techniques such as model optimization, renewable energy use, and waste heat recovery can reduce impact, but widespread adoption remains limited across the industry.

Comments


Commenting on this post isn't available anymore. Contact the site owner for more info.
Sitemap of the website

© 2025 by 360°IA.

bottom of page