The unseen environmental burden of artificial intelligence
Every time a user poses a new question to ChatGPT, an unseen process is set into motion, resulting in a surge of carbon emissions – a fact unbeknownst to most users.
Unlike a basic web search that processes each query independently, large language models like ChatGPT revisit the entire conversation history with each new message: every word, every sentence, every “token”. This implies that as the conversation expands, so does the computational load, consequently escalating energy consumption and emissions.
Increased Calculations, Increased Pollution
Language models like ChatGPT, which are based on the so-called “transformer” architecture, demand millions to billions of calculations as the conversation lengthens. Specifically:
- A conversation of 1,000 words necessitates about 1 million attentional computations.
- A conversation of 10,000 words requires 100 million computations.
- A conversation of 100,000 words demands 10 billion computations.
Research involving 14 large artificial intelligence models revealed that the complexity of a question directly correlates with the level of pollution it generates, with a complex question causing up to 6 times more pollution than a simple one. The more advanced models consumed up to 50 times more energy to provide the same answer compared to simpler systems.
In fact, it’s estimated that a ChatGPT query uses five times more electricity than a Google search, or the equivalent energy required to power a light bulb for approximately 20 minutes.
The Threat of Overloading Data Centers
A report from the University of Cambridge warns that the rapid proliferation of AI could lead to a 25-fold increase in the tech sector’s energy consumption by 2040. This surge could place immense pressure on power grids and jeopardize global climate change objectives.
Already, companies like Google and Microsoft are witnessing a drastic increase in carbon emissions, despite their pledges towards “zero emissions”. Google, which has claimed to be carbon neutral since 2007 through offsets, conceded in its 2023 sustainability report that it no longer maintains “operational neutrality” due to the growth of AI.
Are We Heading Towards a More Polluted Future?
At present, data centers are responsible for approximately 1.5% of global emissions. However, by 2040, this percentage could escalate to a staggering 8%, overtaking the total emissions from global aviation.
Goldman Sachs forecasts that by 2030, data centers in the United States alone will account for 8% of the country’s total electricity consumption, a significant rise from 3% in 2022.
However, Bhargav Srinivasa Desikan, the lead author of a report from Cambridge, criticizes major tech companies for their lack of transparency regarding the energy requirements of AI. This lack of clarity leaves a void in information, making it difficult for politicians and regulators to fully understand the situation.
As AI continues to rapidly evolve and the demand for “smart” solutions grows, a pressing question arises: Can we truly call it “innovation” if we fail to consider its environmental impact?





