Artificial Intelligence (AI) presents itself as a sleek, digital marvel, but its environmental footprint tells a different, resource-intensive story. The technology's simple interface and instant responses mask a colossal appetite for water and energy, raising serious sustainability concerns.
The Invisible Cost Behind Every AI Chat
Since its explosive debut in late 2023, ChatGPT has defined a new era of on-demand intelligence. However, every smooth conversation with a machine is powered by a physically demanding process. Modern AI systems are built on vast neural networks trained by processing unimaginably large datasets—trillions of words, images, and numbers. This training relies on Graphics Processing Units (GPUs), chips that perform thousands of calculations at once but generate intense heat.
To prevent hardware failure, this heat must be removed, requiring significant energy for cooling. In many data centres, this is done through evaporative cooling systems, where air is blown through water-soaked pads, leading to substantial water loss. A single typical ChatGPT query is estimated to consume about 0.32 millilitres of water. When multiplied by the billions of daily interactions globally, the total water footprint becomes staggering.
Growing Demand Meets Scarce Resources
The problem is accelerating. The near-constant launch of new AI models increases the demand for resources. Researchers from the Massachusetts Institute of Technology estimate that training just one large language model (LLM) can use several million litres of fresh water. Furthermore, the International Energy Agency projects that by 2030, AI data centres could be responsible for a substantial 8% of global power demand.
As water scarcity intensifies, the industry is exploring alternatives. Deborshi Barat, head of public policy at S&R Associates, points to emerging solutions. "Technologies like liquid and immersion cooling show promise," he says. He also suggests recovering waste heat for industrial use, shifting to air-based cooling, and strategically locating data centres in less water-stressed regions.
Pathways to a More Sustainable AI Future
Experts argue that instead of a complete infrastructure overhaul, incremental steps can yield significant benefits. Sushant Singh from the International Institute for Sustainable Development notes that smaller language models can often match the accuracy of larger ones while being far less resource-intensive.
Software-level optimisations are also key. Barat highlights that model compression and carbon-aware scheduling of AI tasks can reduce computational loads and energy draw. However, Singh believes regulation will be crucial. "Every enterprise wants its own LLM, which reuses parts of existing models and consumes more resources," he explains. He proposes regulatory checks on model size or token counts.
Ultimately, while energy use may be inevitable, its source can be cleaner. "We can at least ensure it's clean," Singh asserts, advocating for mandates requiring data centres to power their operations solely with renewable or non-polluting energy like solar, hydro, or nuclear power. The challenge lies in balancing relentless AI innovation with the planet's finite resources.