AI Data Centers: Energy and Water Use Under Scrutiny

2

The rapid expansion of artificial intelligence is raising critical questions about the environmental cost of powering and cooling the massive data centers that underpin this technology. While some claims of extreme resource consumption have been disputed, the underlying issues remain significant. AI’s energy and water demands are increasing, putting pressure on already strained systems. This is not just an abstract concern; it’s a practical problem that requires urgent attention as AI becomes more integrated into daily life.

Debates Over Resource Consumption

Recent debates have focused on the accuracy of water usage claims, particularly those surrounding OpenAI’s ChatGPT. CEO Sam Altman dismissed estimates of 17 gallons of water per chatbot query as “totally fake,” stating that OpenAI has moved away from evaporative cooling methods. However, this claim is complicated by the fact that 56% of data centers still rely on evaporative cooling, a process that consumes substantial water resources. A 2026 report by Xylem and Global Water Intelligence projects that AI water consumption will surge nearly 130% by 2050.

The discussion over water use highlights a larger problem: the lack of transparency in how tech companies track and report environmental impact. Without verified data from OpenAI, Meta, and Google, it’s difficult to assess the full extent of their resource consumption.

The Scale of the Problem: Water Usage

Data centers are water-intensive facilities. Two Google data centers in Council Bluffs, Iowa, alone consumed 1.4 billion gallons of water in 2024. Meta’s facilities used approximately 1.39 billion gallons in 2023. These figures illustrate the sheer scale of water demand, even as companies like OpenAI claim to be shifting toward more sustainable practices.

The need for cooling is driven by the immense heat generated by AI training and operation. Just like smartphones and laptops, powerful servers overheat if not managed properly, leading to slowdowns or damage. The choice between water-intensive evaporative cooling and more efficient closed-loop systems will determine how sustainable AI development becomes.

Energy Demands and Alternatives

AI also places a significant strain on energy grids. Generative AI chatbots consume more power than traditional search engines, with a single query requiring up to 10 times the electricity of a Google search. Google’s own data shows that a median Gemini text prompt uses 0.24 watt-hours of energy, while AI-generated videos demand far more.

The industry is exploring renewable alternatives, with OpenAI investing in solar and battery storage. Other major tech players like Meta, Microsoft, and Amazon have also expanded their solar power use. However, these renewable sources currently supplement, rather than replace, reliance on fossil fuels in most data center grids.

The Path Forward: Transparency and Sustainability

The debate around AI and resource consumption is evolving from speculation to data-driven scrutiny. Communities and policymakers are demanding greater transparency and sustainable practices to ensure AI’s growth doesn’t come at the expense of local resources. Balancing technological innovation with environmental responsibility is no longer optional; it’s essential. As AI continues to advance, the industry must prioritize sustainable cooling solutions, renewable energy adoption, and open reporting of its environmental footprint to mitigate its impact.