HR Transformation

The Hidden Cost of Intelligence: How AI’s Energy Appetite Challenges Corporate ESG Policies

Elie Azzi | Client Relationship Partner

Artificial Intelligence (AI) is often described as the engine of the modern digital economy, a powerful force transforming how we work, create, and compete. But as the corporate world races to embed AI into every process and platform, an uncomfortable truth is surfacing: intelligence, it turns out, is energy hungry.

From the outside, AI feels immaterial, a virtual assistant that replies instantly, a chatbot that never sleeps, a model that analyses terabytes of data in seconds. But behind the friendly interface sits an enormous physical infrastructure: warehouses of computers stacked floor to ceiling, cooled by vast air and water systems, consuming electricity around the clock.

This is not a minor environmental footnote. According to the International Energy Agency (IEA), data centres worldwide used roughly 415 terawatt-hours (TWh) of electricity in 2024, around 1.5% of total global consumption. By 2030, this figure could more than double to approximately 945 TWh, propelled largely by the surge in AI adoption. To make that relatable, that’s about the same as Japan’s total annual power use or the electricity needed to supply 350 million average UK homes.

And that’s before considering water. Research from the University of California (Ren et al., Making AI Less Thirsty, 2023) estimates that AI-driven data centres could consume between 4.2 and 6.6 billion cubic metres of water annually by 2027, enough to meet the yearly water needs of around 33 million UK households. In simple terms: every second, AI uses as much water as one family consumes in an entire year.

These numbers illustrate a growing paradox. Many of the same corporations driving the AI revolution are also those most committed to ambitious Environmental, Social, and Governance (ESG) targets. They’ve pledged net-zero emissions, water positivity, and circular operations, yet their new digital engines may quietly undermine those very goals.

The AI–ESG Dilemma

Corporate boards today face a unique sustainability challenge. AI is no longer a niche investment; it’s a strategic imperative. Whether it’s automating customer service, accelerating product design, forecasting demand, or optimising logistics, AI promises higher efficiency, lower costs, and better decision-making.

But these benefits come with a hidden carbon and water cost. Training a large AI model, such as a language model used in search or enterprise platforms, can emit hundreds of tonnes of CO₂, depending on where it’s hosted and how it’s powered. Even daily AI use (“inference”) across millions of queries can create significant continuous energy demand.

Ironically, AI is also being deployed to help sustainability, for example, by improving renewable energy forecasts, managing grid load, or optimising supply chains to reduce waste. This creates a fascinating duality: AI may contribute to emissions in the short term, even as it helps cut them elsewhere in the economy.

Executives are therefore asking a difficult question: Does AI’s efficiency dividend offset its environmental footprint?

The answer, at least for now, is not quite, but the gap is closing.

Efficiency Gains: Real but Outpaced

AI companies and cloud providers are racing to make data centres more efficient. The focus is on two key metrics:

  • PUE (Power Usage Effectiveness) how efficiently a data centre converts electricity into computing power.
  • WUE (Water Usage Effectiveness) how much water is consumed for cooling.

Both metrics have improved steadily, with leading operators experimenting with zero-water cooling, heat recycling, and renewable energy contracts. Microsoft, for example, has pledged to eliminate water use in new AI-focused data centres, and Google has committed to 24/7 carbon-free energy by 2030.

Yet despite these efforts, overall demand continues to rise faster than efficiency gains. Each new generation of AI models is exponentially larger and more complex. What’s saved per server is quickly erased by the sheer scale of adoption. The world’s appetite for AI continues to grow faster than the industry’s ability to make it cleaner.

The Growing Footprint and Its Tangible Scale

To make this easier to picture, let’s put AI’s impact in everyday terms:

  • By 2030, AI-related electricity demand could reach 945 TWh per year, enough to power the entire UK 13 times over.
  • During that same period, the associated water use could equal over 4 billion cubic metres per year, the equivalent of filling 1.6 million Olympic swimming pools annually.

At this scale, the sustainability challenge is not theoretical, it’s immediate.

Where AI Could Pay Back Its Environmental Cost

So, can AI ultimately pay back its environmental cost?

The encouraging answer is that AI’s biggest environmental benefits lie outside the data centre, in how it’s applied. When deployed intelligently, AI can deliver dramatic sustainability improvements across industries:

  • Energy Systems: AI improves renewable forecasting, reduces grid instability, and helps balance supply and demand more effectively, allowing higher penetration of solar and wind energy.
  • Manufacturing and Logistics: Predictive analytics reduce waste, optimise production, and cut transport emissions.
  • Buildings and Cities: AI-driven systems can lower heating and cooling use by up to 20%, saving both energy and water.
  • Agriculture: Machine learning optimises irrigation and fertiliser use, increasing yields while reducing environmental impact.

These downstream benefits can, over time, outweigh AI’s own energy footprint, if AI use is focused on efficiency-enhancing, rather than purely convenience-driven, tasks.

That means companies need to treat AI compute like capital: a finite, valuable resource that must earn its environmental return.

The Role of Responsible Usage

Another layer of the conversation concerns how AI is used. Many people assume that once a model is trained, using it is relatively energy-light. In reality, every query still draws power and the cost increases with the length and complexity of each interaction.

To explore this further, we spoke to our AI Expert Hugh Abbott, a specialist in AI prompt programming, who shed light on how user behaviour itself influences AI’s environmental footprint.

Interview with Our AI Expert Hugh Abbott

Hugh explained that every time you send a message to a large language model, the entire conversation history is re-processed. “That means each new query doesn’t just process what you’ve written, it also processes everything you’ve said before. The longer the conversation, the more computation and energy are required.”

He explained that the computational cost grows quadratically with the length of the prompt, double the text, and you need four times the compute power. It’s a built-in property of the transformer architectures that power modern AI.

To help illustrate, he made it simple: “A 10-minute conversation with a big model uses about 5 watt-hours of energy, roughly the same as running a small LED bulb for half an hour. That sounds trivial but scale it up across millions of users chatting all day, and you’re lighting up cities.”

When asked whether smarter prompts, shorter, more precise instructions, could meaningfully reduce energy use, he nodded. “Theoretically, yes. The less unnecessary text you send, the less the model has to process. But do I think people will change their habits? Probably not. It’s like me deleting files on my old 13-megabyte Mac to save some space, possible, but nobody bothers when storage feels free.”

The deeper issue, he noted, is user detachment. “You don’t see the electricity bill when you use AI. It’s bundled into your apps and your devices. Until people feel the cost financially or ethically, they won’t act differently.”

He also highlighted how AI provokes an emotional response unlike other technologies. “There are three themes people can’t resist: AI will take all the jobs, AI will turn into the Terminator, and AI will consume too much energy. The last one is probably the most real.”

Intriguingly, our AI Expert also touched on the future of data centres and suggested that one of the more imaginative solutions might not be so far-fetched. He welcomed the idea of building data centres in space, calling it “feasible and within human capability.”

“With new technologies drastically lowering the cost per kilo to launch facilities into orbit, and the potential for limitless solar energy, it’s not impossible to imagine the first space-based data centre in our lifetime,” he said. “It could be one of the boldest and cleanest frontiers of sustainable computing.”

Still, he stressed that industry leaders on Earth are already incentivised to make AI greener. “It’s good for the planet and good for the pocket,” he said.

“The future of AI sustainability will depend as much on how intelligently we use it as on how intelligently we build it.”

A Corporate Roadmap for Responsible AI Growth

What does all this mean for corporate strategy? For most organisations, the solution isn’t to slow down AI adoption, it’s to integrate environmental intelligence into AI governance from the start.

  1. Measure and Disclose: Treat AI workloads as part of your carbon and water accounting. Include them in sustainability reports and set internal efficiency targets per model or per query.
  2. Select Cleaner Providers: Choose cloud partners that can verify renewable energy sourcing and water-efficient cooling. Location matters, running AI in the UK, for instance, can produce three times fewer emissions than in regions with higher carbon intensity.
  3. Design Efficiently: Encourage teams to optimise prompts, use smaller models where possible, and schedule training runs during periods of low-carbon electricity supply.
  4. Reward Sustainable Innovation: Tie cost savings from efficiency improvements to sustainability KPIs, make efficiency profitable.
  5. Educate Users: Internal campaigns encouraging “responsible AI use”, shorter prompts, minimal context, and conscious use, can make a meaningful cumulative impact.

In essence, sustainability in AI is no longer a back-end engineering issue; it’s a leadership responsibility.

A Future Balanced Between Progress and Planet

AI has already become a defining force of our era, as central to the 2020s as the internet was to the 1990s. Its potential to accelerate innovation, insight, and productivity is extraordinary. But its environmental impact cannot be ignored.

The same creativity that allows us to build machines that think must now be channelled into making them sustainable. Whether through greener data centres, responsible user behaviour, or even ambitious ventures like space-based computing, the next chapter of AI must align technological ambition with planetary boundaries.

As our AI Expert put it, the balance is possible, “if we make intelligence responsible before it becomes indispensable.”

Sources:
International Energy Agency (IEA, 2025); Microsoft Environmental Sustainability Report (2024); Ren et al., Making AI Less Thirsty (2023); BloombergNEF (2025); Carbon Brief (2025); OpenAI Energy Estimates (2024); Cambridge Bitcoin Electricity Index (2025).

Elie Azzi headshot

About the author, Elie Azzi.

Elie boasts 11 years in Marketing and Business Development, spanning financial services, insurance, and consulting. A problem-solving pro, he’s renowned for optimizing processes and delivering innovative solutions for clients.