Analyst Gartner’s most recent forecast of datacentre electricity consumption suggests that datacentres are likely to require approximately 1,200TWh (terawatt-hours) of energy by 2030, a 20% increase from the forecast a year earlier.

According to Gartner, power consumption of artificial intelligence (AI)-optimised servers that use graphics processor units (GPUs) is expected to rise to around 156GW (gigawatts), reflecting both the scale and pace of AI infrastructure adoption.

During a keynote presentation at the Microsoft AI Summit in London, which took place at the end of February, Microsoft CEO Satya Nadella spoke about AI energy efficiency in terms of the amount of electricity consumed to process snippets of information – known as tokens – that constitute the phrases and keywords that form a natural language query submitted to a generative AI (GenAI) engine.

As the company continues to expand the Microsoft Azure cloud and AI datacentres, Nadella said: “We are making sure that we have renewable energy powering all of our datacentre footprint. We have 100% renewable power today that is powering all of Azure, and we’re very proud to build that base and essentially stimulate renewable energy around the world and in the UK.”

The smallest measurable unit of work in the AI world is the token, and, at least from Nadella’s perspective, the goal is not only to reduce the energy needed to process a token, but to do so in a cost-effective manner. As such, IT decision-makers need to be cognisant of both the absolute processing cost and the carbon footprint for AI workloads.

As Shane Herath, chair of Eco-Friendly Web Alliance, notes: “If we are to avoid a future where AI growth is decoupled from our planetary boundaries, we must move beyond the idea that hyperscalers are the sole curators of the carbon footprint.”

Herath believes that true sustainability requires a recalibrated landscape where enterprises and individuals become active participants in a “digital diet”.

Daniel Smith, CEO of Astralis Technology, warns: “Every AI model trained, every dataset retained indefinitely, every compute-intensive workload spun up without scrutiny contributes incrementally to the overall footprint. Multiply that across thousands of organisations and the cumulative effect is substantial.”

Smith urges IT leaders to “do their bit”, which means assessing their AI requirements. For Smith, IT leaders need to make informed choices about whether their organisation genuinely needs any given AI workload to run continuously, and then make a genuine assessment of the AI models being deployed. He adds: “Are we optimising model size and training frequency, or defaulting to brute force compute?”

Beyond AI itself, Smith urges IT leaders to consider their organisation’s legacy systems and data estates. He says IT leaders should consider whether these are being rationalised or whether AI capabilities are just being laid on top of them.

“Environmental accountability in AI is not about restraint for its own sake,” he says. “It is about intelligent demand management and applying the same discipline to compute consumption that many organisations already apply to financial spend or cyber risk.”

Smith recommends that IT leaders reassess their organisation’s sustainability roadmaps given the rise in usage of enterprise AI. What they should not do, according to Smith, is defer or suspend them to build out the organisation’s AI strategy unhindered by environmental concerns.

“Too often, sustainability strategies are treated as parallel initiatives that are well-intentioned, but secondary to ‘core’ digital transformation. AI changes that equation. It amplifies both the opportunity and the risk,” he says.

In other words, sustainability metrics should influence architectural decisions rather than merely being used to satisfy the reporting needs for environmental impact and sustainability key performance indicators.

AI datacentre planning

The expansion of UK datacentre capacity is unfolding in an increasingly chaotic and uncoordinated manner. According to Luke Sperrin, senior practice lead for energy at Digital Catapult, planning authorities have been inundated with simultaneous applications, with more than 60 separate planning applications for the construction of new datacentres filed in England and Wales in 2025. This, he says, is creating significant local strain and signalling a lack of national oversight.

Sperrin warns that the geography of datacentre deployment is similarly imbalanced, with the largest clusters concentrated around London Docklands and Slough, two of Europe’s most mature and interconnected digital hubs.

“As AI servers become more power‑dense, datacentre connection requests – often sized to reflect anticipated final capacity – are placing increasing demands on electricity networks, prompting providers to explore alternative solutions that may carry environmental trade-offs,” he says.

There is a lack of standardised carbon accounting for digital workloads, which for Sperrin, means their environmental impact remains opaque and poorly quantified.

An alternative interface for human-computer interaction

One of the topics the Microsoft chief discussed during his keynote at the London AI Tour is how the user experience conversation has moved on from a slick graphical user interface (GUI) to a simple command line prompt, where the real power is hidden behind a powerful GenAI model that interprets language in a way that feels more natural to a human.

But as Herath points out, there is a hidden cost behind every GenAI prompt: “The energy gap between a standard web search and an AI-generated query has become a chasm. While a traditional Google search might draw a negligible amount of power, a single interaction with a generative AI model can consume 10 times that amount.

While a traditional Google search might draw a negligible amount of power, a single interaction with a generative AI model can consume 10 times that amount
Shane Herath, Eco-Friendly Web Alliance

“If that query includes image or video generation, the energy draw spikes further. Generating one high-resolution AI image can consume the equivalent of half a smartphone charge.”

For most people, these costs remain invisible. Herath warns that when a user prompts an AI to “summarise this email” or “draw a cat in a dinner jacket”, these simple phrases trigger a cascade of high-density compute in a facility often hundreds of miles away. “This creates a rebound effect – it is because the technology feels free and effortless [that] we use it frivolously,” he adds.

Who pays?

While the United Nations’ sustainable development goals (SDG) 12 advocates for the “efficient use of natural resources”, the current AI economy encourages high-volume, low-intent consumption.

The real cost of AI infrastructure is no longer hidden. As Craig Wentworth, principal analyst at TechMarketView observes, for much of the past decade, cloud economics allowed energy consumption to be abstracted away from enterprise decision-making. Hyperscalers invested at scale, efficiencies improved and sustainability narratives focused on relative gains versus on-premise infrastructure.

“AI changes that equation because its workloads change the scale, timing and concentration of energy demand,” he says. “Unlike earlier waves of cloud adoption, AI infrastructure drives sustained high-intensity compute, exacerbates peak demand pressures, and accelerates the need for grid reinforcement and transmission upgrades.”

Public investment in energy infrastructure has always underpinned economic development, and AI datacentres are increasingly framed as critical national infrastructure (CNI). But as Wentworth points out, once AI infrastructure becomes visible at this level, the question of who pays becomes unavoidable. “Simply treating AI growth as a public good does not absolve private actors of responsibility,” he adds.

Should Microsoft, Google and Amazon cover the full societal cost of their datacentres? Herath believes they need to pay their fair share. As an example, he says that Microsoft is already supporting rate structures in places such as Wisconsin that charge very large customers the full cost of the power they require, which prevents the financial burden of grid upgrades from falling on local families.

However, Herath adds: “There is a moral hazard in letting the user – whether a global bank or an individual hobbyist – off the hook. If the environmental burden is entirely internalised by the provider, the user has no incentive to change their behaviour.”

As the conversation around who pays for AI’s environmental impact evolves, it is likely that ordinary people, who are now getting started with tools such as ChatGPT, will be drawn into the conversation.

If they are charged a fee for usage, then that will effectively kill off the adoption of AI queries as a replacement for free internet searches. But there is an environmental cost, so perhaps what is needed is greater public awareness of AI’s significant carbon footprint.

Share.
Leave A Reply

Exit mobile version