Part 4 of 5 in a blog series looking at AI’s journey since the release of ChatGPT in November 2022.
In the last instalment of this blog series, we saw how the real framing of the AI sustainability question is this: “Does AI reduce the total energy required to achieve meaningful outcomes?”
To answer this question, we need to look at more than server racks, and take a systems perspective. The total energy cost must take into account human labour, organisational processes, and the physical world in which humans and machines co-exist. This makes it much harder to assess.
The first thing to note is that reducing the cost per computation doesn’t necessarily reduce overall cost, since there is a rebound effect. Cheaper compute makes it cost-effective to run more tasks, explore more scenarios, generate more content, and demand more infrastructure. The net effect of falling energy cost per operation can be to increase absolute energy demand.
However, when AI genuinely replaces repetitive cognitive labour – the human grunt work of searching, integrating, cross-checking, drafting, and reconciling – the potential energy savings at a societal level are enormous. The baseline for this aspect of the calculation is the human brain. Without romanticising human cognition or rejecting automation, it’s vital to recognise that millions of years of evolution have made the brain astonishingly energy-efficient. It runs on roughly 20 watts, a fraction of the power consumed by even a mid-range laptop. Yet it perceives, integrates, abstracts, judges, and invents in ways machines still cannot match.
This means that AI’s true energy saving potential lies in supporting human decision-making, not replacing it. And there’s a great operational fit. Unlike modern software, including AI, the brain didn’t evolve to exhaustively compute every possibility. It evolved to prioritise what matters, ignore what doesn’t, and make good-enough decisions quickly. AI will be most sustainable – and most transformative – when it helps humans focus on judgement, creativity, and care, rather than on repetitive, low-value work.
There is also a third aspect that we need to consider. Looking beyond individual tasks, the sustainability opportunity isn’t merely in cutting the energy cost of providing a response to an LLM query. It’s in reducing the total energy cost across the collaborative processes required to get anything done in the world:
- Consultations and meetings
- Back-and-forth iterations
- Compliance checks
- The impact of unexpected events and delays
AI has the potential to reduce total energy consumption by reducing human friction. This would be a genuine sustainability win, that goes far beyond infrastructure optimisation. But to unlock that win, we have to uncover the hidden energy in the world of human work and look closer at the meaning of efficiency.
To learn more, continue following this blog series.


Leave a Reply
You must be logged in to post a comment.