Part 3 of 5 in a blog series looking at AI’s journey since the release of ChatGPT in November 2022.
In this article, I’ll start looking at another issue with modern AI: the electricity it consumes.
The part of the story that’s easy to grasp is huge data centre facilities. You can assess the impact of construction in square footage, GPUs burning in megawatts (or more commonly now, gigawatts), and cooling systems in running costs. But there’s another kind of energy footprint, one that is far harder to measure, potentially far larger, and fundamentally dependent on the reliability issues discussed in earlier articles in this series.
This is the energy cost of human time – the countless hours people spend on work that AI is increasingly being called on to replace, accelerate, or augment. When we look at AI’s environmental impact purely through the lens of hardware electricity usage, we miss the broader system costs of doing human work the old way, the subtle ways AI reshapes energy consumption across society now, and how it could potentially do so much more efficiently in future.
Human mental work is energy-heavy. Humans compute using food, warm buildings, transport networks, and social infrastructure. When safety engineers, auditors, or analysts spend days or weeks manually reconciling datasets and documents, that’s not just time lost. It’s a flow of embodied energy through every part of their work environment:
- The calories they burn and the food production system that supplied them
- The lights, screens, and heating in the office
- The buildings that house them
- The transport they use to get there
- The downstream healthcare, housing, and services that sustain them
And then there’s the energy cost of delay. A patient waiting longer for a diagnosis consumes resources in the form of extended hospital stays, support services, caregiver time, anxiety and its physiological toll. A logistics network slowed by hours costs fuel and friction. The energy cost of people waiting for a solution piles up in ways that aren’t visible on a data centre’s power bill.
When AI can resolve a complex analytical or decision task in seconds, the direct compute energy can look large. But if that AI also collapses delays, reduces repeated work, and prevents needless travel or meetings, then the total system energy may be far lower than the old way of doing things.
So, the real framing of the AI sustainability question is this: “Does AI reduce the total energy required to achieve meaningful outcomes?”
In the next article of this series, I’ll start looking at the answer, and we’ll see how it depends on the discussions in previous articles.


Leave a Reply
You must be logged in to post a comment.