Efficient AI that consumes more: the return of Jevons paradox

The more efficient AI becomes, the more it gets used. And the more energy it consumes. Jevons paradox from 1865 returns forcefully to the center of tech debate. Where will the energy come from to power the AI wave?

Gaetano Castaldo Gaetano Castaldo
07 Jul 2025
AI Sustainability Energy #AI #Jevons paradox #nuclear energy #sustainability #data center #energy efficiency
Conceptual illustration of Jevons paradox applied to artificial intelligence: efficiency generating greater energy consumption

In the debate around artificial intelligence, there's a lot of talk about models, efficiency, plummeting costs. Much less about energy. Yet, if we look carefully, AI risks bringing back to center stage an old economic paradox born in the 19th century: Jevons paradox.

The paradox that comes from coal

William Stanley Jevons, studying the coal industry, noticed something counterintuitive: making steam engines more efficient did not reduce coal consumption, it increased it. The more efficient an engine is, the more worthwhile it becomes to use it, the more you use it. Result: total consumption grows.

Today we're replicating the same pattern with AI.

The efficiency that multiplies use

Models become lighter, hardware more powerful, energy efficiency improves. The cost to run a single "call" to an advanced model has crashed compared to a few years ago. This makes AI much more accessible: it integrates everywhere, in every process, in every app.

But precisely because it costs less and is more powerful, it tends to be used more and more, by more companies, for more things.

The paradox is here: every operation consumes less, but the number of operations explodes. Efficient AI doesn't automatically reduce overall consumption, it can actually increase it significantly. Some studies explicitly talk about a "digital Jevons paradox": Jevons paradox applied to the world of data centers and artificial intelligence.

The uncomfortable question: where will the energy come from?

If we accept this diagnosis, the next question is inevitable: where will all the energy come from to power this wave of computing? And that's where nuclear enters the picture.

Large digital players are already moving. Long-term agreements to buy energy from existing or reactivated nuclear plants, twenty-year contracts to dedicate entire facilities to data center demand: these are clear signals. If AI becomes critical infrastructure, we'll need stable, dispatchable electrical sources with low emissions to support it.

Renewables + nuclear: the necessary mix

Renewables – solar, wind, storage – are and will remain fundamental. But their intermittent nature makes them, at least for now, difficult to use as the sole pillar supporting continuous, inflexible demand like data centers.

Hence the renewed interest in "next-generation" nuclear, more modular, safer, integrated in hybrid energy mixes alongside renewables.

This doesn't mean ignoring the challenges: high initial costs, long implementation timelines, waste management, social acceptance. But ignoring the issue would be equally shortsighted. If AI continues to expand at its current pace, the discussion about where and how to produce energy will become as central as the one about models and algorithms.

Efficiency isn't enough: we need rules

The core message is that we can't rely on technology efficiency alone to "save us" on the energy and environmental front. In fact, precisely because AI becomes more efficient, it risks being used everywhere, pushing consumption up and creating new capacity needs.

We need two things in parallel:

  1. A decisive acceleration toward clean, abundant, stable sources
  2. Rules, incentives, and best practices that direct AI use toward scenarios that are genuinely useful and sustainable, not toward pure "use for the sake of using"

A challenge that goes beyond technology

The challenge isn't just technical, it's political, economic, and social. It concerns how we organize work, update skills, plan infrastructure that will be with us for decades.

In the end, Jevons paradox applied to AI reminds us of a simple truth: every time we make something more efficient and cheaper, we'll use more of it.

It's up to us to decide whether to accompany this dynamic with an energy strategy worthy of the task, or to suffer it, hoping that technology alone does the job for us.


Key Takeaways:

  • AI efficiency leads to greater use, not reduced consumption (Jevons paradox)
  • Data centers require stable and programmable energy sources
  • Next-generation nuclear is returning to the debate as a complement to renewables
  • Rules and incentives are needed to direct AI use toward sustainable scenarios
  • The energy challenge of AI is political, economic, and social, not just technological

Tags

#AI #Jevons paradox #nuclear energy #sustainability #data center #energy efficiency
Gaetano Castaldo
Gaetano Castaldo Sole 24 Ore

Founder & CEO · Castaldo Solutions

Consulente di trasformazione digitale con esperienza enterprise. Aiuto le PMI italiane ad adottare AI, CRM e architetture IT con risultati misurabili in 90 giorni.

Read also

Related articles you might find interesting

Want to learn more?

Contact us for a free consultation on your company's digital transformation.

Contact Us Now