Plain AI in plain English
About

Looking ahead · the climate angle

AI & energy

30-second gist~30s read

Each individual chat is small — about the same energy as a few Google searches. But the totals are big and growing. Datacentres training and serving AI use serious electricity, and most of the new demand is being met with whatever's available, including reopened gas plants.

The "should I worry?" answer for an individual user is: a little. For the industry: a lot.

If you want more

The numbers, as best we know~1 min

A single chatbot response uses on the order of a few watt-hours — roughly 5-10× a regular Google search. A ten-image generation, more. Training a frontier model uses electricity on the scale of a small town for weeks.

The International Energy Agency projected in 2024 that global datacentre electricity use could roughly double by 2026, driven mostly by AI. The exact figure varies by who's reporting and how they count, but the direction is consistent: up, fast.

What you can do~30s
  • Don't agonise over individual chats. The footprint is small. Worry about the bigger habits.
  • Avoid generating images you don't need. Images cost much more than text.
  • Pick services with public sustainability disclosures. The big providers are increasingly transparent. Google, Microsoft, and Anthropic publish carbon-intensity numbers; smaller services often don't.
  • Push your employer to ask. Big enterprise contracts can move suppliers toward cleaner energy in a way personal users can't.