It frustrates me how common it is for articles about AI energy usage to use figures for datacenter usage overall - understandable since AI companies aren't transparent enough about their power usage, but there are a whole lot of other things society uses datacenters for that aren't "AI"
Comments
Bitcoin is high energy. Bitcoin whales don’t want to change the algorithm because doing so would obsolete the specialist hardware they own.
This topic attracts a lot of idiotic stridency. I see a lot of bullshit and it's very clearly two way traffic.
Others see it differently and care more about meagacorporations than people so they defend the corporations and insult the people.
https://itkservices3.com/posts/datacentres_LLMs.html
On days where most power is from wind and solar (happening more and more), then the carbon cost of AI is effectively zero.
I suggest reading this article: https://en.wikipedia.org/wiki/Carbon_footprint
Q1: If the energy wasn't used for computing, how would it have been used?
Q2: If DCs are using clean energy, does that force others to use dirty energy?
Q3: How much is the data center build-out raising electricity rates for people besides the tech companies and their users?
A1: CA and TX both had days of 100% renewable power in 2024. Excess power can't be allowed in, so they have to turn stuff off
A2: No, it's not a zero sum game;
A3: Zero. DC get industrial prices, which are 10-20 times lower than prices for individuals. Two things keep electricity individual prices up, greed and regulation.
Yet usage is still counted by carbon rather than horsepower or kwh.
So a data center powered by solar is discussed as a carbon producer, when in reality it's role in carbon gen is so small as to be meaningless 😐
The absence of those leaves a critical public to assume ALL data center energy usage growth is because of "AI"
Both of these things are given way more attention and rage than they actually merit vs say… transportation or industry or concrete or agriculture
¹ https://suffolklitlab.org/protective-randomness-artificial-intelligence/#860250ad-e2a2-46ca-9b31-bca1f1e3265c-link
² https://greenproductionguide.com/wp-content/uploads/2021/04/SPA-Carbon-Emissions-Report.pdf
³ https://aiindex.stanford.edu/wp-content/uploads/2023/04/HAI_AI-Index-Report-2023_CHAPTER_2.pdf
It's weirdly hard to describe the searches they work for though - I see it as more or less anything that could be answered by consulting a year old copy of Wikipedia
It´s environmentally untenable.
People who don't find them useful think their energy use is a complete waste
People (like myself) who use them every day are much more accepting of their energy costs
And individual responsibility is severely limited, when w know that individual behavior as it relates to climate truly does not move the needle at *all.* What does is corporate regulation, innovation, and new standards.
I care that people have accurate information that they can use to guide their choices
"Should society regulate the use of these tools?" can be considered independently of how individuals can decide whether or not to use them given that they are currently available
Plenty of people chose not to eat meat
Investment in AI companies is investment professionals, investing monies given them by... "everybody". If a union hands retirement money to a hedge fund, that's union members choosing investments. That's union members building data centers for AI.
(If you know it's a tech bubble, I hope you're betting against it! Short the AI companies and make your fortune!)
I don't use video games, but that doesn't mean I think the energy use devoted to them is a waste.
Those one-off costs are at least shared by everyone who uses the model once it has been trained - but new models are being trained all the time
My notes on that here: https://simonwillison.net/2025/Jan/12/generative-ai-the-power-and-the-glory/
Best current energy estimate for a day of ChatGPT use is equiv. to driving an average car the length of a tennis court:
https://engineeringprompts.substack.com/p/does-chatgpt-use-10x-more-energy
with inference time compute (e.g. o1, o3), faster models can squeeze in more “thinking” before the first emitted token
so faster = smarter = cheaper = less energy
Energy efficiency of compute has been multiplied by 20,000,000,000...
The power required to run AI will soon be thousands of times less than today 🤷♂️
You have to understand that processing a string of tokens is unbounded here, as opposed to a fixed query.
Inference is still getting cheaper, and yes it will not become as cheap as search is now, but this is the reason that basically all the tech companies are buying nuclear power plants.
The buildout is happening.
https://www.barrons.com/articles/microsoft-stock-price-ai-data-centers-8574ae32
The answer doesn´t lie in the hardware but in the software: LLMs are just not a good algorithm.
And for me, “this form of compute may someday use too much electricity” has not crossed it yet.
However I’ve come to the conclusion that tuning this out is the right choice, for me anyway.
My opinion is whatever the motivation of initial arguments, it distills to something not quite correct
Growth in per capita human energy use is: prosperity!
AGI: an existential risk if not actively benevolent towards humans.
If we skip over agriculture & concrete production and dozens of other CO2 sources in order to fight about small, culturally polarizing *consumer end uses* of 🔌⚡️, I don’t think we’re helping.
And DCs are entirely electric, there's no oil or gas required. It can run on clean energy.