I’ve read claims that AI queries require a lot of energy. Today I heard another claim on the Nerdland Podcast (a popular science podcast here in Belgium): “letting ChatGPT write an email of 100 words requires 70 Wh” (if you’re interested, that’s said at 00:28:05 in this episode).
I though to myself: that’s a lot of energy. 70 Wh is 252,000 Ws (70 W * 3600 s). Assume that it takes 10 seconds to write that email, then it requires 25,200 W of power, or 25 kW. That’s way more than the theoretical maximum I can get here at home from the power grid (9 kW).
So I decided to do some quick & dirty tests with my desktop computer and my powermeter.
First test: measure everything.
Step 1: starting up my desktop computer (connected to my powermeter) and waiting for the different services to startup, required 2.67 Wh of electrical energy:

Step 2: I opened a command prompt, started Ollama, typed a query to generate an email, and waited for the result. By then, the required electrical energy op my desktop computer (since starting up) was 3.84 Wh:

So step 2 took 57 seconds (00:02:36 minus 00:01:39) and required 1.17 Wh (3.84 – 2.67). That’s way less than 70 Wh.

Second test: just measure the query.
I restarted my computer and started Ollama. Then I started my powermeter and pasted my query and waited for the answer:

That took 3 seconds and required 0.236 Wh:

Notice that I have not just measured the electrical energy consumption of Ollama processing my query, but I measured the total electrical energy consumption of my desktop computer while Ollama was processing my query.
0.236 Wh for a computer running Ollama and processing a query is very different than 70 Wh for ChatGPT processing a query. That’s almost 300 times more, so even though my test here is just anecdotal and I’m using another LLM than ChatGPT, I will assume that 70 Wh is a gross overestimation.
FYI: asking Google “what is the electrical energy consumption of chatgpt processing a query”, I find results mentioning between 1 and 10 Wh. That’s closer to my tests than the 70 Wh claim.
Quickpost info
Thank you for the insightful post on the electric energy consumption of LLMs. One question that comes to mind is regarding the energy used during the training phase of these models? Of course this would have to be later devided by the times a model is used.
Comment by Anonymous — Monday 14 October 2024 @ 9:43
That’s something interesting to think about, thanks for the suggestion.
Comment by Didier Stevens — Tuesday 15 October 2024 @ 16:33
The most energy is consumed when training the model. Not when the model queried.
What people tend to forget is that the training of new LLM iterations never end, because the holy grail of AI is not found yet (or just market share). While using LLM version x.y is in use, its creator is already training the next version of the LLM. And training the next version always needs a considerate amount of extra energy in comparison to the previous version.
Comment by Anonymous — Sunday 10 November 2024 @ 9:54
That is true. Nevertheless, there are many claims that LLM queries require more energy than classic web search queries. Not clear to me if that is true too.
Do you have numbers for training?
Comment by Didier Stevens — Monday 11 November 2024 @ 8:32