This is a bugfix version.
base64dump_V0_0_26.zip (http)MD5: CD4370499288015C7EE13B59CB062129
SHA256: 3EEB76875ECCA782293D4486286F8155D1BB04DF23E3D3433E36C6373389B81D
This is a bugfix version.
base64dump_V0_0_26.zip (http)I have a classic wired doorbell at home: the 230V powered transformer produces 12V on its secondary winding. The circuit on that secondary winding powers an electromechanical doorbell via a pushbutton. The bell rings (“ding-dong”) when the button is pushed (closing the circuit).
Since losses occur in all transformers, I wanted to know how much my doorbell transformer consumes in standby mode (doorbell not ringing). The primary winding is always energized, as the pushbutton (normal-open switch) is on the circuit of the secondary winding.
I made the measurements on the primary winding: 3,043 Watt. That’s more than I expected, so I double-checked, and noticed I had forgotten this:

There’s a small incandescent light-bulb in the doorbell button. That consumes power too!
Second set of measurements after removing the light-bulb: 1,475 Watt.
So with light-bulb, my doorbell consumes 3 Watt 24/7, and 1,5 Watt without light-bulb.
1,5 Watt is very similar to the standby consumption of linear power supplies. As energy experts here in Europe advice to replace linear power supplies in favor of switched-mode power supplies, I wonder why they never mention doorbells … Replacing your doorbell would not be as easy as replacing a (USB) charger though (it would best be done by an electrician), so that might explain it, but on the other hand, there are enough energy experts proposing impractical solutions.
3 Watt is 26,28 kWh for a whole year. In my case, that’s a cost of €5,89 (that’s total cost: electricity plus taxes). I could reduce this by half, just by removing the incandescent light-bulb.
Should I do this? Well, the decision has already been taken for me: I dropped the light-bulb while it was still hot, and the impact broke the filament …
For comparison: 3 Watt is at least three times higher than the individual standby consumption of our appliances like TV, fridge, freezer, …
Yet another comparison: asking an LLM to write an email requires less (< 0,3 Wh) than my doorbell over a period of an hour (3 Wh).
This small update brings support for ZIP 2.0 via the pyzipper module.
strings_V0_0_10.zip (http)This is a post for version updates 0.0.8 and 0.0.9.
Added command officeprotection and option -j for pretty.
xmldump_V0_0_9.zip (http)This small update brings support for ZIP 2.0 via the pyzipper module and fixes a /ObjStm parsing bug.
pdf-parser_V0_7_10.zip (http)This small update brings support for ZIP 2.0 via the pyzipper module.
pdfid_v0_2_9.zip (http)A friend asked me if I had used a GPU for my tests described in blog post “Quickpost: The Electric Energy Consumption Of LLMs”. Because he had tried running an LLM on a machine without GPU, and it was too slow.
I did a quick test, just redoing previous test but without GPU (by setting environment variable CUDA_VISIBLE_DEVICES=-1).
Answering my query took 17 seconds, and required 1.13 Wh (again, for the whole PC).


I’ve read claims that AI queries require a lot of energy. Today I heard another claim on the Nerdland Podcast (a popular science podcast here in Belgium): “letting ChatGPT write an email of 100 words requires 70 Wh” (if you’re interested, that’s said at 00:28:05 in this episode).
I though to myself: that’s a lot of energy. 70 Wh is 252,000 Ws (70 W * 3600 s). Assume that it takes 10 seconds to write that email, then it requires 25,200 W of power, or 25 kW. That’s way more than the theoretical maximum I can get here at home from the power grid (9 kW).
So I decided to do some quick & dirty tests with my desktop computer and my powermeter.
First test: measure everything.
Step 1: starting up my desktop computer (connected to my powermeter) and waiting for the different services to startup, required 2.67 Wh of electrical energy:

Step 2: I opened a command prompt, started Ollama, typed a query to generate an email, and waited for the result. By then, the required electrical energy op my desktop computer (since starting up) was 3.84 Wh:

So step 2 took 57 seconds (00:02:36 minus 00:01:39) and required 1.17 Wh (3.84 – 2.67). That’s way less than 70 Wh.

Second test: just measure the query.
I restarted my computer and started Ollama. Then I started my powermeter and pasted my query and waited for the answer:

That took 3 seconds and required 0.236 Wh:

Notice that I have not just measured the electrical energy consumption of Ollama processing my query, but I measured the total electrical energy consumption of my desktop computer while Ollama was processing my query.
0.236 Wh for a computer running Ollama and processing a query is very different than 70 Wh for ChatGPT processing a query. That’s almost 300 times more, so even though my test here is just anecdotal and I’m using another LLM than ChatGPT, I will assume that 70 Wh is a gross overestimation.
FYI: asking Google “what is the electrical energy consumption of chatgpt processing a query”, I find results mentioning between 1 and 10 Wh. That’s closer to my tests than the 70 Wh claim.