Didier Stevens

Wednesday 20 November 2024

Update: base64dump.py Version 0.0.26

Filed under: My Software,Update — Didier Stevens @ 20:04

This is a bugfix version.

base64dump_V0_0_26.zip (http)
MD5: CD4370499288015C7EE13B59CB062129
SHA256: 3EEB76875ECCA782293D4486286F8155D1BB04DF23E3D3433E36C6373389B81D

Sunday 3 November 2024

Quickpost: The Electric Energy Consumption Of A Wired Doorbell

Filed under: Hardware,Quickpost — Didier Stevens @ 0:00

I have a classic wired doorbell at home: the 230V powered transformer produces 12V on its secondary winding. The circuit on that secondary winding powers an electromechanical doorbell via a pushbutton. The bell rings (“ding-dong”) when the button is pushed (closing the circuit).

Since losses occur in all transformers, I wanted to know how much my doorbell transformer consumes in standby mode (doorbell not ringing). The primary winding is always energized, as the pushbutton (normal-open switch) is on the circuit of the secondary winding.

I made the measurements on the primary winding: 3,043 Watt. That’s more than I expected, so I double-checked, and noticed I had forgotten this:

There’s a small incandescent light-bulb in the doorbell button. That consumes power too!

Second set of measurements after removing the light-bulb: 1,475 Watt.

So with light-bulb, my doorbell consumes 3 Watt 24/7, and 1,5 Watt without light-bulb.

1,5 Watt is very similar to the standby consumption of linear power supplies. As energy experts here in Europe advice to replace linear power supplies in favor of switched-mode power supplies, I wonder why they never mention doorbells … Replacing your doorbell would not be as easy as replacing a (USB) charger though (it would best be done by an electrician), so that might explain it, but on the other hand, there are enough energy experts proposing impractical solutions.

3 Watt is 26,28 kWh for a whole year. In my case, that’s a cost of €5,89 (that’s total cost: electricity plus taxes). I could reduce this by half, just by removing the incandescent light-bulb.

Should I do this? Well, the decision has already been taken for me: I dropped the light-bulb while it was still hot, and the impact broke the filament …

For comparison: 3 Watt is at least three times higher than the individual standby consumption of our appliances like TV, fridge, freezer, …

Yet another comparison: asking an LLM to write an email requires less (< 0,3 Wh) than my doorbell over a period of an hour (3 Wh).


Quickpost info

Saturday 2 November 2024

Update: strings.py Version 0.0.10

Filed under: My Software,Update — Didier Stevens @ 8:28

This small update brings support for ZIP 2.0 via the pyzipper module.

strings_V0_0_10.zip (http)
MD5: F98C9D646A83322BC9226673D79FFE2D
SHA256: 7C062616C95DE5DDF0792A8CE9CA0CCA14FF43A8786DCED043193B729361BB59

Update: xmldump.py Version 0.0.9

Filed under: My Software,Update — Didier Stevens @ 7:45

This is a post for version updates 0.0.8 and 0.0.9.

Added command officeprotection and option -j for pretty.

xmldump_V0_0_9.zip (http)
MD5: 6EC24845F61FE3F9AC111BFEC69B53C7
SHA256: B1F3F6B153367AEF83C42B8002E7EA8A650B7E7092D97ACA288F2B62A93D4B9D

Update: pdf-parser.py Version 0.7.10

Filed under: My Software,Update — Didier Stevens @ 7:21

This small update brings support for ZIP 2.0 via the pyzipper module and fixes a /ObjStm parsing bug.

pdf-parser_V0_7_10.zip (http)
MD5: 2EB627850B215F3B9D1532880DA4E8DB
SHA256: 17F9EA0B4CADF0143AA52E1406EEC7769DA1B860375440D8492ADC113300CDFD

Update: pdfid.py Version 0.2.9

Filed under: My Software,Update — Didier Stevens @ 7:19

This small update brings support for ZIP 2.0 via the pyzipper module.

pdfid_v0_2_9.zip (http)
MD5: 57C5AE391116B79E1F90FFF7BBB36331
SHA256: 1FC540C9EB9722C1E430262DFF64F39606A7B4838DDE9F70EE3C56526EDEF5FF

Friday 1 November 2024

Overview of Content Published in October

Filed under: Announcement — Didier Stevens @ 14:42
Here is an overview of content I published in October:

Blog posts: SANS ISC Diary entries:

Tuesday 8 October 2024

Quickpost: The Electric Energy Consumption Of LLMs – No GPU

Filed under: Quickpost — Didier Stevens @ 0:00

A friend asked me if I had used a GPU for my tests described in blog post “Quickpost: The Electric Energy Consumption Of LLMs”. Because he had tried running an LLM on a machine without GPU, and it was too slow.

I did a quick test, just redoing previous test but without GPU (by setting environment variable CUDA_VISIBLE_DEVICES=-1).

Answering my query took 17 seconds, and required 1.13 Wh (again, for the whole PC).


Quickpost info

Sunday 6 October 2024

Quickpost: The Electric Energy Consumption Of LLMs

Filed under: Quickpost — Didier Stevens @ 19:17

I’ve read claims that AI queries require a lot of energy. Today I heard another claim on the Nerdland Podcast (a popular science podcast here in Belgium): “letting ChatGPT write an email of 100 words requires 70 Wh” (if you’re interested, that’s said at 00:28:05 in this episode).

I though to myself: that’s a lot of energy. 70 Wh is 252,000 Ws (70 W * 3600 s). Assume that it takes 10 seconds to write that email, then it requires 25,200 W of power, or 25 kW. That’s way more than the theoretical maximum I can get here at home from the power grid (9 kW).

So I decided to do some quick & dirty tests with my desktop computer and my powermeter.

First test: measure everything.

Step 1: starting up my desktop computer (connected to my powermeter) and waiting for the different services to startup, required 2.67 Wh of electrical energy:

Step 2: I opened a command prompt, started Ollama, typed a query to generate an email, and waited for the result. By then, the required electrical energy op my desktop computer (since starting up) was 3.84 Wh:

So step 2 took 57 seconds (00:02:36 minus 00:01:39) and required 1.17 Wh (3.84 – 2.67). That’s way less than 70 Wh.

Second test: just measure the query.

I restarted my computer and started Ollama. Then I started my powermeter and pasted my query and waited for the answer:

That took 3 seconds and required 0.236 Wh:

Notice that I have not just measured the electrical energy consumption of Ollama processing my query, but I measured the total electrical energy consumption of my desktop computer while Ollama was processing my query.

0.236 Wh for a computer running Ollama and processing a query is very different than 70 Wh for ChatGPT processing a query. That’s almost 300 times more, so even though my test here is just anecdotal and I’m using another LLM than ChatGPT, I will assume that 70 Wh is a gross overestimation.

FYI: asking Google “what is the electrical energy consumption of chatgpt processing a query”, I find results mentioning between 1 and 10 Wh. That’s closer to my tests than the 70 Wh claim.


Quickpost info

Wednesday 2 October 2024

Overview of Content Published in September

Filed under: Announcement — Didier Stevens @ 0:00
Content: Here is an overview of content I published in September:

SANS ISC Diary entries:
« Previous PageNext Page »

Blog at WordPress.com.