There are powerstrips with a switch that lights up when the switch is turned on. Like this one:
These switches (certainly older models) often use a neon lamp as light source.
I measured the electric energy consumption of a powerstrip with switch on and neon lamp burning (without anything plugged into the powerstrip’s outlets).
It consumed 7,8582 Wh over 24 hours, thus it drew on average 0,327 W.
FYI: although the switch is turned on in the above picture, you don’t see the neon lamp burning.
That’s because of the AC power here in Belgium is 230V and 50Hz.
50Hz means that the current is 0 A 100 times per second, and thus the neon lamp does not light up around these 0 A current values.
So the picture above was taken at a moment that the lamp wasn’t lighting up because the current was (almost) 0 A.
I will go into more details in an upcoming blog post.
I did not conduct tests with powerstrips that use LEDs in stead of neon lamps yet, because all the powerstrips with LEDs I have, also have a builtin USB charger, and that draws power too.
I did the following test: overnight, I let the fridge run for 12 hours. It contained an Aluminum can filled with water at room temperature (around 17° C).
I used a power meter to measure the electric energy consumption, and a multimeter with a thermocouple (type K) to measure the water temperature. The thermocouple was at the bottom of the water, not touching the bottom of the can.
The USB fridge consumed 60.717 Wh over that period, and the water temperature (at the bottom) was around 14.7 °C when I stopped the test. After the test, I moved the thermocouple to the top of the water, and there the temperature was 16.9 °C.
My multimeter logged the temperature every 60 seconds, resulting in this chart:
Notice that the first 12 minutes, the temperature rises a bit, and then starts to lower (I’ll do more experiments to try to figure out why it rises first). And then, when the cooling starts, it gradually slows down. Around 8 hours 45 minutes into the test, the water temperature reaches 14.80 °C and from then on barely changes.
The can is coolest at the bottom, as can be observed in this thermal image:
More pictures:
You don’t get much cooling from this USB fridge for the amount of energy it takes. I didn’t RTFM, so maybe its purpose is not to cool a can from ambient temperature down to a nice cool drink, but to keep a can cooled in a real fridge, cool when it’s sitting on your desk.
It’s average standby electrical power consumption is 236,46 mW. Standby means: I plug the adapter into an electrical outlet (230V) without connecting any device for charging.
I imagine that for a travel adapter, standby consumption is not that important, as one would use it only occasionally.
I took a lemon, inserted a zinc and copper piece of metal (a couple centimeters deep) and connected an electronic load to draw 1 mA of current.
I let it run for a couple of hours until no more measurable current flowed.
The electronic load dissipated 0,034 Wh of electrical energy over that period. Hence, we can assume that the lemon battery delivered 0,034 Wh.
I’m sure the lemon battery could deliver more energy, by “resetting” it: cleaning the electrodes, inserting them in another place in the lemon, …
After a bit of searching through the web, I’m going to assume that a typical smartphone nowadays has a battery of 10 Wh. So we would need 294 times (10 Wh / 0,034 Wh) the electrical energy delivered by my lemon battery to charge a smartphone.
Except that, the 0,9 V that the lemon battery does deliver, is by far not enough to be able to charge via the USB interface. We need 5V, so, 5,555… lemon batteries connected in series.
On the screenshot above, you can also see that 37 mAh was measured. Notice that you can not compare this to the mAh rating of a (smartphone) battery, because both values involve different voltages.
Comparing this to a button cell like a CR2032 (Dutch Wikipedia article, because there’s no English Wikipedia article): the CR2032 has a 225 mAh electrical charge (on average) and a 2.0 discharge voltage. That’s 225 mAh * 2.0 V = 450 mWh. Or 13 times more than my lemon battery (34 mWh).
Here are more pictures of the lemon after the experiment (one week later):
They consume considerably less standby power than linear power supplies, like this one:
These contain a transformer to go from a high voltage (AC) to a low voltage (AC), and then contain some electronic components, for example a diode bridge and capacitors, to convert the low voltage AC electricity into DC.
I tested this old power supply I had lying around, and it consumed 1.6836 Wh when tested with my power meter during one hour:
That’s 14,75 kWh for a year. Which is about 10 times more than my worst switched power supply tested here.
So, if you are planning to follow the advice of energy experts here in Europe (and watch out, quite a few are not experts at all, just echo chambers) to reduce your electric energy consumption and save money, consider the following points (their idea is to unplug chargers you don’t use).
Start with your linear power supplies, they consume the most (a tip to recognize them: they are heavy compared to the switched-mode ones, because of the transformer; and they are old)
If you are going to do this daily, take into account mechanical wear and tear. Like on the pins of the power plug, the cables …
To avoid that extra wear and tear, you can plug your power supplies into a power-strip with a switch
I have a laptop power brick that regularly cause the power plug to spark when I plug it into a socket. That’s also something you want to avoid.
TLDR: reducing the sound volume level of our TV has no (significant) impact on its electric energy consumption, but reducing the back-lighting does.
Here in Belgium, mainstream media is full of news with tips to reduce energy consumption.
Some good tips, some bad tips … That’s mainstream media for you 🙂
Recently, there was an article with the following tip: “reduce the sound volume level of your TV to save energy” … (I’m not linking to this article).
It is true that a speaker (and the audio amplifier) requires power. And that there is a positive correlation between electric energy consumption and sound volume level. Large speakers can draw quite some amps…
But I was a little doubtful that lowering the sound volume level of our TV with a view clicks, would have a significant/measurable impact. Because some time ago, I already made measurements, and our TV drew 120 Watt maximum. So I did not expect a big impact.
Anyways, one has to make measurements to know if there is a (significant) impact or not.
We have a 55 inch QLED Samsung TV from 2018. The test protocol I worked out is the following: start to play a long movie (LoTR) and measure the electric energy consumption during one hour exactly (with a GW Instek GPM-8310 digital power meter). Don’t touch the TV or remote while testing is going on, and make sure that no dynamic settings are enabled that can influence the electric energy consumption (like ambient light based brightness control).
I measured at 3 sound volume levels: 20, 19 and muted. And I did this twice.
Here are the results:
Sound level
Electric energy consumption (Wh)
20
117,74
19
117,74
0 (muted)
117,66
For our TV, there’s no difference between a sound volume level of 20 and 19.
And by completely muting the TV, we save 0,08 Watts. That’s a very small amount. To put that in perspective, we would have to watch 125 hours of muted TV to power a 10 Watt LED light-bulb for 1 hour.
Of course, that’s for our TV. If you have a TV with a powerful soundbar and extra speakers, your measurements will be totally different.
While going through all the settings of our TV, there is one thing I noticed: the back-lighting setting was set to its maximum (20).
I reduced the back-lighting to 10 and measured again. That made a significant change: 77,666 Wh in stead of 117,74 Wh (both at sound volume level 20, our usual setting). That’s a 34% reduction in electric energy consumption. That’s a significant reduction, but …, don’t forget that the back-lighting setting happened to be at its maximum.
We will keep it like that for the moment, and see if we still enjoy watching TV.
I tested a small powerbank that I have, and it’s very inefficient.
It takes 10.07 Wh to charge:
And it delivers 5.95 Wh when I discharge it (5V at 0.250 mA).
So I only got 59% back of the energy I put in.
This powerbank is quite old, it might have become so inefficient over the years. Google searches tell me that you should get at least 85% efficiency.
Although this powerbank still works fine, and his very handy to me because of its small form factor, I’ll see if I can get a more efficient one with a similar form factor.
It’s a 30+ year old multimeter, and it had become very dirty because of all the dust it collected while I used it in a home renovation project, years ago. It was still functional, so I used it for years like that.
But recently, after discovering YouTube “restoration videos”, I got the idea to open it up and clean it.
The result was very good. Until I used it the first time to measure a 230V cable. Then there was a big flash inside the casing, and all the lights went out.
This is how it looks now (notice the black soot marks on the orange plastic):
And the burned diodes:
What went wrong? The meter also has aluminum foil to shield the electronics:
And I was not careful enough when I put it back, and it shorted the 2 connectors:
I have 2 Bosch 18V “power for all” chargers. A normal charger (AL 1830 CV) and a fast charger (AL 1880 CV).
Measuring the power consumption of these 2 chargers in standby mode (plugged into a 230V outlet, but no battery connected) with a GPM-8310 powermeter, I obtained the following results: