Standby/No-Load Power Consumption: Are we doing it right?

The issue of energy efficiency has always been an issue with electrical appliances. With increasing adoption of ever-cheaper appliances, energy rating labelling initiatives and minimum energy performance standards (MEPS) for a range of products have been put into place to try and ensure that (properly imported) products reaching the market in Australia have acceptable electrical efficiencies. Legislature has been put into force, forcing a ban on less efficient conventional incandescent globes of higher wattages, which will soon get tighter to exclude halogen bulbs.

Global initiatives, such as US EPA’s Energy Star, and IEA’s One Watt Initiative have tried to clamp down on standby power consumption and no-load power consumption, sometimes termed “vampire power”. Official protocols for determining standby power consumption have been codified in IEC 62301 (2011) Ed. 2.0. As products are often designed for sale across the globe, the introduction of such schemes overseas have resulted in flow-on benefits to other countries as the products are often designed to meet the most stringent requirements to ensure they are marketable worldwide.

However, despite these initiatives, what has really spurred consumers to making better choices are increases in energy prices, along with real-time energy consumption meters, like the efergy and kill-a-watt, which help to inform consumers about their usage and provide some guidance to help change their energy usage habits, which is the most effective way to reduce your power consumption.

Despite consumers now actively playing a role in trying to optimize their energy efficiency, it seems that it can be quite easy to make the wrong choices based on misleading information or incomplete understanding of the problem.

The Wrong Information?

Would it surprise you if the devices that actually monitor your power consumption actually are misinforming you to some degree? It may be hard to believe at first, but I can definitely confirm that this is the case for many of the consumer grade power meters on the market. It boils down to the fact that they were built down to a price as an indicative device, and can easily be fooled.

For example, the efergy clamp-based power meter that I ran at home for over four years was known to attribute about 70W of usage to an idling microwave. A similar thing happened when tested with a kill-a-watt style meter which claimed 66W. Likewise, the three idling air conditioner units with no compressor heater windings claimed consumption of close to 200W in total. Could this really be?

In fact, the answer was no. I had flipped off the circuit breakers for the air conditioners for over half a year. With a 200W indicated load, for 24/7, would have equated to a 4.8kWh saving per day on the bill. The actual saving on the bills turned out to be nothing. Likewise, the supposedly 70W idling microwave was cool to the touch – 70W is not an inconsequential amount of power, and anything using 70W should be pretty hot after a while. Think of most desktop computers – that’s a real 70W load.

This misinformation comes about due to something known as power factor, which is commonly explained as a phase mismatch between the consumed current and voltage resulting in “reactive” power. This is an oversimplification, as low power factor can also arise from non-sinusoidal peaky current waveforms, otherwise known as high crest factor loads (which is very common amongst devices in standby). These two work together to make simple power meters, which are more suited for “basic” resistive loads, inaccurate.

Power factor is a dimensionless quantity, ranging between 0 and 1, representing the ratio of real power (i.e. actual consumed power) to apparent power (i.e. real + reactive power). Higher power factors normally indicate consumed current matches the voltage profile well, and that power isn’t being taken away in one part of the cycle and returned in another causing cable resistance losses. Reactive power can come about due to poor circuit design with no regard to power factor, or even cheap passive power factor correction circuitry. Passive power factor correction is normally designed to tune out the reactive power component of a given load, but if that load varies (i.e. goes into standby), then the power factor correction circuit can actually make things worse, resulting in poor power factor during standby periods.

The other quantity is crest factor, which describes how peaky the load is versus the average reading. Resistive loads which don’t interact with the AC should have a crest factor of 1.414 approximately as they follow the input voltage for current. Newer devices, utilizing more efficient switch-mode converters, do not follow the input voltage and consume current throughout the whole cycle, instead consuming current at the “peaks” of the cycle where the voltage of the line exceeds the voltage across their internal filter capacitor. This results in high crest factors of 3-20, requiring measurement equipment that can handle such high crest factors without losing the peak.

Basic meters will claim to be able to report power factor, but that doesn’t mean they are equipped to produce accurate readings of devices with poor power factor. Also, peaky devices can also trip-up these meters resulting in inconsistent over/under reporting of consumed power by quite a margin. In my experience, where you have a meter that reports power factor, the lower the power factor, the less reliable the measured result is. I’m not inclined to trust power factor <0.4 readings on your average kill-a-watt style meter based on experience.

Also, rather counterintuitively, because of this power factor issue, it is possible and indeed evidenced by myself that turning on appliances can reduce the metered power consumption. That, by conservation of energy, should be impossible, however, I’ve witnessed it on several occasions. The reason behind this seems to be because the appliance improves the overall power factor of the house, possibly increasing the “base” consumption to reduce the current crest factor and makes the meter more accurate. Under no real circumstances does turning on an appliance reduce your power consumption or bills, so don’t be misguided there.

The meters are also relatively inaccurate for low power loads, and for measuring standby currents especially on modern devices with low draw standby. This is specifically why many people end up buying these meters in the first place! Instead, they find the meter reports 0.00W for most of the time with an occasional one-reading spike to some random figure and then 0.00W again. Such readings are not highly meaningful, and leaving them for hours to integrate results in a figure which is hardly accurate in any sense, and more the integration of random noise. Short of performing modifications, which may be dangerous if improperly made, users may come to the wrong conclusion about which power adapter consumes more standby power.

Righting the Wrongs

Surely, there must be a way to measure the standby power correctly, right? Indeed there is, and it is specified in IEC 62301 (2011) Ed. 2. This publication gives guidance on how to make proper standby power consumption measurements, and is quite onerous, requiring expensive test equipment and specific test conditions to ensure repeatability and accuracy.

I was lucky to be one of the RoadTesters for element14’s test of the Tektronix PA1000, a AU$5,500 power analyzer instrument which has IEC full compliance standby testing as one of its “inbuilt” features. Despite having quite a few issues with the analyzer during and after the review, after spending over a year with Tektronix, the firmware has been improved and most major issues rectified, but standby compliance was one of the things which was working somewhat properly during my review.

It was only through doing the review that I realized just how onerous standby power testing is. The equipment has to be warmed up to achieve stable results, the room must be kept within the right range of temperature and humidity, and the power supply has to be nominal voltage and frequency +/- 1% with crest factor checked as well. As it turns out, just the mains power within the house could not meet voltage alone even with a Variac to adjust the voltage, and could never meet the crest factor (i.e. shape of the sinusoidal mains voltage waveform) requirements either.

As a result, I ended up using a cheap pure sine wave inverter which, also has its own drawbacks, namely a long one-hour warm-up period before its voltage stabilized to within 1% during the test runs. The test runs themselves were not plug in and read a number experiments, each taking 15 minutes or longer, involving linear regression slopes to determine statistically whether the consumption was stable and to average out burst consumption over time. Uncertainty limits were calculated as well (which, in the review, were too wide and later were tightened with no change in the test itself because the instrument was found to be more accurate than initially specified for).

Given such onerous requirements to get an accurate result, it’s not something you can expect a householder to do, and it’s not something you expect even an electronics hobbyist to do. Regardless, in the review, I had tabulated over 120 results of various devices and products I had around the house, most of which were somewhat older products, in the hope of finding some truth when it comes to standby energy.

Indeed, what I had found deserved its own posting, but I didn’t have the time. I was half-wishing that someone would come across the lump of data I put out there and would write it for me. But in the year and a half that has passed since then, it seems no such thing happened, so I’m back to actually finishing what I started.

The Right Choice?

This is the interesting part, mainly as this is where the results which may be counter-intuitive to some, or somehow not immediately obvious, get presented along with some of my concerns.

Is it standby power?

The first problem is one simply of semantics – what is classified as standby power? Officially standby power is the power consumed when a device is switched off and not performing its primary function, but in some sort of idle state which may include the ability of being turned on remotely.

By this definition, a mobile phone charger which is plugged in but not actually charging a phone is technically not a form of standby power, but instead, a form of no-load power. A device intended to switch appliances on and off by remote control, or say a mechanical timer, has no standby mode as such, as its primary function is to turn appliances on and off.

What about a TV with a PVR recording mode that is operational when the TV is turned off? It’s considered a low-power mode of operation, but as it’s still performing part of its’ primary function, it might not be considered standby. Likewise a Zigbee smart globe which is switched off, ready to be turned back on by remote command could be considered standby at first glance, but if it participates in networking, which it does (as Zigbee is a mesh networking protocol), then it could be considered a low-power mode instead.

Why are these semantics important? Mainly because some of the standby/no-load requirements only apply when in standby or no-load conditions. Where they are in a low-power state, they might be exempted from these requirements.

As a result, the user may think the device is in standby as they’re actively using it, but this is no guarantee that the power consumption is of a standby power magnitude (<1W). But lets not confuse ourselves – for the purposes of this article, I will consider standby as any device in its lowest power state which is not a complete hardware off.

Is it worth switching or unplugging?

This question is one that I get asked occasionally, and it’s a tough one to answer simply as it depends on your motivations. Lets make it clear up-front that unplugging or switching off a device definitely saves energy and will definitely save money, but it introduces inconvenience. Whether this is worth it for a given user is really up to their values.

cost-calculations

Most people like to think of whether something is worth doing by how much money they could save. As a result, I’ve computed this small guide-table to go with the results that basically tells you how much money it costs and how much energy is consumed by a given standby load for a whole year at AU$0.25/kWh. At the 1W “limit” of standby power for more modern devices, the cost is $2.19 per year. Of course, the actual savings depend on the device in question.

Mobile Phone Charger

Almost everybody has one, and many phones now have reminder alerts that say “Battery Full. Unplug charger to save energy.” Well, is it worth heeding that advice? From my testing, the actual power consumed by various phone chargers, some old, some new, are as follows:

phone-chargers

The most efficient charger was the newest charger consuming just about 11mW. Unplugging this charger from the wall will only save you a measily 2c/year and I bet you the inconvenience of doing so for the saving is not worth your hassle. Replacing a worn socket or switch would cost you a lot more. The second best result was the LG charge, the second newest at about 24mW, which is still under 5c/year. The worst was an old Nokia coming in at 465mW or about $1 a year, which might be still barely worth the effort.

As you can see, from a pure financial standpoint, it’s hard to justify unplugging your mobile charger at all, and modern chargers have gotten so efficient that their consumptions are small enough to become almost negligible. The progress as a percentage is staggering, the newest charger only using 1/46th of the power of the oldest in standby. This is, a trade-off as some people do keep older chargers and still keep using them, but overall, the power wastage is still considerably small so is probably no big thing to worry about.

Power Adapter

power-adapters

The long list of power adapters shows a much more mixed result, with numerous supplies exceeding the 1W standby limit. The reason for this is that many of them are older adapters which pre-date the requirements (majority of the failing ones not listed below) and are often linear supplies (e.g. Dick Smith Electronics M9560, Icom BC-110V, Panasonic N0JDCE000001, Realvision SA35-060-0400D, Uniden AAD-92S). That’s not to say that newer linear supplies cannot meet the standby current requirements.

That being said, many of the newer devices do in fact exceed the <1W, and <0.5W requirements. The Asian Power Devices WA-series and K-tec devices are fairly popular with external hard drives, the Asus, BenQ, Dell and HP PPP-series adapters are common amongst more modern laptops. This tells us that the majority of your power adapters are costing less than $1/year to be in their no-load state. Plugging in the device to the power adapter however, will increase their no load power somewhat.

Monitors

This is another one I get asked a lot of questions about – whether it’s good enough to leave the monitor in the “orange light” standby state, or whether to turn it off to full standby with the button. Sadly, most of the tested monitors are somewhat dated, but they prove a point still:

monitors

In all cases, LED ON increased the power consumption, but by varying amounts from about 70mW all the way up to 600mW. This is expected to make a difference of about $0.11-1.10 per year per device to your bills. At the low end of the spectrum, it’s easy to see how people would just pay the price for convenience.

Outsmarting the system

Some people have come to the idea that switching off the power is great for saving some energy and money, however little, but they are too lazy to do it at the wall. Instead, they decide to do one of the following:

  • Buy a powerboard with individual switches
  • Use a timer to switch the devices automatically
  • Use a remote-control switch of sorts

While good in theory, this can rapidly come undone because the consumption of the switching device was not taken into account.

Powerboards

power-board

Generally people think of power-boards as not consuming any power, and that is true with the exception of any power board with an LED or neon indicator on them. Some models have the indicator per switch, and lit whenever the device is on, others have one switch for the whole board lit when the board is on, and others have an indicator that is always on when power is supplied.

In this case, I tested on board with just one neon switch turned on, and its consumption was 175mW. This is more than the majority of idling modern phone chargers just to run a light that says the switch is on. This is not an obvious outcome, and as a result, you should choose no-frills powerboards with regular plain switches and no indicators if saving energy and money is your aim.

Furthermore, this easily explains why modern power adapters have been eschewing the power-on LED as that consumes energy, of a similar magnitude to the standby energy of the best chargers, essentially doubling the standby energy.

Timers

mech-timer

If you think you’re being smart, think again. Mechanical timers consume much more than most single idling appliances, with over 1W consumed. Electronic timers may be better alternatives, but I don’t have any of them at this stage.

Remote Control

Unfortunately, I don’t have many of these devices, but I recently tested the Belkin WeMo Switch with an eye to see how its standby consumption worked out. Ultimately, its output-off consumption was 1190.3mW and the output-on consumption was 1645.8mW, making it slightly more power consuming than a regular timer, and again, a poor choice to save energy as its standby consumption may be more than the consumption of the devices you are trying to switch.

As a result, if you’re using such switches, do consider only using them where you have a large number of appliances controlled by the switch, and even then, use them sparingly as they can easily otherwise undo any savings you might accrue. Remember that these in-line devices are often chewing power even when the switched device is off, and will present a 24/7 load.

In other cases, people have been deceived by plug-in devices which claim to reduce your energy bills simply by plugging the device into the wall. The device generally is nothing more than a crude power factor correction device, which can reduce indicated power readings falsely on low quality power meters that do not correctly measure low power factor loads. The saved energy is in reactive power, for which home consumers are not billed for and is not “real” delivered power anyway, and such devices have been heard to cause danger in catching fire or failing catastrophically.

The problem with linear transformers

With the introduction of MEPS and energy saving initiatives, the old fashioned linear wall-wart “brick” power supply has given way to newer, lighter, more efficient switch-mode supplies. The reason for this move might not have been very obvious to the average consumer, so I decided I should probably touch on this.

To simplify, an old fashioned wall-wart transformer consisted of a (normally) E-I laminated core transformer, that converts mains voltage AC into low voltage AC, which is then rectified and smoothed by capacitors to produce your low voltage DC. Such transformers can be quite expensive to manufacture due to the iron involved.

These transformers got warm, or even hot, while plugged in and idling. One reason is that the core material is conducive to eddy current losses, where the changing magnetic field induces electrical currents in the core material, which loses it as heat due to electrical resistance. Another loss is the resistance loss in the windings themselves, which are made out of many many turns of enameled copper wire which itself is also expensive. One of the bigger sources of losses is a concept called magnetizing current. As the iron core itself has hysteresis and remnance, each half of the AC cycle is trying to magnetize it in the opposite direction, and so loses some of the energy in trying to overcome the hysteresis and residual magnetic field in the core.

Later transformers were typically cost-optimized by reducing the amount of iron in the core. This had the unfortunate side effect of causing the core to run into magnetic saturation, basically reaching the limit of the magnetic field within the core resulting in a dramatic drop in permeability and much less controlled current spike at the peak of the incoming AC waveform. This results in dramatically increasing no-load power as a function of input voltage.

transformers-v-voltage

This can be seen in a voltage sweep of several linear transformers in my collection. Different transformers have different idle losses as a function of their design and construction, but the worst transformer running at 240v instead of 230v (a 10v increase) sees an increase in no-load consumption by 330mW which is fairly sizeable. In places with higher-than-normal mains voltage, you can easily be paying more in standby energy from these older linear transformers due to this cost-cutting measure.

As a result, in larger commercial installations with magnetic ballasted lighting, mains voltage optimization is sometimes used although with marginal benefit, as the losses in the transformer that lowers the mains voltage offsets the energy saved by such a system. That being said, a long time ago, someone was pointing out that utility suppliers were increasing the voltage to increase energy consumption and was laughed off the stage but there is some truth to that.

Switchmode power supplies instead seek to rectify the incoming AC into DC, and are wide-voltage, and relatively insensitive to input power quality as they are fully regulated. This DC is then chopped up into higher frequency AC which is more efficiently handled by smaller transformers with less weight and materials, at the cost of complexity of the device. These devices will consume nearly constant power for a wide range of voltages, thus overcoming this limitation of linear supplies as well.

Unfortunately, not all equipment is suited for use with switching supplies as their outputs may have high frequency noise components, sensitive radio equipment being one example.

Newer is better? How can you tell?

In general, newer is better for several reasons. Newer appliances are generally designed with the latest technology, and can thus offer significant energy savings. This might include more efficient converter electronics inside the device, or better seals and insulation in the case of refrigerators, and better mechanical design to optimize the output efficiency. Newer appliances generally will have to meet more stringent energy efficiency requirements in order to get imported, for example MEPS in Australia, or even 80Plus certification for computer power supplies.

However, one must be careful as it’s not a guarantee that a newer product is more efficient. In some cases, newer products with more bells and whistles can add to the power consumption, and desired functionality upgrades (say, trading up to a larger TV) can actually worsen your consumption.

Looking at the nameplate wattage is not enough. Sadly, the nameplate wattage is derived from operating conditions which may not be uniform across models, and often represents maximum draw, whereas most users would typically configure a device to their requirements which could draw significantly less power (e.g. a TV not running at maximum brightness). As a result, you could end up being misled, say a 140W TV with a lower maximum brightness, versus a 160W TV with a higher maximum brightness could end up consuming something like 90W and 70W respectively  when tuned to the same brightness. It would be less efficient to buy the 140W TV solely because it consumes less at its maximum usage.

That is where energy efficiency labelling based on testing (e.g. the Energy Rating stars label) is useful. These provide a figure and star-based rating based on laboratory testing under simulated real-life conditions which try to take the model-to-model performance variation out, and instead look at the power consumed to perform a particular task to a particular standard. Some of these have been found to underestimate real-life usage, but again, each consumer’s operating conditions will vary.

Of course, in general, unless the appliance is very old, it rarely makes sense to replace an appliance with a newer one for the energy savings especially if saving money is your goal. This is especially the case for appliances with low utilization, for which the differences in energy consumption aren’t accumulated quickly as savings. Furthermore, the replacement of appliances before their nominal lifetime has a potentially negative impact on the environment by land-filling devices before their useful life is up and consuming further resources in a new device. Some exceptions to this can be found of course, especially in old fridges and TVs where take-back, and for incandescent lighting where carbon abatement schemes were put into place.

On power adapters and other devices, the MEPS efficiency and idle load requirements are indicated by efficiency marks. These are roman numerals marked next to the words “efficiency level”. For external power supplies, these are governed by AS4665.1 and AS4665.2. Low voltage lighting transformers are specified by AS4879.1 and AS4879.2. All of these Australian standards have been recommended to IEC for adoption as IEC standards.

These efficiency marks are different for each category of device, so an external power supply marked IV is not of the same efficiency as a lighting transformer marked with IV. The full table is provided in Appendix A of their respective standards, and gets quite messy for power supplies where nameplate power, AC-AC or AC-DC are considered and natural-log functions are involved. Ultimately, a higher roman numeral designation indicates better efficiency – i.e. V is better than IV, is better than III, is better than II.

I think it’s rather odd that even though devices that need MEPS testing are labelled with efficiency marks, these marks are often only printed on the device itself. How is a consumer to know, without opening the box, whether a router includes a power supply unit with a III mark, or a V mark? Over the operating lifetime, the difference between these efficiency levels could result in a significant difference. It’s a sad fact that you just don’t know for sure, and all you know is that you’re not going to be getting something below II mark grading.

Imported trouble

Despite our “globalized” society, often there are situations of unfair pricing which lead to importing products from overseas and buying “grey imports”. These products are sometimes “regionalized” for their country of sale, resulting in products with the wrong language instruction manual, for example.

But to regulators and efficiency standards, this is where non-compliant imported devices can come unstuck. In my experience, most major-brand products typically stick to having one or very few variations of devices for the whole world which means that aside from not having the right label with the right logos on a given device, the device is still very much as efficient as it otherwise would be and safe to operate.

It’s the low-cost unbranded stuff, typically from China, that you really have to be concerned about. In a drive to minimise components to minimise costs, the resulting devices have been known to cause safety problems leading to electrocution, low-quality power output with poor regulation and high ripple, as well as low efficiency. These unbranded products are not only a potential hazard, they can also be very poor performers efficiency wise. I have personally seen very questionable construction methods including insufficient isolation, lack of fusing, wire/solder scraps, and incorrect components with insufficient voltage ratings being used regularly in such adapters. It might be wise to replace any of these power adapters with more reputable ones with the necessary safety marks.

It’s all in the habits

I suppose the take home message is that a change in habits is likely to have a bigger impact on your electricity usage than a choice of products on their standby power alone. For high utilization devices, where they are active for long periods, choosing a device with lower active power might make sense. But as most items in the home may be idle for long periods, it seems that the energy efficiency initiatives have succeeded in reducing the standby power consumption to rather low levels.

Despite this, it’s important to remember that these currents will all add up, and by having 20 x 0.5W devices idling at all times is the same as leaving on a 10W energy saving globe at all times. As a result, wherever practical, it is good to make a habit of unplugging unused devices to reduce the standby costs, as well as eliminate risk of fire and damage to devices due to surges. Even if it doesn’t save much energy, every bit of energy saved can contribute positively to the condition of the environment.

More importantly, the bigger message is to avoid leaving appliances in their active modes longer than necessary. Not watching that TV? Turn it off. Not using that room? Turn off the light. The difference between active mode and standby mode consumption has grown due to the push to reduce standby power consumption levels, and accordingly, even if left on standby for the majority of the day, the active power consumption still makes the majority of the device power consumption in many cases. As a result, making sure you reduce the active mode consumption can have a larger impact than chasing the standby power alone.

standby-energy-balance-calculations

This table shows what I would consider typical application case power. For the mobile phone charger, most people use it 2-3 hours a day, which results in about 10-15% of lifetime consumed energy being standby energy (i.e. energy saved when unplugged). This is more in the case of a lightly used laptop, which may see 20-45% of energy being in standby, so it can make sense to unplug if mostly in standby, but heavier users will be unlikely to see as much benefit. As for a television,the power can range from 27-64% of energy in standby, and the main reason is that I’ve accounted for a larger standby power load of ~5W in the case of Smart TVs which remain network-connected and recording in the background. Despite this being a low-power mode, it does contribute significantly, although the user probably wouldn’t want to unplug that in order to enjoy all that their device has to offer. This is the price of more modern, connected, always-on, always-ready devices and upgrading to “better” products.

Conclusion

When it comes to saving energy, it doesn’t take much brains to know that unplugging it or turning it off at the wall is the way to go. However, depending on how the device is used, doing so may not actually save you as much money as you would have hoped, and this is mainly due to the success of standby energy saving initiatives forcing manufacturers to produce more efficient devices, and progress in technology. In fact, the inconvenience of doing this can drive people to technological solutions which end up costing them more energy. Ultimately, saving energy doesn’t have to involve expensive fancy gadgets, and in fact, fancy gadgets may be quietly working against you without you knowing.

While I didn’t test many of the newest and more modern devices, the message seems to be that the monetary cost of standby energy is not as big as it once was, due to the adoption of switching converter supplies. As before, a change in habit with particular attention to how your devices are used in their active modes is still likely to provide more financial savings. Good habits will pay off consistently, and so will having good choices when it comes time to replace your equipment.

It would be nice to better understand how accurate (or inaccurate) cheap power meters are, but even the logistics of performing such measurements are not trivial. Generating the necessary waveforms at a range of powers to allow for comparison is something that is daunting and the measurements will take a day or more to be collected and analyzed. Maybe I will take on this exercise sometime in the future. However, experience based on owning and using a proper piece of equipment seems to show that those meters are both useful and misleading, all at the same time, so don’t be too concerned if you are reading implausible values – it could just be that the appliance you are testing draws a current waveform that is highly complex and not of the sort that a cheap power meter could accurately measure. However, such meters still remain useful in the case of identify gross, large loads which may have inadvertently been left on (e.g. a stove, iron, etc).

UPDATE: I have since tested the accuracy of a Kill-a-Watt style meter, and the results aren’t exactly great.

About lui_gough

I'm a bit of a nut for electronics, computing, photography, radio, satellite and other technical hobbies. Click for more about me!
This entry was posted in Electronics, Opinion and tagged , , . Bookmark the permalink.

Error: Comment is Missing!