Clocks, Microwaves, and the Limits of Fermi Problems

I don't have anything all that new to say about last night's Cosmos reboot, and I'm leaving for scenic Madison, WI today to attend DAMOP, so I don't have a great deal of time. Kate did mention something over dinner last night, though, that's a good topic for a quick blog post.

Kate's a big listener of audiobooks and podcasts, including The Naked Scientists podcast, and she mentioned something they said in responding to a question about charging phones and the cost of electricity:

I think my favourite one is a microwave oven. So, the clock on a microwave oven uses more electricity over the course of his lifetime than cooking food does. So, if you want your microwave oven to cost the least amount of money, don’t use it as a clock. Just turn it on when you want to use it.

That struck me as kind of weird, and it's the sort of thing that is amenable to a Fermi problem sort of plausibility argument. So, to Kate's chagrin, I grabbed a napkin and one of SteelyKid's markers, and did some math.

The tricky thing about discusions of electrical usage is always the units-- people get twisted around about what things mean. Most electrical things have ratings in watts, and so people will casually talk about electrical power as if that's the thing consumed. Power isn't a measure of stuff, though, power is a rate-- it's a measure of how much energy gets converted from electrical potential to some other form (light, heat, sound, microwaves) in a given amount of time. When you're talking about electricity used over the lifetime of something, what you need is not the power, but the total energy.

(This is why your electric bill is given in units of "kilowatt-hours." It's mixing units a little-- a kilowatt is 1000 joules per second, and an hour is 3600 seconds-- but a kilowatt-hour has dimensions of energy, and so is the proper measure of consumption.)

So, if you want to figure out the energy usage of your microwave compared to the clock in your microwave, what do you do? Well, the microwave itself has a rating in watts-- typically around 1000W for a household device, which is a nice round number to use in a Fermi-problem estimate. For estimation purposes, it's probably reasonable to assume that the microwave draws this full amount of power when it's running on high. But that's a very short time, compared to the clock, which is on 24/7, albeit at a much lower power level.

So, how do you compare the two? Well, one reasonable thing to do is to calculate an average power for the microwave when it's used for cooking. That is, we can calculate the total energy used over some time period, which is really concentrated in a few short bursts, but then divide it by the full duration of that period. Essentially, we pretend that rather than dissipating energy at 1000 J/s for a short time, it's dissipating a much lower energy over a much longer time. Then we can see how that compares to the power usage of the clock.

So, what's the energy used by a microwave? Well, in Chateau Steelypips, out microwave sees a good deal of use in heating up leftovers and frozen vegetables and the like. I don't think it would be unreasonable to say that it's used for cooking about 10 minutes a day, which is a nice round number for Fermi problem purposes.

So, the average daily energy consumed by the Chateau Steelypips microwave running at 1000W for 10min a day is:

$latex E_{cooking} = (1000 J/s)(600 s) = 600,000 J $

There are 86,400 seconds in a day, so this works out to:

$latex P_{avg,cooking} = \frac{600,000J}{86,400s} \approx 7 J/s $

So, the ten minutes a day we spend cooking with our microwave is equivalent to a smaller appliance dissipating energy at the rate of seven watts. To determine whether the claim that the clock uses more energy than the microwave holds up, we need to compare this number to the average power drawn by the clock in 24/7 operation.

So, what's the average power of a microwave clock? I have no idea. Seven watts seems awfully high, though. My smart phone (a Moto X) is probably on the short side of a watt when I'm talking on it (using a similar calculation to this), and that's working a lot harder than just keeping an LED clock display going. The label on my bedside alarm clock says 4W, but like the microwave, I expect that's probably only when the screechy alarm is blaring, not the constant draw. If it were dissipating 4W worth of power all the time, I'd expect it to be warm to the touch, and it's not.

So, I'm skeptical that this claim is really true. But this problem also demonstrates the limits of the Fermi problem approach. All I'm really doing with markers on a napkin is setting a reasonable range for the problem. To calculate an actual answer would require a good deal of information that goes beyond the simplifying assumptions I'm making here-- that 1000W might be a maximum rating, not the actual power used; maybe it's really running at 800W. And maybe it's not ten minutes a day of cooking, but six. At which point, the average cooking power has come down by a factor of two, into the same ballpark as my alarm clock. And then maybe we're not typical of microwave owners-- we do have two small kids, which may well mean more microwaving of stuff than the average Briton considered by the Naked Scientists.

I still doubt this really holds up, but given the Fermi-problem estimate, all I can really say is that it continues to seem implausible, but it's not totally ridiculous. I doubt it, but it's a close enough thing that it would probably be reasonable to slap a meter on there and check.

Which I'm not going to do, because I'm getting on a plane to Wisconsin in a few hours, but it's the next logical step in the scientific approach.

More like this

Assume the clock uses about as much power as your wristwatch. Then ask how long you can run your microwave on a watch battery....

By John Novak (not verified) on 02 Jun 2014 #permalink

My guess is that they're reasoning from a possibly old estimate of the vampiric power draw from the transformer that works as the clock's DC power supply. (It sounds as if this kind of thing is more tightly regulated than it used to be.)

By Matthew McIrvin (not verified) on 02 Jun 2014 #permalink

For the record, my chagrin was only that you were doing this _right_ _then_ . . .

By Kate Nepveu (not verified) on 02 Jun 2014 #permalink

I think Matt @2 has the answer. If you have a new(ish) microwave, the power draw for the clock probably will be smaller than for older microwaves. My microwave is about ten years old; not coincidentally, that was when I had my kitchen remodeled. That's a reasonable guess for average age of microwaves; some thrifty folks keep theirs going for years (the microwave I replaced was well over a decade old), but others get new kitchens or replace their kitchen appliances more frequently, and of course new housing units are likely to have new microwaves installed. My guess, based on nothing more than a sense that the average Brit is less real estate obsessed than the average American, is that the average microwave over there is older than here, so their average power draw for the clock would be correspondingly higher. Also, older microwave ovens tend to be lower powered (going from memory, I replaced a 700 W unit with an 1100 W unit), although they may not have clocks (my old microwave did not). I'd also venture to guess that your cooking usage is significantly higher than would be typical in the UK, as the "convenience food" factor is less important in most other countries than in the US. So it's likely that for the average Brit, energy usage for the clock really is the same order of magnitude as for cooking.

By Eric Lund (not verified) on 02 Jun 2014 #permalink

I found someone who measured the power usage of a clock radio at between 3 and 4 watts. http://www.lesswaiting.com/alarm-clock-power-consumption.shtml I also have an old GE analog clock (synchronous motor so its basically a ac cycle counter). at 2.5 w. So for stand alone clocks it sounds like 2-4 w is a range that could be used.
Or looking another way, Say you use the microwave to also cook and get say 20 mins a day of high power use thus about 330 wh per day. The clock is 72 wh per day. So for a month (assuming a standard 30 day month), 10 kwh per month for the cooking, and 2.1 kwh/month for the clock.

@Matthew
I doubt that a modern microwave oven is a significant part of household energy cost. But.. there can be surprises... I was shocked to find out that my older Motorola TV cable box was using about 30 watts continuously, 24/7 -- almost 1/2 as much energy as my refrigerator and costing me $50 a year in electricity.

With the older mechanical-timer microwaves the story is slightly plausible assuming a motor-driven clock. However, a more modern microwave with buttons and a digital display will likely implement its clock on the same microprocessor that runs its timer and other functions. So the power difference is that of driving the display to show the time. It depends on display technology (vacuum fluorescent, LED, LCD+backlight, etc).

A largish LCD with an LED backlight would draw in the neighborhood of 800mw. (I'm assuming 4 LEDs for the backlight -- the LCD panel would draw microamps -- at 20ma each with resistive current limiting and a 5v supply at 50% efficiency. I'm also assuming that with the clock disabled the backlight would be turned off -- if it's still on, putting up a clock display would need a few micro watts at most.)

A vacuum fluorescent might draw more, though looking at the spec sheet for a present-day Futuba (brand name) VF display I find it comparable to the backlit LCD (70ma typical @ 5v for a Futuba sixteen-character display -- it has a built-in inverter for the higher voltage it needs).

Plain LED digits would draw about 10ma @ 5v each, so a 4-digit clock would draw about half what the LCD backlight or vacuum fluorescent would.

So on a reasonably modern microwave (which would include my 15-year-old backlit LCD model) it's hard to see a draw of more than a watt. Of course, I'm often amazed at how poor the engineering is in consumer electronics (e.g. why does my 8-year-old Samsung HDTV draw 26 watts when it's turned off), so it's likely there is a microwave out there that fulfills the claim.

By weirdnoise (not verified) on 02 Jun 2014 #permalink

My older in wall microwave (which burned out two years ago, had a Nixie Tube display, which I guestimated at around 5watts. Its also quite possible the vampire load from the clock plus electronics power supply adds a few more. So simply replacing this one old poorly designed appliance saved on the order of one percent of total household electricity consumption.

The claim was probably correct at the time it was made, at least for some microwaves. Its probably not correct for modern ovens.

By Omega Centauri (not verified) on 02 Jun 2014 #permalink

Based upon measurements on a cheapo, non-Energy Star, Emerson 1100w microwave I've had for a few years using an inexpensive non-true-RMS Craftsman Multi-meter and a line splitter I measured the resting current to be .024A at 124v.

Assuming a power factor close to one it gets me a figure of roughly 3w. Which sounds about right. As I understand it more modern Energy Star units use less than 1w on standby.

As I figure it, assuming the unit is actually using 1100w and I did the math right, subject to correction, if you use the microwave less than 4 minutes a day then the standby energy use exceeds the running energy.

The way I figured it was that he standby energy use was 1/366 of the full power. 86400s in a day/ 366, /60 to get back to minutes, is just this side of 4 minutes. I estimate that I exceed that pretty much every day.

An interesting calculation is that assuming $ .12/KwH that 3w running 24/7/365 is costing me $.13 a year. If I unplug it after every use, perhaps half a dozen times a day or more, I save 13 cents. Of course that doesn't account for my time or the wear and tear on the plug and receptacle.

So far that cheap microwave has been quite the bargain. I got it on deep discount because the box was torn up when it had literally fallen off the truck. I had to readjust the micro-switches to get it to work right. A bit of fiddling to get the door to close properly and for the safeties to know the door was properly closed.

Re #10 You make a good point any system where there are touch pads to activate needs to have some electronics powered on at all times waiting for the touch pad to activate it. It is like the tv waiting for the remote control to turn it on, it takes some energy to keep the receiver online, or computers that don't have real power switches.

It's absolutely not a surprise that most appliances are electrical "vampires." (The 26 watts my TV and 21 watts my video receiver draw when turned off add up to more than all the lights (LEDs) in my living room when turned on -- which is why I have a switch that disconnects the TV and receiver entirely when not in use.) My calculation was based on the original premise, which was that having a clock on a microwave costs more energy than the few minutes of actual use it gets a day. That's probably not true. It's not the clock, but rather the other electronics that create the drain. It's misleading to have folks think that by turning of the clock on their appliance they are saving significant power, or that the clock is the reason it draws that power in the first place.

To actually save significant power, a consumer would have to have a switch that interrupts power fed to the device's power cord.

By weirdnoise (not verified) on 02 Jun 2014 #permalink

A lot of that standby load is a result of engineers trying to keep costs, component counts, and weight down. A single large transformer can be tapped multiple times to feed multiple circuits and voltages.

The down side of this is that a transformer optimized for that 1100w load is not efficient when feeding a 12v 1w clock/control circuit. Not unexpected that the transformer dissipates 2w as heat feeding that 1w load. The 1w being used to feed a simple 12v diode bridge and DC-DC converter IC that feeds 250ma of 5v DC to a control and display circuit.

Remember that mark-ups on retail are 3 to 10x so that $100 microwave has to be built, from mining of raw materials to trucking to the store, for somewhere between $10 and $33 because customers are very price sensitive and not very concerned with costs of ownership. So ... Yes, it could have been designed to use less energy.

Although there are certainly limits to Fermi problems (or rather "Fermi solutions"), this post is an interesting illustration of an aspect of their power which often goes unmentioned. Yes, the solution in Chad's post seems to have error bars that are too big to draw a clearcut conclusion. But it seems to me that the process has inspired others with more detailed knowledge to refine the solution. I learned some stuff this morning - mostly from the comments, but it seems to me that it was the original post that set things in motion. Ain't that great?

By Dr. Decay (not verified) on 02 Jun 2014 #permalink

Ah, where to begin.

I'd say the point of that Fermi question is, quite simply, to get you to think about the difference between average and instantaneous power. That makes a good teaching example. My only problem with your analysis is how little you use your microwave! You must dine out a lot.

Step outside the box of the original question and you should be thinking of an experimental tool like those watt meters you plug into the wall to monitor a single device. Get one at school and use it as a take-home lab.

One of the old barometer-height-of-building solutions would be to take your wife on a month-long vacation with the AC turned off and everything else unplugged and let the power company tell you the parasitic load from the microwave.

@13: An old microwave I took apart had two transformers, a massive one that was the majority of the weight of the beast, and a small one with multiple taps. IIRC, the big one was for the microtron tube and the other ran everything else (but no clock).

@8: Nixie tubes? Was it steam powered?

Our microwave doesn't have a clock, but our electric stove does. That clock must run off of stepped-down 240 V.

By CCPhysicist (not verified) on 03 Jun 2014 #permalink

@15: Good points, especially about purchasing a plug-in wattmeter, which would remove all of the pseudoscience used in the estimations.

If one measures the actual power levels, then performs the energy calculations, it might seem beneficial to switch off specific devices when they're not in use. However, switching off a device in order to save the cost of electricity makes the huge leap of faith that regularly power cycling the devices will not significantly reduce their working life (due to premature component failure).

@10: I'd always assumed that the power factor was close to one. A plug-in wattmeter that also measures VA and power factor showed me how mistaken I was for low-powered devices and especially mistaken for devices operating in standby mode.

Low power factor devices do not directly incur cost to the consumer; it is the power losses they cause in the electricity generators and supply cables that incur cost to the supplier. This wasted energy also impacts the environment via increased generator fuel usage.

E.g. one of my phone chargers on standby is a supply load of 2 VA, but it consumes only 0.02 W. For the sake of the environment I should perhaps switch it off; for the sake of my electricity bill, longevity of the charger, and wear on the switch, I should perhaps leave it permanently powered. There isn't a correct answer because there is no suitable research data for any of my devices as used in a typical home environment.

Chad has indeed presented a really good example of a Fermi problem.

So I dragged out our 15-year-old old microwave and my "Watts Up?" power meter, and plugged it in. Nada -- it read 0.0. Well, it may not work down to fractional watts, so I added a small incandescent lamp to the load, read as 48.2 watts. Unplugging and replugging the microwave produced a tiny surge, but it settled out to the same value: 48.2. The LED in the microwave display lit up (it was smaller and dimmer than I remembered -- probably only a single LED). I set the clock display, and as expected, saw no change.

Well, I suppose I could buy a more expensive power meter (this was an original made-in-the-USA Watts Up? I got a decade ago). But I feel safe in saying that the average 1726 watts the microwave drew while heating a cup of water to boiling in 2:15 is several times more energy than the less than 1 watt (perhaps even less than 0.1 watt) the idle microwave would draw over the rest of 24 hours.

So this microwave may be exceptional (kudos to the engineers at Panasonic if it is), but it shows that even pre-Energy Star appliances can be thrifty with energy.

By weirdnoise (not verified) on 07 Jun 2014 #permalink

Your calculation is more easily done in kiloWatt-hours
(and assuming a 360-day year, for round numbers:)
(Result is of course, the same if calculating J/day or kWh/year,
but consumers may know the price of a kWh, and certainly don't know a Joule from a BTU).

So use over 360 days is:
- for Heating:
1 kW * 10 min/day *360 (days per year) = 3600 kW-min per year
(/60 [min/h]) = 60 kW h per year
- for clock/standby, If using 3W continuously
3 W * 24h* 360 (days/year) =25920 Wh = 25.9 kWh, say 26 kWh.

So heating use is abut 60kWh /year and a 3W clock 26 kWh/year, for a ratio of 26/60 = 43%.
If you get electric power at $0.40/kWh, that is $24 for the microwave heating, and $10.40 for the clock at 3W.
(Add 1.4% for a 365day -year)
I suspect that most consumers won't care.