Thanksgiving weekend is the traditional start of the Christmas lighting season at many homes, when the ladders come out and the strings of blinking colored bulbs go up.
But while holiday lighting displays make most people think of “peace on earth,” and all that stuff, it made me think of my electric bill. No wonder no one invites me to their holiday parties.
I thought figuring out how much electricity my Christmas lights use would be easy. My plan for testing the power use on my lights was to:
- Establish the baseline power consumption for the house by reading the energy monitor
- Plug in a string of lights to see how much the power use increased
- Subtract the difference and get the consumption for the lights
- Multiply that by the number of lights I’d plug in and multiply that number by my power rate
Turns out my problems began with Step 1.
I thought I had the house in a steady state, told everyone to keep their hands off the light switches, had the refrigerator off, etc. Yet I kept getting slight variations in the power monitor readings, ranging from .409 kilowatts to .426 kilowatts. Yes, tiny variations, but when I plugged in a string of LED Christmas lights I couldn’t tell the difference, or at least couldn’t get a reliable reading.
The likely cause of the fluctuation, CenterPoint tells me, would be power consumed by my computer, cable TV box and a few other items, which could add or subtract a few watts at a time to the power usage. So I have to redo the test with just about everything in the house unplugged.
But there’s an easier way to figure this all out, even without that nifty power monitor: Simply read the label on the lights.
Most lights will have a label that lists three numbers: volts (V), hertz (HZ) and amps (often just an ‘A’). Multiply the volts by the amps and you get watts (V x A = W).
So, one of the four strings of LEDs that will adorn my roofline this year are 120 volts (pretty much any set of lights you buy will be 120 volts) and 0.046 amps. That means they use 5.52 watts per hour.
My neighbors let me test one of the light strings they have used on their house for the past 10 years, which is labeled at 1.46 amps. That equaled 175.2 watts per hour.
So if I run my one LED light string for, let’s say, 10 hours per day for all of December, that’s 1,711.2 watts per hour or 1.7 kilowatt hours. At my current rate of 7.8 cents per kWh (month-to-month plan, so I’m sure it won’t last), that’s 13.26 cents for the month.
My neighbor’s lights over that same period would use 54.3 kWh or $4.23 for the month.
Multiply that by the number of light strings that will be draped around the house and it adds up to real money. I guess I should tell my neighbor.
Once I get my house all wired up, I’ll retry the tests using the power monitor and compare it with the Volts x Amps calculation from the lighting labels. Stay tuned.