Computers | |
Desktop Computer | 60-250 watts |
On screen saver | 60-250 watts (no difference) |
Sleep / standby | 1 -6 watts |
Laptop | 15-45 watts |
Monitors | |
Typical 17" CRT | 80 watts |
Typical 17" LCD | 35 watts |
Apple MS 17" CRT, mostly white (blank IE window) | 63 watts |
Apple MS 17" CRT, mostly black (black Windows desktop with just a few icons) | 54watts |
Screen saver (any image on screen) | Same as above (no difference) |
Sleeping monitor (dark screen) | 0-15 watts |
Monitor turned off at switch | 0-10 watts |
| x Cost per kilowatt-hour = Total Cost |
| |
|
For example, let's say you have a big high-end computer with a gaming-level graphics card and an old CRT monitor, and you leave them on 24/7. That's about 330 watts x 24 hours x 365 days/yr = 2,890,800 watt-hours, or 2891 kilowatt-hours. If you're paying $0.14 per kWh, you're paying $405 a year to run your computer.
Let's try a different example: You have a computer that's less of an energy hog, like in iMac G5 20", which uses about 105 watts, and you're smart enough to turn it off when you're not using it. You use it for two hours a day, five days a week. That's ten hours a week, or 520 hours a year. So your 105 watts times 520 hours = 54,600 watt-hours. Divide by 1000 and you have 55 kilowatt-hours (kWh). If you're paying 10¢ per kilowatt-hour, then you're paying $5.50 a year to run your computer.
That's quite a range, $5.50 to $405 a year. It really depends on what kind of computer it is, and how much you use it -- and especially whether you sleep it when you're not using it. Both the examples above are extremes. I used to have only one example somewhere in the middle but then I'd see people on blogs and messageboards misquoting it by writing, "Mr. Electricity says a computer costs about about $150/yr. to run" No, that is not what I said. I said that was just an example. Your situation is almost certainly different, and you need to consider all the variables, like what kind of computer it is, how much you use it, and most especially whether you leave it running all the time or sleep it when you're not using it.
Watts x Hours Used
1000
[(Watts times hours used) divided by 1000] times Cost per kilowatt-hour equals Total Cost
Let's try a different example: You have a computer that's less of an energy hog, like in iMac G5 20", which uses about 105 watts, and you're smart enough to turn it off when you're not using it. You use it for two hours a day, five days a week. That's ten hours a week, or 520 hours a year. So your 105 watts times 520 hours = 54,600 watt-hours. Divide by 1000 and you have 55 kilowatt-hours (kWh). If you're paying 10¢ per kilowatt-hour, then you're paying $5.50 a year to run your computer.
That's quite a range, $5.50 to $405 a year. It really depends on what kind of computer it is, and how much you use it -- and especially whether you sleep it when you're not using it. Both the examples above are extremes. I used to have only one example somewhere in the middle but then I'd see people on blogs and messageboards misquoting it by writing, "Mr. Electricity says a computer costs about about $150/yr. to run" No, that is not what I said. I said that was just an example. Your situation is almost certainly different, and you need to consider all the variables, like what kind of computer it is, how much you use it, and most especially whether you leave it running all the time or sleep it when you're not using it.
0 comments:
Post a Comment