| Forum Home | ||||
| PC World Chat | ||||
| Thread ID: 55752 | 2005-03-18 21:55:00 | Computer power usage | lazydog (148) | PC World Chat |
| Post ID | Timestamp | Content | User | ||
| 335489 | 2005-03-18 21:55:00 | Does anyone know about how much power a computer would use in a month if its allways turned on?. Prefer it in dollar value if poss. We just got our power bill and i'm trying to defend the computer ;) Cheers. |
lazydog (148) | ||
| 335490 | 2005-03-18 23:02:00 | Can't give any idea without specifications, what peripherals are attached to the machine and how often they are used and, what the machine is used for (the load and therefore power consumption). For eg: Lazer printers and CRT monitors consume significantly more power than injet printers and LCD monitors, respectively. If the machine is left on (including peripherals) are they set to go on standby or hybernate when not in use, what power scheme is used? If you do a search of PF1, you'll find some figures for working this out, look for posts about a year old IIRC. |
Murray P (44) | ||
| 335491 | 2005-03-18 23:12:00 | Ok cheers. | lazydog (148) | ||
| 335492 | 2005-03-19 00:39:00 | i think we figured out that mine costs about $10 per month. the exact details of the process has faded into the mists of time but it goes something like this: it's an old celeron 466 with no monitor... even if it was drawing the maximum possible amount of power, it would consume about as much as two light-bulbs (200 watts). figure out how much it costs for 1 watt of electricty for 1 hour (look at your power bill or call your power company), then go 200 (modern computers more like 400) times cost-of-watt-per-hour times amount-of-hours-per-month = cost of running computer. now most of the time your computer will not be drawing anywhere NEAR the maximum amount that it's power supply can provide so this only gives you a maximum possible cost. the real value is probably about 75% of that, but i think you'll find that the maximum possible is so small that the exact amount doesn't really matter... :2cents: |
Rimu (7639) | ||
| 335493 | 2005-03-19 01:26:00 | It depends on the tariff, but power costs are often around 14c / kWh these days. 200 watts (0.2 kW) on 24/7 = 0.2 x 24 (hours) x 30 (days) x $0.14 = $20.16 per month. 400 watts is therefore $40.32 per month etc. But the celeron 466 sans monitor would probably be about 150 watts = $15 month. |
godfather (25) | ||
| 335494 | 2005-03-19 02:57:00 | Much less. My (UNEFFICIENT) linear regulator circuit leaves the computer drawing a total of 3.5 amps from the 12v battery. That's only 42w. Admittedly it's only a Celeron 466, but still most PC's only use about 2/3 the rating of the PSU. The actual computer itself only uses 35 watts or so, but linear regulators are inefficient. Of course, I'm kind of talking on a DC tangent here. Back in the 240v world, I have an old 400w PSU that says it draws up to 5 amps from 240v - holy moly that's 1200w!! So it really does depend on the (greatly differing) efficiency of the PSU. I estimate $3 - 15 per month for most systems in average power settings. |
george12 (7) | ||
| 335495 | 2005-03-19 03:47:00 | OK, some definitive figures. These have been measure using a calibrated instrument. My 2 GHz P4, with no monitor draws 144 watts, with the PC idling (doing nothing but run Windows). Activity (HDD or CD etc) will increase this. That is ~ $15 per month, less for a 466 Celeron. The "amps rating" of the PSU will also account for any 110 v setting where the current will be double. |
godfather (25) | ||
| 1 | |||||