This is a bit of an unusual question, and I wasn't really sure where I should put it (Geeks Lounge or Hardware) but I decided to stick it here anyway.

I've recovered an old PC, an Acer Aspire Desktop (not sure on the model, but it's certainly a form factor... and is poorly designed for heat dispersion but that's another story), which I am planning on turning into a Folding@Home server for Stanfords medical research project.

The problem is I don't know how much it is going to cost, preferably I would want it to be running for as long as possible but with 18 pence (27 cents) per Kilowatt costs it could get very expensive.

The Power Supply is rated at 19V – 4.74 Amp output and 240V – 1.5 Amp input. Does that mean that it is only consuming 76 Watt/Hours of electricity, or are my calculations wrong?

Thanks, and any help would be appreciated.

4 Years
Discussion Span
Last Post by Octet

Remember that the power supply can pride up to that amount of power. Usually, units require less than the maximum, so lets assume the max for this exercise.

AmpsVolts = Watts so 194.74 = 90.06 Watts.
So up to 90.06 Watts per hour * 24 hours = 2161.44 Watt hours each day.


Thanks for the maths CimmerianX, so that would put it at about £142 per Annum, so running it for half a day at night would make sense, so not only would it half that figure but it should in theory be cheaper because of 'non peak time' charges.

This question has already been answered. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.