Mon, 20 Oct 2003
Apparently a cluster of 1100 Apple G5 computers requires "the same amount of electricity as 3,000 average sized homes." Now, it doesn't say exactly what an average-sized home uses, but this seems a little high.
I was under the impression that the CPUs used in the Macs were low-energy, efficient things. This is requiring the energy from almost 3 average-sized homes for each computer. Don't most houses these days have a computer in them? Or are we averaging this out across all the homes, including mud huts in Africa? Maybe a computer will use half the energy of a home. I will assume this below.
Ok, cooling is probably a large percentage of the energy usage. 1100 computers must generate some heat, and they go on about it in the article, however I don't believe that it should need 5 watts of energy to move 1watt of energy from a machine room into the atmosphere. If it does, that might explain why so much energy is currently used for air-conditioning.
If the G5 is efficient, and it does need 3 homes per computer, I would hate to have built something like this with Intel CPUs, they must use even more unbelievable amounts of energy. Do supercomputers need this sort of power to get the same performance? 3000 homes must be a small substation, you would not get much time out of your average UPS for this sort of load.
Maybe we should be looking into harnessing the waste heat from PCs, and using it in some sort of combined heat and power system. If we even got enough energy out of it to drive the air-con it would probably be a good thing. How could we do this? Thermocouples?
Last updated: 14:25, 20 Oct 2003 Link..