> > All wattage consumed is expressed as either heat or is > distributed out > > of the room to other places where it is eventually turned > into heat. > > The wattage consumed is a very good indicator of the heat > produced by > > the electronics in the room. Outgoing network signals are > very low power. > > What you are saying is finally -all energy- entering a system > somehow turns into heat without loss.(Lol - Actually,heat > itself is loss). > > I wish this was true because from the bill point of view, > this approach indicates that all processes done by > CPU,HDD,Optical Drives etc. is free of charge.Perhaps the > reason why we plug electronic devices to electricity is to > get heat.Furthermore,to support global warming. Well, yeah, it all ends up as heat eventually. Most of the energy going into a hard drive ends up as heat in the chips & resistors, the rest in the spinning platters. Friction will cause them to stop eventually, more heat. Stop building server rooms, you're speeding up the heat death of the Universe. Some people get smart and use the 'waste' heat from the server room to warm the rest of the office. It's not really 'free' heat as you've already paid for it, but at least it's not wasted by being pumped out the window to melt the snow off some trees. I use my PC as a heated footstool, same concept. Not in summer though. The most efficient device you have would be your electric kettle. Its sole purpose in life is to turn electricity into heat, and it does that very well. It would be 100% efficient except you lose a little bit in the cable, but never mind. Plonk a Stirling engine on top to capture the waste heat, it can power your coffee grinder. Tony -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist