When I was a young experimenter, I thought the wattage rating on resistors meant they wouldn't get hot at the stated power dissipation level. As time went on, I found that the wattage rating meant something closer to "the device won't be damaged" or maybe "the temperature rise won't put the part out of spec", seeing as how temperature change alters characteristics. Another eye-opener occurred when I was repairing data terminals, some having a defective rectifier bridge in their power supply. These rectifiers were a serious-looking block of metal, will epoxy fill, and four tabs sticking out. I looked up the rating on these things out of curiosity, and here's what it said: Max current (attached to a heat sink): 25A Max current (free air) 2A At which point, "the repair guy was enlightened". And yeah, the defective ones, tended to have loose bolts or had little or no thermal grease under them. Thermal physics is interesting to think about, especially as it works a lot like ohm's law. Mark's right, that the size of the thing isn't as important as it looks. Generating heat makes something's temperature rise, and it rises until whatever's carrying the heat away (goes up with temperature) balances the heat being generated. It can escape through the air, through the leads to the board (then to the air), or onto a heat sink (and then to the air)...but it has to escape somehow. There were these little metal clips you could put over a small (to-5, to-39, etc.) transistor to increase the surface area. Many times it was enough to keep them at a reasonable temperature. Maybe something like that for your diode would be enough. Then again, when the voltage regulating part gets too hot, that's about when I begin to think, "Maybe I should find a better way to do this..." -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist