I've used an instrument to measure the power from a Xenon lamp that comes down a piece of fiber optic cable. The instrument has a black disk with a slot cut around it maybe 3/4 of the way out. It is joined to the center of he disk by several "spokes" of the same material. There's a thermocouple across each of the spokes. They are put in series and drive a meter. The thermocouple voltage is proportional to the difference in temperature between the hot and cold junctions, and that difference in temperature is proportional to the power delivered to the center of the disk (and inversely proportional to the thermal resistance across which we are measuring the temperature difference). This was a dental curing light (to cure composite fillings in teeth). The xenon lamp went through a filter that passed only blue light, so with a 300W lamp, we got about 2W of light out the end. We got somewhere between 1 and 2 watts with a blue LED driven with 10W. This, of course, is a measure of broad band power. Often, though, we are interested in knowing how bright the light looks, taking in to consideration the varying sensitivity of the human eye to various wavelengths. The most accurate method of measuring this that I know of is to use a spectrometer to gather a power measurement every 10nm or so. Multiply these by coefficients of a photopic filter that emphasizes the wavelengths around green. Add the results. The instrument is typically calibrated with a $1,000 light bulb (an "A-illuminant" as I recall). They do this stuff where I work, but I don't directly work on it. So, this is just what I've overheard. Harold -- FCC Rules Updated Daily at http://www.hallikainen.com - Advertising opportunities available! -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist