I've been pondering adjustable current sources to control LED brightness over the past few days and my lack of experience with dynamic analog circuits leaves me with a number of questions. The simplest case I can think of (likely too simple) is to put a capacitor in parallel with the LED/resistor of a typical PWM drive. A resistor of much smaller value would be used, largely as a shunt to feed the voltage drop back through an A->D input. The duty cycle of the PWM output is adjusted to match the desired voltage drop over the resistor, which can be calculated to the desired current. The most basic question is does this even work in theory? If yes, then will this work in practice, and is it at all efficient? Since I'd want several PWM outputs and they would need to be done in software, would the power draw from running the uC at the required clock frequencies be greater than disapation from heat from an analog control transistor? I realize that there would be limitations in the responsiveness of such a circuit design,but I don't think they would matter for the purposes of LED brightness control. I don't think I would use such a circuit in practice, I'm more interested in determining whether or not I understand what is going on with the components. An extension to this is something that I would be interesting in incorporating to an actual design. I'd like to run several groups (channels) of LEDs in series from a single Li-ion cell. This would mean using a boost regulator to convert ~3.6V up to 10-15V as an adjustable constant-current source. Is it reasonable to use a PIC as the switcher here? The rough idea would be to have separate boost circuits (separate inductors) for each channel controlled. Can this be done efficienty in terms of battery consumption, or would I be better off with a dedicated external switcher? -p. -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist