Hi Gerhard I hate butting in with criticisms online, especially where my understanding is poor (electronics is a hobby to me...). Still, either I can correct my understanding, or correct the details on LED's and voltages, etc. Please see my comments at the end though... Gerhard Fiedler wrote: > Peter Todd wrote: > > > [stuff snipped] >>>> There is no law that the lower voltage on the bus should be 0V: I can >>>> imagine a but with for instance signalling levels 9V and 12V. >>>> >>> Works for downstream only. You have some power to get rid of at the >>> sender, though. >>> >> Hmm... such a system would nessesitate some sort of voltage regulator at >> the slaves, unless the voltages were 5V and, say, 3V >> >> What would the advantage be of such a system? >> > > Not sure what Wouter thought of, but for one you want to keep the voltage > high enough so that the lower voltage plus any possible variations is high > enough. Additionally, you may be able to use the higher voltage and PWM the > LEDs with a reduced on time, and with this reduce the average current. In > your setup, it seems the processor currents are negligible and the major > problem are the LED currents. So it may make sense to make the voltage as > high as possible for the LEDs. The processor supply can be had easily and > cheaply with a resistor and a zener (a these days often forgotten "sort of > voltage regulator" :). > > I understand that you are suggesting that if you increase the voltage supplied to the nodes that there will be a potential saving related to a decrease in the average current supplied to the node. This can be realised by using a PWM scheme to pulse the LED's. My understanding is that this would not make sense... The voltage across a diode, or LED, is *always* the forward voltage of the diode. The only *voltage* requirement for an LED to operate is that the supply voltage is greater than the forward voltage. The only other significant requirement is that the current through an LED can not exceed it's ratings. Take for example a red 2.5V LED with a 20mA rating. With a 5V supply you would need a 150R resistor to limit the current at 20mA. The voltage across the LED is 2.5V (the forward voltage). Increasing the supply to say 12V, the equation would still be simple.... with a 600R resistor to limit to 20mA. If the LED has a higher current rating with a pulsed PWM process, then, for example, with 50% duty cycle on PWM, you could say double the current, and to do so, you would just halve the resistors to 60R for the 5V supply, and 300R for the 12V supply. In either case, the current along the wires would be identical in the 5V and 12V arrangement. There is no advantage in using a higher supply voltage than one that already works. In fact, there is a distinct disadvantage because the series LED Resistor needs to dissipate more power with a higher voltage.... in the 5V case it dissipates 20mA * 2.5V = 50mW, but with the 12V system it dissipates 20mA * 9.5V = 190mW In my understanding it would be a mistake to drive the LED's at a higher voltage unless it cannot be avoided. Rolf -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist