Hi all, My understanding of analog electronics is next to nothing. But it seems that to build a circuit of any sort, this knowledge is essential. Well, to build a good circuit anyway. So I have a question. Let's say that there's a power source with variable voltage. Across its terminals is a resistor of 10 Ohms. The power supply can only supply a maximum of 1 Amp. If the voltage is gradually increased, to the point where I =3D V/R > 1 Amp, what will happen? Will Ohm's "law" not be violated? In the case of the battery charger example: A NiMH battery should be trickle charged (assume constant current of a few milliamps, rather than a high current pulse). The battery has a relatively low internal resistance. How can the current supplied to the battery be controlled precisely when the applied voltage is relatively constant? --=20 http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist .