Ohms law can't be violated in this case. The current through the resistor is E/R. So, if you are outputting 10 volts, and your load is 10 ohms, the current will be 1 amp. If you're only Outputting 5 volts, then you'll draw .5 amps through the resistor. If you're outputting 20 volts,=20 Theoretically you would draw 2 amps, but the power supply is limited to 1 amp, so you can only draw 1 amp. The voltage will not go above this equilibrium point, and will stay at whatever voltage it takes to maintain The 1 amp current. In this case, 10 volts. Jim -----Original Message----- From: piclist-bounces@mit.edu [mailto:piclist-bounces@mit.edu] On Behalf Of V G Sent: Tuesday, November 23, 2010 2:45 PM To: PICLIST Subject: [EE] Question about constant current and Ohm's law Hi all, My understanding of analog electronics is next to nothing. But it seems that to build a circuit of any sort, this knowledge is essential. Well, to build a good circuit anyway. So I have a question. Let's say that there's a power source with variable voltage. Across its terminals is a resistor of 10 Ohms. The power supply can only supply a maximum of 1 Amp. If the voltage is gradually increased, to the point where I =3D V/R > 1 Amp, what will happen? Will Ohm's "law" not be violated? In the case of the battery charger example: A NiMH battery should be trickle charged (assume constant current of a few milliamps, rather than a high current pulse). The battery has a relatively low internal resistance. How can the current supplied to the battery be controlled precisely when the applied voltage is relatively constant? --=20 http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist --=20 http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist .