On 23/11/2010 20:44, V G wrote: > Hi all, > > My understanding of analog electronics is next to nothing. But it > seems that to build a circuit of any sort, this knowledge is > essential. Well, to build a good circuit anyway. > > So I have a question. > > Let's say that there's a power source with variable voltage. Across > its terminals is a resistor of 10 Ohms. The power supply can only > supply a maximum of 1 Amp. > > If the voltage is gradually increased, to the point where I =3D V/R> 1 > Amp, what will happen? Will Ohm's "law" not be violated? No, as you would not be able to raise the voltage any higher than 10V=20 through the 10 ohm resistor, otherwise you would indeed violate Ohms law=20 (I would not equal V/R). You could try it if you have a bench power=20 supply - with say a 1/2W 1K resistor apply 20V for 20mA, then set=20 current limit to 10mA and the voltage should drop to 10V (make sure you=20 don't exceed the power rating of the resistor) Simulating this stuff in say LTSpice using the current and voltage=20 sources can be quite useful for learning purposes. --=20 http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist .