Remember that although you may not have an 'ohmic' or conductive path to ground, you are likely to be capacitively coupled to ground. When measuring the voltage from yourself to the 230V line with a high impendence voltmeter you may see a small voltage. This is because you have created an AC voltage divider with several elements: the voltmeter (1 or 10M ohm typical impedance), you (? ohms) and the capacitive coupling between you and ground (? ohms at 60hz). Now, assuming for the sake of concrete numbers: Rmeter = 1Mohm Ryou = 0ohm (worst case - highest current) Rcap = 19Mohm V = 240V Then your meter would show a voltage of : 1M/(1M+19M)*240V = 12V Oh, and the current would be 240V / 20Mohm = 12 microamps (should be quite safe) Now, you can test this theory be placing a 1M resistor in parallel with the voltmeter. Now the voltage read on the meter should be: 500K/(500K+19M)*240V = 6.15V And this time the current is Now, as far as your burning the LED lead: in order to get enough current to do that, you must have accidentally connected with a real ground somewhere, I would think. Please describe in more detail the physical arrangement of your experiment. Bob Ammerman RAm Systems -- http://www.piclist.com hint: PICList Posts must start with ONE topic: [PIC]:,[SX]:,[AVR]: ->uP ONLY! [EE]:,[OT]: ->Other [BUY]:,[AD]: ->Ads