Maybe I'm doing something wrong, but here's what I'm trying to do. I'm trying to measure voltage of a battery that can go as high as 16V. I'm using RA0 set as analog input for the A/D and I'm using Vdd as a reference. In order to reduce the voltage to compatible levels, I've connected the following voltage divider: from the + side of the battery, I have a 4.7K and then a 2.0K in series going to the - side. Let's say that the battery has 12V across poles. I put my multimeter + probe in between the two resistors and the - probe on the - of the battery. It reads about 3.6V, perfect. Now I connect my circuit's ground to the ground on the battery to be measured, still 3.6V. The problem happens when I connect my RA0 pin to the divider (same spot where my probe is). Immediately the voltage drops to 2.65V.... why is that? 1V doesn't seem a lot but when I do the math to output the real battery voltage, of course it doesn't match anymore. What am I doing wrong? I measured the voltage accross RA0 and ground, it is not zero. I figured that perhaps it is because the pin is set as input and is floating? Help is very much appreciated Padu -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist