Had this problem a while ago and just didn't care because the application wasn't too important, but I'm coming up on a project in which temperature measurements will need to be within a few degrees... For this test, I am using an LM35 (Centigrade temp sensor, 10mV/C output), and a 16F870's 10bit ADC, and spitting the value out onto an LCD. I consistantly get around 10C (8-10 in the range I can test) less than the actual output (by probing). Say, for instance if I probe and get .334mV, that is 33.4C, but my output is around 22C. That's just not right. I'm wondering if the built-in ADC's are just simply horrible? Or am I doing something wrong? My scaling is as follows: 0-5V for the voltage range (Vdd as reference) 0-1024 for the input value range (10bit ADC) so it's simply (adcvalue * 5) / 1024 -> actual voltage, correct? The output does of course rise when I heat the sensor, and drop back down as it cools, it's definately getting *some* value from the ADC... I don't know of any offset register in the pic as there are only 4 registers for the ADC in general, and none mention an offset i would need to set up. Now, I realize I could use a more accorate reference, a lower voltage for the reference to gain resolution, but right now I just want to get this working and move from there, and honestly, it should be much, much more accurate than this. Ideas? Pitfalls? -- Nick Veys | nick@veys.com | www.veys.com -- http://www.piclist.com hint: To leave the PICList mailto:piclist-unsubscribe-request@mitvma.mit.edu