I have to apologize for a silly mistake in my post of Sunday. In my eagerness to give a simple example, I divided an odd number by two and got an extra digit of resolution :-). What I got, of course, was an extra _bit_ of resolution -- you have to average 10 readings to get another decimal digit, and I believe the _useful_ resolution scales as the log(?) of the number of readings. Or is it the square root? It's been too long since I read all those papers. john perry jperry@norfolk.infi.net