Thomas Sefranek wrote: > I think you have things backwards... > You can not add range to an A2D with an amplifier. > > You want the BEST linearity in your amplifier... > Specify what signal you intend to amplify. hmmm. I think the point is not exactly "range", but "resolution", and that is what I believe he is looking for. Suppose you have a signal to measure, that can swing from zero to 1000mV. Suppose your 10 bits unipolar ADC reference voltage is 2000mV. It means that when the signal is at span, your ADC would be using about half its bits, or, (01 1111 1111) 01FFh. So, in real, your system is using only 9 bits for the whole range of the input signal, with a resolution of 1.953mV. Suppose you change the ADC to an unipolar 12 bits, with the same reference voltage of 2000mV. Now, when reading the 1000mV signal, even that this 12 bits ADC is only using 11 of its bits, it would be generating around (0111 1111 1111) 07FFh, what is 2 more bits (4 times better), or, a resolution of 488uV. Well, if you go back to the 10 bits ADC and amplify your input signal by 2, now, at the input signal span, the ADC would be receiving 2000mV, so it would be using all its 10 bits, doubling also the resolution to 976uV, when comparing to the original 1.953mV without the front end amplifier. The expression "ADC Range" for me, means what the ADC can embrace with its bits, as a relation between the reference voltage and the input signal (range). If you can extract more or less bits from an ADC using a front-end amplifier, then you are changing the ADC range. Of course, it can also be done by just changing the reference voltage, whenever possible. That is exactly what "auto-range" measurement units do, changing the input signal, or, the reference voltage. We did it in the past with a NTC metering unit. The NTC "log" curve can be separated in 3 parts, steep, linear and long. Depending how you amplify each of this parts, you can get a better transfer rate between the signal and the output resolution, I mean, as closer to 45 degrees curve, the better. Our equipment was produced to offer an auto-range amplifier with 3 levels, so, selected by software, at any point of the curve we had the best possible transfer rate. We adapted the ADC range to different points of the NTC curve. About the amplifier linearity, at very low resolution (10 or 12 bits), and processing speed (low), I would not worry at all. Any regular operational amplifier, LM324 as example, would present a linearity that would be much better than the ADC resolution error. At any point in the measured signal, the 976uV resolution, lets say 1mV, will only discriminate the signal at those steps. If the measured signal is 100.3mV, then amplified by 2, would be 200.6mV, the binary resolution would show 200mV or 201mV. The LM324 (not one of the best) would not generate a linearity error bigger than the 600uV or 400uV introduced by the ADC gross resolution. We usually look for BEST linearity in amplifiers, when dealing with 24 bits ADC, as we do in our equipments. At this level, the resolution is pretty high, around 12nV, and here yes, overall circuitry linearity is very important, including of course, operational amplifiers. Wagner Lipnharski - email: wagner@ustr.net UST Research Inc. - Development Director http://www.ustr.net - Orlando Florida 32837 Licensed Consultant Atmel AVR _/_/_/_/_/_/ -- http://www.piclist.com hint: The list server can filter out subtopics (like ads or off topics) for you. See http://www.piclist.com/#topics