Hi all, I'd like to select a scheme for digitizing an analogue signal which contains some gaussian noise. I'd like to eliminate the gaussian noise somehow. I know that if I take 100 samples of the noisy signal and I avarage the samples, than the signal to noise ratio gets better by a factor of 10. The analogue to be digitized has a bandwidth of about 10 Hz. Sometimes it may be greater but only the lowest 10 Hz is interested. The analogue signal is superimposed on a fix voltage between 0 and 5 volts and it changes between 0...150 mV. The gaussian noise has a density of 325 microVolt / square Hz If I used an ADC with a one pole low pass filter, then the voltage of the noise would be: (325 microV/sqrt Hz)*sqrt(10Hz)*1,5=1,54mV RMS So 1,54mV would be added to my analogue signal? I'd like to reduce this noise down to 50 microVolts. If I want to do it with that RC filter than the 3dB point had to be set to 0.01 Hz. The reciprocal of 0.01 is 95sec. Does this time mean something? Do I have to wait 95secs to reach the desired accuracy? How should I choose the ADC and the antialiasing filter to reduce the noise? How about FFT ? Or an RMS to DC hardware converter? If you can't answer please recommend someone who can. Laszlo Cser cserl@freemail.hu