> There has been some discussion on the list of various > forms of eliminating bad data by averaging or using a > median filter. > > One interesting method of improving an average is to > throw away the highest and the lowest values. > I call this the Olympic Method, Interesting! I don't want to trash your idea, or your OlymPIC efforts at solving this problem but here's some points about averaging that I've learned the hard way, by writing code that doesn't work: Here's the other problem with averaging. Let's try a few randomly selected data points, and use the Olympic method (rumor has it that's also how they do sealed bids on construction jobs in Great Britain - Is that true, Brits?) 120, 150, 180, 100, 80, 250, 240, 120, 120, 110, 130, 120, 130, 140, 90, 80, 85, 111 We throw out the two highest and lowest values, 250 and 80, then average the rest getting 126.625 PICS, though, don't usually have the luxury of floating point math. The answer in a PIC is 126. In a real application following a setpoint, this error can add up. The PIC's answer is ALWAYS less than the real floating pont answer. so taking a moving average tends to "wind down". If you usually get an error of .5 LSB then you'll wind down to zero after 512 measurements, if you are trying to take a moving average. There are various methods to get movingf averages, like wieghting the old data vs. new data, adding the old data in as anopther data point, etc. etc. etc. The result is usually the same, after a long period of time. I wrote a fine application that would always float down to the bottom of it's setpoint range. Further attempts at writing rounding code, or doing more digits pof math inroduce extra complexity and needless headaches. Meanwhile the median filter would pick out a value of 120 from this dataset, and would continue to pick out values near there. I we really see that many values at 120, that may be the "real" number we are looking for. In my earlier extreme example, one screwy number can skew an average so it does not look very much like the data being measured. Throwing out the highest number helps a little, and you can add more and more complexities to a "simple" averaging scheme until you are using floating point math and a Cray supercomputer, you'll still get wrong answers. I maintain that 120 is a better picture of this data than 126. -- Lawrence Lile "An Engineer is simply a machine for turning coffee into assembler code." Download AutoCad blocks for electrical drafting at: http://home1.gte.net/llile/index.htm