If I understand the question properly then I'm not sure that this problem can be solved accurately. For example, if I run the logger for 1 second then I expect to have 60 samples spaced at 16.7 mS apart. If I run the logger for 1.4 seconds then I expect to have 60 samples spaced at 23.3mS apart. If I run the logger for 2 seconds then I expect to have 60 samples spaced at 33.3mS apart. etc How could you maintain an even sampling rate to something that is not constant? Time in this case. If the operator can turn the logger off at ANY given time how could there be 60 samples waiting for you when the logging interval was never known? If I start off with a 60 sample time frame of 100mS what is the next time frame? 101mS, 110mS, 200.000054mS, double the first? If I discard old samples that do not fit the current sample rate they are lost forever, but I may need some of them to match a future sample rate or I may not. It sounds like an exploding data base may form here, with excruciating task of sorting it out on the fly. Tony Just when I thought I knew it all, I learned that I didn't.