Hmmm... no, I wasn't thinking that (that it has to be synchronized). FWIW, I got this working, but I get occasional errors if the sampling period is any higher than 1/9th the period of the data. I suspect that most of this is due to latency from other interrupt routines in the code. It's not a problem for now, as there's no other significant processing in the code, but I'm cleaning up (and breaking up) some of that other code now. Thanks. Cheers, -Neil. Dave Tweed wrote: > > I think you missed the point of the scheme I'm outlining. It doesn't > require the sampling clock to be synchronized with the data at all. > The only requirement is that the sampling period be less than the time > intervals you're trying to resolve, with sufficient margin based on > things like CPU clock accuracy and sampling jitter. > > -- Dave Tweed > -- View this message in context: http://www.nabble.com/Better-way-to-read-a-serial-data-stream--tp25018655p25417060.html Sent from the PIC - [PIC] mailing list archive at Nabble.com. -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist