> Anyway, if I understand correctly, you set a timer for some value > smaller than the length of a bit, then sample the pic every interrupt > right? But what if my timer interrupt lands right on a port change or > something? Also I don't understand the INTF statement. Well, first you need to know the format and timing of the data coming in. If it's a regulary timed data stream (ie the same length for a "0" or "1") then the convention is to sample each bit at least three times, although if you're confident about the timing at both ends (Tx and Rx) you could sample just once, right in the middle of the bit. This does leave you open to errors though To keep it simple, say a data bit length is 6ms and you want to sample each bit 3 times. After the first edge detection (which is where INTF comes into it), you wait for 1ms, sample, wait 2ms, sample, wait 2ms, sample. You have then got 3 equi-distant samples, (at 1ms, 3ms and 5ms) which if you have to, can be checked for a majority decision in the case of a noisy environment or if you just want to be absolutely sure. Many micros sample 16 or more times per bit and then get a majority decision. Getting the PIC's code timing in synch with the data stream should mean that nothing unexpected will happen whilst you are sampling. If the data is that important, it should be given the highest priority you can -- http://www.piclist.com hint: To leave the PICList mailto:piclist-unsubscribe-request@mitvma.mit.edu