here is how my software UARTS work, assuming 10-bit frame: I trigger on the falling edge of the INT pin, so I get an interrupt. This is serviced immediately upon entry into the interrupt routine so that the software delays before the service routine are known. At that point, I set the bit count for 10 bits, start TIMER1, preset the value of TIMER1 to a spot that will cause the first trigger to occur right in the middle of the start BIT, then I turn OFF the INT pin (I only use it to locate the beginning of the serial frame). Every time Timer1 interrupts, I immediately capture the data bit state, which occurs approx in the dead-center of the bit. I count down the bit counter, and when all 10 bits have been captured, I make sure the FIRST bit is LOW and the LAST bit is high, then extract the 8 bits in between. I immediately turn off TIMER1, turn on INT pin. Because there is a gap from the middle of the stop bit until the next start bit begins, a small amount of timing error can be easily handled if a small EXTRA delay is issued. If it is NOT added, I might miss the start of the next serial frame. --Bob William Chops Westfield wrote: > On Feb 3, 2005, at 9:25 AM, Bob Axtell wrote: > >> from [start bit] on through the frame until the end of the STOP bit, >> timing is precise. The receiver UART locks on at the beginning of the >> START bit and subjects the remaining bits to a rigid timing test until >> the end of the STOP bit, then the UART is released from its rigid >> timing. > > > Presumably the "problem" is with software uarts that check the full stop > bit, and don't have time to "unsynchronize" before the next start-bit > starts. > > BillW -- Note: To protect our network, attachments must be sent to attach@engineer.cotse.net . 1-866-263-5745 USA/Canada http://beam.to/azengineer -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist