> Oh yes, I see now. Unfortunately, it doesn't generate an interrupt. > It's going to set about 100 usec after the last interrupt. (I'm doing > 115200 Baud). That's too long to just spin and wait. (My nodes are > pretty busy). > > But the receiver (who's going to send the next message) does get an > interrupt, and will be putting something on the wire very quickly, > within a few microseconds. > > It's looking like a timer of about 100 usec is the best option. I > would start the timer when I get the interrupt for the last > transmitted byte, and disable the transmitter when the timer expires. Well, here is another idea based on my previous one: In the transmitter: (1) Send an additional byte at the end of each 'packet' (in response to TXIF) (2) One BIT (not byte!) time later disable the transmitter You would still need to delay between steps one and two, but that would only be for about 10 uSec instead of 100. Also, the exact delay is not particularly important, just so that it long enough to ensure the entire start bit is transmitted before the transmitter is turned off. This might let you do some other work while waiting to disable the transmitter. In the receiver: (1) Expect an additional byte at the end of the packet. You will see a proper 'start' bit because of the delay between steps one and two above. The receiver may or may not detect a framing error, depending on the idle state of the link when no transmitter is active. This can be prevented by adding appropriate biasing resistors: a pulldown on one RS485 data line and a pullup on the other. (2) Immediately after receiving the RXIF status for the additional byte the receiver can enable its own transmitter and start blasting away. In other nodes ('watchers'): Other systems on the wire will see a continuous stream of bytes. Bob Ammerman RAm Systems -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist