>> Hrm, this implies that to communicate without regard to clock >> accuracy one could reduce to one bit per word, and then the >> receiver could determine the rate by measuring a start bit time >> ... is there a name for this form of protocol? I've almost always heard it called "auto baud". > I don't know how you can measure the start bit time if the 1st > and or subsequent data bits are the same state. In asynchronous serial communications, the stop bit forces a transition at the leading edge of the start bit. For it to work properly, there has to be a transition at the trailing edge of the start bit. So the low order bit of the first character sent (the one used to measure the start bit width) has to be a 1. Conveniently, the US ASCII character carriage return has this property. There also has to be some mechanism to reenable the automatic baud rate detection when the transaction, session, time limit, or whatever is complete. Lee Jones -- http://www.piclist.com hint: To leave the PICList mailto:piclist-unsubscribe-request@mitvma.mit.edu