>> I have an potential application that must cope with 2 stop bits > A "stop bit" is nothing more than a little bit of guaranteed idle time > between characters. It is included in the data because otherwise accurate > interpretation of long strings of zero bytes would be impossible. For > example, the sequence of bytes [$00 $00 $00] would be sent as > > "...11111110000000000000000000000000001111..." [27 zeroes] > > whereas the sequence of bytes [$00 $00 $80] would be sent as > > "...1111111000000000000000000000000001111..." [26 zeroes] > > Accurate discrimination between [27 zeroes] and [26 zeroes] is not > practical, and discrimination between, say, [270 zeroes] and [269 zeroes] > would be all but impossible. All my comments assume we're talking about serial lines using an EIA-232 (formerly RS-232), -422, -423, -485, current loop or similar interface. Assuming that your diagrams are showing the voltage levels on an _asynchronous_ serial line, then your statements are wrong. For async serial, you have completely ignored the start bits. Your sample of three all-zeros octets [$00 $00 $00] sent as a consecutive block (no inter-character delay) on an async serial line would appear as follows: ...MT00000000PT0000000PT00000000PM... <-- 1 stop bit ...MT00000000PPT0000000PPT00000000PPM... <-- 2 stop bits Where M is the constant voltage inter-character marking state, T is the start bit, and P is the stop bit(s). The stop bit ensures that the line is in a marking state prior to the next start bit and guarantees that the receiver circuit can detect the leading edge of the start bit. The series of ...1111... you show at either end surrounding the data bits imply that an async line has fixed time slots. This is wrong. There are no voltage transitions between characters. Timing is all predicated on the leading edge of the start bit. If you are discussing synchronous serial lines, then the modem recovers clock timing information from the encoded data stream or uses the clock signal from the interface. If the line did not have a modem, them the timing pulses must be provided by some clock generator. In any case, it would certainly be able to discriminate between 270 and 269 bits -- be they all zeros, all ones, or any combination. > Normally, the only time that stop bits pose a problem is when a > receiver--by design--requires a certain amount of time between > characters in which to "think" or "do things". Or by mechanical necessity. The only device requiring 2 stop bits that I've worked with was the old (really old) Teletypes. They ran at 110 baud and each character was 11 bits; 1 start bit, 7 data bits, 1 parity bit, and 2 stop bits. This gave 10 characters/second. The marketing literature quoted it as 100 words/minute. :-) It was a totally electro-mechanical device. There was a motor inside that rotated continuously and a shaft that rotated once per character (if I recall details correctly, it's been decades). When the start bit arrived, it energized a clutch which tied the shaft to the spinning motor. It needed 2 stop bits to let the shaft complete a rotation and declutch in preparation for the next character. Lee Jones