Hi, Dave Tweed wrote: >Almost certainly, a portion the start bit for the dummy byte will be >transmitted before you can disable the transmit driver, which means that >the rest of the receivers on the bus may interpret the now-floating bus as >an input byte, especially if another node starts transmitting within one >byte time. Seems risky to me. Yes, I think you are correct, but it should have been nice :) >On an 8051 system with RS-485 that I built, I had a set of software timers >that were driven by an interrupt from the same hardware timer that >generated the UART bit rate. When I put the last byte into the UART >transmit buffer, I just started a software timer for 20 bit times, and when >that timer expired, I turned off the transmitter. 20 bit times allows >enough time for the last byte to go out, plus one full byte time of >enforced idle time between packets. Well that's about how my current designs operates, but it requires an available timer. On an pic I use an dedicated timer for this, well is dedicated during transmission. Which is started when the last byte is sent to the internal uart. When it set it's irq I disable the rts. But I just feel that there should be an easier way.=20 On one project i used an external r/c based solution, but that had to be designed for worst-case senario, lowest bitrate and allow for component tolerances, so it did not utilise the bus effeciently. But for that project it was ok. In sweden we have an company called Westermo that makes all kind of i/f gadgets, one is an RS232<->RS485 'converter' with 'auto' transmitter control, well actually I think it uses an very small pic and one has to setup the baudrate and total number of bits with dil switches to get it working. In that case it just need to sense the startbit and delay accordingly. /Tony -- http://www.piclist.com hint: To leave the PICList mailto:piclist-unsubscribe-request@mitvma.mit.edu