I'd appreciate it if anyone can throw some light on the "interrupt on start and stop bits" modes of the PIC16F SSP module (with SPCON:SSPM<3:0> = 111x). I have not been able to find any proper documentation of this function in the PIC device data sheet, the Mid-Range Reference Manual or the application notes relating to I2C slave mode. I'm using a PIC16F818 as an I2C bus slave in conjunction with a mature and well-characterised (non-PIC) master device. My basic slave code written around the SSP state machine works fine, but I want to add some protection against the slave locking up the bus, not least because there are silicon bugs which can apparently cause the PIC to jam under certain conditions. I'm not keen, though, on the workaround suggested by Microchip in the "SSP Module Silicon/Data Sheet Errata" (DS80132C): "A time-out routine should be used to monitor the module's operation. The timer is enabled upon the receipt of a valid START condition; if a time-out occurs, the module is reset." If interpreted literally, this would mean that the SSP module could be reset in the middle of a valid slave operation which occurs around one time-out period after a previous one. I reckon that this could cause almost as many additional problems as it resolves. My original idea was to use the "interrupt on start and stop bits" modes to detect receipt of a stop bit by the slave and halt the time-out timer, thus preventing a forced reset under normal conditions. However, my attempt to implement this doesn't work properly. An interrupt does occur for the stop condition (with SSPSTAT:P=1), but I've been unable to clear this condition to prevent it from firing again on exit from the interrupt routine. I have tried clearing and then setting SSPCON:SSPEN to reset the SSP module after a P=1 condition (in the expectation that SSPSTAT would return to the S=0, P=0 condition as after reset) but the P=1 condition seems to persist. It isn't possible to clear the P bit in SSPSTAT directly, so I'm not sure what else I can do. I'd guess that perhaps the "interrupt on start and stop bits" mode is not suitable for this sort of application, but without any documentation I'm unable to confirm this. Perhaps it is intended to support master/slave operation, in which case the response to a P=1 interrupt might be to clear the SSP interrupt enable bit and start bit-banging as a master device in the foreground. At the moment, I'm planning to implement a different I2C time-out strategy. I'm proposing to run the SSP module in normal slave mode (SPCON:SSPM<3:0> = 011x) and set up a timer interrupt with a 4 ms period which is kicked off after any SSP interrupt. The timer interrupt routine polls the SSPSTAT:P bit and disables further timer interrupts if it is set (i.e. the slave has released the bus). If the P bit is still clear after 256 consecutive polls, then the SSP module undergoes a forced reset via SSPCON:SSPEN. This should mean that, if the bus becomes locked up, then the slave will try to release itself after just over 1 second. I'd be interested in any comments on this approach. At the moment, I don't see any disadvantages that it offers compared to the Microchip workaround, apart from a few extra words of code memory to implement the timer routine. Many thanks in advance for any thoughts. -- Ian Chapman Chapmip Technology, UK -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist