Dear Jon, I have not used the 17C756 Project MicroSeeker sounds neat. Regarding I2C, arbitration is done on a bit by bit basis so the CPU needs to supervise each bit's progress. So you don't gain much by having a byte-to-bit hardware interface because you'd only have to write software to manage that. An asynchronous UART is worth its silicon because it handles time-critical stuff like baud rate generation, and the data is sent out over relatively long time periods. An I2C data transfer is in the order of microseconds, and is far less time critical, so masters would not gain much. Anyone could legitimately claim to have an I2C interface if they have only got a pair of open-collector lines. If Microchip _have_ got I2C master duty done in byte-oriented hardware (which I doubt), this is no big bonus. Having an I2C _slave_ interface done in hardware _is_ worth having in silicon, because it will sit around and transfer whole bytes before interrupting the CPU, instead of the CPU having to watch and respond to the I2C signals within 5 microseconds. > I will need to do a software implementation of an I2C slave > I have heard this is a problem, but I'm not sure why. Because the Microchip documentation and example code is crap. Which is pretty shameful for a product conceived around 25 years ago. They don't go into it in anywhere like adequate detail. I have actually got my PIC16C65 working as both I2C slave and master, after much effort. I can send you the e-mails on the subject rather than repeat them here. This will save you a lot of time consuming headaches. Changing subject, tell us about your cool app? Is it going to be braving the North Sea (been there done that) or just retrieving your soap from the bottom of your bath? :-)