Given a 16C54 class device, what is the minimum uncertainty/jitter with which a change in a general input can be propagated to a general output? My first thought is that a bit test & skip/goto loop gives a three instruction cycle uncertainty, but I'm probably missing a trick. I don't care (within reason) about minimizing the latency, just the absolute uncertainty in the latency. Dan Lanciani ddl@danlan.com -- http://www.piclist.com hint: The PICList is archived three different ways. See http://www.piclist.com/#archives for details.