Some more articles. http://www.vision-systems.com/display_article/286656/19/none/none/Feat/Latency-and-determinism-in-GigE-Vision-systems "Different protocol architectures also impact latency and determinism. With standard Internet traffic, TCP is commonly implemented to ensure data are not corrupted or lost during the transmission event between devices. When the system detects corruption, it initiates a process to re-attempt the initial event. This significantly increases the amount of time required to complete the initial event, resulting in much higher jitter. Since GigE Vision implements the UDP protocol, the re-attempt process can be made optional, which keeps the jitter lower and permits better determinism for short maximum guaranteed response times." If you use UDP, then there is possibility of packet loss. In that case, a packet may take infinite time to reach the target. So it is non-deterministic in a sense, right? Does Hard Real-Time (deterministic) allow packet loss? Does Soft Real-Time allow packet loss? I know there are many applications which do not need hard real time and allow packet loss (you just wait for the next packet which contains the same data). Kind of confusing... Xiaofan -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist