Xiaofan Chen wrote: > I do not quite understand this. If the jitter is quantifiable, it would > have a maximum value. So it has determinism. Right? Kind of confusing. Not necessarily. Jitter may be quantified in statistic terms like average, mean, distribution -- without quantifying a maximum value. > I know there are many applications which do not need hard real time and > allow packet loss (you just wait for the next packet which contains the > same data). There's no form of determinism that can guarantee "no packet loss" -- just cut the cable (or all cables), and there /will/ be packet loss :) I don't think this ("deterministic") is a "deterministic" term, so to speak :) It has a meaning that depends on the context and the author. The text you cited in your last message contained the term "better determinism", which implies that determinism is not an is/is not characteristic but a more/less characteristic. I don't think this is how many other people use this term. I also think that some authors confuse determinism with reliability. IMO the former is a characteristic of the network protocol (and it either is or is not, given a set of assumptions), while the latter is a probabilistic characteristic of the whole setup, which includes such things as connectors, cabling (you can reduce the reliability by cabling all over the factory floor with standard connectors :) and so on. Gerhard -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist