On Mon, Nov 3, 2008 at 6:27 PM, Gerhard Fiedler wrote: >> I do not quite understand this. If the jitter is quantifiable, it would >> have a maximum value. So it has determinism. Right? Kind of confusing. > > Not necessarily. Jitter may be quantified in statistic terms like average, > mean, distribution -- without quantifying a maximum value. You are right. But in the automation segment, often we only care about the worst case values and not typical values. >> I know there are many applications which do not need hard real time and >> allow packet loss (you just wait for the next packet which contains the >> same data). > > There's no form of determinism that can guarantee "no packet loss" -- just > cut the cable (or all cables), and there /will/ be packet loss :) Go back to the topic, are you saying that determinism has nothing to do with guaranteeing no data loss. I feel that is a bit strange. > I don't think this ("deterministic") is a "deterministic" term, so to speak > :) It has a meaning that depends on the context and the author. You are right. That is why I start to think that deterministic means very little to me now. Still latency and jitter are important considerations. > I also think that some authors confuse determinism with reliability. So far I have not found any authors do that. Xiaofan -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist