No matter how many processors you have, they'll all make the wrong choice if you load the same program onto them. NASA's computers on the space shuttle have triple redundancy (or better), but one of the computers' programs was written by a different company than the rest. That way the same flaw won't show up in all. That doesn't mean that they will always make the right choice, but it's an improvement. Of course, this doesn't take into consideration the fact that the inputs may have problems, but on a software level this seems like a good idea. -Tony << Just on this issue of redundancy, "tripling" the processors does NOTHING. The general issue is we need to get a bunch of processors to reach concensus about an event (ie "open the valve"). If we wish to allow "k" failures in the system and have the system still work safely, the number of processors we need is 3k+1. This is a "you-can't-dodge-it" result from information theory. The best exposition of this is in an IEEE computer issue from the early 1990's - can't locate it right now, old age has that effect :-). The interested reader is encouraged to think about the NASA Space Shuttle, which is supposed to be fail-safe, fail-safe, fail-operational. That is, k=3. They have 5 processors. OOPS. >> -- http://www.piclist.com hint: The list server can filter out subtopics (like ads or off topics) for you. See http://www.piclist.com/#topics