Hi, I was glancing some of the back issues of Circuit Cellar magazine and i've seen an article titled "Debugging, In-Circuit Style" in CC issue 109. Here is the three passage from the beginning of the article that makes me wondering about the subject: "Thanks to the amazing Turing machine, embeddedsystem designers can create an endless array of different end products using the same simple, finite instruction set of their choice. This freedom means that a generic tool that supports the features of a specific processor can address the needs of developing a flight-control systemor a toaster oven. For those of us in the business of creating flight-control systems, toaster ovens, or any other electronic systems, this is a good thing. Arguably, the strongest advantage of basing a design on programmable digital logic, as opposed to fixed function digital or analog devices, is the ability to bring these generic tools to bear. There's no doubt that, in practice, every system must be brought to life in a test bed as unique as itself. Thanks to the theoretical advances of people like Mr. Turing and the practical experience of generations since, a lot of the work has been done for us." Since i dont have CEng degree i dont have much information on Turing machine subject. Actually i have the Roger Penrose's *"Emperor's New Mind: Concerning Computers, Minds and The Laws of Physics*" book but its a bit hard to grasp for me. So, can you give me some quick explanation and easy references for that topic? Thanks. -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist