Andre Abelian wrote: > 1. I was working on a project that took me 1,5 month to get it done. > After going thru lots of troubles later on I started using interrupts > then all of our problems went a way. My question is how do you guys > calculate or figure out that interrupt is required ?. In bigger processors, I usually have at least a timer interrupt. But in general, I think this depends mostly on the application and somewhat on the coding style. I create a rough architecture of the application, based on the requirements what has to be done how fast, with how much jitter, etc. With this, it's usually not so difficult to see where interrupts come in. When you do it the first time, it takes a bit, because you have to compare polling vs interrupts for every part. With time, you make your experiences and find out about your preferences, and the choice becomes easier. > 2. My second question is how do you calculate how much clock speed > needed? or should I always use 40 mhz. most of the time I use 20 mhz. Same as above: find out what the time-critical path is, and what clock speed you need to get it done. With an interrupt-based architecture, this is not as simple as counting instruction cycles; you need to take the interrupts (which not necessarily are in regular intervals) into account. One easy way to go about this is to start with a medium-high clock rate (like your 20 MHz) and see how much wiggle room you have. It it's not enough, you go higher. If it's a lot, you may go lower. Gerhard -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist