Hi All. Wondering what the best way to generate interrupt based stepper motor step delays in software would be? Say a stepper has 200 steps/rev and i want to run it between 0-5 revs a second. Thats 0-1000 steps per second. Ideally i'd like to have a 10 bit value 1-1000, but even an 8-bit value 0-250 with a multiplaction factor of 4 would suffice, and generate an interrupt at the right frequency to drive the stepper. The scale has to be linear (or close to it). At 1000 steps/sec that works out to 1 interrupt every 1mSec. At 1 step/sec it's a full second between interrupts. At values in between there are a bunch of unfriendly numbers. I could just use the equation: time between interrupts = 1/(number of steps per second). But i'm having a hard time figuring out how i could use any of the timers to help me out. It has to be interrupt driven 'cause there's a bunch of other stuff going on at the same time. Currently using a PIC16F877. Anyone got any ideas? TIA. Kresho Sprem -- http://www.piclist.com hint: The list server can filter out subtopics (like ads or off topics) for you. See http://www.piclist.com/#topics