I've been doing pretty well with my A/D, interrupts, timers, etc. But one thing that I did is very confusing to me. I'm not sure if it is something I'm not understanding (potentially including my test instruments too). I'm working with timer1 on a 16F877 at 20MHz. I'm trying to create an interrupt every millisecond...and I'm really really close, but not perfect.... Here's the important code fragments using CCS's compiler: #include <16F877.H> #device ICD=TRUE #fuses HS,NOPROTECT,NOWDT #define TIMERVAL -(20000000 / 4 / 1000) int LED0; #int_timer1 Timer1ISR() { set_timer1(TIMERVAL); // immediately reset timer output_bit(PIN_D0, LED0); LED0 = !LED0; } There is a lot of other code in the actual Timer1ISR routine but that doesn't matter. OK, so I ran all of this and put my scope on D0 and it measured 495Hz. (Honest...I didn't even know until this morning that the scope could measure that!). I also have Fluke DMM with a frequency counter that also showed 495Hz so I feel pretty good that it really is 495Hz. The question is, where is the 5Hz that I'm missing? I did check the listing file and the compiler is putting 0xEC78 (decimal -5000) into the TMR1L/TMR1H counter. Replacing: #define TIMERVAL (46 - (20000000 / 4 / 1000)) ...nails it at 500Hz perfectly. But why the "46" offset?