Jan=3DErik, I think I finally understand. I did this. Reset display; Turn the Gate on; call delay(numberseconds); //This will be for one second Turn the Gate off; call delay(numberseconds); again /This will be for 15 seconds. do forever delay(numberseconds) { for(count =3D 0; count <=3D numberseconds*10;); return; } interruptcode goes here. void Interrupt() { if (TMR1IF_bit) { TMR1IF_bit =3D 0; TMR1H =3D 0x0B; TMR1L =3D 0xDC; count++; //This increments the count variable=20 which was declared as global. } } It compiles. Now I have to test it and see what the output is. If this work= s I'll then add the pot and the ADC so that the display time can be varied.= I'm getting there. Thanks, rich! P.S. Allen this also uses your suggestion. Thanks, rich! On 7/19/2014 4:39 PM, Jan-Erik Soderholm wrote: > > Jan-Erik, > > The interrupt happens every 100mS. So for a one second delay I need > > to execute ten interrupts. > > No, that is done automaticly. You doesn't have to > do anything, the 10 interrupts will happen each > 1 second by the hardware. > > > But I just don't understand how to do this. > > You don't have to do anything. Just let your interrupts > --=20 http://www.piclist.com/techref/piclist PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist .