Is the following bit of code valid for continuously digitizing an 8-bit value on channel 0 and a 10 bit value on channel 1? My main question is about the for() loop: whether this is a valid method to get 10 bit = resolution with an 8 bit adc and whether or not there should be any additional = delays in this loop. Delays are only needed when you open up the channel mux, = right? thanks, Rob unsigned char value0; // 8 bit resolution unsigned short value1; // 10 bit resolution unsigned char channel; unsigned char i; // initialize a/d: channel =3D 0; ADCON0 |=3D (channel << 3); // open channel 0 delay(); // delay 50 microseconds // main loop: while(1) { // do other things digitize(); // gets here after at least 50 = microseconds // do other things } void digitize() { ADGO =3D 1; while(ADGO) {} if (channel=3D=3D0) { value0 =3D ADRES; channel =3D 1; } else { value1 =3D ADRES; // first digitization for(i=3D0;i<3;i++) { ADGO =3D 1; while(ADGO) {} value1 +=3D ADRES; } // do I have a valid 10 bit 'value1' here? channel =3D 0; } ADCON0 |=3D (channel << 3); // open next channel for next pass return; } -- http://www.piclist.com#nomail Going offline? Don't AutoReply us! email listserv@mitvma.mit.edu with SET PICList DIGEST in the body