(the posting was rejected by piclist first, so the refs are inconsistent) > knew about (hardware) synchronous detectors, that I was just seeing a beat > effect between the sample and input frequencies. I see no easy way to turn > that into a useful output for a DTMF detector. You can auto-correlate the signal. Just XOR the current 1-bit input with a copy of it delayed by exactly half a period (180 degrees) of the expected frequency (for example 1/(2*697) sec). Check if the XOR results in a 1 or 0, count the 1s. Now do this over a certain amount of time (for example 20ms), and then consider the counter value as indicator of how much the input signal matches 697Hz. The higher the counter, the more similar it is. A "perfect" DTMF consists of 50% energy at the row frequency, and 50% at the coloumn frequency. Normalize your counter values to respect the faster period (=more measurements) of higher frequencies. Then compare them. Take the two highest frequencies, and check them against the 3rd highest. There must be a way of a margin. Also, the two peaks must belong to different groups (row/coloumn). Next, both frequencies must have similar counter values (otherwise you experience forward/reverse twist). Then compare the sum of them to the mathematical maximum (which is set by your measurement time). If it is far below, there are a lot of other frequency components in the input signal (for example a voice speaking, with 697Hz in it for a moment). It might be worth to do a 9th measurement at 435Hz and also include it into the sum before energy comparision, to account for dialtones as tolerable sidetone (425-450 Hz in most countries). I haven't done this with 1 bit yet, but I did do it with 8 bit and it works like a charm. And I have done other one-bit projects which turned out to work, but be far more sensitive to noise than their real-ADC-brothers. So don't expect to meat ITU specs with 1-bit PIC :-)