I have a challenge. I need to convert a 48-bit binary number (stored in 6 consecutive memory registers) into an ASCII string (i.e. '0' to '9') which will then get transmitted to a PC. The string will be up to 15 characters long. I can do this by going first to BCD and then from BCD to ASCII (although this may well not be the best way). Microchip have published "Binary 16 to BCD" code in AN526, and I also have "B24toBCD" which is just an extended version of the 16-bit program. It looks straightforward to extend it further to 48 bits. The thing is, I'd simply be working "parrot fashion". I can't suss out how the BCD-to-ASCII algorithm works! Can anyone explain in plain English the basic principle? Also, with a bit of luck that might allow me to work out how to avoid the intermediate BCD stage altogether and go straight from binary to ASCII. Or has anyone done that already? Thanks, Steve Thackery Suffolk, England. Web Site: http://www.btinternet.com/~stevethack/ "Having children is hereditary. If your parents didn't have any, neither will you." - Henry Morgan