At 01:31 PM 9/5/99 +0100, you wrote: >I have a challenge. I need to convert a 48-bit binary number (stored in 6 >consecutive memory registers) into an ASCII string (i.e. '0' to '9') which >will then get transmitted to a PC. The string will be up to 15 characters >long. We cheat. An 8-bit byte is composed of 2 each 4-bit nibbles. Four bits covers the hex numbers 0-F, so we break down a byte into nibbles, convert the nibble to the ascii representation of the hex number, and send that. Thus the byte 3FH gets sent as two ascii characters: "3" and "F". HOWEVER it doubles the number of characters sent. The reasons we did it this way are buried in the dim and mystical past--somewhere before the invention of dirt. Why not just send an 8-bit byte as the ascii equivalent? One possible reason would be ambiguity between data and a sync byte. On one of our new systems, the host sends a single character trigger which causes the slave to return a packet of data--since the packet is pre-defined, it can contain anything. Hope this helps. Kelly William K. Borsum, P.E. -- OEM Dataloggers and Instrumentation Systems &