At 10:27 PM 9/6/99 +0100, you wrote: >Thanks for your thoughts, Kelly. Your raise an interesting point: > >> Why not just send an 8-bit byte as the ascii equivalent? >> One possible reason would be ambiguity between data and a sync byte. > >That's the reason, in a nutshell. My current implementation sends the >pure binary as a "packet" of six bytes (topped and tailed with the usual >start and stop bits, of course). Trouble is, how do you tell when you've >got to the end of one packet and the start of the next? You can't >implement a "framing" byte because it isn't possible to assing a unique >bit pattern to it. The real data can contain all possible 256 values. > >At the moment I'm cheating: I'm relying on the fact that there is always a >short interval of time between packets. This means I have to implement a >"timeout" in the receiving software. This works fine in NT, which is >almost-real-time multitasking, but under W95 the software fails >intermittently when the operating system is away doing something else. > >Hence the decision to convert the whole lot to an ASCII string. Then I >can use a null, or even a carriage return, to signify the end of a string. >It also makes the data file that gets recorded on the PC human readable. > >The hex approach is attractive. It means I can stick to ASCII characters >but the conversion becomes trivial: just a sixteen bit look-up table. >I'll have to think about how to sort it out in the receiving software. >Shouldn't be too difficult. Its not. Extract a pair of ascii characters that form a byte using the usual mid$(string,x,2) then derive the value of the hex string. Nibble 1 * 16 + nibble 2. Vbasic has a bunch of direct conversions which we put into functions to do all this automatically--just passed a string of two characters and got back a integer and an ASCII character corresponding to the integer. Kelly William K. Borsum, P.E. -- OEM Dataloggers and Instrumentation Systems &