On Tue, 23 Jul 2002, Jinx wrote: >> Maybe a little late... >> >> Remember the good old check of PRNG output entrophy. If you >> gather a fair value of randomness, try to compress it with common >> utilities, e.g.gzip. The higher the entrophy, i.e. "randomness", the >> less compressable the data is. Of course, the quality of the >> compression algorithm is a factor but today's best compression >> utilities should be fairly good at detecting any kind of pattern or >> non-randomness > >I've got just a basic knowledge of compressors, but not modern >ones. So you're saying a file made of supposedly random numbers >probably shouldn't decrease in size ? At all, or maybe just a little ? >The algorithms I used to write for compression would actually >make a file bigger the more disparate the data, and really were >suitable for repetitive or regular data like black & white line drawings If you try to compress a file with too much entropy the file *grows* (by the size of the compression headers at least). Peter -- http://www.piclist.com hint: The list server can filter out subtopics (like ads or off topics) for you. See http://www.piclist.com/#topics