On Mon, Apr 28, 2014 at 3:30 PM, YES NOPE9 wrote: > My question has to do with preventing slow deterioration of the data stor= ed. I apologize if this has been discussed and I missed it. > > There are many ways to keep multiple copies of important data. How does = one insure that the data has not been corrupted and you are merely continui= ng to backup corrupt data. This may be data you do not look at for years. = Is there a file system that manages data integrity with some form of check= sum ? ( I mean checksum in the generic sense which could include polynomia= ls , etc. ) The simplest method would be to generate MD5 or SHA hashes on all your files and verify them on a regular basis. I do this for my files and I have never found any corruption of media ("bit rot") but I did have an interesting experience moving the data from a bunch of bare IDE disks onto a NAS a built a while back. Because I had no available IDE ports on the NAS, I was using IDE-USB dongles to copy data onto the server. I was getting quite a number of hash fails on the target and it took me *quite* a while to track this down to flaky firmware in my USB-IDE dongle. I wound up putting the drive in a desktop and copying the files over the network and then everything worked fine. -p. --=20 http://www.piclist.com/techref/piclist PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist .