Well you do need to have some sort of involvement in the backup process. Backing up need not be a painful, all-encompassing effort that overwhelms you for weeks on end. I do sufficient backups in about 15 minutes for each machine per week. But you still must involve yourself by thinking through what to backup and then testing your process for restoring backup files. I keep all my user documents in a given directory. I also have a sense of what other directories contain important data. Depending on which operating system platform I'm using, I do the following: * archive the important directories to archive files using tar or WinZip. There is no need to pay for software to do this -- I guess I could also use Java's jar if I wanted to. The important bit here is that archive files preserve directory structures and filenames. Notice that I say archive important directories. Don't waste your time hunting for specific files to archive with some silly idea of saving space on your drive or the CD. Instead, just archive entire directories even if they contain only 1 or 2 files of interest. As a bonus, you will probably archive files you forgot/didn't think about/didn't realize were important. * If I'm on a Linux system, create an ISO image of the archive files using mkisofs. On Windows, just launch Roxio Easy CD Creator software. You can escape paying for the Roxio product if you download Cygwin, install it, and install the 'cdrecord' application. * Write the archive ISO image to a CD. * Really really important files I should probably put on a disk key -- that is, a USB flash drive, and wear the thing around my neck. Is my system perfect? Well, I still have some learning to do when on Linux filesystems. I've gotten smart and tarred up my /home directories. But usually I forget very important config files in /etc, and end up kicking myself for it because I test a lot of Fedora Core betas. In fact I have Fedora Core 2 running on one machine right now and it is looking great. It has PHP5 RC1, MySQL 4.1.1 and so on all installed and running on it. But I need to remember to tar up /etc before I overwrite it with the next beta. Once you are practiced at backing up your data, it should only take 10-15 minutes once a week or so. I'm assuming that you only need to think of yourself and one or two machines at most, and that your needs are pretty basic and simple. If your responsibilities exceed these, you should do some serious book reading. The book "Unix Backup & Recovery" by Curtis Preston is required reading for anyone doing backups on large Unix networks. Be sure to visit his web site for updates because the book is a bit old now, but still well worth buying and reading. I'm not sure if Preston or others authored something equivalent for Windows machines. Google and amazon are your friends. Bob Cochran Robert Ussery wrote: > Hi, Folks. > > > > After losing some pretty important data recently, I decided it's finally > time to get some backup/version control software. I googled around a bit, > and found that there're hundreds of options out there. Here are my basic > requirements: > > > > - Periodic backup of selected files and folders, preferably to DVD > > - Version control, so that each backup doesn't overwrite earlier backups > > - Preferably invisible operation (i.e., I don't want to have to take part in > the process. basically just leave my computer on and let it back itself up > once in a while) > > - Free! (I'm just a hobbyist/home user - no need for anything fancy or > enterprise level!) > > > > Do any of you have any suggestions? With the plethora of options, I don't > really know if there are any really standard/prevalent or superior packages > out there. Thanks! > > > > - Robert > > > -- > http://www.piclist.com#nomail Going offline? Don't AutoReply us! > email listserv@mitvma.mit.edu with SET PICList DIGEST in the body > > -- Bob Cochran Greenbelt, Maryland, USA http://greenbeltcomputer.biz/ -- http://www.piclist.com hint: To leave the PICList mailto:piclist-unsubscribe-request@mitvma.mit.edu