On Fri, 22 Jun 2007 16:14:56 -0700, James Newtons Massmind wrote: >> You might think backing up 5 workstations every hour would be time >> consuming and network intensive but because XXCopy has some extensive >> differencing options the whole process only takes 6-8 minutes per hour. >> >> It's awfully nice having automatic backups of everything back to the >> last hour, day, week or month when you have one of those "oops" moments. >> Obviously having a rotation of 4 weekly off-site backups is a good >> thing too. >> > > Does it have an option to store (not transmit, but store) only the > difference between an original version of the file and the current > version? E.g. if there is hugefile.dat and I copy it over once a month, > then every day copy over hugefile.dat.200706##.dif so that I have the > ability to restore to any point from the first to the last of the month > and the space requirement is not sizeof(hugefile.dat)*31 ? I don't believe so James. XXCopy is basically a file copy program on steroids, HGH and crack combined. ;-) However the program does have some hundreds of command line options so you might want to download the user manual (which is refreshingly complete and thorough, unusual for software these days!) and see if that feature might be hidden away somewhere. I don't have many (any?) huge data files that change frequently and my backup storage requirements are fairly stable with the workstations and servers, only adding a little data each day and an occasional new application. I looked at the total sizes of the data stored on my systems and doubled it to size my backup drives. They are currently at about 60% usage. The one computer I do have that has massive data on it is an audio workstation in my basement lab. I do audio editing on it and also store/serve all my MP3 files there. I back it up separately to an external USB drive. I supplement the regular file based backups with periodic images of my system drives using TrueImage. This allows a quick recovery from system drive failure (it's happened a time or two over the years). Most of my computers have at least two hard drives and I generally try to keep only system related things on my "C" drive partition, despite what Microsoft and other software vendors would have you do. The reason I was drawn to XXCopy was that I wanted something that would make an exact copy of the files and remove all orphan files of every drive and directory on every system on the network. It does that really well. Mostly I'm trying to protect against data loss due to hardware failure or "oops" moments. So far my way has worked well even though it may not be optimum. It is automatic and just quietly sits there and works and that was also a major consideration for me. My "servers" are really nothing more than stripped down installations of Windows 2000 workstation. When configured like that I've found Win2K to be *extremely* robust. My main server has over 6 months of uptime right now. My workstations are configured similarly and about the only time I have to reboot them is if I install software that insists on rebooting. My network is a simple peer-peer network and it's all I need. Every other backup program I've tried (Novaback and Retrospect mainly) had problems for me. Novaback, while easy to setup and use would only use it's own scheduler and it would occasionally crash. It also puked it's database once it got above about 50 GB of backup data. I'm not sure I ever got a reliable restore from it once I started backing up my whole network. Retrospect I found to be really cumbersome to setup and restore from. It has extensive backup options (including several different ways of doing differential backups). Again, I really wanted something that just kept a separate copy of everything I do and can restore simply and quickly. Frankly, I found commercial backup software to woefully inadequate and unreliable. I bought and demo'd many packages and they all had problems. Either they were aimed at single workstation backups or full domain networks with 100's of workstations ($$$$) but finding anything that was simple, robust and cost effective for a small workgroup was pretty much non-existent. That's why I ended up rolling my own. Matt Pobursky Maximum Performance Systems -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist