Thanks to all who responded, it's helped me quite a bit. I did get one solution working last night - enough that it might be the one (one line, wrapped): 7za.exe a -v4g -scsWIN -t7z "K:\gz\Backup of Profile Untitled made on Thursday, 12-Jun-2008.7z" -mx=5 -w"c:\TEMP\" @"K:\gz\Backup of Profile Untitled made on Thursday, 12-Jun-2008~.txt" That's 7-zip, telling it to split at 4GB, use the 7z format, work in the temp dir, and take the filelist from the .txt file. I used a program called Abakt that's essentially a front end that lets you pick the files and dirs you want done and makes up the command line, and makes the file list to use from that. Its command line params don't include the splitting on 7z (they do on the zip side) so I added that. 7Zip also found it needed to be told which character set my dirs were using... I picked Win from Win, DOS, Unicode and it seemed to have about a dozen cases where it said it couldn't find the file listed, but it IS there on the disk. That will need some looking into. It isn't perfect yet, and as I write I'm seeing if I can get anything out of these 4GB files.. ;) It takes a while - maybe 4GB isn't the right size for each... I also don't have a way to decide which archive to look into, so selective restore will be a pain. The \t=tab idea has merit. I did see the dir name split one time, but I don't remember which time. I do remember using \\ to escape the backslashes on one try, but maybe I need to try all variations and \\. I'll have to look at Rsync. I did years ago and the windows support/apps just weren't reliable. I remember a lot of hangs. Maybe it's better now, or maybe I'll be better at it. ;) Trying to use partly ported UNIX apps could be the problem, but I tried it for 2 reasons. I used to use Unix a lot, even though I forget most of it now, and it seems better suited for the task here - if I can get it to work. The other is that I deal with a lot of executive level folks who challenge me to defend this-or-that about Linux vs Windows (I must be the Linux whipping boy). My approach to that is multi-tiered, and sometimes using a Unix tool in Windows carries a lot of weight - especially with someone not willing to take a close look at Linux itself. So, an elegant tar-by-directory solution, without making it look complex, carries some 'opinion weight' potential, or is at least the 'thin end of the wedge'. Winzip/zip is limited to 65k files, so it fails early. Last night's stats appear to be 61GB and 250k files+ compressed to 21GB and 6 files. Decompressing may be a problem though. And lastly, I agree that it's a shared problem of quotes and spaces, but it's really the spaces being allowed that started it all. I could maybe accept the ability to use spaces in file and dir names, and since I refrain form using them, would be OK, but there are so many Microsoft-defined dirs that have them that it breaks any sort of elegant solution to have to rework all names before backup. I believe they did this to offer convenience to novices AND frustrate any Unix cross-pollination. I'm not sure which was the steak and which was the gravy though! ;) So far, no luck on decompressing with 7zip portable... it may be too ugly after all. BTW, I should mention a few other methods I've tried: Since a handful of files will always be locked, a full backup will take booting with an alternative. I've tried Knoppix to just copy the files and folders, but a byte compare afterward always shows massive differences, even though the copy seemed to work fine. I don't know if I could trust a tar from it until I know more about why. I've used Bart PE, and it's the best so far in conjunction with XXcopy, however there is no compression, and as you can see, I can get 3:1, which is a BIG help, not to mention the 'size on disk' issue with having small vs large files. AND, I can do things like make the archive read only and not affect the internal files, where doing that to the file by file copy would mess up the real bits. I did get Cygwin last night too, but haven't installed it. I remember on previous machines, where I use apps that bundle the cygwin.dll, that there would be conflicts after installing the latest cygwin. I'll have to contain that and see what happens. So far, it looks like there isn't a nice way to list and select a file to get from these big 7z files - just dump them all out. Not a good thing, so I'm still looking. From a process point of view, the top level folders compressed into a single file each is still the best idea, and preferably something winzip or such can read and index. If anyone has any more ideas, I'm still not done. Maybe using FOR with a command line winzip... I may need therapy after all this... ;) Thanks for all the the ideas so far. -Skip Tamas Rudnai wrote: > Hi, > > > Just woke up in the morning, and realized what could be the problem. For > some reason the GNU tar+gzip are parsing the parameters, so you can use the > C string like backslash encodings. Therefore the '\t' is a tab character... > Therefore the "d:\Temp.install\*.*" will seen as "d:\ emp.install\*.*" for > tar as '\T' from the "\Temp" will be replaced by tabulator... Weird and also > sad that when Mr Bill Gates took over the directory handling from Unix and > put it into the CP/M they reversed the slash (as slash was the parameter > separator on CP/M... > > Anyway, thanks god if you start using normal slashes it uses slashes for the > entire path plus thanks god XP and Vista can handle slashes too. So this > should work: > > C:\>FOR /D %G IN (*.*) DO tar cv "C:/%G/*.*" | gzip > "BACKUP_%G.tgz" > > BTW: Did you try WinRar? I am using it for huge archives for backing up the > malware and threat samples and so far had no problem with that, plus it can > split up the archive into given size. > > Tamas > > > > On Wed, Jun 11, 2008 at 9:52 PM, Dr Skip wrote: > > > > -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist