Alan B. Pearce wrote: > > You mean the way IE8 will save a web page as a .mht file, with pictures and > all in it? I don't know how compressed it is. > Not sure about IE8, but Firefox with addons will do that, but it's a page at a time. There are other addons that suck web sites or dirs too, but they don't create one file with all the pages concatenated together. > As Tamas has mentioned, this can be done with CHMs in Windows. I think the > HTMLHelp tool from Microsoft compiles these - but I've only ever used that > utility as part of the Sandcastle toolchain (auto-documents .NET XML commented > code), so don't know how flexible it is. You'll need to grab it from the MS > website - there's two of them IIRC, version 1 and version 2. Hope this > helps. > > Regards, > > Pete Restall I'll try to take a look at this, but I suspect it will be view a page - save the page - give the page a name, etc, go to next page and do the same, etc. I may never be heard from again.... :-O Optimally, I need something that will cat a directory worth of html files with some limited intelligence to strip out headers and metadata and such so the whole lot would end up as one file that is readable. Maybe a page break between what used to be the individual pages. Even a command line tool. I'm no Perl expert, but I think Perl would be well suited (but beyond my abilities these days). It would have to incorporate a lot of html knowledge though to selectively strip out stuff as it wrote the one big file as a one file html doc... -Skip -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist