The older tool from Canon (it was commercial ware but in the deep discount $5 bin when I got it for Win 95) did all that. Point it at a page, give it some selection as to how deep to go and what domains (much like HTtrack or wget) or dirs and filespecs if any, and it fetched and put it in one big doc. I don't remember its file format in the end, but it could be printed to any printer including pdf (if one had acrobat in those days). I never got it to work on XP or NT, so I don't even know where it is now. Probably went to the thrift shop. It was very useful, but I think Canon just didn't want to be in the software biz unless it was based on a specific hard product of theirs. I also thought it was such an obviously useful tool that there would be more like it as the web took off. Now I can't find anything like it, but for web spiders that will recreate the site dir locally, but not put them all in one doc... Budgeting here is an odd activity - don't ask. ;) Given the fact that the function seems so obvious for a tool like Canon had, and the condition of the economy, et al, it just needs to get done in one's 'extra' time... -Skip AGSCalabrese wrote: > I presume there will be images to include in the concatenated > documents. How do you plan to deal with them ? > Does one find an image URL in the HTML and grab the image, put it > in a local folder with an updated name and updated link in the > concatenated HTML ? > > Gus -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist