> I have pretty good success with a program called wget. The > program has lots of command line switches, but it isn't too > hard to figure out. Some sites expect the HTTP request to > contain user agent and referrer information, but with wget you > can fool those sites pretty easily. ;) Grumble, grumble. Yes you can... But I... Err... I mean site hosts can still look for patterns in the way the links are followed and the speed with which pages are requested or the number of pages requested over time. I do all of those, patterns and speed are checked automatically. Page count and patterns are also manually checked when the server notices more activity than expected. If you have to rip a site, wget is one of the better ones. It can be set to wait a while between requests, which I very much suggest you do to avoid locking up the server with your requests. --- James Newton: PICList webmaster/Admin mailto:jamesnewton@piclist.com 1-619-652-0593 phone http://www.piclist.com/member/JMN-EFP-786 PIC/PICList FAQ: http://www.piclist.com -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist