> For awhile I had some code that would detect the google spider and > simply disable that stuff. But I noticed that every new page I put up > would work... then about a week or two later that log crap would show up > again. Easier just to go to their webmaster site and put in a request to them to not index your site, if that was the goal. They also honor (unlike some unscrupulous crappy search engines) the robots.txt file, just fine. In fact I threw up a robots.txt file to stop them and others from indexing it while it was on low-bandwidth, and their webmaster tools still show it blocked now that robots.txt was removed a month ago... don't care, but just pointing out how "effectively" the follow that file.. -- Nate Duehr, nate@natetech.com -- http://www.piclist.com PIC/SX FAQ & list archive View/change your membership options at http://mailman.mit.edu/mailman/listinfo/piclist