19 Oct 2004
|This script will parse robot.txt files and store
a hash! of forbidden paths. This is very useful for
webbots or spiders of any kind (at least if obeying
robot exclusion standards is desirable).
I looked at the ht://dig package (http://www.htdig.org/)
script that does the same thing as this script to see if I [...]|