robotstxt (version 0.7.15)
A 'robots.txt' Parser and 'Webbot'/'Spider'/'Crawler'
Permissions Checker
Description
Provides functions to download and parse 'robots.txt' files.
Ultimately the package makes it easy to check if bots
(spiders, crawler, scrapers, ...) are allowed to access specific
resources on a domain.