Analyze any website's robots.txt file. View parsed rules, sitemaps, issues, and test if specific paths are crawlable.
Test if a specific path is allowed or blocked for a given user-agent.