Test and validate your robots.txt with this testing tool. Check if a URL is blocked, which statement is blocking it and for which user agent. You can also check if the resources for the page (CSS, JavaScript, images) are disallowed.


Invalid URL
{{user_agent.name}}{{user_agent.token}}
Live Editor
Check Resources
Test

robots.txt: {{data.robotstxt.url == "Editor" && data.robotstxt.url || data.robotstxt.url + ' (' + data.robotstxt.status_code + ' ' + data.robotstxt.status_text + ')'}}

URL Path: {{data.url.path}}

Result: {{data.url.result}}


Sitemap URL Status
{{sitemap.url}} {{sitemap.status_code}} {{sitemap.status_text}}
Export (.xlsx) Google Sheets
Resource URL Status Type Result Host robots.txt URL
{{resource.url}} {{resource.status_code}} {{resource.status_text}} {{resource.type}} {{resource.crawl.status}} by {{resource.crawl.applied_rule.rule}} {{resource.crawl.host}} {{resource.crawl.robots_txt}}