check_robots_txt
Analyze a website's robots.txt file to identify allowed and disallowed paths, sitemap references, and crawl-delay settings for SEO optimization.
Instructions
Check and analyze a site's robots.txt file.
Shows which paths are allowed/disallowed, sitemaps referenced, and crawl-delay settings.
Input Schema
| Name | Required | Description | Default |
|---|---|---|---|
| url | Yes | Website URL or domain to check robots.txt |
Output Schema
| Name | Required | Description | Default |
|---|---|---|---|
No arguments | |||