Skip to content

Commit fe8794a

Browse files
Update src/content/changelog/ai-crawl-control/2025-10-21-track-robots-txt.mdx
1 parent 073eadb commit fe8794a

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

src/content/changelog/ai-crawl-control/2025-10-21-track-robots-txt.mdx

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -11,7 +11,7 @@ AI Crawl Control now includes a **Robots.txt** tab that provides insights into h
1111
The Robots.txt tab allows you to:
1212

1313
- Monitor the health status of `robots.txt` files across all your hostnames, including HTTP status codes, and identify hostnames that need a `robots.txt` file.
14-
- Track the total number of requests to each `robots.txt` file, with breakdowns of allowed versus unsuccessful requests.
14+
- Track the total number of requests to each `robots.txt` file, with breakdowns of successful versus unsuccessful requests.
1515
- Check whether your `robots.txt` files contain [Content Signals](https://contentsignals.org/) directives for AI training, search, and AI input.
1616
- Identify crawlers that request paths explicitly disallowed by your `robots.txt` directives, including the crawler name, operator, violated path, specific directive, and violation count.
1717
- Filter `robots.txt` request data by crawler, operator, category, and custom time ranges.

0 commit comments

Comments
 (0)