You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
| Request by AI crawlers | A graph which displays the number of crawl requests from each AI crawler |
17
+
| Summary | A list of AI crawlers with the most number of crawl requests |
18
+
| Most popular paths by AI crawlers | The most popular pages crawled by AI crawlers, for each AI crawler |
19
19
20
20
The **Summary** table also enables you to [Enforce your robots.txt](/ai-audit/features/enforce-robots-txt/).
21
21
@@ -41,6 +41,6 @@ From the dropdown, select either **All subdomains**, or the specific subdomain y
41
41
Selecting a specific subdomain allows you to access:
42
42
43
43
-**Violations only** toggle: Toggles the AI Audit page to only display bots which are violating your configured rules.
44
-
-[**Enforce robots.txt policy**](/ai-audit/features/enforce-robots-txt/): Ensure bots cannot access webpages which are off-access, as specified in your `robots.txt` file.
44
+
-[**Enforce robots.txt policy**](/ai-audit/features/enforce-robots-txt/): Ensure bots cannot access webpages which are off-limits, as specified in your `robots.txt` file.
45
45
46
-
The values of the AI Audit metrics will update according to your filter.
46
+
The values of the AI Audit metrics will update according to your filter.
Copy file name to clipboardExpand all lines: src/content/docs/ai-audit/get-started.mdx
+1-1Lines changed: 1 addition & 1 deletion
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -38,7 +38,7 @@ To use AI Audit:
38
38
39
39
## 2. Block specific bot categories (Enterprise plan only)
40
40
41
-
Customers on the Enterprise plan can choose to only block specific AI crawlers, while allowing others.
41
+
Customers on the Enterprise plan -- and with a [Bot Management subscription](/bots/plans/bm-subscription/) -- can choose to only block specific AI crawlers, while allowing others.
0 commit comments