Skip to content

Commit 44073bb

Browse files
authored
Some fixes
1 parent a49bddf commit 44073bb

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/content/docs/support/troubleshooting/general-troubleshooting/troubleshooting-crawl-errors.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -41,9 +41,9 @@ Review the following recommendations to prevent crawler errors:
4141

4242
* Do not block the United States via [custom rules](/waf/custom-rules/) or [IP Access rules](/waf/tools/ip-access-rules/).
4343

44-
* Do not block User-Agents in your `.htaccess` file, server configuration, [`robots.txt`](http://support.google.com/webmasters/bin/answer.py?answer=35303), or web application.
44+
* Do not block Google User-Agents in your `.htaccess` file, server configuration, [`robots.txt`](http://support.google.com/webmasters/bin/answer.py?answer=35303), or web application.
4545

46-
Google uses a [variety of User-Agents](https://support.google.com/webmasters/answer/1061943) to crawl your website. You can [test your `robots.txt` via Google](https://support.google.com/webmasters/answer/6062598?hl=en).
46+
Google uses a [variety of User-Agents](https://developers.google.com/search/docs/crawling-indexing/overview-google-crawlers) to crawl your website. You can [test your `robots.txt` via Google](https://support.google.com/webmasters/answer/6062598?hl=en).
4747

4848
* Do not allow crawling of files in the `/cdn-cgi/` directory. This path is used internally by Cloudflare and Google encounters errors when crawling it. Disallow crawls of `cdn-cgi` via `robots.txt`:
4949

0 commit comments

Comments
 (0)