Skip to content

Commit 7f6ee7d

Browse files
committed
fix: update noindex example URL pattern for pages.dev
1 parent f6cecbd commit 7f6ee7d

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

src/content/partials/workers/custom_headers.mdx

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -117,14 +117,14 @@ This applies the `Access-Control-Allow-Origin` header to any incoming URL. To be
117117

118118
[Google](https://developers.google.com/search/docs/advanced/robots/robots_meta_tag#directives) and other search engines often support the `X-Robots-Tag` header to instruct its crawlers how your website should be indexed.
119119

120-
For example, to prevent your <code>{props.product === 'workers' ? '\*.\*.workers.dev' : '\*.pages.dev'}</code> URLs from being indexed, add the following to your `_headers` file:
120+
For example, to prevent your {props.product === 'workers' ? <code>\*.\*.workers.dev</code> : <><code>\*.pages.dev</code> and <code>\*.\*.pages.dev</code></>} URLs from being indexed, add the following to your `_headers` file:
121121

122122
<Code
123123
lang="txt"
124124
code={
125125
props.product === "workers"
126126
? `https://:version.:subdomain.workers.dev/*\n\tX-Robots-Tag: noindex`
127-
: `https://*.pages.dev/*\n\tX-Robots-Tag: noindex`
127+
: `https://:project.pages.dev/*\n\tX-Robots-Tag: noindex\n\nhttps://:version.:project.pages.dev/*\n\tX-Robots-Tag: noindex`
128128
}
129129
/>
130130

0 commit comments

Comments
 (0)