-
Notifications
You must be signed in to change notification settings - Fork 4.3k
Update Fetch Metadata positioning #1875
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
Extended core guidance to mention Fetch Metadata request headers (Sec-Fetch-*) as an alternative to CSRF tokens for state-changing requests. Added clarification that developers can use CSRF tokens or Fetch Metadata depending on project needs and client compatibility. Updated Fetch Metadata positionaing
removed old Fetch Metadata section
Fix typos and markdown issues
fix heading increment
|
Thanks for taking the lead on this! You'll definitely want a review by someone more knowledgeable (eg. @FiloSottile), but here's some feedback. I believe it was agreed in #1803 that "Fetch metadata is a complete and robust fix for CSRF, not just defense in depth." However, this PR reads unnecessarily precautionary.
|
|
Hi, @nickchomey! Thanks a lot for the thorough review and for the pointers, here some thoughts
You’re right — that’s on me. The phrasing ended up sounding more precautionary than intended; it reflects my personal stance on the topic
I moved the Fetch Metadata section because we’re proposing it be treated as one of the primary mitigations. If we leave it buried, readers may miss that change.
I understand your point — my intent was to reflect the current state of adoption and confidence rather than to imply it’s inherently secondary. As I mentioned in the issue discussion, Fetch Metadata is still relatively new compared to traditional CSRF mitigations. While it’s an elegant and highly effective mechanism for modern browsers, it hasn’t yet gone through the same long cycle of real-world testing, widespread deployment, and maturity that CSRF tokens have. In security, mechanisms often take time to earn broad trust — a good parallel might be the SameSite cookie attribute, which was proposed years before browsers began enforcing it by default and before most frameworks integrated it as a standard protection. That said, I’m open to rephrasing. Which parts stand out to you most?
My point wasn’t to question the coverage but rather to acknowledge that, as a community, we can’t dictate every project’s goals or constraints. If the software targets modern browsers, then Fetch Metadata is clearly the way to go. However, as we discussed, for environments where these headers aren’t yet supported, developers will still need to rely on CSRF tokens — even if that feels less practical.
Agree
Regarding the note on blocking legitimate CORS or third-party flows — I intentionally kept it, just to highlight that these areas need extra care, since enabling Fetch Metadata protections without considering such flows could break them. The relevant mitigations are already covered in section 3.2 (“How to treat Fetch Metadata headers on the server-side”) section.
Good catch! I hadn’t considered mentioning it explicitly, but that’s a great recommendation and would indeed address the “potentially trustworthy” issue as well. I’ll incorporate that. |
|
I can't help but think that your response was written almost completely by AI... It makes me wary of collaborating any further on this as its not clear to me that a thoughtful human is actually on the other side... Perhaps incorporate my suggestions into a new commit and we can see where we're at then |
|
I don’t like where this is headed, especially with things starting to feel a bit personal. It feels like the focus is mainly on getting Fetch Metadata headers recognized as the primary CSRF protection and moving away from CSRF tokens entirely, which doesn’t sit well with me. That’s just my personal view, but I worry it could lead to CSRF tokens falling out of use altogether, since most developers would naturally choose the simpler Fetch Metadata approach. I get your point though, and as I mentioned in my previous comment, this section reflects my own perspective. It’s admittedly an awkward position — I don’t have solid evidence that Fetch Metadata isn’t as robust as you’d like to present it, but at the same time, “lack of evidence doesn’t mean lack of existence” isn’t a strong argument either. I’m just not sure how to balance both sides — treating Fetch Metadata as “the best” while still keeping CSRF tokens relevant. |
|
Feel free to open your own PR — I’m not comfortable positioning Fetch Metadata as THE MAIN CSRF defense. |
|
These last 2 comments, and those from the previous issue, have no hint of AI. I don't think anyone was ever advocating for fetch metadata being THE primary csrf protection. Just that it is suitable as A Primary/standalone protection, if a few niche caveats and associated mitigation are presented.
Though, this is precisely what people are ultimately advocating for - CSRF tokens are a headache (and therefore much more likely to implemented poorly due to human error). They also make caching extremely difficult. I see no conflict or even issue with effectively saying "these are two worthy options. Take your pick". Most would, indeed, eventually pick fetch metadata, and tokens would eventually be forgotten. What's wrong with that? That's just another example of good technical progress for the web platform. |
I’m not a native speaker, so in the first comment I was just trying to politely make my point. When I saw you didn’t like my style, I just dropped it :)
I know, and that makes perfect sense. I was just trying to play it safe and leave room for discussion, something like: Let’s do it this way: I’ll fix the language according to the suggestions in your first message, add the missing bits about HSTS, keep the Metadata section at the beginning (so we don’t have to open another PR to move it later), and then ask for a review from the rest — especially the Go folks, where this all started. |
Added guidance that all Fetch Metadata implementations must include a mandatory fallback to Origin/Referer verification for compatibility. Reworked browser compatibility notes and Limitations and gotchas section. Changed language to avoid undermining Fetch Metadata headers
|
Hi, @nickchomey |
As the founder of the cheatsheet series and project lead, I agree 100%. Fetch Metadata s a very useful first line defense against CSRF, but it is not safe to rely on it alone for all production workloads. Use it as part of defense-in-depth. Combine it with SameSite cookies, origin checks, and per-request CSRF tokens. The spec’s notion of “user-initiated” navigation and some navigation flows (top-level navigations, prerender/prefetch, PaymentRequest-like flows) can result in values that permit requests you’d expect to be blocked; attackers can sometimes craft flows to exploit those behaviors. There is research showing odd corner cases. So let's revisit this in a year, but for now, Fetch Metadata is one defense with limitations and is not the only be-all defense for CSRF. Anywhere we say this, I want it softened. And again, we can revisit this in a year. |
artis3n
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
First time commenter, but I am very motivated by making security simple and I like the Sec-Fetch-Site pattern very much. Added some recommendations about structuring this page.
cheatsheets/Cross-Site_Request_Forgery_Prevention_Cheat_Sheet.md
Outdated
Show resolved
Hide resolved
|
|
||
| Because some legacy browsers may not send `Sec-Fetch-*` headers, a fallback to [standard origin verification](#using-standard-headers-to-verify-origin) using the `Origin` and `Referer` headers **is a mandatory requirement** for any Fetch Metadata implementation. | ||
|
|
||
| The Fetch Metadata request headers are: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this section be focused on the subset of Sec-Fetch-* headers that are used to protect against CSRF? (Just Sec-Fetch-Site and Origin/Host header validation, and link off to MDN or something "if you'd like to learn more about what other Sec-Fetch-* headers are available for other purposes, see..."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hi @artis3n, sorry for the wait on the review
If I understood your comment correctly, you’re suggesting narrowing this section down to just Sec-Fetch-Site and Origin/Host header validation. I agree this would simplify things, but my thinking was that it’s important to keep the broader context of Fetch Metadata headers, since they work as a family and the spec encourages using them together.
and link off to MDN or something
we already link out to the W3C spec for readers who want deeper detail.
Maybe you can draft a short suggestion of what you’d like this section to look like?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also, there are examples of Sec-Fetch-Mode, Sec-Fetch-Dest, and Sec-Fetch-User further down in the section.
Because of that, I think it makes sense to keep the broader description here, so all the related information stays in one place and developers don’t need to jump between different resources to understand how the full set of headers works together.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That makes sense if we want to keep them co-located. I do think we should distinguish between the headers you need to use for CSRF protection vs. the ones you can use for additional checks. E.g. mode, dest, and user.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
mode, dest, and user are still part of the overall Fetch Metadata–based CSRF protection strategy, the Sec-Fetch-Site header is just the primary signal we rely on. But I get your point, I’ll try to update the wording to make that distinction clearer.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I agree that making it clear that sec-fetch-site is the primary signal would be important, so as to not overcomplicate/overwhelm people. The others can be included for reference/deep dive.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated
cheatsheets/Cross-Site_Request_Forgery_Prevention_Cheat_Sheet.md
Outdated
Show resolved
Hide resolved
|
|
||
| 3.2 Whitelist explicit cross-origin flows. If certain endpoints intentionally accept cross-origin requests (CORS JSON APIs, third-party integrations, webhooks), explicitly exempt those endpoints from the global Sec-Fetch deny policy and secure them with proper CORS configuration, authentication, and logging. | ||
|
|
||
| ### Things to consider |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How about a section for Requirements / Pre-Conditions/ etc. to indicate what must already be present in order to rely on this method? (Must enforce HTTPS only or be using localhost, must not use GET requests for state-changing requests, recommended to set HSTS).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Agree, this originally started as a “things to consider” section, but during the PR review we already narrowed it down to notes about trustworthy URLs and HSTS. We can definitely rephrase it as requirements rather than optional considerations.
Regarding “must not use GET requests for state-changing requests,” there’s already a dedicated section for that, but we can briefly mention it here as well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Updated, feel free to resolve the conversation
|
|
||
| For the rare cases of outdated or embedded browsers that lack `Sec-Fetch-*` support, a fallback to [standard origin verification](#using-standard-headers-to-verify-origin) should provide the required coverage. If this is acceptable for your project, consider prompting users to update their browsers, as they are running on outdated and potentially insecure versions. | ||
|
|
||
| ### How to treat Fetch Metadata headers on the server-side |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd recommend we update this section. Maintaining language agnosticity, I'd like to see the pattern outlined in https://words.filippo.io/csrf/#protecting-against-csrf-in-2025 personally.
- Validate Origin header against an allowlist (for the JS example, recommend configuring as a
new Set([domains])for an easyset.has(Origin)as this validation step - Check if Sec-Fetch-Site header is present
a. If present, allow request if value issame-originornone, else deny - If Sec-Fetch-Site and Origin headers are both missing, pass request through
- Validate Origin header against Host header and pass request if they match, else reject (do this at the end and not with initial Origin validation due to step 3 support of significantly legacy browsers pre-2020)
Also the pattern described, basically, in https://web.dev/articles/fetch-metadata#how_to_use_fetch_metadata_to_protect_against_cross-origin_attacks .
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
And I'm partial since it is the language I am usually in but would be nice to note with the examples that Go developers can follow the above pattern by just using https://pkg.go.dev/net/[email protected]#CrossOriginProtection in the standard library as of Go 1.25.
Perhaps similar to https://cheatsheetseries.owasp.org/cheatsheets/Cross-Site_Request_Forgery_Prevention_Cheat_Sheet.html#javascript-automatically-including-csrf-tokens-as-an-ajax-request-header , a section for Go: Use net/http CrossOriginProtection middleware with a link out to those docs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This PR's aproach was actually based primarily on the approach outlined in https://web.dev/articles/fetch-metadata#how_to_use_fetch_metadata_to_protect_against_cross-origin_attacks, however, I intentionally changed the flow to make it deny-by-default, rather than the more allow-by-default used in both that article and Filippo’s blog.
Both external sources propose flows that can short-circuit early and allow requests without evaluating Fetch Metadata. For example:
If the Origin header matches an allow-list of trusted origins, allow the request.
In this example, the request never reaches any Fetch Metadata checks. That effectively makes the Origin header the primary defense.
These requests are not from (post-2020) browsers, and can’t be affected by CSRF.
I’m not fully comfortable relying on that assumption. My interpretation is that it’s based on the earlier point that older (pre-2020) browsers sometimes omitted the Origin header on POST. The wording could also be misread as “pre-2020 browsers can’t be affected by CSRF,” which I’m sure isn’t the intended meaning.
if as a community we agree that Fetch Metadata is the stronger and more reliable signal in modern browsers, then it seems logical to evaluate Sec-Fetch-Site first and use standard headers (Origin/Referer/Host) only as a fallback for legacy or non-compliant clients.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
a section for Go: Use net/http CrossOriginProtection middleware with a link out to those docs.
Regarding your second suggestion, that’s a great idea. We should definitely add a Go section to the “CSRF Prevention in Modern Frameworks” area. I can write a short part for it and include a reference to the CrossOriginProtection middleware along with a link to the documentation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
While the blog post uses Origin first, the actual Go source code checks sec-fetch-site first:
I have to assume this concern was raised while implementing the feature, and thus appropriately uses origin as a fallback.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for bringing this up, I went straight for the blog.
As we discussed, we’ll definitely add a reference to the Go implementation in the Frameworks section at the end of the document. Do you feel the current “### How to treat Fetch Metadata headers on the server-side” section needs any updates? @nickchomey
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Added this as part of "### Use Built-In Or Existing CSRF Implementations for CSRF Protection"
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems OK to me.
Like this, go for it Co-authored-by: Ari Kalfus <[email protected]>
Co-authored-by: Ari Kalfus <[email protected]>
cheatsheets/Cross-Site_Request_Forgery_Prevention_Cheat_Sheet.md
Outdated
Show resolved
Hide resolved
cheatsheets/Cross-Site_Request_Forgery_Prevention_Cheat_Sheet.md
Outdated
Show resolved
Hide resolved
cheatsheets/Cross-Site_Request_Forgery_Prevention_Cheat_Sheet.md
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry i couldnt look at this til now. I'm generally happy with how this is taking shape, and have no qualms with making a qualified recommendation for fetch metadata.
I made a few small suggestions.
I also wonder if @jmanico's remaining concerns should be enumerated somewhere, along with possible mitigations?
- The concern about “user-initiated” navigation and prefetch/prerender flows isn’t theoretical. There are ongoing W3C TAG and WHATWG discussions about how speculative browsing contexts propagate fetch metadata. For example, prerendered pages sometimes emit Sec-Fetch-Site: same-origin even when the final navigation resolves cross-site. Chrome’s prerender2 work (tracked in Chromium #1170321 and #1261932) notes inconsistencies in how speculative requests report fetch metadata headers. PaymentRequest and similar APIs also create browser-initiated POSTs that don’t always surface Sec-Fetch-* in predictable ways. None of these are critical exploits, but they show that interpretation and propagation aren’t yet frozen.
- Regarding proxies: you’re totally right in that stripping Origin or Sec-* headers is a misconfiguration or vulnerability, but in large enterprise networks (and some healthcare/ICS environments) those conditions unfortunately frequently exist. From a standards perspective, we try to write guidance that remains safe even in sub-optimal deployments. And CSRF tokens and SameSite cookies both work well in these realistic environments.
Co-authored-by: nickchomey <[email protected]>
Co-authored-by: nickchomey <[email protected]>
Co-authored-by: nickchomey <[email protected]>
|
@nickchomey Another great idea, we should definitely add that, otherwise it’ll get lost. My first thought was to put it under the “Things to consider” subsection, but we already discussed converting that into Requirements. Now I’m thinking that at the end of the Fetch Metadata section we should have three subsections: Concerns (including @jmanico’s remaining points), Requirements (trusted URLs, localhost, HSTS), and the existing “Rollout & testing recommendations” |
|
That sounds fine to me |
Updated "The Fetch Metadata request headers are" part. Updated "Use Built-In Or Existing CSRF Implementations for CSRF Protection" to include reference to Go CrossOriginProtection. Updated "### Things to consider"
fix list indentation
| - **If the framework does not have built-in CSRF protection, add [CSRF tokens](#token-based-mitigation) to all state changing requests (requests that cause actions on the site) and validate them on the backend, or validate [Fetch Metadata headers](#fetch-metadata-headers) on the backend for all state-changing requests.** | ||
| - **If your software is intended to be used only on modern browsers, you may rely on [Fetch Metadata headers](#fetch-metadata-headers) to block cross-site state-changing requests, so long as you use the fallback options detailed below.** |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think the 2nd half of the first bullet should have been added. It is completely redundant with the 2nd bullet (and also has a lot of repetitive wording with the rest of the 1st bullet.
| - **If the framework does not have built-in CSRF protection, add [CSRF tokens](#token-based-mitigation) to all state changing requests (requests that cause actions on the site) and validate them on the backend, or validate [Fetch Metadata headers](#fetch-metadata-headers) on the backend for all state-changing requests.** | |
| - **If your software is intended to be used only on modern browsers, you may rely on [Fetch Metadata headers](#fetch-metadata-headers) to block cross-site state-changing requests, so long as you use the fallback options detailed below.** | |
| - **If the framework does not have built-in CSRF protection, add [CSRF tokens](#token-based-mitigation) to all state-changing requests (requests that cause actions on the site) and validate them on the backend.** | |
| - **If your software is intended to be used only on modern browsers, you may rely on [Fetch Metadata headers](#fetch-metadata-headers) to block cross-site state-changing requests, so long as you use the fallback options detailed below.** |
|
|
||
| Fetch Metadata request headers provide extra context about how an HTTP request was made, and how the resource will be used, enabling servers to reject suspicious cross-site requests. Servers can use these headers — most importantly `Sec-Fetch-Site` — as a lightweight and reliable method to block obvious cross-site requests. See the [Fetch Metadata specification](https://www.w3.org/TR/fetch-metadata/) for details. | ||
|
|
||
| Because some legacy browsers may not send `Sec-Fetch-*` headers, a fallback to [standard origin verification](#using-standard-headers-to-verify-origin) using the `Origin` and `Referer` headers **is a mandatory requirement** for any Fetch Metadata implementation. `Sec-Fetch-*` [is supported](https://caniuse.com/?search=sec-fetch-site) in all browsers since March 2023 (with the exception of IE as it no longer receives updates and does not support `Sec-Fetch-*` headers). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is referer actually necessary, or even a good idea? I've seen it mentioned in various places that it has some issues. And origin is essentially 100% browser coverage. Hence that's all Go 1.25 uses. I don't think we mention referer elsehwre, either.
|
|
||
| ### How to treat Fetch Metadata headers on the server-side | ||
|
|
||
| `Sec-Fetch-Site` is the most useful Fetch Metadata header for blocking CSRF-like cross-origin requests and should be the primary signal in a Fetch-Metadata-based policy. Use other Fetch Metadata headers (`Sec-Fetch-Mode`, `Sec-Fetch-Dest`, `Sec-Fetch-User`) to further refine or tailor policies to your application (for example, allowing navigate mode top-level requests or permitting specific Dest values for resource endpoints). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is navigate mode defined anywhere?
| **Policy (high level)** | ||
|
|
||
| 1. If `Sec-Fetch-Site` is present: | ||
| 1.1. Treat cross-site as untrusted for state-changing actions. By default, reject non-safe methods (POST / PUT / PATCH / DELETE) when `Sec-Fetch-Site: cross-site`. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This isn't really a comment about the doc, but a related question.
In wordpress, Ajax requests are often made with GET to make mutations. This is, of course, wrong in many ways, but does happen.
Though, wp encourages the use of their nonce mechanism for Ajax, and doesn't use fetch metadata at all. Still, I will likely put in some effort to see if sec fetch can replace the wp nonces in a drop-in manner (just turn off/ignore nonces if they're there and use sec fetch)
So, I wonder what can be done to handle state-changing GET requests?
Should anything be mentioned here for the benefit of other frameworks or sites that use GET incorrectly?
|
|
||
| ### Rollout & testing recommendations | ||
|
|
||
| - Include an appropriate `Vary` header, in order to ensure that caches handle the response appropriately. For example, `Vary: Accept-Encoding, Sec-Fetch-Site`. See more [Fetch Metadata specification](https://w3c.github.io/webappsec-fetch-metadata/#vary). |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add any info on what the issue is here that is being addressed? It isn't obvious to me, and surely most others.
|
|
||
| For the rare cases of outdated or embedded browsers that lack `Sec-Fetch-*` support, a fallback to [standard origin verification](#using-standard-headers-to-verify-origin) should provide the required coverage. If this is acceptable for your project, consider prompting users to update their browsers, as they are running on outdated and potentially insecure versions. | ||
|
|
||
| ### How to treat Fetch Metadata headers on the server-side |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems OK to me.
As agreed in #1803 :
Updated Fetch Metadata positioning
Extended core guidance to mention Fetch Metadata request headers (Sec-Fetch-*) as an alternative to CSRF tokens for state-changing requests.
Added clarification that developers can use CSRF tokens or Fetch Metadata depending on project needs and client compatibility.
In case of a new Cheat Sheet, you have used the Cheat Sheet template.
All the markdown files do not raise any validation policy violation, see the policy.
All the markdown files follow these format rules.
All your assets are stored in the assets folder.
All the images used are in the PNG format.
Any references to websites have been formatted as
[TEXT](URL)You verified/tested the effectiveness of your contribution (e.g., the defensive code proposed is really an effective remediation? Please verify it works!).
The CI build of your PR pass, see the build status here.
AI Tool Usage Disclosure (required for all PRs)
Please select one of the following options:
the contents and I affirm the results. The LLM used is
[llm name and version]and the prompt used is
[your prompt here]. [Feel free to add more details if needed]This PR fixes issue #1803.