feat: add sitemaps for all apps and robots.txt at root#709
Merged
sean-brydon merged 1 commit intomainfrom Mar 2, 2026
Merged
Conversation
Co-Authored-By: pasquale <pasqualevitiello@gmail.com>
Contributor
🤖 Devin AI EngineerI'll be helping with this pull request! Here's what you should know: ✅ I will automatically:
Note: I can only respond to comments from users who have write access to this repository. ⚙️ Control Options:
|
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
sean-brydon
approved these changes
Mar 2, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Adds per-app sitemaps for the
wwwanduiapps (theoriginapp already had one), and moves the canonicalrobots.txtto thewwwapp so it serves atcoss.com/robots.txtwith references to all three sitemaps.Changes:
apps/www/app/sitemap.ts— Static sitemap listing all www pages (home, scheduling, calendar, email, sms, video, payments, notifications, auth)apps/www/app/robots.ts— Programmatic robots.txt that references all three app sitemaps (/sitemap.xml,/origin/sitemap.xml,/ui/sitemap.xml)apps/ui/app/sitemap.ts— Dynamic sitemap using fumadocssource.getPages()for all doc pages, plus the home and particles pagesapps/origin/app/robots.txt— Was serving at/origin/robots.txt(due to basePath), not at the root domain, so was effectively invisible to crawlersReview & Testing Checklist for Human
uisitemap URL format:source.getPages()returns pages with a.urlproperty (e.g./docs,/docs/components/button). Confirm thathttps://coss.com/ui${page.url}produces correct URLs (no missing/separator). Deploy preview or local build of theuiapp and check/ui/sitemap.xmloutput.wwwsitemap page list is complete: The static list was based on the currentapps/www/app/directory. Confirm no pages are missing. Note: if new pages are added towwwin the future, this file needs to be manually updated.coss.com/sitemap.xml,coss.com/origin/sitemap.xml,coss.com/ui/sitemap.xmland verify valid XML output. Also verifycoss.com/robots.txtlists all three.Notes
wwwsitemap is a static list — consider whether this should be dynamic in the future if pages are added frequently.origin/app/robots.txtDisallow: /private/rule was preserved in the new rootrobots.ts.