Skip to content

feat: add sitemaps for all apps and robots.txt at root#709

Merged
sean-brydon merged 1 commit intomainfrom
devin/1772274580-add-sitemaps
Mar 2, 2026
Merged

feat: add sitemaps for all apps and robots.txt at root#709
sean-brydon merged 1 commit intomainfrom
devin/1772274580-add-sitemaps

Conversation

@pasqualevitiello
Copy link
Contributor

Summary

Adds per-app sitemaps for the www and ui apps (the origin app already had one), and moves the canonical robots.txt to the www app so it serves at coss.com/robots.txt with references to all three sitemaps.

Changes:

  • apps/www/app/sitemap.ts — Static sitemap listing all www pages (home, scheduling, calendar, email, sms, video, payments, notifications, auth)
  • apps/www/app/robots.ts — Programmatic robots.txt that references all three app sitemaps (/sitemap.xml, /origin/sitemap.xml, /ui/sitemap.xml)
  • apps/ui/app/sitemap.ts — Dynamic sitemap using fumadocs source.getPages() for all doc pages, plus the home and particles pages
  • Deleted apps/origin/app/robots.txt — Was serving at /origin/robots.txt (due to basePath), not at the root domain, so was effectively invisible to crawlers

Review & Testing Checklist for Human

  • Verify ui sitemap URL format: source.getPages() returns pages with a .url property (e.g. /docs, /docs/components/button). Confirm that https://coss.com/ui${page.url} produces correct URLs (no missing / separator). Deploy preview or local build of the ui app and check /ui/sitemap.xml output.
  • Verify www sitemap page list is complete: The static list was based on the current apps/www/app/ directory. Confirm no pages are missing. Note: if new pages are added to www in the future, this file needs to be manually updated.
  • Test all three sitemaps after deploy: Hit coss.com/sitemap.xml, coss.com/origin/sitemap.xml, coss.com/ui/sitemap.xml and verify valid XML output. Also verify coss.com/robots.txt lists all three.

Notes

  • The www sitemap is a static list — consider whether this should be dynamic in the future if pages are added frequently.
  • The old origin/app/robots.txt Disallow: /private/ rule was preserved in the new root robots.ts.
  • Requested by: @pasqualevitiello
  • Link to Devin run

Co-Authored-By: pasquale <pasqualevitiello@gmail.com>
@devin-ai-integration
Copy link
Contributor

🤖 Devin AI Engineer

I'll be helping with this pull request! Here's what you should know:

✅ I will automatically:

  • Address comments on this PR that start with 'DevinAI' or '@devin'.
  • Look at CI failures and help fix them

Note: I can only respond to comments from users who have write access to this repository.

⚙️ Control Options:

  • Disable automatic comment and CI monitoring

@vercel
Copy link

vercel bot commented Feb 28, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Actions Updated (UTC)
coss-com Ready Ready Preview, Comment Feb 28, 2026 10:32am
coss-com-origin Building Building Preview, Comment Feb 28, 2026 10:32am
coss-com-ui Building Building Preview, Comment Feb 28, 2026 10:32am
coss-examples-calcom Ready Ready Preview, Comment Feb 28, 2026 10:32am

Request Review

Copy link
Contributor

@cubic-dev-ai cubic-dev-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No issues found across 4 files

@sean-brydon sean-brydon merged commit 013d1a2 into main Mar 2, 2026
14 checks passed
@sean-brydon sean-brydon deleted the devin/1772274580-add-sitemaps branch March 2, 2026 09:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants