-
Notifications
You must be signed in to change notification settings - Fork 0
added seo things #11
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
added seo things #11
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull Request Overview
Adds SEO configuration and assets to improve crawlability and sharing previews.
- Introduces centralized SEO constants and metadata for Next.js.
- Adds a sitemap generator for core routes.
- Adds a robots.txt with crawl rules and sitemap reference.
Reviewed Changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 7 comments.
| File | Description |
|---|---|
| public/robots.txt | Adds crawl directives and sitemap URL. |
| lib/seoMetadata.ts | Defines Next.js Metadata object using constants. |
| lib/constantSEO.ts | Centralizes base URL, titles, descriptions, and keywords. |
| app/sitemap.ts | Exposes a sitemap for homepage, generate, and collections routes. |
Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.
| # Block common sensitive paths | ||
| Disallow: /admin | ||
| Disallow: /api | ||
| Disallow: /private | ||
| Disallow: /.netlify |
Copilot
AI
Oct 6, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
User-agent groups do not inherit rules. Because Googlebot and Bingbot have their own groups without any Disallow directives, those bots are currently allowed to crawl everything (including /admin, /api, etc.). Add the same Disallow lines under the Googlebot and Bingbot groups, or remove the UA-specific groups and rely on the wildcard group. For example, under each of Googlebot and Bingbot add: Disallow: /admin, Disallow: /api, Disallow: /private, Disallow: /.netlify.
| Allow: /collections | ||
|
|
||
| # Bingbot | ||
| User-agent: Bingbot | ||
| Allow: /generate | ||
| Allow: /collections |
Copilot
AI
Oct 6, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
User-agent groups do not inherit rules. Because Googlebot and Bingbot have their own groups without any Disallow directives, those bots are currently allowed to crawl everything (including /admin, /api, etc.). Add the same Disallow lines under the Googlebot and Bingbot groups, or remove the UA-specific groups and rely on the wildcard group. For example, under each of Googlebot and Bingbot add: Disallow: /admin, Disallow: /api, Disallow: /private, Disallow: /.netlify.
| Allow: /collections | |
| # Bingbot | |
| User-agent: Bingbot | |
| Allow: /generate | |
| Allow: /collections | |
| Allow: /collections | |
| Disallow: /admin | |
| Disallow: /api | |
| Disallow: /private | |
| Disallow: /.netlify | |
| # Bingbot | |
| User-agent: Bingbot | |
| Allow: /generate | |
| Allow: /collections | |
| Disallow: /admin | |
| Disallow: /api | |
| Disallow: /private | |
| Disallow: /.netlify |
| export const metadata = { | ||
| title: "Sitemap", | ||
| }; | ||
|
|
Copilot
AI
Oct 6, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] app/sitemap.ts is a metadata route (not a page) and Next.js ignores page metadata here. Removing this export avoids confusion and dead code.
| export const metadata = { | |
| title: "Sitemap", | |
| }; |
| robots: { | ||
| index: true, | ||
| follow: true, | ||
| googleBot: { | ||
| index: true, | ||
| follow: true, | ||
| "max-video-preview": -1, | ||
| "max-image-preview": "large", | ||
| "max-snippet": -1, | ||
| }, | ||
| }, |
Copilot
AI
Oct 6, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] These directives advertise index/follow, but robots.txt currently blocks crawling for the wildcard agent. Align metadata and robots.txt to avoid mixed signals (either relax robots.txt as suggested, or adjust these flags to reflect the intended crawl policy).
|
|
||
| export const metadata = { | ||
| title: "Sitemap", | ||
| }; | ||
|
|
||
| export default function sitemap() { |
Copilot
AI
Oct 6, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
[nitpick] Add an explicit return type for stronger typing with Next.js metadata routes. For example: import type { MetadataRoute } from 'next'; export default function sitemap(): MetadataRoute.Sitemap { ... }.
| export const metadata = { | |
| title: "Sitemap", | |
| }; | |
| export default function sitemap() { | |
| import type { MetadataRoute } from "next"; | |
| export const metadata = { | |
| title: "Sitemap", | |
| }; | |
| export default function sitemap(): MetadataRoute.Sitemap { |
Co-authored-by: Copilot <[email protected]>
Co-authored-by: Copilot <[email protected]>
No description provided.