Skip to content

Conversation

@Shitanshukumar607
Copy link
Owner

No description provided.

Copilot AI review requested due to automatic review settings October 6, 2025 12:02
@vercel
Copy link

vercel bot commented Oct 6, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
img-to-svg Ready Ready Preview Comment Oct 6, 2025 0:10am

Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull Request Overview

Adds SEO configuration and assets to improve crawlability and sharing previews.

  • Introduces centralized SEO constants and metadata for Next.js.
  • Adds a sitemap generator for core routes.
  • Adds a robots.txt with crawl rules and sitemap reference.

Reviewed Changes

Copilot reviewed 4 out of 4 changed files in this pull request and generated 7 comments.

File Description
public/robots.txt Adds crawl directives and sitemap URL.
lib/seoMetadata.ts Defines Next.js Metadata object using constants.
lib/constantSEO.ts Centralizes base URL, titles, descriptions, and keywords.
app/sitemap.ts Exposes a sitemap for homepage, generate, and collections routes.

Tip: Customize your code reviews with copilot-instructions.md. Create the file or learn how to get started.

Comment on lines +12 to +16
# Block common sensitive paths
Disallow: /admin
Disallow: /api
Disallow: /private
Disallow: /.netlify
Copy link

Copilot AI Oct 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

User-agent groups do not inherit rules. Because Googlebot and Bingbot have their own groups without any Disallow directives, those bots are currently allowed to crawl everything (including /admin, /api, etc.). Add the same Disallow lines under the Googlebot and Bingbot groups, or remove the UA-specific groups and rely on the wildcard group. For example, under each of Googlebot and Bingbot add: Disallow: /admin, Disallow: /api, Disallow: /private, Disallow: /.netlify.

Copilot uses AI. Check for mistakes.
Comment on lines +28 to +33
Allow: /collections

# Bingbot
User-agent: Bingbot
Allow: /generate
Allow: /collections
Copy link

Copilot AI Oct 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

User-agent groups do not inherit rules. Because Googlebot and Bingbot have their own groups without any Disallow directives, those bots are currently allowed to crawl everything (including /admin, /api, etc.). Add the same Disallow lines under the Googlebot and Bingbot groups, or remove the UA-specific groups and rely on the wildcard group. For example, under each of Googlebot and Bingbot add: Disallow: /admin, Disallow: /api, Disallow: /private, Disallow: /.netlify.

Suggested change
Allow: /collections
# Bingbot
User-agent: Bingbot
Allow: /generate
Allow: /collections
Allow: /collections
Disallow: /admin
Disallow: /api
Disallow: /private
Disallow: /.netlify
# Bingbot
User-agent: Bingbot
Allow: /generate
Allow: /collections
Disallow: /admin
Disallow: /api
Disallow: /private
Disallow: /.netlify

Copilot uses AI. Check for mistakes.
Comment on lines +3 to +6
export const metadata = {
title: "Sitemap",
};

Copy link

Copilot AI Oct 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] app/sitemap.ts is a metadata route (not a page) and Next.js ignores page metadata here. Removing this export avoids confusion and dead code.

Suggested change
export const metadata = {
title: "Sitemap",
};

Copilot uses AI. Check for mistakes.
Comment on lines +75 to +85
robots: {
index: true,
follow: true,
googleBot: {
index: true,
follow: true,
"max-video-preview": -1,
"max-image-preview": "large",
"max-snippet": -1,
},
},
Copy link

Copilot AI Oct 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] These directives advertise index/follow, but robots.txt currently blocks crawling for the wildcard agent. Align metadata and robots.txt to avoid mixed signals (either relax robots.txt as suggested, or adjust these flags to reflect the intended crawl policy).

Copilot uses AI. Check for mistakes.
Comment on lines +2 to +7

export const metadata = {
title: "Sitemap",
};

export default function sitemap() {
Copy link

Copilot AI Oct 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

[nitpick] Add an explicit return type for stronger typing with Next.js metadata routes. For example: import type { MetadataRoute } from 'next'; export default function sitemap(): MetadataRoute.Sitemap { ... }.

Suggested change
export const metadata = {
title: "Sitemap",
};
export default function sitemap() {
import type { MetadataRoute } from "next";
export const metadata = {
title: "Sitemap",
};
export default function sitemap(): MetadataRoute.Sitemap {

Copilot uses AI. Check for mistakes.
@Shitanshukumar607 Shitanshukumar607 merged commit eb54e3a into main Oct 6, 2025
5 of 6 checks passed
@Shitanshukumar607 Shitanshukumar607 deleted the seo branch October 10, 2025 04:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants