diff --git a/.wordlist.txt b/.wordlist.txt
index 5306e83e1..542b20f04 100644
--- a/.wordlist.txt
+++ b/.wordlist.txt
@@ -355,6 +355,8 @@ Enums
EqualsAny
ErrorsFacade
EventListener
+EventListeners
+EventSubscriber
EventSubscriberInterface
Everytime
ExampleAppWithCookies
@@ -424,6 +426,7 @@ GuestWishlistPage
GuestWishlistPagelet
HMAC
HMR
+Hacktober
HeaderPagelet
Homebrew
Hono
@@ -1852,6 +1855,7 @@ triggerDeprecated
truthy
tsx
twigjs
+txt
typeAndCheck
typeAndCheckSearchField
typeLegacySelectAndCheck
diff --git a/guides/plugins/plugins/content/seo/extend-robots-txt.md b/guides/plugins/plugins/content/seo/extend-robots-txt.md
new file mode 100644
index 000000000..83abac59e
--- /dev/null
+++ b/guides/plugins/plugins/content/seo/extend-robots-txt.md
@@ -0,0 +1,265 @@
+---
+nav:
+ title: Extend robots.txt configuration
+ position: 20
+
+---
+
+# Extend robots.txt configuration
+
+## Overview
+
+Since Shopware 6.7.1, the platform provides full `robots.txt` support with all standard directives and user-agent blocks.
+This feature was developed as an open-source contribution during Hacktober 2024 ([learn more](https://www.shopware.com/en/news/hacktoberfest-2024-outcome-a-robots-txt-for-shopware/)).
+For general configuration, refer to the [user documentation](https://docs.shopware.com/en/shopware-6-en/tutorials-and-faq/creation-of-robots-txt).
+
+::: info
+The events and features described in this guide are available since Shopware 6.7.5.
+:::
+
+You can extend the `robots.txt` functionality through events to:
+
+* Add custom validation rules during parsing
+* Modify or generate directives dynamically
+* Support custom or vendor-specific directives
+* Prevent warnings for known non-standard directives
+
+## Prerequisites
+
+This guide requires you to have a basic plugin running. If you don't know how to create a plugin, head over to the plugin base guide:
+
+
+
+You should also be familiar with [Event listeners](../../plugin-fundamentals/listening-to-events).
+
+::: info
+This guide uses EventListeners since each example listens to a single event. If you need to subscribe to multiple events in the same class, consider using an [EventSubscriber](../../plugin-fundamentals/listening-to-events#listening-to-events-via-subscriber) instead.
+:::
+
+## Modifying parsed directives
+
+The `RobotsDirectiveParsingEvent` is dispatched after `robots.txt` content is parsed. You can modify the parsed result, add validation, or inject dynamic directives.
+
+This example shows how to dynamically add restrictions for AI crawlers:
+
+
+
+
+```PHP
+getParsedRobots();
+
+ // Add restrictions for AI crawlers
+ $aiCrawlers = ['GPTBot', 'ChatGPT-User', 'CCBot', 'anthropic-ai'];
+
+ $aiBlock = new RobotsUserAgentBlock(
+ userAgents: $aiCrawlers,
+ directives: [
+ new RobotsDirective(
+ type: RobotsDirectiveType::DISALLOW,
+ value: '/checkout/',
+ ),
+ ],
+ );
+
+ $parsedRobots->addUserAgentBlock($aiBlock);
+
+ $this->logger->info('Extended robots.txt with AI crawler rules');
+ }
+}
+```
+
+
+
+
+
+```XML
+
+
+
+
+
+
+
+
+
+
+```
+
+
+
+
+## Handling custom directives
+
+The `RobotsUnknownDirectiveEvent` is dispatched when an unknown directive is encountered. Use this to support vendor-specific directives or prevent warnings for known non-standard directives:
+
+
+
+
+```PHP
+getDirectiveName()), $knownCustomDirectives, true)) {
+ $event->setHandled(true); // Prevent "unknown directive" warning
+ }
+ }
+}
+```
+
+
+
+
+
+```XML
+
+
+
+
+
+
+
+
+
+```
+
+
+
+
+## Validation and parse issues
+
+You can add validation warnings or errors during parsing using the `ParseIssue` class. This example shows common validation scenarios:
+
+
+
+
+```PHP
+getParsedRobots();
+
+ // Validate crawl-delay values
+ foreach ($parsedRobots->getUserAgentBlocks() as $block) {
+ foreach ($block->getDirectives() as $directive) {
+ if ($directive->getType() === RobotsDirectiveType::CRAWL_DELAY) {
+ $value = (int) $directive->getValue();
+
+ if ($value <= 0) {
+ $event->addIssue(new ParseIssue(
+ severity: ParseIssueSeverity::ERROR,
+ message: 'Invalid crawl-delay value: must be a positive integer',
+ lineNumber: null,
+ ));
+ }
+
+ if ($value > 10) {
+ $event->addIssue(new ParseIssue(
+ severity: ParseIssueSeverity::WARNING,
+ message: 'Crawl-delay value is very high. This may significantly slow down indexing.',
+ lineNumber: null,
+ ));
+ }
+ }
+ }
+ }
+
+ // Check for conflicting Allow/Disallow directives
+ foreach ($parsedRobots->getUserAgentBlocks() as $block) {
+ $disallowed = [];
+ $allowed = [];
+
+ foreach ($block->getDirectives() as $directive) {
+ if ($directive->getType() === RobotsDirectiveType::DISALLOW) {
+ $disallowed[] = $directive->getValue();
+ } elseif ($directive->getType() === RobotsDirectiveType::ALLOW) {
+ $allowed[] = $directive->getValue();
+ }
+ }
+
+ foreach ($allowed as $allowPath) {
+ foreach ($disallowed as $disallowPath) {
+ if ($allowPath === $disallowPath) {
+ $event->addIssue(new ParseIssue(
+ severity: ParseIssueSeverity::WARNING,
+ message: sprintf('Conflicting directives: Path "%s" is both allowed and disallowed', $allowPath),
+ lineNumber: null,
+ ));
+ }
+ }
+ }
+ }
+ }
+}
+```
+
+
+
+
+
+```XML
+
+
+
+
+
+
+
+
+
+```
+
+
+
+
+Issues are automatically logged when the `robots.txt` configuration is saved in the Administration. Use `WARNING` for recommendations and `ERROR` for critical problems that prevent proper generation.
diff --git a/guides/plugins/plugins/content/seo/index.md b/guides/plugins/plugins/content/seo/index.md
index a3a18cf29..2bc55d434 100644
--- a/guides/plugins/plugins/content/seo/index.md
+++ b/guides/plugins/plugins/content/seo/index.md
@@ -7,6 +7,16 @@ nav:
# SEO
-The Shopware SEO feature offers the ability to add custom SEO URLs to optimize the search engine visibility of the e-commerce platform.
+The Shopware SEO feature offers comprehensive tools to optimize the search engine visibility of your e-commerce platform.
-With the SEO plugin, you can create custom SEO URLs for product pages, categories, content pages, and other relevant sections of the website. The plugin allows businesses to customize meta tags for each page, including meta titles, descriptions, and keywords. This enables users to optimize the on-page SEO elements, providing search engines with relevant information about the content of the page.
+## SEO URLs
+
+You can create custom SEO URLs for product pages, categories, content pages, and other relevant sections of the website. The plugin allows businesses to customize meta tags for each page, including meta titles, descriptions, and keywords. This enables users to optimize the on-page SEO elements, providing search engines with relevant information about the content of the page.
+
+
+
+## Robots configuration
+
+Shopware provides full support for `robots.txt` configuration, including all standard directives, user-agent blocks, and extensibility through events. You can customize how search engine crawlers interact with your shop by extending the `robots.txt` parsing and generation.
+
+
diff --git a/guides/plugins/plugins/framework/data-handling/add-data-translations.md b/guides/plugins/plugins/framework/data-handling/add-data-translations.md
index f69e1daf3..73ada1fd6 100644
--- a/guides/plugins/plugins/framework/data-handling/add-data-translations.md
+++ b/guides/plugins/plugins/framework/data-handling/add-data-translations.md
@@ -297,3 +297,4 @@ class ExampleDefinition extends EntityDefinition
]);
}
}
+```