frosh/robots-txt

Generate robots.txt

Installs: 11 824

Dependents: 0

Suggesters: 0

Security: 0

Stars: 9

Watchers: 2

Forks: 3

Open Issues: 1

Type:shopware-platform-plugin

0.2.2 2024-10-22 14:45 UTC

This package is auto-updated.

Last update: 2024-11-22 14:56:00 UTC


README

Twitter Follow Slack

This plugin provides a robots.txt for a Shopware 6 shop. Currently it is not possible to distinguish different user-agents. Note that in general only the robots.txt at the root will be considered.

Allow and Disallow rules

Currently there exist the following default rules

Allow: /
Disallow: /*?
Allow: /*theme/
Allow: /media/*?ts=

If you need to modify them this should be done by a template modification. If there are other general rules which are useful for others, consider creating a pull request.

It is possible to configure the Disallow and Allow rules in the plugin configuration. Each line needs to start with Allow: or Disallow: followed by the URI-path. The generated robots.txt will contain each path prefixed with the absolute base path.

For example suppose you have two "domains" configured for a sales channel example.com and example.com/en and the plugin configuration

Disallow: /account/
Disallow: /checkout/
Disallow: /widgets/
Allow: /widgets/cms/
Allow: /widgets/menu/offcanvas

The robots.txt at example.com/robots.txt contains:

Disallow: /en/account/
Disallow: /en/checkout/
Disallow: /en/widgets/
Allow: /en/widgets/cms/
Allow: /en/widgets/menu/offcanvas
Disallow: /account/
Disallow: /checkout/
Disallow: /widgets/
Allow: /widgets/cms/
Allow: /widgets/menu/offcanvas

Sitemaps

In addition to the rules the sitemaps containing the domain will be linked. Again suppose we have the domains example.com and example.com/en configured, the robots.txt will contain

Sitemap: https://example.com/en/sitemap.xml
Sitemap: https://example.com/sitemap.xml