frosh / robots-txt
Generate robots.txt
Installs: 11 000
Dependents: 0
Suggesters: 0
Security: 0
Stars: 9
Watchers: 2
Forks: 3
Open Issues: 1
Type:shopware-platform-plugin
Requires
- shopware/core: ~6.5.8 || ~6.6.0
README
This plugin provides a robots.txt
for a Shopware 6 shop. Currently it is not possible to distinguish different user-agents. Note that in general only the robots.txt
at the root will be considered.
Allow
and Disallow
rules
Currently there exist the following default rules
Allow: /
Disallow: /*?
Allow: /*theme/
Allow: /media/*?ts=
If you need to modify them this should be done by a template modification. If there are other general rules which are useful for others, consider creating a pull request.
It is possible to configure the Disallow
and Allow
rules in the plugin configuration. Each line needs to start with Allow:
or Disallow:
followed by the URI-path. The generated robots.txt
will contain each path prefixed with the absolute base path.
For example suppose you have two "domains" configured for a sales channel example.com
and example.com/en
and the plugin configuration
Disallow: /account/
Disallow: /checkout/
Disallow: /widgets/
Allow: /widgets/cms/
Allow: /widgets/menu/offcanvas
The robots.txt
at example.com/robots.txt
contains:
Disallow: /en/account/
Disallow: /en/checkout/
Disallow: /en/widgets/
Allow: /en/widgets/cms/
Allow: /en/widgets/menu/offcanvas
Disallow: /account/
Disallow: /checkout/
Disallow: /widgets/
Allow: /widgets/cms/
Allow: /widgets/menu/offcanvas
Sitemaps
In addition to the rules the sitemaps containing the domain will be linked. Again suppose we have the domains example.com
and example.com/en
configured, the robots.txt
will contain
Sitemap: https://example.com/en/sitemap.xml
Sitemap: https://example.com/sitemap.xml