mediawiki / crawlable-all-pages
Extension to remove robot restrictions from Special:AllPages in MediaWiki
1.0.0
2024-08-24 04:26 UTC
Requires (Dev)
- mediawiki/mediawiki-codesniffer: 44.0.0
- mediawiki/minus-x: 1.1.3
- php-parallel-lint/php-console-highlighter: 1.0.0
- php-parallel-lint/php-parallel-lint: 1.4.0
- phpmd/phpmd: ~2.1
This package is auto-updated.
Last update: 2024-10-21 21:12:06 UTC
README
This extension overrides Special:AllPages
by changing the HTML header of the page. This is a relatively easy way to allow a search engine crawler to index all the pages in your wiki.
The HTML removed is simply:
<meta name="robots" content="noindex,nofollow"/>
Installation without composer
- Download and place the files in a directory called
CrawlableAllPages
in yourextensions/
folder. - Add the following code at the bottom of your
LocalSettings.php
:
wfLoadExtension( 'CrawlableAllPages' );
- ✓ Done – Navigate to
Special:Version
on your wiki to verify that the extension is successfully installed.
Installation with composer
- If you do not have a
composer.local.json
file in your MediaWiki installation, create one:
echo '{require: { "mediawiki/crawlable-all-pages": "dev-master" }' > composer.local.json
- If you have jq and moreutils’ sponge installed and an existing
composer.local.json
, you can use the following command to add this extension to yourcomposer.local.json
file:
jq '.require += { "mediawiki/crawlable-all-pages": "dev-master" }' \ composer.local.json | sponge composer.local.json
- Run
composer update
composer update
- ✓ Done – Navigate to
Special:Version
on your wiki to verify that the extension is successfully installed.