mediawiki / crawlable-all-pages
Extension to remove robot restrictions from Special:AllPages in MediaWiki
Installs: 21
Dependents: 0
Suggesters: 0
Security: 0
Type:mediawiki-extension
pkg:composer/mediawiki/crawlable-all-pages
Requires (Dev)
- mediawiki/mediawiki-codesniffer: 44.0.0
- mediawiki/minus-x: 1.1.3
- php-parallel-lint/php-console-highlighter: 1.0.0
- php-parallel-lint/php-parallel-lint: 1.4.0
- phpmd/phpmd: ~2.1
This package is auto-updated.
Last update: 2025-10-21 23:30:34 UTC
README
This extension overrides Special:AllPages by changing the HTML header of the page. This is a relatively easy way to allow a search engine crawler to index all the pages in your wiki.
The HTML removed is simply:
<meta name="robots" content="noindex,nofollow"/>
Installation without composer
- Download and place the files in a directory called
CrawlableAllPagesin yourextensions/folder. - Add the following code at the bottom of your
LocalSettings.php:
wfLoadExtension( 'CrawlableAllPages' );
- ✓ Done – Navigate to
Special:Versionon your wiki to verify that the extension is successfully installed.
Installation with composer
- If you do not have a
composer.local.jsonfile in your MediaWiki installation, create one:
echo '{require: { "mediawiki/crawlable-all-pages": "dev-master" }' > composer.local.json
- If you have jq and moreutils’ sponge installed and an existing
composer.local.json, you can use the following command to add this extension to yourcomposer.local.jsonfile:
jq '.require += { "mediawiki/crawlable-all-pages": "dev-master" }' \ composer.local.json | sponge composer.local.json
- Run
composer update
composer update
- ✓ Done – Navigate to
Special:Versionon your wiki to verify that the extension is successfully installed.