fuelviews / laravel-robots-txt
Laravel robots txt package
Installs: 1 147
Dependents: 0
Suggesters: 0
Security: 0
Stars: 1
Watchers: 1
Forks: 0
Open Issues: 0
Requires
- php: ^8.3
- illuminate/contracts: ^10.0||^11.0||^12.0
- spatie/laravel-package-tools: ^1.92
Requires (Dev)
- driftingly/rector-laravel: ^2.0
- laravel/pint: ^1.14
- nunomaduro/collision: ^8.1.1||^7.10.0
- orchestra/testbench: ^10.2||^9.0.0||^8.22.0
- pestphp/pest: ^3.0||^2.34
- pestphp/pest-plugin-arch: ^3.0||^2.7
- pestphp/pest-plugin-laravel: ^3.2||^2.3
- rector/rector: ^2.0
This package is auto-updated.
Last update: 2025-08-20 19:44:56 UTC
README
Laravel Robots.txt is a robust and easy-to-use solution designed to automatically generate and serve dynamic robots.txt files for your Laravel application. The package provides intelligent caching, environment-based rules, and seamless integration with your application's routing system.
Requirements
- PHP ^8.3
- Laravel ^10.0 || ^11.0 || ^12.0
Installation
Install the package via Composer:
composer require fuelviews/laravel-robots-txt
Publish the configuration file:
php artisan vendor:publish --tag="robots-txt-config"
Basic Usage
Automatic Route Registration
The package automatically registers a route at /robots.txt
that serves your dynamic robots.txt file:
https://yoursite.com/robots.txt
Configuration
Configure your robots.txt rules in config/robots-txt.php
:
<?php return [ /** * The disk where the robots.txt file will be saved */ 'disk' => 'public', /** * User agent rules for different paths */ 'user_agents' => [ '*' => [ 'Allow' => [ '/', ], 'Disallow' => [ '/admin', '/dashboard', ], ], 'Googlebot' => [ 'Allow' => [ '/', ], 'Disallow' => [ '/admin', ], ], ], /** * Sitemaps to include in robots.txt */ 'sitemap' => [ 'sitemap.xml', ], ];
Environment Behavior
Development/Staging Environments
In non-production environments (app.env
!== 'production'), the package automatically generates a restrictive robots.txt:
User-agent: *
Disallow: /
This prevents search engines from indexing your development or staging sites.
Production Environment
In production, the package uses your configured rules to generate the robots.txt file.
Advanced Usage
Using the Facade
use Fuelviews\RobotsTxt\Facades\RobotsTxt; // Get robots.txt content $content = RobotsTxt::getContent(); // Generate fresh content (bypasses cache) $content = RobotsTxt::generate(); // Save to a specific disk and path RobotsTxt::saveToFile('s3', 'seo/robots.txt');
Direct Class Usage
use Fuelviews\RobotsTxt\RobotsTxt; $robotsTxt = app(RobotsTxt::class); // Check if regeneration is needed $content = $robotsTxt->getContent(); // Generate and save to custom location $robotsTxt->saveToFile('public', 'custom-robots.txt');
Named Routes
The package registers a named route that you can reference:
// In your views <link rel="robots" href="{{ route('robots') }}"> // Generate URL $robotsUrl = route('robots');
Configuration Options
Disk Configuration
Specify which Laravel filesystem disk to use for storing the robots.txt file:
'disk' => 'public', // or 's3', 'local', etc.
User Agent Rules
Define rules for different user agents:
'user_agents' => [ '*' => [ 'Allow' => ['/'], 'Disallow' => ['/admin', '/dashboard'], ], 'Googlebot' => [ 'Allow' => ['/api/public/*'], 'Disallow' => ['/api/private/*'], ], 'Bingbot' => [ 'Crawl-delay' => ['1'], 'Disallow' => ['/admin'], ], ],
Sitemap Integration
Include sitemap URLs in your robots.txt:
'sitemap' => [ 'sitemap.xml', 'posts-sitemap.xml', 'categories-sitemap.xml', ],
This generates:
Sitemap: https://yoursite.com/sitemap.xml
Sitemap: https://yoursite.com/posts-sitemap.xml
Sitemap: https://yoursite.com/categories-sitemap.xml
Caching System
The package uses an intelligent caching system that regenerates the robots.txt file only when:
- The configuration changes
- The application environment changes
- The application URL changes
- The cached file doesn't exist
Cache Management
Cache is automatically managed, but you can clear it manually:
use Illuminate\Support\Facades\Cache; // Clear the robots.txt cache Cache::forget('robots-txt.checksum');
File Storage
Automatic Storage
The package automatically stores the generated robots.txt file to your configured disk at robots-txt/robots.txt
.
Custom Storage
use Fuelviews\RobotsTxt\Facades\RobotsTxt; // Save to specific location RobotsTxt::saveToFile('s3', 'seo/robots.txt'); // Save to multiple locations RobotsTxt::saveToFile('public', 'robots.txt'); RobotsTxt::saveToFile('backup', 'robots-backup.txt');
Example Generated Output
Production Environment
User-agent: *
Allow: /
Disallow: /admin
Disallow: /dashboard
User-agent: Googlebot
Allow: /
Disallow: /admin
Sitemap: https://yoursite.com/sitemap.xml
Sitemap: https://yoursite.com/posts-sitemap.xml
Non-Production Environment
User-agent: *
Disallow: /
Testing
Run the package tests:
composer test
Troubleshooting
Robots.txt Not Updating
If your robots.txt isn't reflecting configuration changes:
- Clear the application cache:
php artisan cache:clear
- Ensure your configuration is valid
- Check file permissions for the storage disk
Route Conflicts
If you have an existing /robots.txt
route or static file:
- Remove any static
public/robots.txt
file (the package automatically removes it) - Ensure no other routes conflict with
/robots.txt
Changelog
Please see CHANGELOG for more information on what has changed recently.
Contributing
Please see CONTRIBUTING for details.
Security Vulnerabilities
Please review our security policy on how to report security vulnerabilities.
Credits
📜 License
The MIT License (MIT). Please see License File for more information.
Built with ❤️ by the Fuelviews team