fuelviews/laravel-robots-txt

Laravel robots txt package

v1.0.0 2025-08-20 19:44 UTC

This package is auto-updated.

Last update: 2025-08-20 19:44:56 UTC


README

Latest Version on Packagist GitHub Tests Action Status GitHub Code Style Action Status Total Downloads PHP Version Laravel Version

Laravel Robots.txt is a robust and easy-to-use solution designed to automatically generate and serve dynamic robots.txt files for your Laravel application. The package provides intelligent caching, environment-based rules, and seamless integration with your application's routing system.

Requirements

  • PHP ^8.3
  • Laravel ^10.0 || ^11.0 || ^12.0

Installation

Install the package via Composer:

composer require fuelviews/laravel-robots-txt

Publish the configuration file:

php artisan vendor:publish --tag="robots-txt-config"

Basic Usage

Automatic Route Registration

The package automatically registers a route at /robots.txt that serves your dynamic robots.txt file:

https://yoursite.com/robots.txt

Configuration

Configure your robots.txt rules in config/robots-txt.php:

<?php

return [
    /**
     * The disk where the robots.txt file will be saved
     */
    'disk' => 'public',

    /**
     * User agent rules for different paths
     */
    'user_agents' => [
        '*' => [
            'Allow' => [
                '/',
            ],
            'Disallow' => [
                '/admin',
                '/dashboard',
            ],
        ],
        'Googlebot' => [
            'Allow' => [
                '/',
            ],
            'Disallow' => [
                '/admin',
            ],
        ],
    ],

    /**
     * Sitemaps to include in robots.txt
     */
    'sitemap' => [
        'sitemap.xml',
    ],
];

Environment Behavior

Development/Staging Environments

In non-production environments (app.env !== 'production'), the package automatically generates a restrictive robots.txt:

User-agent: *
Disallow: /

This prevents search engines from indexing your development or staging sites.

Production Environment

In production, the package uses your configured rules to generate the robots.txt file.

Advanced Usage

Using the Facade

use Fuelviews\RobotsTxt\Facades\RobotsTxt;

// Get robots.txt content
$content = RobotsTxt::getContent();

// Generate fresh content (bypasses cache)
$content = RobotsTxt::generate();

// Save to a specific disk and path
RobotsTxt::saveToFile('s3', 'seo/robots.txt');

Direct Class Usage

use Fuelviews\RobotsTxt\RobotsTxt;

$robotsTxt = app(RobotsTxt::class);

// Check if regeneration is needed
$content = $robotsTxt->getContent();

// Generate and save to custom location
$robotsTxt->saveToFile('public', 'custom-robots.txt');

Named Routes

The package registers a named route that you can reference:

// In your views
<link rel="robots" href="{{ route('robots') }}">

// Generate URL
$robotsUrl = route('robots');

Configuration Options

Disk Configuration

Specify which Laravel filesystem disk to use for storing the robots.txt file:

'disk' => 'public', // or 's3', 'local', etc.

User Agent Rules

Define rules for different user agents:

'user_agents' => [
    '*' => [
        'Allow' => ['/'],
        'Disallow' => ['/admin', '/dashboard'],
    ],
    'Googlebot' => [
        'Allow' => ['/api/public/*'],
        'Disallow' => ['/api/private/*'],
    ],
    'Bingbot' => [
        'Crawl-delay' => ['1'],
        'Disallow' => ['/admin'],
    ],
],

Sitemap Integration

Include sitemap URLs in your robots.txt:

'sitemap' => [
    'sitemap.xml',
    'posts-sitemap.xml',
    'categories-sitemap.xml',
],

This generates:

Sitemap: https://yoursite.com/sitemap.xml
Sitemap: https://yoursite.com/posts-sitemap.xml
Sitemap: https://yoursite.com/categories-sitemap.xml

Caching System

The package uses an intelligent caching system that regenerates the robots.txt file only when:

  • The configuration changes
  • The application environment changes
  • The application URL changes
  • The cached file doesn't exist

Cache Management

Cache is automatically managed, but you can clear it manually:

use Illuminate\Support\Facades\Cache;

// Clear the robots.txt cache
Cache::forget('robots-txt.checksum');

File Storage

Automatic Storage

The package automatically stores the generated robots.txt file to your configured disk at robots-txt/robots.txt.

Custom Storage

use Fuelviews\RobotsTxt\Facades\RobotsTxt;

// Save to specific location
RobotsTxt::saveToFile('s3', 'seo/robots.txt');

// Save to multiple locations
RobotsTxt::saveToFile('public', 'robots.txt');
RobotsTxt::saveToFile('backup', 'robots-backup.txt');

Example Generated Output

Production Environment

User-agent: *
Allow: /
Disallow: /admin
Disallow: /dashboard

User-agent: Googlebot
Allow: /
Disallow: /admin

Sitemap: https://yoursite.com/sitemap.xml
Sitemap: https://yoursite.com/posts-sitemap.xml

Non-Production Environment

User-agent: *
Disallow: /

Testing

Run the package tests:

composer test

Troubleshooting

Robots.txt Not Updating

If your robots.txt isn't reflecting configuration changes:

  1. Clear the application cache: php artisan cache:clear
  2. Ensure your configuration is valid
  3. Check file permissions for the storage disk

Route Conflicts

If you have an existing /robots.txt route or static file:

  1. Remove any static public/robots.txt file (the package automatically removes it)
  2. Ensure no other routes conflict with /robots.txt

Changelog

Please see CHANGELOG for more information on what has changed recently.

Contributing

Please see CONTRIBUTING for details.

Security Vulnerabilities

Please review our security policy on how to report security vulnerabilities.

Credits

📜 License

The MIT License (MIT). Please see License File for more information.

Built with ❤️ by the Fuelviews team

⭐ Star us on GitHub📦 View on Packagist