moe-mizrak / laravel-log-reader
Lightweight Laravel package for reading, searching, and filtering logs from both file and database sources.
Installs: 53
Dependents: 1
Suggesters: 0
Security: 0
Stars: 0
Watchers: 0
Forks: 0
Open Issues: 0
Type:package
pkg:composer/moe-mizrak/laravel-log-reader
Requires
- php: ^8.4
- spatie/laravel-data: ^4.17
Requires (Dev)
- laravel/pint: ^1.24
- orchestra/testbench: ^9.6.1
- phpunit/phpunit: ^11.4
This package is not auto-updated.
Last update: 2025-10-21 12:14:07 UTC
README
Lightweight Laravel package for searching and filtering logs from both file and database sources.
This package provides the core log reading functionality used by the Laravel MCP Log (MCP tool for Laravel log analysing with AI.).
Installation
You can install the package (that you created with this template) via composer:
composer require moe-mizrak/laravel-log-reader
You can publish and run the migrations with:
php artisan vendor:publish --tag="laravel-log-reader"
Usage
You can use the package to read, search, and filter logs from both file and database sources.
If your logs are stored in files (laravel.log), in the config file (laravel-log-reader.php) set the driver to file as:
'driver' => env('LOG_READER_DRIVER', LogDriverType::FILE->value), // in .env file LOG_READER_DRIVER=file
And set the log file path as:
'path' => env('LOG_FILE_PATH', storage_path('logs/laravel.log')), // in .env file LOG_FILE_PATH=/full/path/to/laravel.log
And also you can set a limit, chunk size as:
'chunk_size' => env('LOG_READER_FILE_CHUNK_SIZE', 512 * 1024), // 512KB for file reading 'limit' => env('LOG_READER_FILE_QUERY_LIMIT', 10000),
Service provider automatically resolves the correct log reader (FileLogReader) and you can use it as:
use MoeMizrak\LaravelLogReader\Facades\LogReader; use MoeMizrak\LaravelLogReader\Enums\FilterKeyType; $query = 'User authentication'; $filters = [FilterKeyType::LEVEL->value => 'info']; $result = LogReader::search($query)->filter($filters)->chunk()->execute();
If your logs are stored in database (log_entries table), in the config file (laravel-log-reader.php) set the driver to db as:
'driver' => env('LOG_READER_DRIVER', LogDriverType::DB->value), // in .env file LOG_READER_DRIVER=db
Set the connection and table name as:
'table' => env('LOG_DB_TABLE', 'log_entries'), // in .env file LOG_DB_TABLE=log_entries 'connection' => env('LOG_DB_CONNECTION'), // in .env file LOG_DB_CONNECTION=mysql
And set the database columns mapping and searchable columns as:
// Column mapping: maps DB columns to LogData properties 'columns' => [ LogTableColumnType::ID->value => 'id', LogTableColumnType::LEVEL->value => 'level', // e.g. 'ERROR', 'INFO' LogTableColumnType::MESSAGE->value => 'message', // main log message LogTableColumnType::TIMESTAMP->value => 'created_at', // time of the log entry (e.g. 'created_at' or 'logged_at') LogTableColumnType::CHANNEL->value => 'channel', // e.g. 'production', 'local' LogTableColumnType::CONTEXT->value => 'context', // additional context info, often JSON e.g. '{"action":"UserLogin"}' LogTableColumnType::EXTRA->value => 'extra', // any extra data, often JSON e.g. '{"ip":172.0.0.1, "session_id":"abc", "user_id":123}' ], 'searchable_columns' => [ ['name' => LogTableColumnType::MESSAGE->value, 'type' => ColumnType::TEXT->value], ['name' => LogTableColumnType::CONTEXT->value, 'type' => ColumnType::JSON->value], ['name' => LogTableColumnType::EXTRA->value, 'type' => ColumnType::JSON->value], ],
And also you can set a limit, chunk size as:
'limit' => env('LOG_READER_DB_QUERY_LIMIT', 10000), // max number of records to fetch in queries 'chunk_size' => env('LOG_READER_DB_CHUNK_SIZE', 500), // number of records per chunk when chunking is enabled
And you can use it as:
use MoeMizrak\LaravelLogReader\Facades\LogReader; use MoeMizrak\LaravelLogReader\Enums\FilterKeyType; $query = 'User authentication'; $filters = [FilterKeyType::DATE_FROM->value => '2025-01-01', FilterKeyType::DATE_TO->value => '2025-12-31']; $result = LogReader::search($query)->filter($filters)->chunk()->execute();
Note: You can chain the
search,filter, andchunkmethods in any order before callingexecute. Thesearchmethod performs a search on searchable fields (like message, context, etc.) based on the provided query (in config we havesearchable_columnsso that it can be customized).
TODO
- Add a
log_insightsmigration/table which will be a normalized, summarized, and searchable table.- It unifies different log mechanisms into a single canonical format, enabling faster lookups over large data.
- A background task should sync new log data periodically, basically everyday it summarizes the previous day's logs and inserts them into
log_insights.- Be aware that summarization may lose some details (e.g., exact errors or stack traces).
- Add support for cloud log readers (AWS CloudWatch, Azure Monitor, Google Cloud Logging).
- Add streaming responses, either as a parameter to search/filter methods or as a new method like
searchStreamusing cursors, yields, or$builder->lazy($chunkSize).- Use a cheap/free model to summarize large log files before search/filter (experimental approach).
- Refine
LOG_PATTERNinFileLogReaderto handle more real-world log formats.- Move
user_id,request_id, andip_addressinto dedicated columns instead of using theextrafield.
Contributing
Your contributions are welcome! If you'd like to improve this project, simply create a pull request with your changes. Your efforts help enhance its functionality and documentation.
If you find this project useful, please consider ⭐ it to show your support!
Authors
This project is created and maintained by Moe Mizrak.
License
Laravel Package Template is an open-sourced software licensed under the MIT license.