bluefly / llm
Core LLM functionality and integration for the platform
dev-drupal-standards-fix-20250802-1427
2025-08-03 17:56 UTC
Requires
- php: >=8.1
- drupal/core: ^10.3 || ^11
Requires (Dev)
- drupal/core-dev: ^10.3 || ^11
- phpunit/phpunit: ^9.5
This package is auto-updated.
Last update: 2025-08-25 12:21:55 UTC
README
Drupal module for AI model management and configuration with multi-provider support, security auditing, and usage tracking.
Features
- Multi-Provider Support: OpenAI, Anthropic, Ollama with automatic failover and smart routing
- Model Discovery: Automatic discovery and configuration of AI models across providers
- Visual Configuration: Drag-and-drop interface for building AI pipelines and workflows
- Security Auditing: OWASP compliance scanning and vulnerability detection
- Usage Tracking: Token monitoring, cost calculation, and analytics dashboard
- Setup Wizard: Interactive setup with CLI and web interfaces
- Multi-Tenancy: Organization and domain-based isolation with Group/Domain module integration
Installation
Prerequisites
- Drupal 10.2+ or Drupal 11.x
- PHP 8.3+
- Composer
- Drupal AI module (
drupal/ai
)
Setup
# Install via Composer
composer require drupal/llm
# Enable the module
drush en llm -y
# Run database updates and clear caches
drush updatedb && drush cr
Quick Start
# Setup wizard (interactive)
drush llm:setup
# Model discovery
drush llm:discovery auto --provider=ollama
# Test provider connectivity
drush llm:test-provider openai
# Run security audit
drush llm:security:audit
Usage
Basic AI Operations
// Get AI chat service
$chatService = \Drupal::service('llm.ai_chat');
// Send a message
$response = $chatService->sendMessage('Explain quantum computing', [
'provider' => 'openai',
'model' => 'gpt-4',
'temperature' => 0.7,
]);
$content = $response['content'];
$usage = $response['usage'];
Security Auditing
// Run comprehensive security audit
$auditor = \Drupal::service('llm.security.owasp_auditor');
$results = $auditor->performSecurityAudit(['all']);
// Get critical findings
$critical = array_filter($results['findings'], function($finding) {
return $finding['severity'] === 'critical';
});
API Reference
Core services and endpoints:
REST Endpoints
# Chat completion
POST /api/llm/v1/chat
Content-Type: application/json
{
"message": "Hello",
"provider": "openai",
"model": "gpt-4"
}
# List providers
GET /api/llm/v1/providers
# Get usage statistics
GET /api/llm/v1/usage/{user_id}
Services
// Core services
llm.platform_manager # Central coordinator
llm.ai_chat # Chat operations
llm.usage_tracker # Usage tracking
llm.cost_calculator # Cost calculation
llm.security.owasp_auditor # Security auditing
Drush Commands
# Setup and discovery
drush llm:setup # Launch setup wizard
drush llm:discovery auto --provider=ollama # Auto-discover models
drush llm:health # Check model health
drush llm:test-provider openai # Test provider connectivity
# Security and usage
drush llm:security:audit # Run security audit
drush llm:usage:stats # Get usage statistics
drush llm:usage:export # Export usage data
Configuration
Configure through Drupal's settings.php:
// settings.php
$config['llm.settings']['providers'] = [
'openai' => [
'api_key' => getenv('OPENAI_API_KEY'),
'default_model' => 'gpt-4',
],
'ollama' => [
'base_url' => 'http://localhost:11434',
'default_model' => 'llama3.2',
],
];
$config['llm.settings']['security'] = [
'audit_frequency' => 'daily',
'compliance_standards' => ['owasp', 'fedramp'],
];
Contributing
See CONTRIBUTING.md for development guidelines.
License
GPL-2.0+ - see LICENSE file.
Documentation
- Service documentation in PHPDoc comments
- Plugin development guides in
src/Plugin/*/README.md
- Configuration schemas co-located with config files
- Integration examples in each service directory
Built with ❤️ by the LLM Platform Team