bluefly / llm
Clean AI coordination platform for Drupal. Orchestrates contrib AI modules with native patterns, multi-provider support, and ECA workflows.
Requires
- php: ^8.1
- drupal/core: ^10.3 || ^11
Suggests
- drupal/advancedqueue: Enterprise queue management
- drupal/ai: Base AI module for provider system (^1.0@alpha)
- drupal/ai_agents: Advanced agent orchestration
- drupal/ai_automators: AI automation tools
- drupal/ai_provider_anthropic: Anthropic Claude provider
- drupal/ai_provider_ollama: Local AI model provider
- drupal/ai_provider_openai: OpenAI GPT provider
- drupal/eca: Event-Condition-Action workflows (^2.0)
- drupal/encrypt: Key encryption (^3.0)
- drupal/facets: Faceted search capabilities
- drupal/key: Secure API key management (^1.0)
- drupal/mcp: Model Command Processor integration
- drupal/mcp_client: MCP client implementation
- drupal/redis: High-performance caching
- drupal/search_api: Search framework for vector operations
- drupal/search_api_solr: Solr backend for advanced search
- drupal/views_bulk_operations: Bulk operations for AI content
- guzzlehttp/guzzle: HTTP client library (^7.0)
README
AI coordination layer extending Drupal contrib AI ecosystem with conversation management, usage tracking, and provider orchestration.
Overview
The LLM Platform Core module serves as the central coordination layer for Drupal's AI ecosystem. Rather than reinventing functionality, it leverages and extends the contrib AI modules to provide a unified platform for AI integration, conversation management, and multi-provider orchestration.
Features
- Provider Orchestration: Coordinates multiple AI providers through the contrib AI module ecosystem
- Conversation Management: Entity-based conversation storage with proper access control and encryption
- Usage Tracking: Comprehensive token usage monitoring across all providers for cost management
- AI-Powered Development: Generate modules, recipes, and tests using AI assistance
- Vector Embeddings: Store and search vector embeddings for RAG workflows
- Workflow Automation: ECA integration for AI-powered content workflows
- Security & Compliance: Built-in audit logging, encryption, and compliance features
- Platform Dashboard: Centralized overview of AI providers and system status
Requirements
- Drupal 10.4 or higher
- PHP 8.1 or higher
- AI module (
drupal/ai
) and ecosystem - Key module (
drupal/key
) for secure credential storage - Encrypt module (
drupal/encrypt
) for data encryption - ECA module suite for workflow automation
Installation
Install via Composer:
composer require bluefly/llm
Enable the module and dependencies:
drush en llm -y
Configure at
/admin/config/ai/llm
Core Concepts
Provider Coordination
The module acts as an orchestration layer for contrib AI providers:
// Discover available providers
$providerManager = \Drupal::service('llm.provider_manager');
$providers = $providerManager->getAvailableProviders();
// Use best provider for task
$response = $providerManager->execute('chat', [
'message' => 'Explain Drupal architecture',
'preferences' => ['speed' => 'fast', 'cost' => 'low']
]);
Conversation Management
Conversations are stored as entities with full access control:
// Create a conversation
$conversation = \Drupal::entityTypeManager()
->getStorage('ai_conversation')
->create([
'title' => 'Architecture Discussion',
'provider' => 'openai',
'model' => 'gpt-4'
]);
$conversation->save();
// Add messages
$conversation->addMessage('user', 'What is Drupal?');
$conversation->addMessage('assistant', 'Drupal is an open-source CMS...');
Usage Tracking
Monitor token usage and costs across all providers:
// Get usage statistics
$usageTracker = \Drupal::service('llm.usage_tracker');
$stats = $usageTracker->getStatistics('monthly');
// Set usage limits
$usageTracker->setLimit('openai', 1000000); // 1M tokens
AI-Powered Development
Module Generation
Generate complete Drupal modules with AI assistance:
# Basic module generation
drush ai:generate-module my_feature \
--description="User analytics dashboard" \
--ai-provider=ollama
# Advanced generation with requirements
drush ai:generate-module commerce_assistant \
--type=api \
--requirements='{"features":["product_search","recommendations"]}' \
--tddai-init \
--coverage-target=90
Recipe Creation
Create comprehensive Drupal recipes:
# Enterprise platform recipe
drush ai:create-recipe enterprise_platform \
--type=enterprise_ai_platform \
--modules='["llm","ai_agent_orchestra","gov_compliance"]' \
--deployment-ready
# Custom solution recipe
drush ai:create-recipe healthcare_portal \
--requirements='{"compliance":["HIPAA"],"features":["patient_chat"]}' \
--ai-enhanced
Test Generation
Automatically generate comprehensive test suites:
# Generate tests for a service
drush ai:generate-tests src/Service/MyService.php \
--test-types=unit,functional \
--coverage-target=95
# Generate tests for entire module
drush ai:generate-tests modules/custom/my_module \
--comprehensive \
--drupal-standards
Code Analysis
AI-powered code analysis and improvements:
# Analyze code quality
drush ai:analyze-code src/Service/MyService.php \
--checks=quality,security,performance
# Get improvement suggestions
drush ai:assist src/Controller/MyController.php \
--intent=optimize \
--drupal-best-practices
Vector Embeddings and RAG
Storing Embeddings
// Generate and store embeddings
$embeddingService = \Drupal::service('llm.embedding_service');
$vector = $embeddingService->generateEmbedding($text);
// Store with metadata
$storage = \Drupal::service('llm.vector_storage');
$storage->store([
'vector' => $vector,
'content' => $text,
'entity_type' => 'node',
'entity_id' => $node->id(),
]);
Similarity Search
// Search for similar content
$query = "Drupal performance optimization";
$queryVector = $embeddingService->generateEmbedding($query);
$results = $storage->search($queryVector, [
'limit' => 10,
'threshold' => 0.8,
]);
Workflow Automation
ECA Integration
Create AI-powered workflows using ECA:
- Navigate to Configuration > ECA > Models
- Create a new model
- Add LLM Platform actions:
- Generate Content
- Analyze Sentiment
- Extract Entities
- Summarize Text
Example Workflow
# Auto-generate content summaries
events:
- plugin: content_entity:insert
entity_type: node
bundle: article
conditions:
- plugin: llm:content_length
operator: '>'
value: 1000
actions:
- plugin: llm:generate_summary
field: field_summary
max_length: 200
provider: best_available
Configuration
Global Settings
Configure at /admin/config/ai/llm
:
- Default Provider: Primary AI provider selection
- Fallback Providers: Backup providers for reliability
- Usage Limits: Token and cost limits per provider
- Encryption: Configure encryption for conversations
- Audit Logging: Enable compliance logging
Provider Configuration
Each provider can be configured with:
- API credentials (via Key module)
- Model preferences
- Rate limits
- Cost tracking
- Custom parameters
API Endpoints
The module provides REST endpoints:
Conversations
GET /api/llm/conversations
POST /api/llm/conversations
GET /api/llm/conversations/{id}
POST /api/llm/conversations/{id}/messages
Providers
GET /api/llm/providers
GET /api/llm/providers/{id}/status
POST /api/llm/providers/{id}/test
Usage
GET /api/llm/usage
GET /api/llm/usage/{provider}
GET /api/llm/usage/report
Security
- Encryption: All conversations encrypted at rest
- Access Control: Entity-based permissions
- API Security: OAuth2/JWT authentication
- Audit Logging: Complete audit trail
- Input Validation: Prompt injection protection
Performance
- Response Caching: Intelligent caching for repeated queries
- Async Processing: Queue API for long-running tasks
- Connection Pooling: Efficient provider connections
- Token Optimization: Automatic prompt optimization
Extending the Module
Custom Providers
namespace Drupal\my_module\Plugin\LLM\Provider;
use Drupal\llm\Plugin\LLM\ProviderBase;
/**
* @LLMProvider(
* id = "my_provider",
* label = @Translation("My AI Provider"),
* description = @Translation("Custom AI provider integration")
* )
*/
class MyProvider extends ProviderBase {
public function chat($message, $model = NULL) {
// Implementation
}
}
Custom Actions
namespace Drupal\my_module\Plugin\ECA\Action;
use Drupal\llm\Plugin\ECA\Action\LLMActionBase;
/**
* @ECAAction(
* id = "my_llm_action",
* label = @Translation("My LLM Action")
* )
*/
class MyLLMAction extends LLMActionBase {
public function execute() {
// Implementation
}
}
Troubleshooting
Provider Connection Issues
- Verify API credentials in Key module
- Check provider status at
/admin/config/ai/llm/providers
- Review logs at
/admin/reports/dblog
Performance Issues
- Enable caching in LLM settings
- Use queue processing for bulk operations
- Monitor token usage for optimization
Support
- Issue Queue: https://github.com/bluefly/llm/issues
- Documentation: https://docs.bluefly.dev/llm-platform
- Packagist: https://packagist.org/packages/bluefly/llm
License
This project is licensed under the GPL-2.0-or-later license.