bluefly / llm_platform
AI integration platform recipe for Drupal with LLM provider support and vector database capabilities
Requires
- bluefly/ai_agent_orchestra: *
- bluefly/ai_provider_apple: *
- bluefly/ai_provider_langchain: *
- bluefly/api_normalization: *
- bluefly/llm: *
- bluefly/llm_platform_manager: *
- bluefly/mcp_client_extras: *
- bluefly/recipe_onboarding: *
- drupal/admin_toolbar: ^3.0
- drupal/advancedqueue: ^1.0
- drupal/ai: ^1.0
- drupal/ai_agents: ^1.0
- drupal/ai_automators: ^1.0
- drupal/ai_content_suggestions: ^1.0
- drupal/ai_eca: ^1.0
- drupal/ai_external_moderation: ^1.0
- drupal/ai_logging: ^1.0
- drupal/ai_provider_amazeeio: ^1.0
- drupal/ai_provider_anthropic: ^1.0
- drupal/ai_provider_azure: ^1.0
- drupal/ai_provider_deepseek: ^1.0
- drupal/ai_provider_google_vertex: ^1.0
- drupal/ai_provider_huggingface: ^1.0
- drupal/ai_provider_litellm: ^1.0
- drupal/ai_provider_lmstudio: ^1.0
- drupal/ai_provider_mistral: ^1.0
- drupal/ai_provider_ollama: ^1.0
- drupal/ai_provider_openai: ^1.0
- drupal/ai_provider_x: ^1.0
- drupal/ai_search: ^1.0
- drupal/ai_vdb_provider_milvus: ^1.0
- drupal/ai_vdb_provider_pinecone: ^1.0
- drupal/config_split: ^2.0
- drupal/consumers: ^1.0
- drupal/core: ^10.1 || ^11.0
- drupal/devel: ^4.0
- drupal/eca: ^2.0
- drupal/eca_content: ^2.0
- drupal/elevenlabs: ^1.0
- drupal/feeds: ^3.0
- drupal/field_group: ^3.0
- drupal/gemini_provider: ^1.0
- drupal/hook_event_dispatcher: ^3.0
- drupal/jsonapi_extras: ^3.0
- drupal/openapi: ^3.0
- drupal/pathauto: ^1.0
- drupal/plugin: ^1.0
- drupal/queue_ui: ^1.0
- drupal/redis: ^1.0
- drupal/restui: ^1.0
- drupal/search_api: ^1.0
- drupal/search_api_solr: ^4.0
- drupal/simple_oauth: ^5.0
- drupal/token: ^1.0
- drupal/ultimate_cron: ^2.0
- drupal/webprofiler: ^4.0
Suggests
- bluefly/alternative_services: Service abstraction layer
- bluefly/api_normalization: API response normalization
- bluefly/mcp_client_extras: Model Control Plane client
- drupal/ai: Core AI framework for Drupal
- drupal/elasticsearch_connector: Elasticsearch integration
- drupal/facets: Faceted search interface
- drupal/memcached: Alternative caching backend
- drupal/search_api_attachments: File content indexing
- drupal/search_api_autocomplete: Search autocomplete functionality
- drupal/search_api_elasticsearch: Alternative search backend
This package is auto-updated.
Last update: 2025-07-05 18:01:10 UTC
README
Status: Production Ready
Drupal: 10.1+ | 11
Type: Site Recipe
Security & Compliance: All security modules, compliance frameworks, and hardening are provided by the Secure Drupal Recipe, which is included as a dependency. The LLM Platform recipe focuses on AI/LLM features and depends on Secure Drupal for all security and compliance.
A comprehensive Drupal recipe that sets up a production-ready AI/LLM integration platform with unified orchestration, vector database management, and enterprise-grade monitoring.
Overview
This recipe installs and configures a complete AI platform for Drupal, providing:
- Core AI Integration: LLM module with multi-provider support
- Modern Admin Interface: Gin admin theme optimized for AI workflows
- Vector Database Support: Semantic search and AI-powered content discovery
- Enterprise Monitoring: Comprehensive analytics and performance tracking
- Content Automation: AI-powered content workflows and moderation
- Dependency Management: Proper module configuration and optimization
- Security & Compliance: All security, encryption, and compliance are managed by the Secure Drupal recipe
Requirements
- Drupal 10.1+ or Drupal 11
- PHP 8.1+
- Node.js 20+ (for complementary TypeScript services)
- One or more AI provider API keys (OpenAI, Anthropic, etc.)
Architecture Overview
Unified AI Platform
π§ AI LLM Platform - Unified
|
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
| UnifiedAiOrchestrator |
| (Central Coordination Hub) |
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
βββββββββββββββ¬ββββββββββββββ¬ββββββββββββββ¬ββββββββββββββ¬ββββββββββββββ
| | | | | |
βVector βAutonomous βMonitor- βSecurity βDeploy- β
βDatabase βAgents βing β(via βment β
βManager βManager βService βSecure βManager β
β β β βDrupal) β β
βββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
Included Modules
Core Drupal Modules
- Node, Field, Text, Filter, User, System, Path
- Image, File, Media (for AI-generated content)
Contrib Modules
- Gin - Modern admin theme optimized for AI workflows
- ECA - Event-Condition-Action framework for AI workflows
- AI Module Suite - Core AI framework and provider integrations
- ai
- ai_external_moderation
- ai_ckeditor
- ai_image
- ai_automators
- Advanced Queue - Enterprise queue management
Custom AI Platform Modules
- LLM - Unified AI orchestration and provider management
- AI Agent Orchestra - Multi-agent workflow coordination
- AI Provider Apple - Apple Intelligence integration
- AI Provider LangChain - LangChain framework integration
- MCP Client Extras - Model Control Plane support
- Alternative Services - Service abstraction layer
- API Normalization - Consistent API responses
- Recipe Onboarding - Guided setup wizard
- Gov Compliance - Government compliance and audit tools
Note: All security modules (key, encrypt, tfa, seckit, login_security, autologout, captcha, honeypot, password_policy, admin_toolbar, etc.) are managed by the Secure Drupal recipe and should not be listed or configured here.
Security & Compliance
All security, encryption, audit logging, access control, and compliance features are provided by the Secure Drupal Recipe, which is included as a dependency. This includes:
- Field-level encryption
- API key security and rotation
- Audit logging and access control
- GDPR, SOC 2, and government compliance
- Content safety and moderation
- All security configuration and validation
To update or customize security, edit the Secure Drupal recipe.
Performance Optimization
Caching Strategy
- Response Caching: Intelligent AI response caching
- Vector Caching: Optimized embedding storage
- Result Memoization: Avoid duplicate processing
- CDN Integration: Static asset optimization
Scalability Features
- Queue Management: Async processing for heavy workloads
- Load Balancing: Multi-provider failover
- Resource Monitoring: Automatic scaling triggers
- Database Optimization: Optimized queries and indexes
Troubleshooting
Common Issues
Provider API Errors
- Verify API keys are correctly configured
- Check rate limits and quotas
- Review error logs at
/admin/reports/dblog
Vector Database Connection
- Verify database credentials and connectivity
- Check firewall and network settings
- Review vector database logs
Performance Issues
- Monitor resource usage in admin dashboard
- Check queue processing status
- Review caching configuration
Service Dependencies
- Run
drush llm:service-check
to validate services - Check module dependencies and versions
- Review service definition conflicts
- Run
Debug Mode
Enable detailed logging:
drush config:set llm.settings debug_mode true
drush config:set llm.settings log_level debug
Support Resources
- Module documentation at
/admin/help/llm
- Error logs at
/admin/reports/dblog
- System status at
/admin/reports/status
- Service health at
/admin/config/llm/health
ποΈ MACH Alliance Compliance
The LLM Platform recipe is designed to enable MACH Alliance architecture principles for enterprise AI implementations:
Microservices Architecture
- Service-Oriented Design: Each AI provider integration operates as an independent service
- Container-First Deployment: Docker support for all AI services with health checks
- Event-Driven Communication: ECA framework enables async communication between services
- Independent Scaling: AI processing components can scale independently based on demand
API-First Implementation
- Comprehensive API Coverage: REST, GraphQL, and JSON-RPC endpoints for all functionality
- OpenAPI Documentation: Machine-readable API specifications for all AI services
- SDK Generation: Automatic TypeScript SDK generation from OpenAPI specs
- Headless Ready: Complete separation of AI processing from presentation layer
Cloud-Native Features
- Kubernetes Support: Helm charts and deployment manifests for orchestration
- Multi-Cloud Deployment: Platform-agnostic configuration supporting AWS, Azure, GCP
- Auto-Scaling: Horizontal pod autoscaling based on AI workload metrics
- Service Discovery: Automatic service registration and health monitoring
- Circuit Breaker Pattern: Resilient AI provider integration with graceful degradation
Headless Architecture
- Frontend Agnostic: AI capabilities accessible via APIs from any frontend framework
- Mobile-First APIs: Optimized endpoints for mobile and progressive web applications
- Edge Computing: Lightweight AI inference suitable for edge deployment
- Multi-Channel Support: Same AI backend serving web, mobile, IoT, and voice interfaces
Enterprise Integration Patterns
- Event Sourcing: Complete audit trail of AI operations for compliance
- CQRS Implementation: Separate read/write models for AI data and analytics
- Message Queues: Advanced queue management for AI workload distribution
- Distributed Caching: Multi-layer caching strategy for AI responses and vectors
This recipe transforms Drupal into a cloud-native AI platform that supports modern enterprise architecture while maintaining the flexibility to operate in traditional environments.
Development & Testing
Test Suite
# Run full test suite (requires Node.js)
npm run test:full
# PHP unit tests only
vendor/bin/phpunit web/modules/custom/llm/tests
# Validate platform health
drush llm:validate
Development Mode
# Enable development settings
drush config:set llm.settings development_mode true
# Show detailed error messages
drush config:set system.logging error_level verbose
# Disable caching for development
drush config:set llm.settings cache_responses false
Related Documentation
- AI Integration Guide
- Production Readiness Report
- Fresh Install Guide
- Environment Variables
- Secure Drupal Recipe
License
GPL-2.0-or-later