bluefly / ai_provider_langchain
LangChain provider for the Drupal AI platform with advanced chain orchestration
Requires
- php: >=8.1
- bluefly/llm: ^0.1
- drupal/ai: ^1.0
- drupal/ai_provider_anthropic: ^1.0
- drupal/ai_provider_huggingface: ^1.0@beta
- drupal/ai_provider_litellm: ^1.1
- drupal/ai_provider_ollama: ^1.0
- drupal/ai_provider_openai: ^1.0
- drupal/ai_search: ^1.1
- drupal/core: ^10.3 || ^11
- drupal/search_api: ^1.0
- drupal/search_api_solr: ^4.3
Suggests
- drupal/ai_automators: AI automation features for enhanced workflows
- drupal/search_api: Search API integration for vector search
- drupal/vault: Additional secure credential storage options (^3.0)
- drupal/vector_database: Vector database integration for embeddings
- theodo-group/llphant: LangChain PHP library for advanced features (^0.7)
This package is auto-updated.
Last update: 2025-08-25 12:21:04 UTC
README
LangChain integration as an AI provider for the Drupal AI ecosystem.
What This Module Does
This module adds LangChain as an AI provider to Drupal's AI system. It enables LangChain orchestration, agents, and tools through the standard Drupal AI provider interface.
Features
LangChain Provider Integration
- Chains: LangChain chain execution
- Agents: LangChain agent orchestration
- Tools: LangChain tool integration
- Memory: Conversation memory
- Retrievers: RAG retrieval
- Document Loaders: Document processing
- Embeddings: Vector embeddings
- Callbacks: Execution monitoring
Drupal AI Integration
- Native AI provider plugin
- Integration with ai_automators
- Integration with ai_assistant_api
- Integration with ai_agentic_workflows
- Integration with ai_agent_crewai
Installation
Prerequisites
composer require drupal/ai
drush en ai
Install LangChain Provider
drush en ai_provider_langchain
Configure LangChain Server
Navigate to /admin/config/ai/providers/langchain
and configure:
- LangChain server URL
- API authentication
- Default models and chains
Usage
Basic LangChain Provider Usage
$ai_provider = \Drupal::service('plugin.manager.ai_provider');
$langchain = $ai_provider->createInstance('langchain');
// Execute a LangChain chain
$result = $langchain->executeChain('conversation', [
'input' => 'Hello, how are you?',
'context' => ['user_id' => 123],
]);
// Execute a LangChain agent
$result = $langchain->executeAgent('research_agent', [
'query' => 'Find information about Drupal AI',
'tools' => ['web_search', 'document_loader'],
]);
Integration with ai_agentic_workflows
$workflow_integration = \Drupal::service('ai_provider_langchain.workflow_integration');
// Use LangChain in workflow steps
$workflow_step = [
'type' => 'langchain_chain',
'chain_name' => 'qa_chain',
'inputs' => ['question' => 'What is Drupal?'],
'provider' => 'langchain',
];
Integration with ai_agent_crewai
$crewai_integration = \Drupal::service('ai_agent_crewai.workflow_integration');
// Use LangChain agents in CrewAI crews
$crew = [
'agents' => [
'langchain_agent' => [
'role' => 'Research Agent',
'provider' => 'langchain',
'chain' => 'research_chain',
'tools' => ['web_search', 'document_loader'],
],
],
];
Configuration
LangChain Server Setup
You need a LangChain server endpoint. Example Python setup:
# langchain_server.py
from flask import Flask, request, jsonify
from langchain.agents import initialize_agent, Tool
from langchain.llms import OpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
import os
app = Flask(__name__)
@app.route('/chain', methods=['POST'])
def execute_chain():
data = request.json
# Initialize LangChain components
llm = OpenAI(api_key=os.getenv('OPENAI_API_KEY'))
# Execute chain based on request
chain_name = data.get('chain', 'conversation')
if chain_name == 'conversation':
prompt = PromptTemplate(
input_variables=["input"],
template="You are a helpful assistant. {input}"
)
chain = LLMChain(llm=llm, prompt=prompt)
result = chain.run(data['inputs']['input'])
return jsonify({
'output': result,
'chain': chain_name,
'success': True
})
@app.route('/agent', methods=['POST'])
def execute_agent():
data = request.json
# Initialize agent with tools
llm = OpenAI(api_key=os.getenv('OPENAI_API_KEY'))
tools = []
for tool_name in data.get('tools', []):
if tool_name == 'web_search':
tools.append(Tool(
name="web_search",
func=lambda x: f"Search results for: {x}",
description="Search the web for information"
))
agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True)
result = agent.run(data['inputs']['query'])
return jsonify({
'output': result,
'agent': data.get('agent', 'default'),
'success': True
})
if __name__ == '__main__':
app.run(host='0.0.0.0', port=5000)
Services
LangChainProvider
Main AI provider plugin that implements Drupal's AI provider interface.
LangChainClient
HTTP client for communicating with LangChain server.
LangChainChainManager
Manages LangChain chain execution and caching.
LangChainAgentManager
Manages LangChain agent execution and tool integration.
LangChainWorkflowIntegration
Integrates LangChain with ai_agentic_workflows.
API Endpoints
Chain Execution
POST /langchain/chain
{
"chain": "conversation",
"inputs": {
"input": "Hello, how are you?"
}
}
Agent Execution
POST /langchain/agent
{
"agent": "research_agent",
"inputs": {
"query": "Find information about Drupal"
},
"tools": ["web_search", "document_loader"]
}
Embeddings
POST /langchain/embeddings
{
"texts": ["Hello world", "Drupal is great"],
"model": "text-embedding-ada-002"
}
Dependencies
Required
- drupal/ai: Core AI framework
Suggested
- drupal/ai_automators: LangChain can power automators
- drupal/ai_assistant_api: LangChain agents as assistants
- drupal/ai_agentic_workflows: LangChain-powered workflows
- drupal/ai_agent_crewai: LangChain + CrewAI integration
- drupal/eca: Trigger LangChain workflows
- drupal/comprehensivequeue: Async LangChain execution
- drupal/redis: Caching for LangChain results
License
GPL-2.0+