bluefly/ai_provider_langchain

LangChain provider for the Drupal AI platform with advanced chain orchestration

0.1.0 2025-08-05 17:56 UTC

README

LangChain integration as an AI provider for the Drupal AI ecosystem.

What This Module Does

This module adds LangChain as an AI provider to Drupal's AI system. It enables LangChain orchestration, agents, and tools through the standard Drupal AI provider interface.

Features

LangChain Provider Integration

  • Chains: LangChain chain execution
  • Agents: LangChain agent orchestration
  • Tools: LangChain tool integration
  • Memory: Conversation memory
  • Retrievers: RAG retrieval
  • Document Loaders: Document processing
  • Embeddings: Vector embeddings
  • Callbacks: Execution monitoring

Drupal AI Integration

  • Native AI provider plugin
  • Integration with ai_automators
  • Integration with ai_assistant_api
  • Integration with ai_agentic_workflows
  • Integration with ai_agent_crewai

Installation

Prerequisites

composer require drupal/ai
drush en ai

Install LangChain Provider

drush en ai_provider_langchain

Configure LangChain Server

Navigate to /admin/config/ai/providers/langchain and configure:

  • LangChain server URL
  • API authentication
  • Default models and chains

Usage

Basic LangChain Provider Usage

$ai_provider = \Drupal::service('plugin.manager.ai_provider');
$langchain = $ai_provider->createInstance('langchain');

// Execute a LangChain chain
$result = $langchain->executeChain('conversation', [
  'input' => 'Hello, how are you?',
  'context' => ['user_id' => 123],
]);

// Execute a LangChain agent
$result = $langchain->executeAgent('research_agent', [
  'query' => 'Find information about Drupal AI',
  'tools' => ['web_search', 'document_loader'],
]);

Integration with ai_agentic_workflows

$workflow_integration = \Drupal::service('ai_provider_langchain.workflow_integration');

// Use LangChain in workflow steps
$workflow_step = [
  'type' => 'langchain_chain',
  'chain_name' => 'qa_chain',
  'inputs' => ['question' => 'What is Drupal?'],
  'provider' => 'langchain',
];

Integration with ai_agent_crewai

$crewai_integration = \Drupal::service('ai_agent_crewai.workflow_integration');

// Use LangChain agents in CrewAI crews
$crew = [
  'agents' => [
    'langchain_agent' => [
      'role' => 'Research Agent',
      'provider' => 'langchain',
      'chain' => 'research_chain',
      'tools' => ['web_search', 'document_loader'],
    ],
  ],
];

Configuration

LangChain Server Setup

You need a LangChain server endpoint. Example Python setup:

# langchain_server.py
from flask import Flask, request, jsonify
from langchain.agents import initialize_agent, Tool
from langchain.llms import OpenAI
from langchain.chains import LLMChain
from langchain.prompts import PromptTemplate
import os

app = Flask(__name__)

@app.route('/chain', methods=['POST'])
def execute_chain():
    data = request.json
    
    # Initialize LangChain components
    llm = OpenAI(api_key=os.getenv('OPENAI_API_KEY'))
    
    # Execute chain based on request
    chain_name = data.get('chain', 'conversation')
    
    if chain_name == 'conversation':
        prompt = PromptTemplate(
            input_variables=["input"],
            template="You are a helpful assistant. {input}"
        )
        chain = LLMChain(llm=llm, prompt=prompt)
        result = chain.run(data['inputs']['input'])
    
    return jsonify({
        'output': result,
        'chain': chain_name,
        'success': True
    })

@app.route('/agent', methods=['POST'])
def execute_agent():
    data = request.json
    
    # Initialize agent with tools
    llm = OpenAI(api_key=os.getenv('OPENAI_API_KEY'))
    
    tools = []
    for tool_name in data.get('tools', []):
        if tool_name == 'web_search':
            tools.append(Tool(
                name="web_search",
                func=lambda x: f"Search results for: {x}",
                description="Search the web for information"
            ))
    
    agent = initialize_agent(tools, llm, agent="zero-shot-react-description", verbose=True)
    result = agent.run(data['inputs']['query'])
    
    return jsonify({
        'output': result,
        'agent': data.get('agent', 'default'),
        'success': True
    })

if __name__ == '__main__':
    app.run(host='0.0.0.0', port=5000)

Services

LangChainProvider

Main AI provider plugin that implements Drupal's AI provider interface.

LangChainClient

HTTP client for communicating with LangChain server.

LangChainChainManager

Manages LangChain chain execution and caching.

LangChainAgentManager

Manages LangChain agent execution and tool integration.

LangChainWorkflowIntegration

Integrates LangChain with ai_agentic_workflows.

API Endpoints

Chain Execution

POST /langchain/chain
{
  "chain": "conversation",
  "inputs": {
    "input": "Hello, how are you?"
  }
}

Agent Execution

POST /langchain/agent
{
  "agent": "research_agent",
  "inputs": {
    "query": "Find information about Drupal"
  },
  "tools": ["web_search", "document_loader"]
}

Embeddings

POST /langchain/embeddings
{
  "texts": ["Hello world", "Drupal is great"],
  "model": "text-embedding-ada-002"
}

Dependencies

Required

  • drupal/ai: Core AI framework

Suggested

  • drupal/ai_automators: LangChain can power automators
  • drupal/ai_assistant_api: LangChain agents as assistants
  • drupal/ai_agentic_workflows: LangChain-powered workflows
  • drupal/ai_agent_crewai: LangChain + CrewAI integration
  • drupal/eca: Trigger LangChain workflows
  • drupal/comprehensivequeue: Async LangChain execution
  • drupal/redis: Caching for LangChain results

License

GPL-2.0+