vluzrmos / ollama
Cliente PHP para a API do Ollama/OpenAI
Requires
- php: >=5.6.0
- ext-curl: *
- ext-json: *
- guzzlehttp/guzzle: ^6.5
Requires (Dev)
- ext-xdebug: *
- phpunit/phpunit: 5.7.*
This package is not auto-updated.
Last update: 2025-08-31 16:18:52 UTC
README
PHP client for Ollama/OpenAI API, compatible with PHP 5.6+. This library provides an easy-to-use interface to interact with Ollama server and also includes OpenAI API compatibility.
Features
- ✅ Compatible with PHP 5.6+
- ✅ Full support for Ollama native API
- ✅ OpenAI API endpoints compatibility
- ✅
Model
class for reusable model configuration - ✅ Chat completions with history
- ✅ Response streaming
- ✅ Image support (vision models like Llava)
- ✅ Function calling (tools)
- ✅ Embeddings
- ✅ Model management
- ✅ Robust error handling
- ✅ Complete documentation
Installation
composer require vluzrrmos/ollama
Quick Usage
Ollama Client (Native API)
<?php require_once 'vendor/autoload.php'; use Vluzrmos\Ollama\Ollama; use Vluzrmos\Ollama\Models\Message; use Vluzrmos\Ollama\Models\Model; // Create client $ollama = new Ollama('http://localhost:11434'); // Simple chat $response = $ollama->chat([ 'model' => 'llama3.2', 'messages' => [ Message::system('You are a helpful assistant.'), Message::user('Hello!') ] ]); echo $response['message']['content'];
OpenAI Client (Compatible)
<?php use Vluzrmos\Ollama\OpenAI; use Vluzrmos\Ollama\Models\Model; use Vluzrmos\Ollama\Models\Message; // Create OpenAI compatible client $openai = new OpenAI('http://localhost:11434/v1', 'ollama'); // Chat using OpenAI methods $response = $openai->chat('llama3.2', [ Message::system('You are a helpful assistant.'), Message::user('Hello!') ]); echo $response['choices'][0]['message']['content'];
Model Class for Reuse
<?php use Vluzrmos\Ollama\Models\Model; use Vluzrmos\Ollama\Models\Message; // Create model $model = (new Model('llama3.2')) ->setTemperature(0.8) ->setTopP(0.9) ->setNumCtx(4096) ->setSeed(42); // Use with OpenAI client $response = $openai->chat($model, [ Message::user('Tell me a story') ]); // Or use with Ollama client $params = [ 'model'=>$model, 'messages' => [ Message::user('Tell me a story') ] ]); $response = $ollama->chat($params);
Advanced Examples
Streaming
<?php // With Ollama client $ollama->generate([ 'model' => 'llama3.2', 'prompt' => 'Tell me a story', 'stream' => true ], function($chunk) { if (isset($chunk['response']) echo $chunk['response']; }); // With OpenAI client $openai->chatStream('llama3.2', [ Message::user('Tell me a story') ], function($chunk) { if (isset($chunk['choices'][0]['delta']['content']) echo $chunk['choices'][0]['delta']['content']; });
Vision Models (Images)
<?php use Vluzrmos\Ollama\Models\Message; use Vluzrmos\Ollama\Utils\ImageHelper; // With OpenAI client $response = $openai->chat('llava', [ Message::image( 'What do you see in this image?', 'data:image/png;base64,iVBORw0KGg...' // or ImageHelper::encodeImageUrl('path/to/your/image.png,jpg') ) ]);
Function Calling (Tools)
<?php $tools = [ [ 'type' => 'function', 'function' => [ 'name' => 'get_weather', 'description' => 'Get weather information', 'parameters' => [ 'type' => 'object', 'properties' => [ 'location' => [ 'type' => 'string', 'description' => 'Location' ] ], 'required' => ['location'] ] ] ] ]; $response = $openai->chatCompletions([ 'model' => 'llama3.2', 'messages' => [ Message::user('How is the weather in São Paulo?') ], 'tools' => $tools ]);
Advanced Tool System
This library provides a comprehensive tool system for creating and executing custom tools with the ToolManager
class.
Creating Custom Tools
Create tools by implementing the ToolInterface
or extending AbstractTool
:
<?php use Vluzrmos\Ollama\Tools\AbstractTool; class CalculatorTool extends AbstractTool { public function getName() { return 'calculator'; } public function getDescription() { return 'Performs basic mathematical operations'; } public function getParametersSchema() { return [ 'type' => 'object', 'properties' => [ 'operation' => [ 'type' => 'string', 'enum' => ['add', 'subtract', 'multiply', 'divide'] ], 'a' => ['type' => 'number'], 'b' => ['type' => 'number'] ], 'required' => ['operation', 'a', 'b'] ]; } public function execute(array $arguments) { $a = $arguments['a']; $b = $arguments['b']; $operation = $arguments['operation']; switch ($operation) { case 'add': return $a + $b; case 'subtract': return $a - $b; case 'multiply': return $a * $b; case 'divide': if ($b == 0) throw new ToolExecutionException('Division by zero'); return $a / $b; } } }
Using Tool Manager
<?php use Vluzrmos\Ollama\Tools\ToolManager; // Create and register tools $toolManager = new ToolManager(); $toolManager->registerTool(new CalculatorTool()); // Make API call with tools $response = $openai->chatCompletions([ 'model' => 'llama3.2', 'messages' => [ ['role' => 'user', 'content' => 'What is 15 + 27?'] ], 'tools' => $toolManager ]); // Handle tool calls from response if (isset($response['choices'][0]['message']['tool_calls'])) { $toolCalls = $response['choices'][0]['message']['tool_calls']; // Execute all tool calls $results = $toolManager->executeToolCalls($toolCalls); // Convert results to message format $toolMessages = $toolManager->toolCallResultsToMessages($results); // Send results back to model $messages = [ ['role' => 'user', 'content' => 'What is 15 + 27?'], $response['choices'][0]['message'], // Original response with tool_calls ...$toolMessages // Tool results ]; $finalResponse = $openai->chatCompletions([ 'model' => 'llama3.2', 'messages' => $messages ]); echo $finalResponse['choices'][0]['message']['content']; }
Tool Call Execution Methods
The ToolManager
provides several methods for handling tool calls:
executeToolCalls($toolCalls)
- Executes multiple tool calls and returns resultstoolCallResultsToMessages($results)
- Converts tool results to API message formatregisterTool($tool)
- Registers a new toollistTools()
- Lists all registered tool namesgetStats()
- Gets statistics about registered tools
Error Handling in Tools
<?php // Tool execution handles errors gracefully $toolCalls = [ [ 'id' => 'call_001', 'type' => 'function', 'function' => [ 'name' => 'non_existent_tool', 'arguments' => '{}' ] ] ]; $results = $toolManager->executeToolCalls($toolCalls); foreach ($results as $result) { if ($result['success']) { echo "Tool executed successfully: " . $result['result']; } else { echo "Tool execution failed: " . $result['error']; } }
JSON Mode
<?php $response = $openai->chat('llama3.2', [ Message::system('Always respond in valid JSON.'), Message::user('List 3 primary colors') ], [ 'response_format' => 'json_object' ]);
JSON Schema
<?php $response = $openai->chat('llama3.2', [ 'messages' => [ Message::user('What are the primary colors?') ], 'response_format' => [ 'type' => 'json_schema', 'json_schema' => [ 'name' => 'primary_colors', 'description' => 'List of primary colors', 'strict' => true, 'schema' => [ 'type' => 'object', 'properties' => [ 'colors' => [ 'type' => 'array', 'description' => 'List of primary colors in user language', 'items' => ['type' => 'string'] ] ], 'required' => ['colors'] ], ] ] ]); echo json_encode($response['choices'][0]['message']['content'], JSON_PRETTY_PRINT);
{ "colors": ["red", "blue", "yellow"] }
Note: JSON Schema format is useful for validating response structure and ensuring it meets user expectations. Not all models support this format, so check the specific model documentation.
Embeddings
<?php // Ollama $response = $ollama->embeddings([ 'model' => 'all-minilm', 'input' => 'Text for embedding' ]); // OpenAI $response = $openai->embed('all-minilm', [ 'First text', 'Second text' ]);
OpenAI Compatibility
This library implements the following OpenAI API endpoints:
- ✅
/v1/chat/completions
- ✅
/v1/completions
- ✅
/v1/embeddings
- ✅
/v1/models
- ✅
/v1/models/{model}
Supported Parameters
Chat Completions
model
,messages
,temperature
,top_p
,max_tokens
stream
,stream_options
,stop
,seed
frequency_penalty
,presence_penalty
response_format
(JSON mode:json_object
,json_schema
)tools
(function calling)
Completions
model
,prompt
,temperature
,top_p
,max_tokens
stream
,stream_options
,stop
,seed
,suffix
frequency_penalty
,presence_penalty
Embeddings
model
,input
(string or array)
Model Management
// List models $models = $ollama->listModels(); // Download model $ollama->pullModel(['model' => 'llama3.2']); // Model information $info = $ollama->showModel('llama3.2'); // Delete model $ollama->deleteModel('old-model');
Error Handling
<?php use Vluzrmos\Ollama\Exceptions\OllamaException; try { $response = $ollama->chat([ 'model' => 'non-existent-model', 'messages' => [Message::user('Hello')] ]); } catch (OllamaException $e) { echo "Error: " . $e->getMessage(); echo "Code: " . $e->getCode(); }
Configuration
Client Options
<?php $ollama = new Ollama('http://localhost:11434', [ 'timeout' => 60, 'connect_timeout' => 10, 'verify_ssl' => false ]); $openai = new OpenAI('http://localhost:11434/v1', 'ollama', [ 'timeout' => 120 ]);
Requirements
- PHP >= 5.6.0
- ext-curl
- ext-json
Complete Examples
See example files in the examples/
folder:
basic_usage.php
- Basic Ollama API usageopenai_usage.php
- OpenAI API examplesadvanced_chat.php
- Advanced chat with toolstool_execution_demo.php
- Comprehensive tool system demonstrationsimple_tool_test.php
- Simple tool execution test
Tool Examples
Tool implementations can be found in examples/tools/
:
CalculatorTool.php
- Mathematical operations toolWeatherTool.php
- Weather information tool (mock)
Testing
Build the docker image and run tests:
docker build -t ollama-php56 . docker run -it --rm \ -e OPENAI_API_URL="http://localhost:11434/v1" \ -e OLLAMA_API_URL="http://localhost:11434" \ -e RUN_INTEGRATION_TESTS=1 \ -e TEST_MODEL="llama3.2:1b" \ ollama-php56
License
MIT
Contributions
Contributions are welcome! See CONTRIBUTING.md for guidelines.