nicobleiler / mistral-php
A PHP client library for Mistral AI API with Laravel support
Requires
- php: ^8.3|^8.4
- guzzlehttp/guzzle: ^7.10
- illuminate/support: ^8.0|^9.0|^10.0|^11.0|^12.0
- logiscape/mcp-sdk-php: ^1.2
Requires (Dev)
- mockery/mockery: ^1.6
- orchestra/testbench: ^7.0|^8.0|^9.0|^10.0|^10.6
- phpunit/phpunit: ^12.3
- dev-master
- dev-refactor
- dev-copilot/fix-b6ab6871-a9f5-4362-9e77-e67a44372ed8
- dev-copilot/fix-32de4474-dcd9-495b-b063-fe46b2ccade5
- dev-copilot/fix-fbeb2801-1ba5-40b7-9340-cbe4832e4135
- dev-copilot/fix-1fbdbf85-ee5d-43aa-be81-5f1848f6a207
- dev-copilot/fix-ec11946d-5765-49ce-90c1-04208ba3116f
- dev-copilot/fix-710c2d62-42a1-43d2-9e94-1af6b679425f
- dev-copilot/fix-a34d1e33-6c7e-4dc8-82d4-1195efa12443
- dev-copilot/fix-78b27e69-8f17-4054-b5b2-7386795a8d7d
This package is auto-updated.
Last update: 2025-10-02 12:26:38 UTC
README
A comprehensive PHP client library for the Mistral AI API with Laravel support.
Features
- ๐ Full Mistral AI API support (Chat Completions, Embeddings, Models, Conversations)
- ๐ Model Context Protocol (MCP) client integration for external tool calling
- ๐ฏ Laravel integration with service provider and facade
- ๐ Streaming support for chat completions
- ๐ Type-safe responses with PHP classes
- โก PSR-4 autoloading
- ๐งช Comprehensive test suite
- ๐ Backward compatibility with array-based API
Installation
Install the package via Composer:
composer require nicobleiler/mistral-php
Laravel
The package will automatically register its service provider in Laravel 5.5+.
Publish the configuration file:
php artisan vendor:publish --tag=mistral-config
Add your Mistral AI API key to your .env
file:
MISTRAL_API_KEY=your-api-key-here
Usage
Basic Usage
use Nicobleiler\Mistral\Client; $client = new Client('your-api-key'); // Chat completion $response = $client->chat()->create([ 'model' => 'mistral-tiny', 'messages' => [ ['role' => 'user', 'content' => 'Hello, how are you?'] ] ]); echo $response['choices'][0]['message']['content'];
Laravel Usage
use Nicobleiler\Mistral\Facades\Mistral; // Using the facade $response = Mistral::chat()->create([ 'model' => 'mistral-tiny', 'messages' => [ ['role' => 'user', 'content' => 'Hello from Laravel!'] ] ]); // Or inject the client use Nicobleiler\Mistral\Client; class ChatController extends Controller { public function __construct(private Client $mistral) {} public function chat(Request $request) { $response = $this->mistral->chat()->create([ 'model' => 'mistral-tiny', 'messages' => $request->get('messages') ]); return response()->json($response); } public function uploadTrainingFile(Request $request) { $file = $this->mistral->files()->upload([ 'file' => $request->file('training_file')->getRealPath(), 'purpose' => 'fine-tune' ]); return response()->json($file); } public function createFineTuneJob(Request $request) { $job = $this->mistral->fineTuning()->create([ 'model' => $request->get('base_model', 'mistral-tiny'), 'training_file' => $request->get('training_file_id'), 'hyperparameters' => $request->get('hyperparameters', []) ]); return response()->json($job); } }
Streaming Chat
$client->chat()->stream([ 'model' => 'mistral-tiny', 'messages' => [ ['role' => 'user', 'content' => 'Tell me a story'] ] ], function ($chunk) { if (isset($chunk['choices'][0]['delta']['content'])) { echo $chunk['choices'][0]['delta']['content']; } });
Embeddings
$response = $client->embeddings()->create([ 'model' => 'mistral-embed', 'input' => ['Hello world', 'How are you?'] ]); $embeddings = $response['data'];
Models
// List all models $models = $client->models()->list(); // Get specific model $model = $client->models()->get('mistral-tiny');
Files
Upload and manage files for use with fine-tuning:
// Upload a file $file = $client->files()->upload([ 'file' => '/path/to/training-data.jsonl', 'purpose' => 'fine-tune' ]); // List files $files = $client->files()->list(); // Retrieve a file $fileInfo = $client->files()->retrieve($file['id']); // Download file content $content = $client->files()->download($file['id']); // Delete a file $deleted = $client->files()->delete($file['id']);
Fine-tuning
Create and manage fine-tuning jobs:
// Create a fine-tuning job $job = $client->fineTuning()->create([ 'model' => 'mistral-tiny', 'training_file' => $file['id'], 'hyperparameters' => [ 'n_epochs' => 4, 'batch_size' => 32, 'learning_rate' => 0.0001 ] ]); // List fine-tuning jobs $jobs = $client->fineTuning()->list(); // Retrieve a job $jobInfo = $client->fineTuning()->retrieve($job['id']); // Cancel a job $cancelled = $client->fineTuning()->cancel($job['id']); // List job events $events = $client->fineTuning()->listEvents($job['id']);
Agents
Create and manage conversational AI agents:
// Create an agent $agent = $client->agents()->create([ 'model' => 'mistral-large', 'name' => 'Math Tutor', 'description' => 'A helpful math tutoring agent', 'instructions' => 'You are a personal math tutor. Help students with math problems step by step.', 'tools' => [ [ 'type' => 'function', 'function' => [ 'name' => 'calculate', 'description' => 'Perform mathematical calculations', 'parameters' => [ 'type' => 'object', 'properties' => [ 'expression' => ['type' => 'string'] ] ] ] ] ] ]); // List agents $agents = $client->agents()->list(); // Retrieve an agent $agentInfo = $client->agents()->retrieve($agent['id']); // Update an agent $updated = $client->agents()->update($agent['id'], [ 'name' => 'Advanced Math Tutor', 'instructions' => 'You are an advanced math tutor specializing in calculus and linear algebra.' ]); // Delete an agent $deleted = $client->agents()->delete($agent['id']);
Type Safety
This package includes comprehensive PHP type safety through actual PHP classes, not just PHPDoc annotations. IDE autocompletion and static analysis tools will provide full type checking for:
- Input parameters: Typed request classes with fluent builder patterns
- Return values: Typed response classes with proper object hierarchies
- Backward compatibility: Existing array-based code continues to work
- IDE support: Full autocompletion and IntelliSense in modern IDEs
Example with full type support:
use Nicobleiler\Mistral\Types\Chat\ChatRequest; use Nicobleiler\Mistral\Types\Chat\Message; // Create typed request objects $messages = [ new Message('user', 'Hello world', 'user123'), // Full IDE completion new Message('assistant', 'Hello! How can I help?') ]; $request = new ChatRequest( model: 'mistral-tiny', // string (required) messages: $messages, // Message[] - strongly typed temperature: 0.7, // float (optional) max_tokens: 100 // int (optional) ); // Make request and get typed response $response = $client->chat()->create($request); // Returns ChatResponse object // Access response with full type safety echo $response->choices[0]->message->content; // IDE knows exact types echo $response->usage->total_tokens; // No guessing about structure
API Reference
Chat Completions
Create chat completions with the Mistral AI models using either arrays (for backward compatibility) or typed objects (for enhanced type safety):
// Using arrays (backward compatible) $response = $client->chat()->create([ 'model' => 'mistral-tiny', 'messages' => [ ['role' => 'system', 'content' => 'You are a helpful assistant.'], ['role' => 'user', 'content' => 'What is the capital of France?'] ], 'temperature' => 0.7, 'max_tokens' => 100, 'top_p' => 1, 'stream' => false ]); // Using typed objects (recommended for new code) use Nicobleiler\Mistral\Types\Chat\ChatRequest; use Nicobleiler\Mistral\Types\Chat\Message; $messages = [ new Message('system', 'You are a helpful assistant.'), new Message('user', 'What is the capital of France?') ]; $request = new ChatRequest( model: 'mistral-tiny', messages: $messages, temperature: 0.7, max_tokens: 100 ); $response = $client->chat()->create($request); // Returns ChatResponse object echo $response->choices[0]->message->content; // Full IDE autocompletion
Conversations
Manage conversations with AI agents:
use Nicobleiler\Mistral\Types\Conversations\ConversationRequest; // Create a conversation $request = new ConversationRequest( agent_id: 'agent-123', metadata: ['topic' => 'programming-help'] ); $conversation = $client->conversations()->create($request); // List conversations $conversations = $client->conversations()->list([ 'limit' => 10, 'order' => 'desc' ]); // Retrieve a conversation $conversation = $client->conversations()->retrieve('conv-abc123'); // Update a conversation $updated = $client->conversations()->update('conv-abc123', [ 'metadata' => ['status' => 'active'] ]); // Delete a conversation $deleted = $client->conversations()->delete('conv-abc123');
Streaming
For real-time responses:
$client->chat()->stream([ 'model' => 'mistral-tiny', 'messages' => [['role' => 'user', 'content' => 'Count to 10']] ], function ($chunk) { // Handle each chunk of the response if (isset($chunk['choices'][0]['delta']['content'])) { echo $chunk['choices'][0]['delta']['content']; flush(); } });
Model Context Protocol (MCP) Client Integration
This package includes MCP client support using the logiscape/mcp-sdk-php
package, allowing Mistral AI to call external MCP tools during conversations. This enables powerful integrations with external services and tools.
Quick Start with MCP
Connect to external MCP servers and use their tools in Mistral conversations:
use Nicobleiler\Mistral\Client; $client = new Client('your-api-key'); // Add an MCP server (stdio transport) $client->addMcpServer('file-tools', 'stdio', [ 'command' => 'python', 'args' => ['file_server.py'], 'working_dir' => '/path/to/server' ]); // Add an MCP server (HTTP transport) $client->addMcpServer('web-tools', 'http', [ 'url' => 'http://localhost:8080' ]); // Use MCP-enabled chat $mcpChat = $client->mcpChat(); // Connect to servers $mcpChat->connectToMcpServer('file-tools'); $mcpChat->connectToMcpServer('web-tools'); // Chat with access to MCP tools $response = $mcpChat->create([ 'model' => 'mistral-large', 'messages' => [ ['role' => 'user', 'content' => 'Can you read the file config.json and summarize it?'] ] ]); echo $response->choices[0]->message->content;
Available MCP Tools
View available tools from connected MCP servers:
$mcpChat = $client->mcpChat(); $mcpChat->connectToMcpServer('file-tools'); $tools = $mcpChat->getAvailableMcpTools(); foreach ($tools as $serverName => $serverTools) { echo "Server: {$serverName}\n"; foreach ($serverTools as $tool) { echo " - {$tool['name']}: {$tool['description']}\n"; } }
Manual Tool Execution
You can also call MCP tools directly:
$mcpManager = $client->getMcpManager(); $mcpManager->addServer('calculator', 'stdio', [ 'command' => 'python', 'args' => ['calculator_server.py'] ]); $mcpManager->connect('calculator'); $result = $mcpManager->callTool('calculator', 'add', [ 'a' => 5, 'b' => 3 ]); if ($result['success']) { echo "Result: " . $result['content']; } else { echo "Error: " . $result['error']; }
MCP Server Configuration
STDIO Transport (subprocess)
$client->addMcpServer('my-server', 'stdio', [ 'command' => 'python', // Executable command 'args' => ['server.py'], // Command arguments 'working_dir' => '/path/to/dir', // Working directory 'timeout' => 30, // Request timeout in seconds 'env' => [ // Environment variables 'API_KEY' => 'secret' ] ]);
HTTP Transport
$client->addMcpServer('my-server', 'http', [ 'url' => 'http://localhost:8080', // Server URL 'timeout' => 30, // Request timeout 'headers' => [ // Additional HTTP headers 'Authorization' => 'Bearer token' ] ]);
Laravel Integration
In Laravel, you can configure MCP servers in your service provider:
use Nicobleiler\Mistral\Facades\Mistral; // In a service provider or controller $mcpChat = Mistral::mcpChat(); $mcpChat->addMcpServer('tools', 'stdio', [ 'command' => 'python', 'args' => [storage_path('mcp/tools_server.py')] ]); $response = $mcpChat->create([ 'model' => 'mistral-large', 'messages' => [ ['role' => 'user', 'content' => 'Use the weather tool to get current weather for Paris'] ] ]);
Automatic Tool Integration
When using mcpChat()
, available MCP tools are automatically added to the conversation context. Mistral can then choose to call these tools as needed during the conversation.
The tool calls happen automatically:
- Mistral decides to use an MCP tool based on the conversation
- The tool is called on the appropriate MCP server
- The results are fed back to Mistral
- Mistral incorporates the results into its response
Error Handling
MCP operations include comprehensive error handling:
try { $mcpChat = $client->mcpChat(); $mcpChat->connectToMcpServer('my-server'); $response = $mcpChat->create([ 'model' => 'mistral-large', 'messages' => [['role' => 'user', 'content' => 'Hello']] ]); } catch (\PhpMcp\Client\Exception\McpClientException $e) { echo "MCP Error: " . $e->getMessage(); } catch (\Exception $e) { echo "General Error: " . $e->getMessage(); }
Configuration
Environment Variables
MISTRAL_API_KEY
- Your Mistral AI API key (required)MISTRAL_BASE_URL
- Custom base URL (optional, defaults to https://api.mistral.ai/v1)MISTRAL_DEFAULT_MODEL
- Default model to use (optional, defaults to mistral-tiny)MISTRAL_TIMEOUT
- Request timeout in seconds (optional, defaults to 30)
Laravel Configuration
After publishing the config file, you can customize settings in config/mistral.php
:
return [ 'api_key' => env('MISTRAL_API_KEY'), 'base_url' => env('MISTRAL_BASE_URL', 'https://api.mistral.ai/v1'), 'default_model' => env('MISTRAL_DEFAULT_MODEL', 'mistral-tiny'), 'timeout' => env('MISTRAL_TIMEOUT', 30), ];
Available Models
mistral-tiny
- Fast and efficient for simple tasksmistral-small
- Good balance of speed and capabilitymistral-medium
- Higher capability for complex tasksmistral-large
- Most capable modelmistral-embed
- For generating embeddings
Error Handling
The client throws GuzzleHttp\Exception\GuzzleException
for HTTP errors:
use GuzzleHttp\Exception\GuzzleException; try { $response = $client->chat()->create([ 'model' => 'mistral-tiny', 'messages' => [['role' => 'user', 'content' => 'Hello']] ]); } catch (GuzzleException $e) { echo "API Error: " . $e->getMessage(); }
Testing
Run the test suite:
composer test
Contributing
Contributions are welcome! Please feel free to submit a Pull Request.
License
This package is open-sourced software licensed under the MIT license.