alle-ai / anthropic-api-php
The go-to PHP library for the Anthropic API — Messages, streaming, tool use, vision, prompt caching, extended thinking, MCP, Files, Batches. Maintained by Alle-AI.
Requires
- php: ^8.2
- ext-curl: *
- ext-json: *
- php-http/discovery: ^1.19
- psr/http-client: ^1.0
- psr/http-factory: ^1.0
- psr/http-message: ^1.1 || ^2.0
- psr/log: ^2.0 || ^3.0
- ramsey/uuid: ^4.7
Requires (Dev)
- aws/aws-sdk-php: ^3.300
- google/auth: ^1.40
- guzzlehttp/guzzle: ^7.9
- infection/infection: ^0.29
- laravel/pint: ^1.18
- nyholm/psr7: ^1.8
- phpstan/phpstan: ^1.12
- phpstan/phpstan-deprecation-rules: ^1.2
- phpstan/phpstan-strict-rules: ^1.6
- phpunit/phpunit: ^10.5 || ^11.0
Suggests
- aws/aws-sdk-php: Required for Auth\BedrockAuth — AWS SigV4 signing for the Bedrock runtime endpoint
- google/auth: Required for Auth\VertexAuth — Google ADC token acquisition for Vertex AI
- guzzlehttp/guzzle: Default PSR-18 HTTP client used when one isn't injected
- infection/infection: Mutation testing for contributors
This package is auto-updated.
Last update: 2026-05-03 11:44:56 UTC
README
The go-to PHP library for the Anthropic API. Maintained by Alle-AI.
A first-class PHP client for the Anthropic Messages API. Built for Claude 4 and beyond — Messages, streaming, tool use, vision, prompt caching, extended thinking, MCP connector, Files, Batches. Works with the direct API, AWS Bedrock, and Google Vertex AI.
Looking for the v1.x docs? See the
1.xbranch. The v1 surface (Alle_AI\Anthropic\AnthropicAPI) is preserved as a deprecation shim through the v2.x line and removed in v3.0. See UPGRADING.md.
Features
- Messages API — typed requests and responses for
POST /v1/messages - Streaming — Generator-based SSE iterator with
->toMessage()aggregator - Tool use — closure tools, class-based tools with reflection-driven JSON Schema, automatic tool-loop helper
- Vision —
ImageBlock::fromFile()/fromUrl()/fromBase64() - Prompt caching —
cache_controlon any block;Usage::$cacheReadInputTokensin the response - Extended thinking —
ThinkingConfig::enabled(budgetTokens: 10_000)for reasoning models - MCP connector — call any remote MCP server via Anthropic's hosted connector
- Files API —
Resources\Files::upload()/get()/list()/delete()/downloadTo() - Batches API —
Resources\Batcheswith JSONL results streaming and apollUntilDone()helper - Models listing —
$client->models()->list()paginated catalog - PSR-18 / PSR-17 — bring any HTTP client (Guzzle, Symfony HttpClient, Buzz, …)
- Retries — exponential backoff with jitter, honors
Retry-After, idempotency keys auto-attached - PSR-3 logging — opt-in
LoggingMiddlewarewith correlation ids, latency, and request-id - Pluggable auth — API key, Bearer token, AWS Bedrock (SigV4), Google Vertex AI (ADC + OAuth)
Installation
Requires PHP 8.2 or newer.
composer require alle-ai/anthropic-api-php
You'll also need a PSR-18 HTTP client and PSR-17 factories. The most popular choice:
composer require guzzlehttp/guzzle nyholm/psr7
Quick start
<?php require_once __DIR__ . '/vendor/autoload.php'; use AlleAI\Anthropic\Client; use AlleAI\Anthropic\Models\Model; $client = Client::fromApiKey(getenv('ANTHROPIC_API_KEY')); $response = $client->messages()->create( model: Model::CLAUDE_SONNET_4_7, maxTokens: 1024, messages: [ ['role' => 'user', 'content' => 'Write a haiku about PHP.'], ], ); echo $response->text(); echo "\n\nUsed {$response->usage->inputTokens} input + {$response->usage->outputTokens} output tokens.\n";
From environment
$client = Client::fromEnvironment(); // reads ANTHROPIC_API_KEY
Builder for advanced configuration
use AlleAI\Anthropic\Http\RetryPolicy; $client = Client::builder() ->withApiKey(getenv('ANTHROPIC_API_KEY')) ->withAnthropicVersion('2023-06-01') ->withAnthropicBeta('prompt-caching-2024-07-31') ->withRetryPolicy(new RetryPolicy(maxAttempts: 5, baseDelay: 1.0)) ->withTimeout(120.0) ->build();
Streaming
use AlleAI\Anthropic\Streaming\Events\ContentBlockDeltaEvent; use AlleAI\Anthropic\Streaming\Events\Deltas\TextDelta; $stream = $client->messages()->stream( model: Model::CLAUDE_SONNET_4_7, maxTokens: 1024, messages: [['role' => 'user', 'content' => 'Tell me a story.']], ); foreach ($stream as $event) { if ($event instanceof ContentBlockDeltaEvent && $event->delta instanceof TextDelta) { echo $event->delta->text; } } $final = $stream->toMessage(); // aggregated MessageResponse echo "\nDone in {$final->usage->outputTokens} tokens.\n";
Tool use
use AlleAI\Anthropic\Tools\ClassTool; use AlleAI\Anthropic\Tools\Schema\Attributes\Enum; use AlleAI\Anthropic\Tools\Schema\Attributes\Param; use AlleAI\Anthropic\Tools\ToolSet; final class GetWeather extends ClassTool { public function name(): string { return 'get_weather'; } public function description(): string { return 'Get current weather'; } /** @return array<string, mixed> */ protected function runTool( #[Param('City name')] string $city, #[Param('Units')] #[Enum('c', 'f')] string $units = 'c', ): array { return ['city' => $city, 'temp' => 24, 'units' => $units]; } } $loop = $client->messages()->toolLoop( model: Model::CLAUDE_SONNET_4_7, maxTokens: 4096, messages: [['role' => 'user', 'content' => 'Weather in Accra and Tokyo?']], tools: new ToolSet(new GetWeather()), ); $final = $loop->run(); // automatic tool-call round-trips until end_turn echo $final->text();
Vision
use AlleAI\Anthropic\Messages\Content\ImageBlock; use AlleAI\Anthropic\Messages\Content\TextBlock; $response = $client->messages()->create( model: Model::CLAUDE_SONNET_4_7, maxTokens: 1024, messages: [[ 'role' => 'user', 'content' => [ ImageBlock::fromFile(__DIR__ . '/diagram.png'), TextBlock::of('Describe this diagram.'), ], ]], );
Prompt caching
use AlleAI\Anthropic\Messages\Content\CacheControl; use AlleAI\Anthropic\Messages\Content\TextBlock; $response = $client->messages()->create( model: Model::CLAUDE_SONNET_4_7, maxTokens: 1024, system: [ TextBlock::of('You are a helpful assistant.'), TextBlock::of($longCorpus)->withCacheControl(CacheControl::ephemeral('1h')), ], messages: [['role' => 'user', 'content' => 'Summarize.']], ); echo "Cache read: {$response->usage->cacheReadInputTokens}\n";
Extended thinking
use AlleAI\Anthropic\Messages\ThinkingConfig; use AlleAI\Anthropic\Messages\Content\ThinkingBlock; use AlleAI\Anthropic\Messages\Content\TextBlock; $response = $client->messages()->create( model: Model::CLAUDE_OPUS_4_7, maxTokens: 16_000, thinking: ThinkingConfig::enabled(budgetTokens: 10_000), messages: [['role' => 'user', 'content' => 'Prove there are infinitely many primes.']], ); foreach ($response->content as $block) { if ($block instanceof ThinkingBlock) { // internal reasoning — typically logged or hidden from end users } if ($block instanceof TextBlock) { echo $block->text; } }
Error handling
use AlleAI\Anthropic\Exceptions\AnthropicException; use AlleAI\Anthropic\Exceptions\AuthenticationException; use AlleAI\Anthropic\Exceptions\RateLimitException; use AlleAI\Anthropic\Exceptions\OverloadedException; try { $response = $client->messages()->create(/* ... */); } catch (AuthenticationException $e) { // 401 — bad API key } catch (RateLimitException $e) { // 429 — back off sleep($e->retryAfter() ?? 30); } catch (OverloadedException $e) { // 529 — Anthropic capacity issue } catch (AnthropicException $e) { // anything else from this SDK error_log('Anthropic API call failed: ' . $e->getMessage()); }
Alternative deployments
Same client, different backend.
AWS Bedrock
composer require aws/aws-sdk-php
use AlleAI\Anthropic\Auth\BedrockAuth; $client = Client::builder() ->withAuth(BedrockAuth::fromEnvironment(region: 'us-east-1')) ->build(); // Use Bedrock model id format: $response = $client->messages()->create( model: 'anthropic.claude-sonnet-4-7-v1:0', maxTokens: 1024, messages: [['role' => 'user', 'content' => 'Hello']], );
BedrockAuth::fromEnvironment() uses the AWS default credentials chain (env vars, ~/.aws/credentials, IAM roles, etc.). The auth provider rewrites the URL to bedrock-runtime.{region}.amazonaws.com, transforms the body to Bedrock's expected shape, and signs with SigV4.
Google Vertex AI
composer require google/auth
use AlleAI\Anthropic\Auth\VertexAuth; $client = Client::builder() ->withAuth(VertexAuth::fromEnvironment( projectId: 'my-gcp-project', region: 'us-east5', )) ->build(); $response = $client->messages()->create( model: 'claude-sonnet-4-7@20260101', // Vertex publisher format maxTokens: 1024, messages: [['role' => 'user', 'content' => 'Hello']], );
VertexAuth::fromEnvironment() uses Google ADC for tokens. Pass projectId explicitly or set GOOGLE_CLOUD_PROJECT; same for region with GOOGLE_CLOUD_REGION.
Custom HTTP client
The SDK auto-discovers a PSR-18 client via php-http/discovery. Inject your own to gain full control:
use GuzzleHttp\Client as GuzzleClient; $guzzle = new GuzzleClient([ 'timeout' => 60, 'proxy' => 'http://corporate-proxy:3128', ]); $client = Client::builder() ->withApiKey(getenv('ANTHROPIC_API_KEY')) ->withHttpClient($guzzle) ->build();
Roadmap
| Tag | Status | Adds |
|---|---|---|
v2.0.0 |
shipped | Messages create + stream, tool use, vision, prompt caching, extended thinking, citations, Files, Batches, MCP connector, Models listing, PSR-3 logging, Bedrock + Vertex auth, server-side tools (web search / computer use / bash / text editor), Messages::createMany() concurrent fan-out, 14 examples, 186 tests, PHPStan level 9, weekly mutation testing, deprecation shim |
v2.x |
rolling | Bug fixes, new Anthropic features as they ship, additional examples |
v2.x |
future | Alle-AI sister client (AlleAI\AlleAI\Client) for multi-model fan-out |
v3.0.0 |
future | Remove Alle_AI\Anthropic\AnthropicAPI deprecation shim |
Migration from v1.x
The single-class v1 surface (Alle_AI\Anthropic\AnthropicAPI::generateText()) is preserved as a deprecation shim. Existing code keeps working — every call emits an E_USER_DEPRECATED notice. Set ALLE_AI_ANTHROPIC_FAIL_ON_DEPRECATED=1 to convert notices into exceptions during migration.
See UPGRADING.md for the full v1 → v2 guide.
About Alle-AI
This library is built and maintained by Alle-AI — Your All-In-One AI Platform. Alle-AI gives you a single interface to compare and combine outputs from frontier models (Claude, GPT, Gemini, Llama, and more). If you build on Anthropic and want to evaluate alternatives in the same workflow, check out the platform.
Support
- Bugs / feature requests: GitHub Issues
- Discussion: GitHub Discussions
- Email: dickson@alle-ai.com
- Security disclosures: see SECURITY.md