omnicado / nette-ai-bundle
Nette integration for Symfony AI components
Requires
- php: >=8.2
- nette/di: ^3.2
- nette/schema: ^1.3
- nette/utils: ^4.0
- symfony/ai-platform: ^0.5
Requires (Dev)
- nette/caching: ^3.4
- nette/security: ^3.2
- nette/tester: ^2.5
- phpstan/phpstan: ^2.0
- phpstan/phpstan-nette: ^2.0
- symfony/ai-agent: ^0.5
- symfony/ai-anthropic-platform: ^0.5
- symfony/ai-open-ai-platform: ^0.5
- symfony/console: ^7.4
- tracy/tracy: ^2.10
Suggests
- contributte/console: For console commands
- symfony/ai-agent: For agent support with tools and memory
- symfony/ai-anthropic-platform: Anthropic platform bridge
- symfony/ai-chat: For chat conversation support
- symfony/ai-open-ai-platform: OpenAI platform bridge
- symfony/ai-store: For vector store and RAG support
- tracy/tracy: For Tracy debug bar panel
README
Nette Framework integration for Symfony AI components.
Provides a seamless bridge between Symfony AI's framework-agnostic platform, agent, and tooling libraries and the Nette DI container, Tracy debugger, and console infrastructure.
Features
- Platform registration — configure OpenAI, Anthropic, Google, Ollama (or custom) platforms via NEON
- Agent wiring — define agents with model, system prompt, tools, and memory
- Tool auto-discovery — services with
#[AsTool]are automatically registered - Processor auto-discovery — services with
#[AsInputProcessor]/#[AsOutputProcessor]are wired into agents - Security — restrict tool access with
#[IsGrantedTool]based onNette\Security\Userroles - Memory — persist conversation memory via
Nette\Caching\Cache - Tracy panel — inspect API calls, duration, token usage, and request/response payloads
- Console commands —
ai:platform:invokeandai:agent:call
Requirements
- PHP 8.2+
- Nette 3.2+
- Symfony AI Platform ^0.5
Installation
composer require omnicado/nette-ai-bundle
Install the platform bridge package for your provider:
# OpenAI composer require symfony/ai-open-ai-platform # Anthropic composer require symfony/ai-anthropic-platform # Google Gemini composer require symfony/ai-google-platform # Ollama (local) composer require symfony/ai-ollama-platform
For agent support (tools, processors, memory):
composer require symfony/ai-agent
Configuration
Register the extension in your NEON config:
extensions: ai: Omnicado\NetteAiBundle\DI\AiExtension
Full configuration reference
ai: # Platform definitions (keyed by name) platforms: openai: api_key: %env.OPENAI_API_KEY% model: gpt-4o anthropic: api_key: %env.ANTHROPIC_API_KEY% model: claude-sonnet-4-5-20250929 ollama: base_url: http://127.0.0.1:11434 # default model: llama3 # Custom platform with explicit factory class custom: factory: App\AI\MyPlatformFactory api_key: %env.CUSTOM_API_KEY% model: my-model # Agent definitions (keyed by name) agents: default: platform: openai # required, references a platform name model: gpt-4o-mini # optional, overrides platform model system_prompt: "You are a helpful assistant." tools: auto # auto | [class list] | false memory: false # inject memory processor researcher: platform: anthropic system_prompt: "You are a research assistant." tools: - App\Tool\WebSearchTool - App\Tool\WikipediaTool memory: true # Memory provider (Nette Cache backed) memory: enabled: false cache_namespace: ai.memory # Security (role-based tool access) security: enabled: false # Tracy debug panel debug: %debugMode%
Configuration options
| Option | Type | Default | Description |
|---|---|---|---|
platforms.<name>.factory |
?string |
null |
Custom PlatformFactory class. Auto-resolved for openai, anthropic, google, ollama. |
platforms.<name>.api_key |
?string |
null |
API key (supports %env.*% parameters). |
platforms.<name>.model |
?string |
null |
Default model identifier. |
platforms.<name>.base_url |
?string |
null |
Base URL (ollama defaults to http://127.0.0.1:11434). |
agents.<name>.platform |
string |
required | References a platform name. |
agents.<name>.model |
?string |
null |
Model override. Falls back to the platform's model. |
agents.<name>.system_prompt |
?string |
null |
System prompt injected into conversations. |
agents.<name>.tools |
'auto'|string[]|false |
'auto' |
Tool wiring mode (see Tools). |
agents.<name>.memory |
bool |
false |
Enable memory injection for this agent. |
memory.enabled |
bool |
false |
Register NetteCacheMemoryProvider. |
memory.cache_namespace |
string |
'ai.memory' |
Nette Cache namespace for memories. |
security.enabled |
bool |
false |
Enable #[IsGrantedTool] role checks. |
debug |
bool |
false |
Enable Tracy panel and event collection. |
Autowiring
The first registered platform is autowired as PlatformInterface. The first registered agent is autowired as Agent. Additional platforms/agents are accessible by service name:
// Autowired (first registered) public function __construct( private PlatformInterface $platform, private Agent $agent, ) {} // By name $container->getService('ai.platform.anthropic'); $container->getService('ai.agent.researcher');
Usage
Direct platform invocation
use Symfony\AI\Platform\Message\Message; use Symfony\AI\Platform\Message\MessageBag; use Symfony\AI\Platform\PlatformInterface; final class MyService { public function __construct( private PlatformInterface $platform, ) {} public function ask(string $question): string { $messages = new MessageBag( Message::forSystem('You are a helpful assistant.'), Message::ofUser($question), ); $result = $this->platform->invoke('gpt-4o', $messages); return $result->getResult()->getContent(); } }
Using agents
use Symfony\AI\Agent\Agent; use Symfony\AI\Platform\Message\Message; use Symfony\AI\Platform\Message\MessageBag; final class ChatService { public function __construct( private Agent $agent, ) {} public function chat(string $userMessage): string { $messages = new MessageBag( Message::ofUser($userMessage), ); $result = $this->agent->call($messages); return $result->getContent(); } }
Tools
Tools are PHP classes annotated with #[AsTool] from symfony/ai-agent. They are automatically discovered and injected into agents.
Creating a tool
use Symfony\AI\Agent\Toolbox\Attribute\AsTool; #[AsTool('weather', 'Get current weather for a city')] final class WeatherTool { public function __invoke(string $city): string { // Fetch weather data... return "The weather in {$city} is sunny, 22°C."; } }
Register the tool as a service:
services: - App\Tool\WeatherTool
Multi-method tools
A single class can expose multiple tools using the repeatable #[AsTool] attribute:
#[AsTool(name: 'weather_current', description: 'Get current weather', method: 'current')] #[AsTool(name: 'weather_forecast', description: 'Get weather forecast', method: 'forecast')] final class WeatherTool { public function current(string $city): string { /* ... */ } public function forecast(string $city, int $days = 3): string { /* ... */ } }
Tool wiring modes
agents: # Auto-discover all #[AsTool] services assistant: platform: openai tools: auto # Only specific tool classes researcher: platform: openai tools: - App\Tool\WebSearchTool - App\Tool\WikipediaTool # No tools simple: platform: openai tools: false
Security
Restrict tool access based on Nette\Security\User roles using the #[IsGrantedTool] attribute.
Setup
ai: security: enabled: true
Requires nette/security with a User service in the container.
Usage
use Omnicado\NetteAiBundle\Security\IsGrantedTool; use Symfony\AI\Agent\Toolbox\Attribute\AsTool; // Only users with the 'admin' role can use this tool #[AsTool('delete_user', 'Deletes a user account')] #[IsGrantedTool('admin')] final class DeleteUserTool { public function __invoke(int $userId): string { // ... } } // Users with either 'admin' or 'editor' role can use this tool #[AsTool('publish_article', 'Publishes an article')] #[IsGrantedTool('admin')] #[IsGrantedTool('editor')] final class PublishArticleTool { public function __invoke(int $articleId): string { // ... } } // No #[IsGrantedTool] = available to everyone #[AsTool('search', 'Searches documents')] final class SearchTool { public function __invoke(string $query): string { // ... } }
When security is enabled:
getTools()filters out tools the current user cannot access (the LLM won't even know about them)execute()returns an "Access denied" result if a tool is called without the required role
Memory
Persist conversation memories across requests using Nette Cache.
Setup
ai: memory: enabled: true cache_namespace: ai.memory # optional agents: default: platform: openai memory: true # enable for this agent
Programmatic usage
The NetteCacheMemoryProvider can also be used directly:
use Omnicado\NetteAiBundle\Memory\NetteCacheMemoryProvider; final class MemoryService { public function __construct( private NetteCacheMemoryProvider $memoryProvider, ) {} public function saveContext(string $fact): void { $this->memoryProvider->remember($fact); } public function clearMemory(): void { $this->memoryProvider->forget(); } public function listMemories(): array { return $this->memoryProvider->all(); } }
When an agent has memory: true, a MemoryInputProcessor is prepended to its input processors. It loads stored memories and injects them into the system message before each call.
Processors
Custom input/output processors are automatically discovered and wired into agents.
Input processor
use Symfony\AI\Agent\Attribute\AsInputProcessor; use Symfony\AI\Agent\Input; use Symfony\AI\Agent\InputProcessorInterface; // Applied to all agents #[AsInputProcessor] final class LoggingInputProcessor implements InputProcessorInterface { public function processInput(Input $input): void { // Log, modify messages, inject context, etc. } } // Applied only to the 'researcher' agent, with high priority #[AsInputProcessor(agent: 'researcher', priority: 100)] final class ResearchContextProcessor implements InputProcessorInterface { public function processInput(Input $input): void { // Add research-specific context } }
Output processor
use Symfony\AI\Agent\Attribute\AsOutputProcessor; use Symfony\AI\Agent\Output; use Symfony\AI\Agent\OutputProcessorInterface; #[AsOutputProcessor(agent: 'default', priority: 10)] final class ResponseFilterProcessor implements OutputProcessorInterface { public function processOutput(Output $output): void { // Filter, transform, or validate agent output } }
Register processors as services:
services: - App\Processor\LoggingInputProcessor - App\Processor\ResearchContextProcessor - App\Processor\ResponseFilterProcessor
Processors are sorted by priority (higher = runs first). When agent is null, the processor applies to all agents.
Tracy Debug Panel
Enable the Tracy panel to monitor AI platform calls during development:
ai: debug: %debugMode%
The panel displays:
- Bar tab — invocation count and total duration (e.g., "2 calls / 1234.5 ms")
- Summary table — total API calls, total duration, prompt/completion/total token counts
- Per-call details:
- Model name
- Duration in milliseconds
- Token breakdown (prompt, completion, thinking, cached)
- Input messages (role + content)
- Output content (truncated to 2000 chars)
- Options dump (temperature, etc.)
The panel hooks into Symfony AI Platform's event system via a lightweight PSR-14 dispatcher that intercepts InvocationEvent and ResultEvent.
Console Commands
Requires contributte/console or any other integration of symfony/console into Nette.
ai:platform:invoke
Invoke a platform directly from the command line:
# Basic usage php bin/console ai:platform:invoke "What is PHP?" # With options php bin/console ai:platform:invoke "Explain quantum computing" \ --model gpt-4o \ --system "You are a physics professor." \ --temperature 0.7
| Option | Short | Description |
|---|---|---|
prompt (argument) |
— | The prompt to send (required) |
--model |
-m |
Model to use (defaults to platform's model) |
--system |
-s |
System prompt |
--temperature |
-t |
Temperature (0.0–2.0) |
ai:agent:call
Call an agent (with tools, memory, processors):
php bin/console ai:agent:call "Summarize the latest news about AI" php bin/console ai:agent:call "Find weather in Prague" \ --system "Always respond in Czech."
| Option | Short | Description |
|---|---|---|
prompt (argument) |
— | The prompt to send (required) |
--system |
-s |
System prompt |
Architecture
src/
├── DI/
│ ├── AiExtension.php # Main compiler extension
│ ├── AiEventDispatcher.php # PSR-14 dispatcher for Tracy integration
│ └── Pass/
│ ├── ToolDiscoveryPass.php # #[AsTool] discovery
│ └── ProcessorDiscoveryPass.php # #[AsInputProcessor] / #[AsOutputProcessor] discovery
├── Console/
│ ├── PlatformInvokeCommand.php # ai:platform:invoke
│ └── AgentCallCommand.php # ai:agent:call
├── Memory/
│ └── NetteCacheMemoryProvider.php # MemoryProviderInterface over Nette Cache
├── Security/
│ ├── IsGrantedTool.php # #[IsGrantedTool] attribute
│ └── NetteToolAccessChecker.php # ToolboxInterface decorator with role checks
└── Tracy/
├── AiPanel.php # Tracy\IBarPanel implementation
├── AiPanelCollector.php # Event data collector
├── InvocationRecord.php # Single invocation value object
└── templates/
└── panel.phtml # Panel HTML template
Minimal example
# config/common.neon extensions: ai: Omnicado\NetteAiBundle\DI\AiExtension ai: platforms: openai: api_key: %env.OPENAI_API_KEY% model: gpt-4o agents: default: platform: openai system_prompt: "You are a helpful assistant." tools: auto debug: %debugMode% services: - App\Tool\WeatherTool
// app/Tool/WeatherTool.php #[AsTool('weather', 'Get weather for a city')] final class WeatherTool { public function __invoke(string $city): string { return "Sunny, 22°C in {$city}."; } }
// app/Presenters/ChatPresenter.php final class ChatPresenter extends Nette\Application\UI\Presenter { public function __construct( private Agent $agent, ) {} public function actionDefault(): void { $result = $this->agent->call( new MessageBag(Message::ofUser('What is the weather in Prague?')), ); $this->sendJson(['response' => $result->getContent()]); } }
License
MIT