elliottlawson / converse-prism
Seamless integration between Laravel Converse and Prism PHP for AI conversations
Requires
- php: ^8.2
- elliottlawson/converse: ^0.2.0
- illuminate/support: ^11.0|^12.0
Requires (Dev)
- echolabsdev/prism: ^0.71
- laravel/pint: ^1.22
- mockery/mockery: ^1.6
- orchestra/testbench: ^9.4
- pestphp/pest: ^3.0
- pestphp/pest-plugin-laravel: ^3.0
- spatie/laravel-ray: ^1.40
Suggests
- echolabsdev/prism: Required to use this integration package (^0.71)
- dev-master
- v0.2.0
- v0.1.6
- v0.1.5
- v0.1.4
- v0.1.3
- 0.1.0
- dev-update-converse-version-0.2.0
- dev-fix-system-messages-prism
- dev-feat/restructure-api-docs
- dev-feat/comprehensive-documentation
- dev-feature/auto-save-prism-responses
- dev-fix/require-minimum-converse-version
- dev-feature/prism-message-converters
- dev-fix/critical-token-and-null-safety
- dev-fix/response-metadata-property-access
- dev-fix/remove-hardcoded-version
This package is auto-updated.
Last update: 2025-06-15 06:26:32 UTC
README
Converse Prism - Seamless AI Integration for Laravel Converse
Managing conversation context is hard. Managing AI provider APIs is harder. Doing both is a nightmare.
Converse Prism bridges Laravel Converse with Prism PHP to make AI conversations effortless. Write $conversation->toPrismText()
and your entire conversation history flows to any AI providerโOpenAI, Anthropic, Google, or beyond. No manual message formatting. No provider lock-in. Just conversations that work.
๐ Documentation
View the full documentation - Comprehensive guides, API reference, and examples.
The Magic
Without Converse Prism, you're juggling two complex systems:
// Extract messages from Converse ๐ $messages = []; foreach ($conversation->messages as $message) { $messages[] = [ 'role' => $message->role, 'content' => $message->content ]; } // Manually configure Prism $prism = Prism::text() ->using(Provider::OpenAI, 'gpt-4') ->withMessages($messages) ->withMaxTokens(500); // Make the call $response = $prism->generate(); // Figure out metadata storage... $conversation->messages()->create([ 'role' => 'assistant', 'content' => $response->text, // What about tokens? Model info? ๐คท ]);
With Converse Prism, it's seamless:
// Everything flows automatically โจ $response = $conversation ->toPrismText() ->using(Provider::OpenAI, 'gpt-4') ->withMaxTokens(500) ->asText(); // Store response with all metadata $conversation->addPrismResponse($response->text);
That's it. Your conversation history becomes your AI context. Automatically.
Features
- ๐ Automatic Message Passing - Conversation history flows to AI providers without manual formatting
- ๐ฏ Direct Prism Integration - First-class support for all Prism features and providers
- ๐ Elegant Streaming - Real-time responses with automatic chunk collection and storage
- ๐ ๏ธ Tool & Function Support - Handle complex AI workflows with automatic message type management
- ๐ Complete Metadata - Token counts, model info, and response metadata stored automatically
- ๐ Drop-in Enhancement - Works with all existing Converse code, just adds Prism superpowers
Installation
composer require elliottlawson/converse-prism
The Prism package will be installed automatically. Run the migrations:
php artisan migrate
Quick Start
Update your User model to use the Converse Prism trait:
use ElliottLawson\ConversePrism\Concerns\HasAIConversations; class User extends Authenticatable { use HasAIConversations; // Replaces the base Converse trait }
Start having AI conversations:
use Prism\Enums\Provider; // Build the conversation context $conversation = $user->startConversation(['title' => 'My Chat']) ->addSystemMessage('You are a helpful assistant') ->addUserMessage('Hello! What is Laravel?'); // Make your AI call with automatic message passing $response = $conversation ->toPrismText() ->using(Provider::OpenAI, 'gpt-4') ->withMaxTokens(500) ->asText(); // Store the AI's response with metadata $conversation->addPrismResponse($response->text);
Requirements
- PHP 8.2+
- Laravel 11.0+
- Converse ^1.0
- Prism ^0.6
License
The MIT License (MIT). Please see License File for more information.