Models
See all models
A set of decoupled PHP components for AI integrations. Each works independently without the full framework. The foundation layer abstracts away third-party specifics.
Send prompts to any AI model and get responses with a single method call. The Platform abstracts away provider differences — switch between OpenAI, Anthropic, Mistral, or local models without changing your application code.
See example$platform = PlatformFactory::create($apiKey, http_client());
$messages = new MessageBag(
Message::forSystem('You are a helpful assistant.'),
Message::ofUser('What is the Symfony framework?'),
);
$result = $platform->invoke('gpt-5-mini', $messages);
echo $result->asText();
Stream responses token by token for real-time output. Instead of waiting for the full response, display text as it's generated — just like ChatGPT does. Works with any supported model and platform.
See example$result = $platform->invoke('gpt-5-mini', $messages, [
'stream' => true,
]);
foreach ($result->asStream() as $chunk) {
echo $chunk; // process chunks as they arrive
}
Convert text to natural-sounding speech or transcribe audio recordings to text. Build voice assistants, podcast transcription tools, or accessibility features with providers like ElevenLabs or OpenAI Whisper — all through the same interface.
See example// Text to Speech
$result = $platform->invoke('eleven_multilingual_v2', new Text('Hello!'), [
'voice' => 'pqHfZKP75CvOlQylNhV4',
]);
$result->asFile('output.mp3');
// Speech to Text
$result = $platform->invoke('scribe_v1', Audio::fromFile('recording.mp3'));
echo $result->asText();
Send images, PDFs, and audio alongside text to any supported model. Build apps that can analyze photos, extract data from documents, or process voice recordings — all through the same simple API.
See example$messages = new MessageBag(
Message::ofUser(
'Describe this image',
Image::fromFile('/path/to/photo.jpg'),
),
);
$result = $platform->invoke('gpt-5-mini', $messages);
echo $result->asText();
Map AI responses directly to PHP objects with automatic deserialization. Instead of parsing JSON strings yourself, get type-safe PHP objects back from the model — ready to use in your application logic.
See example$messages = new MessageBag(
Message::forSystem('You are a helpful math tutor.'),
Message::ofUser('Solve 8x + 7 = -23'),
);
$result = $platform->invoke('gpt-5-mini', $messages, [
'response_format' => MathReasoning::class,
]);
$math = $result->asObject(); // MathReasoning instance
Build autonomous agents that reason, plan, and execute multi-step tasks. An Agent wraps a model with tools and processors, handling the full loop of calling tools, reading results, and deciding what to do next — until the task is complete.
See example$agent = new Agent($platform, 'gpt-5-mini');
$messages = new MessageBag(
Message::forSystem('You are a helpful assistant.'),
Message::ofUser('What is the Symfony framework?'),
);
$result = $agent->call($messages);
echo $result->getContent();
Track exactly how many tokens each request consumes — prompt, completion, and total. Essential for monitoring costs, optimizing prompts, and staying within budget when running AI features in production.
See example$result = $agent->call($messages);
$usage = $result->getMetadata()->get('token_usage');
echo $usage->getPromptTokens();
echo $usage->getCompletionTokens();
echo $usage->getTotalTokens();
Let AI models call your PHP functions to fetch data or trigger actions. Define tools as simple classes — the framework handles schema generation, argument passing, and result formatting automatically.
See example$toolbox = new Toolbox([new YoutubeTranscriber(http_client())]);
$processor = new AgentProcessor($toolbox);
$agent = new Agent($platform, 'gpt-5-mini', [$processor], [$processor]);
$result = $agent->call($messages);
Compose complex workflows by using agents as tools for other agents. A coordinator agent can delegate specialized tasks — like math, research, or translation — to focused subagents, each with their own model and instructions.
See example$mathAgent = new Agent($platform, 'gpt-5.2', [$mathPrompt]);
$subagent = new Subagent($mathAgent);
$toolbox = new Toolbox([$subagent]);
$processor = new AgentProcessor($toolbox);
$agent = new Agent($platform, 'gpt-5-mini', [$processor], [$processor]);
$result = $agent->call($messages);
Index documents into vector stores and retrieve relevant context automatically. Let your AI answer questions about your own data — manuals, blog posts, knowledge bases — by finding the most relevant passages before generating a response.
See example$vectorizer = new Vectorizer($platform, 'text-embedding-3-small');
$processor = new DocumentProcessor($vectorizer, $store);
$indexer = new SourceIndexer($loader, $processor);
$indexer->index(['/path/to/docs']);
$retriever = new Retriever($vectorizer, $store);
$docs = $retriever->retrieve('How do I create a command?');
Clone the demo, add your OpenAI API key, and start the server. The app includes a chatbot UI, RAG pipeline, and multiple demo scenarios ready to explore.
$ composer create-project symfony/ai-demo
$ cd ai-demo
$ echo "OPENAI_API_KEY='sk-...'" > .env.local
$ docker compose up -d
$ symfony serve -d
The official MCP PHP SDK, built in collaboration with the PHP Foundation. Create MCP servers that expose tools and resources to any AI agent, or build clients that connect to existing MCP servers.
class WeatherTool
{
/**
* @param string $city Name of the city
*/
#[McpTool(name: 'weather')]
public function getWeather(string $city): string
{
return 'sunny, 24°C';
}
}
# Install Mate with extensions
$ composer require --dev symfony/ai-mate
# Symfony extension: profiler, services
$ composer require --dev symfony/ai-symfony-mate-extension
# Monolog extension: log search, filtering
$ composer require --dev symfony/ai-monolog-mate-extension
# Initialize and start the MCP server
$ vendor/bin/mate init
$ vendor/bin/mate serve
An MCP server for AI-assisted development. Mate gives AI assistants like Claude Code, Copilot, or Codex direct access to your Symfony application — profiler data, logs, container services — so they can help you debug and understand your app in real time.
Mate docs