Building AI-Driven Features in Symfony

AI is transforming web development — and with php-llm/llm-chain, PHP developers can easily add powerful LLM features to Symfony apps. This guide shows you how to get started with chatbots, smart assistants, and more.
Why AI in Symfony?
AI is no longer a futuristic concept — it's part of today's tech stack. From chatbots to content enrichment and semantic search, AI-driven features are everywhere. Thanks to the php-llm/llm-chain library, integrating these capabilities into your Symfony project has never been easier.
Introduction
php-llm/llm-chain is a PHP-native library that allows you to interact with Large Language Models (LLMs) like:
It supports a wide variety of platforms (OpenAI, Azure, Replicate, etc.) and allows you to:
Generate content.
Call external tools (functions) with LLMs.
Embed and semantically search documents.
Chain multiple AI calls together with logic.
Symfony integration is provided via php-llm/llm-chain-bundle, which gives you automatic service registration, DI support, and config-driven setup.
Installing the Symfony Bundle
Install the package via Composer:
composer require php-llm/llm-chain-bundle
Configure your .env
:
OPENAI_API_KEY=your-api-key-here
Configure the service in config/packages/llm_chain.yaml
:
llm_chain:
platform:
openai:
api_key: '%env(OPENAI_API_KEY)%'
chain:
default:
model:
name: 'gpt4o-mini'
Using AI in Your Symfony Service
Here’s a simple example of how to create a Symfony service that sends a message to an LLM and gets a response.
use PhpLlm\LlmChain\ChainInterface;
use PhpLlm\LlmChain\Model\Message\Message;
use PhpLlm\LlmChain\Model\Message\MessageBag;
use PhpLlm\LlmChain\Model\Response\ResponseInterface;
final class SmartAssistant
{
public function __construct(
private ChainInterface $chain
) {
}
public function ask(string $question): ResponseInterface
{
$messages = new MessageBag(
Message::forSystem('You are a helpful assistant.'),
Message::ofUser($question),
);
return $this->chain->call($messages);
}
}
You can now use this service in any controller, console command, or background worker.
Tool Calling: Make the AI Interactive
Want your LLM to call real PHP functions? Annotate them by using #[AsTool]
attribute:
use PhpLlm\LlmChain\Toolbox\Attribute\AsTool;
#[AsTool('current_time', 'Returns the current server time')]
final class ClockTool
{
public function __invoke(): string
{
return (new \DateTimeImmutable())->format('Y-m-d H:i:s');
}
}
The LLM can now decide on its own when to use this function during a conversation. Think of it like ChatGPT Plugins… but in PHP.
Embeddings & Search
llm-chain also supports embeddings for semantic search. You can store vectors in providers like:
This is great for implementing Retrieval-Augmented Generation (RAG) — a technique where you fetch contextually relevant documents before asking the LLM a question.
Try It: Symfony Demo Project
Want to test this out? The php-llm/llm-chain
team provides a demo Symfony application showing chatbot interaction and vector search:
➡ php-llm/llm-chain-symfony-demo
Controlling Costs and Tokens
LLMs aren’t free, so stay efficient with:
Cache repeating responses
Using short prompts
Monitoring token usage via logs
Limiting your system prompts
Conclusion
With just a few lines of configuration and code, you can integrate powerful AI features into your Symfony app. Whether you want to automate tasks, answer questions, or enrich content — llm-chain is a solid tool to get started.
Symfony is ready for the AI age. Are you?