Building AI-Native Applications with the Laravel AI SDK: A Comprehensive Guide
Overview: The Unified AI Strategy for Laravel
Building AI features often feels like a fragmented journey. Developers usually jump between specialized APIs for text, separate services for images, and complex libraries for audio transcription. The changes this by providing a unified, first-party toolkit that handles the heavy lifting of AI integration. It treats AI as a core application concern, much like how handles databases or queues. By abstracting the differences between providers like , , and , it allows you to write cleaner, more maintainable code that isn't locked into a single vendor's API.
designed the SDK to feel "Laravel-esque." This means leaning into conventions like class-based agents, fluent API chains, and deep integration with the existing ecosystem. Whether you need to summarize an issue in a project management tool, generate realistic speech via , or perform semantic search on a mountain of PDFs, the SDK provides the scaffolding to do it efficiently. It moves AI from being an experimental add-on to a standard part of the modern developer's workflow.
Prerequisites and Environment Setup
Before you begin building, ensure you have a standard environment ready. You should be comfortable with , , and basic concepts like controllers and service providers. You will also need API keys from at least one AI provider. While the SDK supports local models via , production applications typically require keys for or .
To get started, install the package via :
composer require laravel/ai
After installation, publish the configuration and migrations:
php artisan vendor:publish --tag="ai-config"
php artisan vendor:publish --tag="ai-migrations"
php artisan migrate
The configuration file (config/ai.php) allows you to define your default providers. You can set different defaults for different modalities—for instance, using for text and for images. This flexibility is a core strength of the SDK.
Key Libraries & Tools
- : The primary toolkit for interacting with LLMs, image generators, and audio services.
- : A community package by TJ Miller that serves as the query builder layer for the SDK's text generation.
- : Integrated for high-quality text-to-speech capabilities.
- : Enables running local models for development and testing without incurring API costs.
- : A local server that provides AI agents with context about your specific codebase.
- with PGVector: Used for storing and searching vector embeddings locally.
Code Walkthrough: Implementing Agents and Tools
1. Creating an Agent
The Agent class is the heart of the SDK. It encapsulates the identity of your AI. Instead of passing long strings of instructions in every controller, you define them once in a reusable class. You can generate one using the Artisan command:
php artisan make:agent SalesCoachAgent
In the generated class, you define the system prompt and the models to use. The instructions method is where you set the "personality" and guardrails for the agent.
namespace App\Agents;
use Laravel\AI\Agent;
class SalesCoachAgent extends Agent
{
public function instructions(): string
{
return "You are an expert sales coach. Analyze the provided transcript and offer three actionable improvements.";
}
}
2. Using Structured Output
One of the most powerful features is getting the AI to return data in a specific format rather than a messy string. The SDK uses a builder to ensure the model follows your rules. This makes it possible to save AI responses directly into your database without fragile regex parsing.
use App\Agents\SalesCoachAgent;
use Laravel\AI\Schema;
$agent = new SalesCoachAgent();
$response = $agent->predict(
prompt: "Analyze the call from yesterday.",
schema: Schema::object([
'sentiment' => Schema::string()->description('Overall tone of the customer'),
'score' => Schema::integer()->description('Score from 1-10'),
'follow_up_needed' => Schema::boolean(),
])
);
// Access data directly as an array
echo $response['sentiment'];
3. Integrating Tools (Function Calling)
Tools allow your AI to actually do things. You can give an agent the ability to search the web, fetch a URL, or even query your own database. The SDK comes with several provider tools built-in, but you can also write your own custom tools by extending the Tool class and implementing a handle method.
use Laravel\AI\Tools\WebSearch;
class ResearcherAgent extends Agent
{
public function tools(): array
{
return [
new WebSearch(),
];
}
}
When you prompt this agent, it will realize it needs more info, call the WebSearch tool, and then use the results to finish its answer. This turns a static LLM into a dynamic assistant.
Syntax Notes: Attributes and Traits
The makes heavy use of PHP attributes to simplify configuration. These attributes allow you to stay updated with the latest model advancements without changing your code logic.
#[UseCheapestModel]: Instructs the SDK to use the most cost-effective model for a specific provider (e.g., or ). This is perfect for simple tasks like summarization.#[UseSmartestModel]: Forces the use of the flagship model (e.g., ) for tasks requiring high reasoning capabilities.RemembersConversationstrait: Adding this to your agent automatically manages database storage for chat history, ensuring the AI remembers previous messages without you manually passing a growing array of context.
Practical Examples: The 'Larvis' Workflow
A practical application of this tech is building a voice-enabled assistant like "Larvis." The workflow demonstrates the multi-modal nature of the SDK:
- Transcription: The user uploads an audio file of their question. The SDK uses a
transcribemethod (typically via ) to convert audio to text. - Context Retrieval: The agent fetches relevant local documents (like files) and injects them into the prompt to provide specific knowledge the LLM wasn't trained on.
- Inference: The agent generates a text response based on the transcription and the local documents.
- Speech Synthesis: The text response is passed to the
audiomethod, using to generate a high-quality voice response that is sent back to the user.
This entire pipeline, which would previously take dozens of different library integrations, can now be handled in a single controller using less than 50 lines of code.
Tips & Gotchas
- Context Bloat: Be careful not to attach too many tools or files to every request. Every tool definition and message in a conversation history consumes tokens, which increases latency and cost. Use the
RemembersConversationssettings to prune old messages. - Failover Logic: In production, always define a fallback provider. If is experiencing downtime or you hit a rate limit, the SDK can automatically switch to to keep your app running.
- Local Development: Use for your daily coding to save money. You can switch your local
.envto point the AI provider tohttp://localhost:11434to test your logic for free. - Async Processing: For long-running tasks like transcribing a massive video file or generating a complex image, use the
queuemethod. This offloads the work to your worker and prevents your web request from timing out.
- 18%· products
- 8%· companies
- 8%· companies
- 8%· products
- 8%· products
- Other topics
- 53%

Building AI Applications with the Laravel AI SDK
WatchLaravel // 2:05:19
The official YouTube channel of Laravel, the clean stack for Artisans and agents. We will update you on what's new in the world of Laravel, from the framework to our products Cloud, Forge, and Nightwatch.