Building RAG Chatbots with Laravel AI SDK: Documents and Vector Stores
Overview
Modern applications increasingly require the ability to interact with internal knowledge bases. The
Prerequisites
To follow this implementation, you should be familiar with the
Key Libraries & Tools
- Laravel AI SDK: The core framework for AI interactions.
- OpenAI: The external provider used for embeddings and LLM processing.
- PostgreSQL: An alternative local storage option for vector data (via pgvector).
- Livewire: Used for the real-time chatbot interface.

Code Walkthrough
1. Processing the Document
When a user uploads a file, we move it to local storage and then dispatch a job to sync it with our AI provider's vector store.
// In DocumentController or a Job
$store = AI::stores()->create(['name' => 'Internal Docs']);
$document = AI::document(storage_path('app/' . $path));
$store->add($document);
2. Creating the Chatbot Agent
We define an agent using the make:agent command. This agent encapsulates the instructions and tools required to search the documents.
class DocumentQA extends Agent
{
#[Provider('openai')]
#[Model('gpt-4-turbo')]
public function instructions(): string
{
return 'You are an assistant answering questions based on the provided files.';
}
public function tools(): array
{
return [Tool::fileSearch()];
}
}
3. Implementing the Livewire Chat
The Livewire component handles the user input and prompts the agent with the specific store_id where the documents are located.
public function askQuestion()
{
$agent = new DocumentQA();
$response = $agent->prompt($this->question, ['store_id' => $this->storeId]);
$this->messages[] = ['user' => $this->question, 'bot' => $response->text()];
}
Syntax Notes
The SDK utilizes PHP attributes like #[Provider] and #[Model] to declaratively configure agents. This removes the need for messy configuration files or manual factory instantiations. The Tool::fileSearch() method is particularly notable as it bridges the gap between the LLM and the vector store without requiring manual embedding logic.
Practical Examples
- Internal Wikis: Companies can upload HR manuals or technical documentation for employee self-service.
- Customer Support: Automated agents can reference product manuals to resolve specific user queries without human intervention.
Tips & Gotchas
- Cost Management: External providers charge for both storage and token usage. A single complex query on a small document can cost upwards of $0.20 when using premium models.
- Latency: Expect delays of 10+ seconds for comprehensive file searches. Use Livewire loading states to keep users informed.