Laravel benchmark reveals bulk native inserts beat Eloquent by 20,000%

Laravel Daily////2 min read

The performance gap in high-volume seeding

Inserting a million rows isn't just a database task; it's a test of architectural efficiency. While Eloquent offers a developer-friendly syntax, its overhead becomes a massive bottleneck at scale. Every model instance triggers events, observers, and timestamp generation. For a standard users table, standard Eloquent inserts crawl at roughly five rows per second, primarily due to the CPU-intensive nature of password hashing. Without optimizations, seeding a million users would theoretically take over 50 hours.

Comparison of five database strategies

To optimize this, we look at five distinct approaches within the Laravel ecosystem:

Laravel benchmark reveals bulk native inserts beat Eloquent by 20,000%
Insert 1M Rows to MySQL Laravel DB: 6 Ways (Benchmarks)
  1. Eloquent Individual Creates: High overhead; triggers all model logic.
  2. Model Factories: Uses static properties to avoid re-hashing passwords, boosting speed significantly.
  3. Query Builder with Transactions: Bypasses model logic entirely for direct SQL execution.
  4. Extended Insert Statements: Bundles multiple records into a single INSERT query.
  5. Bulk Native (SQL Load File): The absolute winner, hitting 100,000+ rows per second by bypassing the application layer entirely.

Prerequisites and Key Tools

Before implementing these benchmarks, ensure you are comfortable with PHP and the Laravel framework. You will need:

  • Laravel Framework: The core environment for Eloquent and Query Builder.
  • MySQL: The primary relational database used for these benchmarks.
  • Artisan Command Line: To run the seeding benchmarks.

Code Walkthrough: Optimizing with Streaming

To prevent your server from crashing due to memory exhaustion, use a streaming approach. Instead of loading a million-row array into memory, you process data in chunks and flush the buffer.

// Using chunks and buffer flushing to keep RAM usage low
foreach ($rows->chunk(1000) as $chunk) {
    DB::table('users')->insert($chunk->toArray());
    // Force the memory to clear by resetting the buffer
    $this->emptyBuffer(); 
}

This method keeps memory usage around 500MB regardless of total row count, whereas non-buffered approaches can easily scale to several gigabytes and trigger an Out-of-Memory (OOM) error.

Syntax Notes and Best Practices

When using Laravel factories, remember that the password field is often defined with a static assignment. This means the first record hashes the string, but every subsequent record reuses that hash, saving thousands of seconds in CPU time. Always use database transactions when performing bulk inserts with the Query Builder; wrapping 1,000 inserts in a single transaction is exponentially faster than 1,000 individual commits.

Topic DensityMention share of the most discussed topics · 13 mentions across 9 distinct topics
Eloquent
23%· products
Laravel
23%· products
Digital Ocean
8%· companies
GitHub
8%· companies
Laravel Forge
8%· companies
Other topics
31%
End of Article
Source video
Laravel benchmark reveals bulk native inserts beat Eloquent by 20,000%

Insert 1M Rows to MySQL Laravel DB: 6 Ways (Benchmarks)

Watch

Laravel Daily // 9:55

Tutorials, and demo projects with Laravel framework. Host: Povilas Korop

Who and what they mention most
Laravel
39.7%27
Filament
19.1%13
LiveWire
16.2%11
PHP
14.7%10
2 min read0%
2 min read