DEV Community

Rap2h
Rap2h

Posted on • Updated on

Export 10M+ rows in XLSX with Laravel without memory issues

TL;DR: This post introduce FastExcel for Laravel, gives a quick overview over PHP generators and finally shows how to use both to generate Excel files from collections while preserving memory.

About FastExcel

Laravel FastExcel is intended at being Laravel-flavoured Spout, with the goal of simplifying imports and exports. It could be considered as a faster (and memory friendly) alternative to Laravel Excel with a different approach and less features. Getting started in 2 steps.

Install with composer:

composer require rap2hpoutre/fast-excel
Enter fullscreen mode Exit fullscreen mode

Then export a Model or a Collection to XLSX, CSV or ODS:

fastexcel($collection)->export('file.xlsx');
Enter fullscreen mode Exit fullscreen mode

More information on the README of the project homepage.

Generators

Generators were introduced in PHP 5, many years ago. A generator function looks just like a normal function, except that instead of returning a value, a generator yields values. Then you can iterate over the generator function.

On of the goal of such a function is to lazy iterate over data without building an array. So it preserves memory when manipulating large datasets. Quoting PHP documentation:

A generator allows you to write code that uses foreach to iterate over a set of data without needing to build an array in memory, which may cause you to exceed a memory limit, or require a considerable amount of processing time to generate.

Assuming you have a User model with 10M+ entries in database and you want to iterate over it in your code, instead of just calling User::all(), you could use a generator:

function usersGenerator() {
    foreach (User::cursor() as $user) {
        yield $user;
    }
}

$users = usersGenerator();
foreach($users as $user) {
    // Do something with each user without hitting memory limit
}
Enter fullscreen mode Exit fullscreen mode

The example above runs queries to only take users one by one. It only uses the memory needed to load 1 user N times.

Export large dataset using FastExcel and Generators

Since v1.3.0, FastExcel accepts generator function as a parameter. Using the previous example, you could just pass the generator to fastexcel function:

function usersGenerator() {
    foreach (User::cursor() as $user) {
        yield $user;
    }
}

// Export consumes only a few MB, even with 10M+ rows.
$users = usersGenerator();
fastexcel($users)->export('test.xlsx');
Enter fullscreen mode Exit fullscreen mode

FastExcel internally creates rows one by one when using generator, so it will not consume extra memory. It will be a long process, so make sure it does not hit max_execution_time (you could use Queue, any asynchronous technique, increase max execution time, or even run it from CLI). Still, your server will be grateful to not consume all its memory for one export.

So thanks to generators, you can now export thousands of models to XLSX, CSV and ODS files, with a few lines of code in a Laravel project.

Learn more about he library in the repository: https://github.com/rap2hpoutre/fast-excel

Top comments (9)

Collapse
 
inquisitivestha profile image
inquisitive-stha

Looks like the cursor does not work with eager loading for example, If we need to fetch data from multiple modal:

function reviewsGenerator($property_id) {
    foreach (Review::with(['model1','model1.model2','model1.model2.model3'])
                 ->orderBy('created_at','desc')
                 ->cursor() as $review) { //with cursor it returns only the base model1 i.e no model2, no model3
//                        yield $review;
                    dd($review)
    }
}
Enter fullscreen mode Exit fullscreen mode

What would be your recommendation to deal with this? How could I export such data which would obviously be huge.

Collapse
 
batistacte profile image
batistacte

You can use database views

Collapse
 
reachrama profile image
reachrama

Hi Raphael,
I was trying to download the xls, in browser. But i am not able to do it. Are is there any other parameter we need to pass to achieve?

return (new FastExcel($dumpdata))->download(DIR.'file.xlsx');

Collapse
 
mengdodo profile image
mengdodo

thanks!

Collapse
 
dfedev profile image
Javerleo

I was trying to use this very promising package.
The problem is, even the basic user's download is not working. No file gets downloaded to browser.
I'm calling the class from Livewire component.
Some advice?

Collapse
 
marcosteodoro profile image
Marcos Teodoro

I already used the Spout package in an application in my work, and that solved a lot of problems that we faced related with memory! Thanks for the post =]

Collapse
 
rahulprgrmr profile image
Rahul Raveendran

Showing Error: Can use "yield from" only with arrays and Traversables. Can you help?

Collapse
 
rap2hpoutre profile image
Rap2h • Edited

The method I wrote does not work with model, thank you for your feedback! I fixed the article using cursor which is simpler!

foreach (User::cursor() as $user) {
    yield $user;
}
Enter fullscreen mode Exit fullscreen mode

Still you could chunk by larger slices manually, like in this code:

// Generator function
function getUsersOneByOne() {
    // build your chunks as you want (200 chunks of 10 in this example)
    for ($i = 0; $i < 200; $i++) {
        $users = DB::table('users')->skip($i * 10)->take(10)->get();
        // Yield user one by one
        foreach($users as $user) {
            yield $user;
        }
    }
}

// Export consumes only a few MB
(new FastExcel(getUsersOneByOne()))->export('test.xlsx');
Enter fullscreen mode Exit fullscreen mode

Thank you!!

Collapse
 
otsabi94 profile image
otsabi94

i want to import 10 M rows any idee please