Home  >  Q&A  >  body text

Laravel Excel queue consuming too much RAM

I have set up Laravel queue to read excel files using your Laravel excel and it works great for small files.

But for large files (100 mb) and 400k records, it takes too much time and consumes nearly 40GB RAM of the server.

I have set up a supervisor to run queue:work commands. My server memory is 60GB. For small files everything works fine, but for large files it doesn't work.

I also checked the query times using telescope, but no query took a long time.

P粉884667022P粉884667022235 days ago349

reply all(2)I'll reply

  • P粉726234648

    P粉7262346482024-03-22 16:23:26

    Currently, there is no direct answer to your question. A lot depends on your target results. You have to devise your own way to solve it.

    One of the things I'm most concerned about is chunking or partitioning large excel files and putting them into a queue. Maybe you can take advantage of Laravel job batching.

    Another thing you can introduce is a microservices system, where these heavy tasks will be done by another, better machine.

    But like I said, there is no single solution to a problem like this. You have to figure this out yourself.

    reply
    0
  • P粉455093123

    P粉4550931232024-03-22 10:38:32

    For all those facing this kind of problem, I recommend using Spout. It works like a charm. I tried 3 PHP services for this and in the end only spout worked.

    https://opensource.box.com/spout/

    https://github.com/box/spout

    reply
    0
  • Cancelreply