Using the Laravel excel extension, how to quickly import more than 10,000 pieces of excel data without modifying the server configuration without reporting a 500 error, thank you
PHPz2017-05-16 16:50:23
This situation usually occurs when the memory usage exceeds the maximum allowed configuration of PHP or the execution time is too long and times out. If you directly use the Laravel excel extension and import based on web upload, it will definitely time out and exceed the memory size limit.
The solution is to use Laravel's scheduled tasks or task queues.
Timed tasks
上传文件后,定时任务去检查是否有新上传的文件,如果有就执行导入处理。
Task Queue
上传文件后,发布一个异步事件,由任务listener去异步处理excel内容的导入。
After the final import is completed, write the import results to the database or other storage methods, so that the asynchronous data import will not cause errors due to memory or timeout.
曾经蜡笔没有小新2017-05-16 16:50:23
It’s normal. Encounter this often. So I developed an extension.
https://git.oschina.net/xavie...
https://github.com/xavieryang...
The problem with PHP importing xls is very serious. Because xlsx is a compressed package plus xml, it is better than xls. Xls is a binary file, which must be fully loaded before it can be parsed, and the parsing is relatively complicated. The currently popular PHPEXCEL is very slow in processing xls, and it often fails to parse when importing xls.
巴扎黑2017-05-16 16:50:23
You don’t need to read 1W items into the memory at once. Read them one by one and save them one by one
function getRows($file)
{
$handle = fopen($file, 'rb');
if (!$handle) {
throw new Exception();
}
while (!feof($handle)) {
yield fgetcsv($handle);
}
fclose($handle);
}
foreach (getRows('data.csv') as $k) {
}