php csv export

不言
不言Original
2018-04-20 10:04:392661browse



This article introduces the content of php csv export, which has certain reference value. Now I will share it with you. Friends in need can refer to


Overview:


Recent company projects require data to be output in addition to page output. There is an export function. Although I have done several export functions before, the amount of data this time is relatively large. There are almost 200,000 pieces of data in one day. If the import is required for 7 or 30 days, then the amount of data can easily exceed one million or even one thousand. Wan, so I discovered some pitfalls in big data export during the development process. I would like to share them with you and learn from each other.

Preparation:

1. PHP setting pit:

  • set_time_limit – Set the maximum execution time of the script:

This configuration is generally 30 seconds by default in PHP. If you have small data, you may not find this setting problem, but if your data reaches one million levels, 30 seconds is often not enough, so you need to Add set_time_limit(0) to your script so that the script has no execution time now

  • memory_limit – PHP memory limit:

This configuration Generally, the default value of PHP is 128M. If you have done small data before, you may have changed this configuration to solve many problems. Maybe some people think, can you increase this for big data as well? So it is really too young and too native. If you can set 1G or unlimited locally, it may be no problem, but in the formal world, something will happen sooner or later if you do this. A PHP program occupies such a large memory space. If you call Your company's operation and maintenance will help adjust the configuration. I guess the operation and maintenance must be very reluctant. It is too luxurious to do this for server hardware. Therefore, we should try to avoid increasing this setting.

2. Excel Pit:

Since we are exporting data, of course everyone immediately thought of the excel format, which is so convenient for viewing data. However, we never expected that excel also has a temper!

  • Table data limit:

Excel 2003及以下的版本。一张表最大支持65536行数据,256列。
Excel 2007-2010版本。一张表最大支持1048576行,16384列。
  • 1

  • 2

That is to say, if you want to easily import millions of items into an EXCEL table at once, it is not possible. You at least need to split the data to ensure that the data cannot exceed 104W per table.

  • PHPexcel memory overflow:

Since the data is limited to 104W, then data division is data division, so you try to import 50W into the table at a time, However, there is a function inside PHPexcel that reports a memory overflow error, and then you continue to reduce the amount of data. Until you import 5W at a time, you will find a memory overflow error. Why is this? Although you split the data to import multiple data tables, in the end PHPexcel still puts all the table data into one variable at once to create the file... Well, these millions of data are stored in one variable. What do you want? It's really difficult to keep the memory from overflowing.
(Later I read some articles and found that PHPExcel also has a solution. The PHPExcel_Settings::setCacheStorageMethod method changes the buffering method to reduce memory usage)

3. csv pit:

EXCEL is so troublesome , Isn’t it okay if I don’t use it? I use csv files to store them. There is no limit to the quantity. I can view them directly using EXCEL. I can also import the files into the database later. Isn’t it great to kill two birds with one stone? Hey, what a great idea, young hero! But CSV also has pitfalls!

  • Too many output buffers:

When you use the PHP native function putcsv(), you actually use the output buffer buffer. If you put a few If millions of data are always output using this function, the output buffer will be too large and an error will be reported. Therefore, we must take out the contents of the output buffer every certain amount of time and set it to the waiting output state. The specific operation is:

ob_flush();flush();
  • 1

  • 2

具体说明介绍:PHP flush()与ob_flush()的区别详解

  • EXCEL查看CSV文件数量限制:

大多数人看csv文件都是直接用EXCEL打开的。额,这不就是回到EXCEL坑中了吗?EXCEL有数据显示限制呀,你几百万数据只给你看104W而已。什么?你不管?那是他们打开方式不对而已?不好不好,我们解决也不难呀,我们也把数据分割一下就好了,再分开csv文件保存,反正你不分割数据变量也会内存溢出。

4、总结做法

分析完上面那些坑,那么我们的解决方案来了,假设数据量是几百万。

1、那么我们要从数据库中读取要进行数据量分批读取,以防变量内存溢出,

2、我们选择数据保存文件格式是csv文件,以方便导出之后的阅读、导入数据库等操作。

3、以防不方便excel读取csv文件,我们需要104W之前就得把数据分割进行多个csv文件保存

4、多个csv文件输出给用户下载是不友好的,我们还需要把多个csv文件进行压缩,最后提供给一个ZIP格式的压缩包给用户下载就好。

代码:

 //导出说明:因为EXCEL单表只能显示104W数据,同时使用PHPEXCEL容易因为数据量太大而导致占用内存过大,
    //因此,数据的输出用csv文件的格式输出,但是csv文件用EXCEL软件读取同样会存在只能显示104W的情况,所以将数据分割保存在多个csv文件中,并且最后压缩成zip文件提供下载
    function putCsv(array $head, $data, $mark = 'attack_ip_info', $fileName = "test.csv")
    {
        set_time_limit(0);        $sqlCount = $data->count();        // 输出Excel文件头,可把user.csv换成你要的文件名
        header('Content-Type: application/vnd.ms-excel;charset=utf-8');
        header('Content-Disposition: attachment;filename="' . $fileName . '"');
        header('Cache-Control: max-age=0');        $sqlLimit = 100000;//每次只从数据库取100000条以防变量缓存太大
        // 每隔$limit行,刷新一下输出buffer,不要太大,也不要太小
        $limit = 100000;        // buffer计数器
        $cnt = 0;        $fileNameArr = array();        // 逐行取出数据,不浪费内存
        for ($i = 0; $i < ceil($sqlCount / $sqlLimit); $i++) {            $fp = fopen($mark . &#39;_&#39; . $i . &#39;.csv&#39;, &#39;w&#39;); //生成临时文件
      //     chmod(&#39;attack_ip_info_&#39; . $i . &#39;.csv&#39;,777);//修改可执行权限
            $fileNameArr[] = $mark . &#39;_&#39; .  $i . &#39;.csv&#39;;        // 将数据通过fputcsv写到文件句柄
            fputcsv($fp, $head);            $dataArr = $data->offset($i * $sqlLimit)->limit($sqlLimit)->get()->toArray();            foreach ($dataArr as $a) {                $cnt++;                if ($limit == $cnt) {                    //刷新一下输出buffer,防止由于数据过多造成问题
                    ob_flush();
                    flush();                    $cnt = 0;
                }
                fputcsv($fp, $a);
            }
            fclose($fp);  //每生成一个文件关闭
        }        //进行多个文件压缩
        $zip = new ZipArchive();        $filename = $mark . ".zip";        $zip->open($filename, ZipArchive::CREATE);   //打开压缩包
        foreach ($fileNameArr as $file) {            $zip->addFile($file, basename($file));   //向压缩包中添加文件
        }        $zip->close();  //关闭压缩包
        foreach ($fileNameArr as $file) {
            unlink($file); //删除csv临时文件
        }        //输出压缩文件提供下载
        header("Cache-Control: max-age=0");
        header("Content-Description: File Transfer");
        header(&#39;Content-disposition: attachment; filename=&#39; . basename($filename)); // 文件名
        header("Content-Type: application/zip"); // zip格式的
        header("Content-Transfer-Encoding: binary"); //
        header(&#39;Content-Length: &#39; . filesize($filename)); //
        @readfile($filename);//输出文件;
        unlink($filename); //删除压缩包临时文件
    }
  • 1

  • 2

  • 3

  • 4

  • 5

  • 6

  • 7

  • 8

  • 9

  • 10

  • 11

  • 12

  • 13

  • 14

  • 15

  • 16

  • 17

  • 18

  • 19

  • 20

  • 21

  • 22

  • 23

  • 24

  • 25

  • 26

  • 27

  • 28

  • 29

  • 30

  • 31

  • 32

  • 33

  • 34

  • 35

  • 36

  • 37

  • 38

  • 39

  • 40

  • 41

  • 42

  • 43

  • 44

  • 45

  • 46

  • 47

  • 48

  • 49

  • 50

  • 51

  • 52

  • 53

  • 54

  • 55

  • 56

  • 57

  • 58

总结:

其实上面代码还是有优化的空间的,比如说用异常捕捉,以防因为某些错误而导致生成了一些临时文件又没有正常删除,还有PHPexcel的缓存设置也许能解决内存溢出问题,可以生成一个EXCEL文件多个工作表的形式,这样对于文件阅读者来说更友好。

以上便是本人对PHP大数据导出的见解,希望能帮到您们,同时不足的地方请多多指教!

———————————————————————————————————— 
2017年12月17日 
PS:最近了解其实关于内存溢出的问题,用迭代器来处理会方便多了。

版权声明:每一篇原创文章都是我的心血,欢迎转载,但请转载前留个评论,感谢您的支持!!! https://blog.csdn.net/Tim_phper/article/details/77581071

相关推荐:

php 逐行读取csv数据入库

PHP进行读取CSV文件数据和生成CSV文件

The above is the detailed content of php csv export. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn