Home >Backend Development >PHP Tutorial >3 ways to download remote files in PHP from the perspective of performance, _PHP tutorial
When exporting Excel today, I always have to test the exported Excel file and download and open it frequently , it’s very troublesome, just think of writing a piece of code to do it in one go. Export Excel on the server ==> Download the Excel file to the local ==> and open it.
Here is the PHP solution for downloading remote files as a reminder. The third method takes into account performance issues when the file is too large.
3 options:
-rw-rw-r-- 1 liuyuan liuyuan 470 Feb 20 18:12 test1_fopen.php
-rw-rw-r-- 1 liuyuan liuyuan 541 Feb 20 18:06 test2_curl.php
-rw-rw-r-- 1 liuyuan liuyuan 547 Feb 20 18:12 test3_curl_better.php
Option 1, suitable for small files
Directly use fopen()/file_get_contents() to obtain the file stream and use file_put_contents() to write
<?php //an example xls file form baidu wenku $url = 'http://bs.baidu.com/wenku4/%2Fe43e6732eba84a316af36c5c67a7c6d6?sign=MBOT:y1jXjmMD4FchJHFHIGN4z:lfZAx1Nrf44aCyD6tJqJ2FhosLY%3D&time=1392893977&response-content-disposition=attachment;%20filename=%22php%BA%AF%CA%FD.xls%22&response-content-type=application%2foctet-stream'; $fp_input = fopen($url, 'r'); file_put_contents('./test.xls', $fp_input); exec("libreoffice ./test.xls", $out, $status); ?>
Option 2: Get content through Curl
<?php //an example xls file form baidu wenku $url = 'http://bs.baidu.com/wenku4/%2Fe43e6732eba84a316af36c5c67a7c6d6?sign=MBOT:y1jXjmMD4FchJHFHIGN4z:lfZAx1Nrf44aCyD6tJqJ2FhosLY%3D&time=1392893977&response-content-disposition=attachment;%20filename=%22php%BA%AF%CA%FD.xls%22&response-content-type=application%2foctet-stream'; $ch = curl_init($url); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); file_put_contents('./test.xls', curl_exec($ch)); curl_close($ch); exec("libreoffice ./test.xls", $out, $status); ?>
There is a problem with the first and second solutions, that is, the file will be read into the memory before being written to the local disk. When the file is large, it may exceed the memory and crash
Even if your memory is set to be large enough, this is unnecessary overhead
The solution is: directly give CURL a writable file stream to let it solve this problem by itself (via the CURLOPT_FILE option), so you must first create a file pointer for it.
<?php //an example xls file form baidu wenku $url = 'http://bs.baidu.com/wenku4/%2Fe43e6732eba84a316af36c5c67a7c6d6?sign=MBOT:y1jXjmMD4FchJHFHIGN4z:lfZAx1Nrf44aCyD6tJqJ2FhosLY%3D&time=1392893977&response-content-disposition=attachment;%20filename=%22php%BA%AF%CA%FD.xls%22&response-content-type=application%2foctet-stream'; $fp_output = fopen('./test.xls', 'w'); $ch = curl_init($url); curl_setopt($ch, CURLOPT_FILE, $fp_output); curl_exec($ch); curl_close($ch); exec("libreoffice ./test.xls", $out, $status); ?>
The above content introduces you to 3 methods of downloading remote files in PHP from the perspective of performance. I hope you like it.