Home >Backend Development >PHP Tutorial >How Can I Execute Parallel cURL Requests to Retrieve Data from Multiple URLs Efficiently in PHP?
PHP: How to Execute Parallel cURL Requests
When dealing with situations where you need to retrieve data from multiple URLs efficiently, parallel cURL requests become a necessity. In this context, let's consider your scenario where you wish to retrieve JSON data from 15 different URLs using file_get_contents($url).
Your current approach using a simple loop to make sequential requests can lead to significant slowdowns due to the synchronous nature of file_get_contents. This means that the script must wait for each request to complete before proceeding to the next one.
To overcome this issue, we can leverage multi-cURL requests. Here's how you can accomplish this:
$nodes = array($url1, $url2, $url3); $node_count = count($nodes); $curl_arr = array(); $master = curl_multi_init(); for ($i = 0; $i < $node_count; $i++) { $url = $nodes[$i]; $curl_arr[$i] = curl_init($url); curl_setopt($curl_arr[$i], CURLOPT_RETURNTRANSFER, true); curl_multi_add_handle($master, $curl_arr[$i]); } do { curl_multi_exec($master, $running); } while ($running > 0); for ($i = 0; $i < $node_count; $i++) { $results[] = curl_multi_getcontent($curl_arr[$i]); } print_r($results);
This script creates a multi-curl context and adds each URL to it as a separate easy handle. It then uses a loop to iterate over all the easy handles, executing them concurrently and collecting the results. Finally, it prints the retrieved content from all the URLs.
By using parallel cURL requests, you can significantly improve the performance of your script and reduce the time required to retrieve data from multiple URLs.
The above is the detailed content of How Can I Execute Parallel cURL Requests to Retrieve Data from Multiple URLs Efficiently in PHP?. For more information, please follow other related articles on the PHP Chinese website!