This article will discuss two specific implementation methods and make a simple performance comparison of different methods.
1. Classic cURL concurrency mechanism and its existing problems
The classic cURL implementation mechanism is easy to find online. For example, refer to the following implementation method in the PHP online manual:
function classic_curl($urls, $delay) {
$queue = curl_multi_init();
$map = array();
foreach ($urls as $url) {
// create cURL resources
$ch = curl_init();
// set URL and other appropriate options
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_NOSIGNAL, true );
// add handle
curl_multi_add_handle($queue, $ch);
$map[$url] = $ch;
}
$active = null;
// execute the handles
do {
$mrc = curl_multi_exec($queue, $active);
} while ($mrc == CURLM_CALL_MULTI_PERFORM);
while ($active > 0 && $mrc == CURLM_OK) {
if (curl_multi_select($queue, 0.5) != -1) {
do {
$mrc = curl_multi_exec($queue , $active);
} while ($mrc == CURLM_CALL_MULTI_PERFORM);
}
}
$responses = array();
foreach ($map as $url= >$ch) {
$responses[$url] = callback(curl_multi_getcontent($ch), $delay);
curl_multi_remove_handle($queue, $ch);
curl_close($ch);
}
curl_multi_close($queue);
return $responses;
}
First push all URLs into the concurrent queue, and then execute the concurrent process, Wait for all requests to be received before parsing the data and other subsequent processing. In the actual processing process, affected by network transmission, the content of some URLs will be returned before other URLs, but classic cURL concurrency must wait for the slowest URL to return Processing starts later, and waiting means CPU idleness and waste. If the URL queue is short, this idleness and waste are still within an acceptable range, but if the queue is long, this waiting and waste will become unacceptable. Accept.
2. Improved Rolling cURL concurrency method
After careful analysis, it is not difficult to find that there is still room for optimization in classic cURL concurrency. When a URL is optimized, After the request is completed, process it as quickly as possible, and wait for other URLs to return while processing, instead of waiting for the slowest interface to return before starting processing and other work, thereby avoiding CPU idleness and waste. Without further ado, below Paste the specific implementation:
function rolling_curl($urls, $delay) {
$queue = curl_multi_init();
$map = array();
foreach ($urls as $url) {
$ch = curl_init();
curl_setopt ($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_TIMEOUT, 1);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_NOSIGNAL, true);
curl_multi_add_handle($queue, $ch);
$map[(string) $ch] = $url;
}
$responses = array();
do {
while (($code = curl_multi_exec($queue, $active)) == CURLM_CALL_MULTI_PERFORM) ;
if ($code != CURLM_OK) { break; }
// a request was just completed -- find out which one
while ($done = curl_multi_info_read($queue)) {
// get the info and content returned on the request
$info = curl_getinfo($done['handle']);
$error = curl_error($done['handle']);
$results = callback(curl_multi_getcontent ($done['handle']), $delay);
$responses[$map[(string) $done['handle']]] = compact('info', 'error', 'results') ;
// remove the curl handle that just completed
curl_multi_remove_handle($queue, $done['handle']);
curl_close($done['handle']);
}
// Block for data in / output; error handling is done by curl_multi_exec
if ($active > 0) {
curl_multi_select($queue, 0.5);
}
} while ($active);
curl_multi_close($queue);
return $responses;
}
3. Performance comparison of two concurrent implementations
The performance comparison test before and after the improvement was conducted on a LINUX host. The concurrent queue used during the test is as follows:
http://item.taobao.com/item.htm?id=14392877692
http://item.taobao.com/item.htm?id=16231676302
http://item.taobao .com/item.htm?id=17037160462
http://item.taobao.com/item.htm?id=5522416710
http://item.taobao.com/item.htm?id=16551116403
http://item.taobao.com/item.htm?id=14088310973
Briefly explain the principles of experimental design and the format of performance test results: To ensure the reliability of the results, each set of experiments is repeated 20 times, in a single experiment, given the same interface URL set, measure the time consuming (in seconds) of the two concurrency mechanisms: Classic (referring to the classic concurrency mechanism) and Rolling (referring to the improved concurrency mechanism). The one with the shortest time is the winner (Winner), and the time saved (Excellence, in seconds) and performance improvement ratio (Excel. %) are calculated. In order to be as close to the real request as possible while keeping the experiment simple, in the processing of the returned results The above only performed simple regular expression matching without performing other complex operations. In addition, in order to determine the impact of the result processing callback on the performance comparison test results, usleep can be used to simulate more responsible data processing logic in reality (such as extraction, word segmentation, writing to files or databases, etc.).
The callback function used in the performance test is:
function callback($data, $delay) {
preg_match_all('/
(.+)
/iU', $data, $matches);usleep($delay);
return compact('data', 'matches');
}
When the data processing callback has no delay: Rolling Curl is slightly better, but the performance The improvement effect is not obvious.
The data processing callback delay is 5 milliseconds: Rolling Curl wins, and the performance is improved by about 40%.
Through the above performance comparison, Rolling cURL should be a better choice in application scenarios that handle URL queue concurrency. , when the amount of concurrency is very large (1000+), you can control the maximum length of the concurrent queue, such as 20. Whenever a URL is returned and processed, a URL that has not yet been requested is immediately added to the queue. The code written in this way will It is more robust and will not get stuck or crash if the number of concurrency is too large. For detailed implementation, please refer to: http://code.google.com/p/rolling-curl/

php把负数转为正整数的方法:1、使用abs()函数将负数转为正数,使用intval()函数对正数取整,转为正整数,语法“intval(abs($number))”;2、利用“~”位运算符将负数取反加一,语法“~$number + 1”。

实现方法:1、使用“sleep(延迟秒数)”语句,可延迟执行函数若干秒;2、使用“time_nanosleep(延迟秒数,延迟纳秒数)”语句,可延迟执行函数若干秒和纳秒;3、使用“time_sleep_until(time()+7)”语句。

php除以100保留两位小数的方法:1、利用“/”运算符进行除法运算,语法“数值 / 100”;2、使用“number_format(除法结果, 2)”或“sprintf("%.2f",除法结果)”语句进行四舍五入的处理值,并保留两位小数。

php字符串有下标。在PHP中,下标不仅可以应用于数组和对象,还可应用于字符串,利用字符串的下标和中括号“[]”可以访问指定索引位置的字符,并对该字符进行读写,语法“字符串名[下标值]”;字符串的下标值(索引值)只能是整数类型,起始值为0。

判断方法:1、使用“strtotime("年-月-日")”语句将给定的年月日转换为时间戳格式;2、用“date("z",时间戳)+1”语句计算指定时间戳是一年的第几天。date()返回的天数是从0开始计算的,因此真实天数需要在此基础上加1。

在php中,可以使用substr()函数来读取字符串后几个字符,只需要将该函数的第二个参数设置为负值,第三个参数省略即可;语法为“substr(字符串,-n)”,表示读取从字符串结尾处向前数第n个字符开始,直到字符串结尾的全部字符。

方法:1、用“str_replace(" ","其他字符",$str)”语句,可将nbsp符替换为其他字符;2、用“preg_replace("/(\s|\ \;||\xc2\xa0)/","其他字符",$str)”语句。

php判断有没有小数点的方法:1、使用“strpos(数字字符串,'.')”语法,如果返回小数点在字符串中第一次出现的位置,则有小数点;2、使用“strrpos(数字字符串,'.')”语句,如果返回小数点在字符串中最后一次出现的位置,则有。


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Dreamweaver Mac version
Visual web development tools

SublimeText3 Chinese version
Chinese version, very easy to use

SAP NetWeaver Server Adapter for Eclipse
Integrate Eclipse with SAP NetWeaver application server.

Safe Exam Browser
Safe Exam Browser is a secure browser environment for taking online exams securely. This software turns any computer into a secure workstation. It controls access to any utility and prevents students from using unauthorized resources.

VSCode Windows 64-bit Download
A free and powerful IDE editor launched by Microsoft
