Home >Backend Development >PHP Tutorial >How to optimize file writing speed

How to optimize file writing speed

WBOY
WBOYOriginal
2016-10-11 14:23:441606browse

How to optimize the file writing speed? According to the calculated sub-database and table, the location of each file writing is different. If there are 1 million pieces of data, such file IO will slow down the program.
Look at the code and ask for optimization advice.

<code>$inputfile = "logs.txt";//输入日志文件,每字段以\t分隔,每行以\n结尾
$fp = fopen($inputfile,'r');
if(!is_resource($fp)){
    echo "打开文件".$inputfile."失败,分析日志程序终止";
    exit(1);
}
while(!feof($fp)){
    $row = fgets($fp,4096);
    str_replace(array("\r","\n"),array("",""),$row);//去除行中的换行符
    if($row == ''){
        continue;
    }
    $arrRec = explode("\t",$row);//将一行字符串以\t分隔为数组
    $accountid = $arrRec[6];//账户id
    //下面由文字说明
    /**
     * 根据一行中的$accountid,调用底层函数算出分库分表位置
     * 例如分库为groupid = 5
     * 分表名为table_0001
     */
    $groupid = 5;
    $tableid = "table_0001";
    //在此处拼接要写入的文件路径
    $outputfile = "tmp/".$groupid."/".$tableid;//这个值每读一行数据,根据accountid计算的的结果都会变

    $content = "\t".$arrRec[0]."\t".$accountid."\t".$groupid."\n";//拼接要写入文件的内容
    if(file_put_contents($outputfile,$content,FILE_APPEND))
    {
        $success++;
        $content = '';
    }
    else
    {
        $failure++;
    }
}
fclose($fp);</code>

Only one piece of data can be written at a time. It takes about 30 minutes to write 1 million pieces of data. The speed is very slow. How to optimize it; after testing, if the target file path remains unchanged, splice all the content to be written into a long string, and then After the while loop ends, the file_put_contents function is called once, and the speed of writing all the contents to the file at once is much faster; but the problem encountered now is that the target path of each write will change, please optimize it

Reply content:

How to optimize the file writing speed? According to the calculated sub-database and table, the location of each file writing is different. If there are 1 million pieces of data, such file IO will slow down the program.
Look at the code and ask for optimization advice.

<code>$inputfile = "logs.txt";//输入日志文件,每字段以\t分隔,每行以\n结尾
$fp = fopen($inputfile,'r');
if(!is_resource($fp)){
    echo "打开文件".$inputfile."失败,分析日志程序终止";
    exit(1);
}
while(!feof($fp)){
    $row = fgets($fp,4096);
    str_replace(array("\r","\n"),array("",""),$row);//去除行中的换行符
    if($row == ''){
        continue;
    }
    $arrRec = explode("\t",$row);//将一行字符串以\t分隔为数组
    $accountid = $arrRec[6];//账户id
    //下面由文字说明
    /**
     * 根据一行中的$accountid,调用底层函数算出分库分表位置
     * 例如分库为groupid = 5
     * 分表名为table_0001
     */
    $groupid = 5;
    $tableid = "table_0001";
    //在此处拼接要写入的文件路径
    $outputfile = "tmp/".$groupid."/".$tableid;//这个值每读一行数据,根据accountid计算的的结果都会变

    $content = "\t".$arrRec[0]."\t".$accountid."\t".$groupid."\n";//拼接要写入文件的内容
    if(file_put_contents($outputfile,$content,FILE_APPEND))
    {
        $success++;
        $content = '';
    }
    else
    {
        $failure++;
    }
}
fclose($fp);</code>

Only one piece of data can be written at a time. It takes about 30 minutes to write 1 million pieces of data. The speed is very slow. How to optimize it; after testing, if the target file path remains unchanged, splice all the content to be written into a long string, and then After the while loop ends, the file_put_contents function is called once, and the speed of writing all the contents to the file at once is much faster; but the problem encountered now is that the target path of each write will change, please optimize it

File writing speed basically has nothing to do with the program. . The bottleneck lies in hardware performance. .

Change to SSD solid state drive

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn