search

Home  >  Q&A  >  body text

mysql - thinkphp3.2 csv batch import of tens of thousands of data reports a timeout error. Is there any good way to deal with it?

$filename = $_FILES['data']['tmp_name']; 
$handle = fopen($filename, 'r'); 
$result = input_csv($handle); //解析csv 
$length = count($result); 

for ($i = 0; $i < $length; $i++) {
    $ip = $result[$i][0]; 
    $port = $result[$i][1];
    //...
    $data = array(
        "ip" => $ip,
        "port" => $port,
        //...
    );
    $count = $Property->where($where)->find();
    if($count){
        $query = $Property->where($where)->save($data);
    }else{
        $query = $Property->add($data);
    }
}
fclose($result); 
$this->success('导入成功!');
exit();

Upload the CSV file, but an error is reported after execution. Please give me some advice
ps: It is necessary to determine that the IP port is unique. If it exists, it will be overwritten. If it does not exist, it will be added.

大家讲道理大家讲道理2834 days ago738

reply all(1)I'll reply

  • 仅有的幸福

    仅有的幸福2017-05-25 15:10:12

    Two types, one is to set the timeout limit, and the other is to convert it into sql and finally get it to the database for execution. I recommend the second one, which has good performance

    reply
    0
  • Cancelreply