Home > Article > Backend Development > How to quickly import large amounts of data in php
PHP method to quickly import large amounts of data into the database
The first method: use insert into, the code is as follows:
$params = array(‘value'=>'50′); set_time_limit(0); echo date(“H:i:s”); for($i=0;$i<2000000;$i++){ $connect_mysql->insert($params); }; echo date(“H:i:s”);
The last display is: 23:25:05 01:32:05, which means it took more than 2 hours!
The second method: use transaction submission, The final time displayed for batch insertion into the database (submitted every 10W items) is: 22:56:13 23:04:00, a total of 8 minutes and 13 seconds. The code is as follows:
echo date(“H:i:s”); $connect_mysql->query(‘BEGIN'); $params = array(‘value'=>'50′); for($i=0;$i<2000000;$i++){ $connect_mysql->insert($params); if($i%100000==0){ $connect_mysql->query(‘COMMIT'); $connect_mysql->query(‘BEGIN'); } } $connect_mysql->query(‘COMMIT'); echo date(“H:i:s”);
The third method: Use optimized SQL statements
Splice the SQL statements, useinsert into table () values (),(),(),()
Then Insert again at once. If the string is too long, you need to configure MYSQL.
Run in the mysql command line:
set global max_allowed_packet = 2*1024*1024*10;
The time consumed is: 11:24:06 11:25: 06;
It only took 1 minute to insert 2 million pieces of test data! The code is as follows:
$sql= “insert into twenty_million (value) values”; for($i=0;$i<2000000;$i++){ $sql.=”('50′),”; }; $sql = substr($sql,0,strlen($sql)-1); $connect_mysql->query($sql);
Summary: When inserting large batches of data, the first method is undoubtedly the worst , while the second method is more widely used in practical applications, and the third method is more suitable when inserting test data or other low requirements, and is indeed fast.
Recommended tutorial: PHP video tutorial
The above is the detailed content of How to quickly import large amounts of data in php. For more information, please follow other related articles on the PHP Chinese website!