Home  >  Article  >  Backend Development  >  Three methods and speed comparison of PHP bulk insertion into the database, three types of PHP database_PHP tutorial

Three methods and speed comparison of PHP bulk insertion into the database, three types of PHP database_PHP tutorial

WBOY
WBOYOriginal
2016-07-13 10:23:12820browse

3 methods and speed comparison of PHP bulk insertion into the database, 3 types of PHP databases

The first method: Use insert into, the code is as follows:

$params = array(‘value'=>'50′);
set_time_limit(0);
echo date(“H:i:s”);
for($i=0;$i<2000000;$i++){
$connect_mysql->insert($params);
};
echo date(“H:i:s”);

The final display is: 23:25:05 01:32:05, which means it took more than 2 hours!

Second method: Use transaction submission to insert into the database in batches (submit every 100,000 items). The final displayed time is: 22:56:13 23:04:00, a total of 8 minutes 13 seconds, the code is as follows:

echo date(“H:i:s”);
$connect_mysql->query(‘BEGIN');
$params = array(‘value'=>'50′);
for($i=0;$i<2000000;$i++){
$connect_mysql->insert($params);
if($i%100000==0){
$connect_mysql->query(‘COMMIT');
$connect_mysql->query(‘BEGIN');
}
}
$connect_mysql->query(‘COMMIT');
echo date(“H:i:s”);

Third method: Use optimized SQL statements: splice the SQL statements, use insert into table () values ​​(),(),(),() and then insert them all at once, if String too long,

You need to configure MYSQL and run it in the mysql command line: set global max_allowed_packet = 2*1024*1024*10; The consumption time is: 11:24:06 11:25:06;

It only took 1 minute to insert 2 million pieces of test data! The code is as follows:

$sql= “insert into twenty_million (value) values”;
for($i=0;$i<2000000;$i++){
$sql.=”('50′),”;
};
$sql = substr($sql,0,strlen($sql)-1);
$connect_mysql->query($sql);

Finally, the first method is undoubtedly the worst when inserting large batches of data, while the second method is more widely used in practical applications. The third method is suitable for inserting test data or other low requirements. More suitable and really fast.

php The fastest way to insert large amounts of data into mysql database

Inserting 1000 pieces of data at one time will be N times faster than inserting one piece at a time. The main trick is to write SQL, which is not difficult
insert into table1 value (v1, v2, v3), (x1, x2,x3),....

instead of
insert into table1 value (v1, v2, v3);
insert into table1 value (x1,x2,x3);
Insert them one by one like this
I hope you can understand
Reference: Niu Ren Xiaohai

php, mysql, how to import large batches of excel data into the database? I recorded data in this way before, first converting it into cs

About csv format
If there are commas in the content, you can enclose the entire field in quotation marks. For details, please refer to Baidu Encyclopedia csv.
For example
Field 1, "Field 2 with,", Field 3

In fact, the key lies in the rules for PHP to read csv. CSV does not have to use commas, it can also be semicolon and other symbols.
Just change the corresponding PHP reading rules.

www.bkjia.comtruehttp: //www.bkjia.com/PHPjc/840763.htmlTechArticle3 methods and speed comparison of PHP bulk insertion into the database, 3 first methods for PHP database: Use insert into insertion, the code is as follows: $params = array('value'='50′);set_time_...
Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn