Home > Article > PHP Framework > Using Mysql for bulk data import in ThinkPHP6
As the amount of data continues to increase, many companies need to import large amounts of data into databases. But for developers, how to import data efficiently is a question worth exploring. In this article, we will introduce how to use Mysql in the ThinkPHP6 framework for bulk data import.
Before starting the import, we need to prepare the data. Data can be exported in formats such as CSV, Excel, or generated directly from code. In this article we will use code to generate data.
First create a data tableuser
:
CREATE TABLE `user` ( `id` int(11) NOT NULL AUTO_INCREMENT, `name` varchar(255) DEFAULT NULL, `age` int(11) DEFAULT NULL, `created_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP, `updated_at` timestamp NOT NULL DEFAULT CURRENT_TIMESTAMP ON UPDATE CURRENT_TIMESTAMP, PRIMARY KEY (`id`) ) ENGINE=InnoDB AUTO_INCREMENT=1 DEFAULT CHARSET=utf8mb4;
Then write the following code to generate 1000 pieces of data:
use thinkacadeDb; // 生成测试数据 $data = []; for ($i = 1; $i <= 1000; $i++) { $data[] = [ 'name' => 'User' . $i, 'age' => mt_rand(18, 60), ]; } // 批量插入数据 Db::name('user')->insertAll($data);
Mysql provides a very convenient functionLOAD DATA
, which can import data from a file into a table. We just need to save the data to a CSV file and then use the LOAD DATA
command to import the data into the table.
First, save all data to a CSV file. In this example, we save the data into the user.csv
file:
use thinkacadeDb; $data = Db::name('user')->select(); $fp = fopen('user.csv', 'w'); //写数据到CSV文件中 foreach ($data as $item) { fputcsv($fp, [$item['name'], $item['age']]); } fclose($fp);
Then use the following code to import the data into the database:
use thinkacadeDb; $filename = 'user.csv'; $sql = <<<EOF LOAD DATA LOCAL INFILE '{$filename}' INTO TABLE `user` CHARACTER SET utf8mb4 FIELDS TERMINATED BY ',' ENCLOSED BY '"' LINES TERMINATED BY ' ' (`name`, `age`) EOF; Db::query($sql);
In the above code , we first specify the file path to be imported, then write the SQL statement and use LOAD DATA
to import the data into the database. It should be noted that before importing, you need to ensure that the local_infile
option of Mysql is turned on.
Mysql’s LOAD DATA
function is convenient and fast, but it needs to save the data as a file , and it requires manual writing of SQL statements, which is cumbersome to operate. In TP6, we can use the built-in batch insertion function to insert a large amount of data at one time.
In the above example, we use the following code to insert data into the database:
use thinkacadeDb; $data = Db::name('user')->select(); $result = Db::name('user')->insertAll($data);
In the above code, we first query all the data and then use insertAll
Method to insert data in batches. It should be noted that the insertAll
method can insert up to 1,000 pieces of data at a time by default. If you want to insert more data, you need to specify the $limit
parameter in the insertAll
method. For example, the following code will insert up to 500 pieces of data at a time:
use thinkacadeDb; $data = Db::name('user')->select(); $limit = 500; $total = count($data); for ($i = 0; $i < $total; $i += $limit) { $result = Db::name('user')->insertAll(array_slice($data, $i, $limit)); }
In the above code, we use a loop to insert data in batches. Up to 500 pieces of data can be inserted each time until all are completed.
Summary:
When importing large batches of data in ThinkPHP6, you may encounter problems such as memory overflow and performance bottlenecks. However, by using the LOAD DATA function of Mysql and the batch insert that comes with TP6 Functions, manual batch insertion of data and other methods can effectively improve the efficiency of data import. In the actual development process, it is necessary to choose the appropriate method according to the situation to achieve the optimal import effect.
The above is the detailed content of Using Mysql for bulk data import in ThinkPHP6. For more information, please follow other related articles on the PHP Chinese website!