Home > Article > Backend Development > How to divide 100 million pieces of data into 100 tables into a Mysql database (PHP)_php skills
The following is a demonstration of the table splitting process for 100 million pieces of data by creating 100 tables. Please see the code below for details.
When the amount of data increases sharply, everyone will choose methods such as library table hashing to optimize data reading and writing speed. The author made a simple attempt, with 100 million pieces of data divided into 100 tables. The specific implementation process is as follows:
First create 100 tables:
$i=0; while($i<=99){ echo "$newNumber \r\n"; $sql="CREATE TABLE `code_".$i."` ( `full_code` char(10) NOT NULL, `create_time` int(10) unsigned NOT NULL, PRIMARY KEY (`full_code`), ) ENGINE=MyISAM DEFAULT CHARSET=utf8"; mysql_query($sql); $i++;
Let me talk about my table splitting rules. full_code is used as the primary key. We hash full_code
The function is as follows:
$table_name=get_hash_table('code',$full_code); function get_hash_table($table,$code,$s=100){ $hash = sprintf("%u", crc32($code)); echo $hash; $hash1 = intval(fmod($hash, $s)); return $table."_".$hash1; }
Get the table name where the data is stored through get_hash_table before inserting data.
Finally we use the merge storage engine to implement a complete code table
CREATE TABLE IF NOT EXISTS `code` ( `full_code` char(10) NOT NULL, `create_time` int(10) unsigned NOT NULL, INDEX(full_code) ) TYPE=MERGE UNION=(code_0,code_1,code_2.......) INSERT_METHOD=LAST ;
In this way, we can get all the full_code data through select * from code.
The above introduction is the entire content of this article, I hope it will be helpful to everyone.