one. Preface
Usually, how do we implement paging?
SELECT * FROM table ORDER BY id LIMIT 1000, 10;
But what happens when the amount of data increases sharply?
SELECT * FROM table ORDER BY id LIMIT 1000000, 10;
The second query above is very slow and will be delayed to death.
The most critical reason is the problem of mysql query mechanism:
It is not skipping first and querying later;
Instead, query first and skip later. (Explanation below)
#What does it mean? For example, limit 100000,10. When the required 10 items are found, the first 100,000 pieces of data will be polled. First, the field data of the first 100,000 items will be retrieved, and then discarded if found to be useless, until the required 10 items are finally found. .
two. Analysis
limit offset,N, when the offset is very large, the efficiency is extremely low,
The reason is that mysql does not skip the offset row. Then just take N rows,
but take offset+N rows, return the offset row before giving up, and return N rows [Same as mentioned above, query first, then skip].
The efficiency is lower. When the offset is larger, the efficiency is lower
Three. 3 optimization suggestions
1:solve it from a business perspective
##Method:It is not allowed to turn over the 100 page
##Take Baidu as an example,Generally turn the page to around page 70.
##2:No needoffset,Query with conditions.
##Example:mysql> select id, from lx_com limit 5000000,10;
+---------+--------------------------------------------+
| id | name |
+---------+--------------------------------------------+
| 5554609 |温泉县人民政府供暖中心 |
..................
| 5554618 |温泉县邮政鸿盛公司 |
+---------+--------------------------------------------+
10 rows in set (5.33 sec)
mysql> select id,name from lx_com where id>5000000 limit 10;
+---------+--------------------------------------------------------+
| id | name |
+---------+--------------------------------------------------------+
| 5000001 |南宁市嘉氏百货有限责任公司 |
.................
| 5000002 |南宁市友达电线电缆有限公司 |
+---------+--------------------------------------------------------+
10 rows in set (0.00 sec)
Phenomenon: From 5.3 seconds to less than 100 milliseconds, the query speed is greatly accelerated; but the data results are different
Advantages: Use where conditions to avoid
Query first and then skip the questions. Instead, the condition narrows the scope and skips directly.
Problems: Sometimes it is found that the results of using this method and limitM, N, two times are inconsistent [as shown in the example above ]
Cause:The data has been physically deleted,has holes.
Solution:The data is not physically deleted(can be logically deleted). 最终在页面上显示数据时,逻辑删除的条目不显示即可. (一般来说,大网站的数据都是不物理删除的,只做逻辑删除 ,比如 is_delete=1) 3:延迟索引. 非要物理删除,还要用offset精确查询,还不限制用户分页,怎么办? 优化思路: 利用索引覆盖,快速查询出满足条件的主键id;然后凭借主键id作为where条件,达到快速查询。 (速度快在哪里?利用索引覆盖不需要回行就可以快速查询出满足条件的id,时间节约在这里了) 我们现在必须要查,则只查索引,不查数据,得到id.再用id去查具体条目. 这种技巧就是延迟索引. 慢原因: 查询100W条数据的id,name,m每次查询回行抛弃,跨过100W后取到真正要的数据。【就是我们刚刚说的,先查询,后跳过】 优化后快原理: a.利用索引覆盖先查询出主键id,在索引上就拿到信息了,避免回行 b.找到主键后,根据已知的目标主键在查询,避免跨大数据行去寻找,而是直接定位哪几条数据直接查询。 本方法即延迟索引查询。 四。总结: 从方案上来说,肯定是方法一优先,从业务上去满足是否要翻那么多页。 如果业务要求,则用id>n limit m的方式来代替limit n,m,但缺点是不能有物理删除 如果非有物理删除有空缺不能用方法二,则用延迟索引法,本质是利用索引覆盖先快速取出索引值,根据锁定的目标的索引值。一次性去回行取值,效果很明显。 以上就是Mysql优化-大数据量下的分页策略的内容,更多相关内容请关注PHP中文网(www.php.cn)!mysql> select id,name from lx_com inner join (select id from lx_com limit 5000000,10) as tmp using(id);
+---------+-----------------------------------------------+
| id | name |
+---------+-----------------------------------------------+
| 5050425 | 陇县河北乡大谈湾小学 |
........
| 5050434 | 陇县堎底下镇水管站 |
+---------+-----------------------------------------------+
10 rows in set (1.35 sec)

MySQLstringtypesimpactstorageandperformanceasfollows:1)CHARisfixed-length,alwaysusingthesamestoragespace,whichcanbefasterbutlessspace-efficient.2)VARCHARisvariable-length,morespace-efficientbutpotentiallyslower.3)TEXTisforlargetext,storedoutsiderows,

MySQLstringtypesincludeVARCHAR,TEXT,CHAR,ENUM,andSET.1)VARCHARisversatileforvariable-lengthstringsuptoaspecifiedlimit.2)TEXTisidealforlargetextstoragewithoutadefinedlength.3)CHARisfixed-length,suitableforconsistentdatalikecodes.4)ENUMenforcesdatainte

MySQLoffersvariousstringdatatypes:1)CHARforfixed-lengthstrings,2)VARCHARforvariable-lengthtext,3)BINARYandVARBINARYforbinarydata,4)BLOBandTEXTforlargedata,and5)ENUMandSETforcontrolledinput.Eachtypehasspecificusesandperformancecharacteristics,sochoose

TograntpermissionstonewMySQLusers,followthesesteps:1)AccessMySQLasauserwithsufficientprivileges,2)CreateanewuserwiththeCREATEUSERcommand,3)UsetheGRANTcommandtospecifypermissionslikeSELECT,INSERT,UPDATE,orALLPRIVILEGESonspecificdatabasesortables,and4)

ToaddusersinMySQLeffectivelyandsecurely,followthesesteps:1)UsetheCREATEUSERstatementtoaddanewuser,specifyingthehostandastrongpassword.2)GrantnecessaryprivilegesusingtheGRANTstatement,adheringtotheprincipleofleastprivilege.3)Implementsecuritymeasuresl

ToaddanewuserwithcomplexpermissionsinMySQL,followthesesteps:1)CreatetheuserwithCREATEUSER'newuser'@'localhost'IDENTIFIEDBY'password';.2)Grantreadaccesstoalltablesin'mydatabase'withGRANTSELECTONmydatabase.TO'newuser'@'localhost';.3)Grantwriteaccessto'

The string data types in MySQL include CHAR, VARCHAR, BINARY, VARBINARY, BLOB, and TEXT. The collations determine the comparison and sorting of strings. 1.CHAR is suitable for fixed-length strings, VARCHAR is suitable for variable-length strings. 2.BINARY and VARBINARY are used for binary data, and BLOB and TEXT are used for large object data. 3. Sorting rules such as utf8mb4_unicode_ci ignores upper and lower case and is suitable for user names; utf8mb4_bin is case sensitive and is suitable for fields that require precise comparison.

The best MySQLVARCHAR column length selection should be based on data analysis, consider future growth, evaluate performance impacts, and character set requirements. 1) Analyze the data to determine typical lengths; 2) Reserve future expansion space; 3) Pay attention to the impact of large lengths on performance; 4) Consider the impact of character sets on storage. Through these steps, the efficiency and scalability of the database can be optimized.


Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

Video Face Swap
Swap faces in any video effortlessly with our completely free AI face swap tool!

Hot Article

Hot Tools

EditPlus Chinese cracked version
Small size, syntax highlighting, does not support code prompt function

WebStorm Mac version
Useful JavaScript development tools

MantisBT
Mantis is an easy-to-deploy web-based defect tracking tool designed to aid in product defect tracking. It requires PHP, MySQL and a web server. Check out our demo and hosting services.

PhpStorm Mac version
The latest (2018.2.1) professional PHP integrated development tool

Zend Studio 13.0.1
Powerful PHP integrated development environment
