Home  >  Q&A  >  body text

mysql - tens of millions of data paging, when the offset turns out to be larger, how to optimize the speed?

Take mysql as an example. Generally, when the amount of data is relatively large, the offset will be larger when paging, and the query efficiency will be lower.

For example, if a post has 2 million comments and 10 are displayed on each page, there are 200,000 pages. How should the data be processed when 200,000 pages are taken? Is there any good way to solve this demand?

Example: NetEase Music, the maximum number of comments for a single song is more than 1.4 million, and the paging speed is very good,

伊谢尔伦伊谢尔伦2712 days ago838

reply all(2)I'll reply

  • 给我你的怀抱

    给我你的怀抱2017-05-18 10:50:01

    First, I believe that the way NetEase Music reads data is definitely to use nosql to read data.
    Of course, if your data table access frequency is not high, you can still read the database directly. Of course, the mysql innodb library has a cheating weakness, that is, the larger the number of pages you query, the smaller the effect. So generally we use id to improve the query effect. The old query usage is limit 100000,10. The recommended query usage is where id > 100000 limit 10. This ensures the use of the index.
    Of course, you can use the form of sub-tables to reduce the data in a single table, thereby improving query efficiency

    reply
    0
  • PHPz

    PHPz2017-05-18 10:50:01

    If you use redis, you should store the IDs in the list, then use redis to obtain a fragment of the list, and then use the ID in the fragment to mysql to query the results.

    Of course, if you want to add sorting, there is no way.

    reply
    0
  • Cancelreply