Home  >  Article  >  Backend Development  >  SQL scheduled update optimization issues

SQL scheduled update optimization issues

WBOY
WBOYOriginal
2016-09-21 14:13:06850browse

A mysql data table contains a large amount of data, such as 10 million items, occupying 5G space. Insert data into it. How to regularly update the status of the newly inserted data to make it more efficient.

For example: I inserted 1000 items into it and updated the status of these 1000 items every 10 minutes. How to improve efficiency

Reply content:

A mysql data table contains a large amount of data, such as 10 million items, occupying 5G space. Insert data into it. How to regularly update the status of the newly inserted data to make it more efficient.

For example: I inserted 1000 items into it and updated the status of these 1000 items every 10 minutes. How to improve efficiency

Use something like cron job

I don’t know what status you are talking about

If it is a certain field in that table, just update it according to the primary key

The amount of table data is large, and the speed of update is related to the table index. If the updated field does not involve an index, and it is updated according to the primary key, it should be very fast, even if it is tens of millions of data

If you want to query the 1,000 new items among the 10 million, it may indeed be slow, depending on whether your query conditions are correctly indexed

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn