1. Do not call Function on the index column
Doing so will prevent the database from using this index. This problem can even affect the partition table, because if you do this, the data will not be read from the specified partition, but scanned The entire table space. For data tables with large amounts of data, this will be a catastrophic performance disaster.
Don’t do this:
WHERE TIME_ID+14 > to_number(to_char(sysdate,'J'))
Should do this:
WHERE TIME_ID > to_number(to_char(sysdate-14,'J'))
2. Use Analyze to optimize complex SQL
If you don’t do this, it will mean: you refuse to use the database’s query optimizer and lose its use Opportunities to optimize connections. Suppose you create a temporary table with 1 million records. If it is not analyzed, the optimizer will not be able to obtain the real contents of the table from the existing clues, so it can only decide to use a nested loop join to Scan the data table row by row. If the amount of data is not large, we may not feel the performance loss, but as the data set grows, your database performance will become worse and worse.
It is recommended to do this:
ANALYZE TABLE <TABLE_NAME> COMPUTE STATISTICS
3. Divide complex SQL into several steps for execution
Think of SQL as a pizza. I don’t think you will swallow the whole pizza in your mouth and chew it.
For creating a complex SQL query, we'd better divide it into 3-4 steps. The simpler the SQL, the better the optimizer's effect. In addition, the easier it is to debug the data in each data table.
IV. Use Distinct only when necessary
This is a very good rule of thumb. Distinct is often used in SQL queries that return 2 or more duplicate records. Using Distinct will filter out duplicates. data records. However, the purpose of using Distinct must be clear. It can only be used when you are sure that the returned record must be unique, such as user ID. Abuse of Distinct will cause unpredictable errors, such as multi-table join queries.
5. Create indexes reasonably
The last point is to create table indexes reasonably. To put it simply, if there is a data table with 100,000 records, you may often query information like this: "One of my customer information is here Is it right?" If an index is used, retrieving this customer information will be very fast, otherwise the database optimizer will choose a full table scan, which is a nightmare in the case of large data volumes.