In computer science, time complexity, also known as time complexity Degree, the time complexity of an algorithm is a function that qualitatively describes the running time of the algorithm. This is a function of the length of the string representing the input value to the algorithm. Time complexity is often expressed in big O notation, excluding the low-order terms and leading coefficients of this function. When using this approach, the time complexity can be said to be asymptotic, that is, as the size of the input values approaches infinity.
In order to calculate the time complexity, we usually estimate the number of operation units of the algorithm, and the running time of each unit is the same. Therefore, the total running time and the number of operating units of the algorithm differ at most by a constant factor.
Different input values of the same size may still cause the algorithm to have different running times, so we usually use the worst-case complexity of the algorithm, denoted as T(n), which is defined as the time required for input n of any size. Maximum run time. Another less commonly used method is average case complexity, which is usually used only when specified. Time complexity can be classified by the natural characteristics of the function T(n). For example, an algorithm with T(n) =O(n) is called a "linear time algorithm"; while T(n) =O(M ^n) and M= O(T(n)), the algorithm where M≥n> 1 is called an "exponential time algorithm".
The time an algorithm takes is proportional to the number of executions of statements in the algorithm. Whichever algorithm has more statements is executed, it takes more time. The number of statement executions in an algorithm is called statement frequency or time frequency. Denote it as T(n).
Generally speaking, the number of repeated executions of basic operations in an algorithm is a function of the problem size n, represented by T(n). If there is an auxiliary function f(n), when n approaches infinity , the limit value of T(n)/f(n) is a constant not equal to zero, then f(n) is said to be a function of the same order of magnitude as T(n). Denoted as T(n)=O(f(n)), O(f(n)) is called the asymptotic time complexity of the algorithm, or time complexity for short.
Among various algorithms, if the number of statement executions in the algorithm is a constant, the time complexity is O(1). In addition, when the time frequency is different, the time complexity may be the same, such as T( n)=n2 3n 4 and T(n)=4n2 2n 1 have different frequencies, but the time complexity is the same, both are O(n2).
Time Frequency
The time it takes to execute an algorithm cannot be calculated theoretically. You must run a test on the computer to know it. But it is impossible and unnecessary for us to test every algorithm on the computer. We only need to know which algorithm takes more time and which algorithm takes less time. And the time an algorithm takes is proportional to the number of executions of statements in the algorithm. Whichever algorithm has more statements is executed, it takes more time. The number of statement executions in an algorithm is called statement frequency or time frequency. Denote it as T(n).
Similar to time complexity, space complexity refers to the measurement of the storage space required when an algorithm is executed within a computer. Recorded as:
S(n)=O(f(n))
The storage space required during algorithm execution includes 3 parts:
The space occupied by the algorithm program;
The storage space occupied by the initial data input;
Required during the execution of the algorithm Extra space.
In many practical problems, in order to reduce the storage space occupied by the algorithm, compression storage technology is usually used.
The above is the detailed content of What does algorithm complexity mainly include?. For more information, please follow other related articles on the PHP Chinese website!