Sometimes we need to write some simple performance testing code, and I happened to see an experience talk on stackoverflow, how to write benchmark tests to shield the impact of the environment as much as possible.
Translated and posted here:
Tips for writing microbenchmarks from the author of Java HotSpot:
Rule 0: Read good papers on JVMs and microbenchmarks. for example. Don't have high expectations for such tests; they only provide a limited measure of JVM performance.
Rule 1: Always include a warm-up phase that runs until all initialization and compilation are triggered. (The number of iterations in the warm-up phase can be reduced, the rule of thumb is tens of thousands of loops.)
Rule 2: Always run with parameters such as -XX:+PrintCompilation, -verbose:gc, etc., so that the compilation phase can be determined and other parts of the JVM are doing some unexpected work while timing.
Rule 2.1: Print messages at the beginning and end of the timing and warm-up phases, so that you can determine whether there is output from rule 2 during timing.
Rule 3: Understand the difference between -client and -server, and the difference between OSR and regular assembly. -server is better than -client, regular compilation is better than OSR
Rule 4: Pay attention to the impact of initialization, do not print the results for the first time, unless it is during the test class loading process, Rule 2 is that you fight against this The first line of defense for effectiveness.
Rule 5: Pay attention to the effects of compiler optimization and recompilation. Don't use any code path when timing it, as the compiler may optimize based on some optimistic assumptions such that the path is not used at all, allowing the code to be trashed and recompiled. Rule 2 is your first line of defense against this effect.
Rule 6: Use the appropriate tools to read the compiler's workings and do a good job of producing some amazing code from it. Check the code yourself before forming a theory about what makes things faster or slower.
Rule 7: Reduce noise in measurements. Run the benchmark on a quiet machine and run it a few times, discarding outliers. Use -Xbatch to serialize the compiler with the application, and consider setting -XX:CICompilerCount=1 to prevent the compiler from running in parallel with itself.
Rule 8: Use some library for benchmarking as it may be more efficient. Such as JMH, Caliper, UCSD Benchmarks for Java, etc.
The above is the detailed content of 【Translation】Some experiences in writing benchmark tests in Java. For more information, please follow other related articles on the PHP Chinese website!