The reason why multithreading is still used despite the challenges is because multithreading still has several benefits, some of these benefits are:
Better resource utilization
Simple programming in some scenarios
Faster response program
Better Resource Utilization
Imagine an application reading and processing files from the local file system. Let's say reading a file from disk takes 5 seconds, processing it takes 2 seconds, executing both files will take:
5 seconds reading file A 2 seconds processing file A 5 seconds reading file B 2 seconds processing file B ----------------------- 14 seconds total
When reading from disk When reading a file, most of the CPU time is spent waiting for the file to be read. At that time the CPU is quite idle. It can do some other things. By changing the order of this operation, the CPU can be better utilized. Look at this sequence:
5 seconds reading file A 5 seconds reading file B + 2 seconds processing file A 2 seconds processing file B ----------------------- 12 seconds total
The CPU waits for the first file to be read. Then it starts reading the second file. While the second file is being read, this CPU executes the first file. Remember, this CPU is idle most of the time while waiting for files to be read from disk.
Normally, the CPU can do some other things while waiting for IO. It doesn't necessarily have to be disk IO. It could also be network IO, or input from a user. Network and disk IO are much slower than CPU and memory IO.
Simpler Programming
If you are manually writing the above sequence of reading and processing files in a single-threaded application , you will have to track the status of every file read and processed. Instead, you can start two threads, each of which simply reads and processes a separate file. Each of these threads will be blocked while waiting for the disk to read the file. While waiting, other threads can use the CPU to process portions of the file they have already read. The result is that the disk will stay consistently busy reading various files into memory. The result is better disk and CPU utilization. It will also be simpler to program, since a thread is only tracking a single file.
Fasterly responsive programs
Another common goal is to convert a single-threaded application into a multi-threaded application. A more responsive app. Imagine a server application that is listening on some ports for incoming requests. When a request is received, it processes the request and then returns to listen. This server cycle is summarized as follows:
while(server is active){ listen for request process request }
If this request takes a long time to process, no new clients send requests to the server during that time. Only the server can listen for incoming requests.
An alternative design is for the listening thread to pass a request to a worker thread and return to listening immediately. This worker thread will execute the request and send a response to the client. The design is outlined below
while(server is active){ listen for request hand request to worker thread }
In this way, the server thread will quickly return to the listener. So more clients can send requests to the server. The server has become more responsive.
The same is true for desktop applications. If you click a button to start a long-running task, and the thread that is executing the task is the thread that is updating windows, buttons, etc., then when the task is executed, the application becomes unresponsive. Instead, this task can be converted into a worker thread. While the worker thread is busy processing tasks, the window thread is free to respond to other user requests. When the worker thread completes, it signals the window thread. The window thread can then update the application window based on the results of this task. Programs designed using worker threads will give users faster responses.
The above is the content of the benefits of Java multi-threading. For more related content, please pay attention to the PHP Chinese website (www.php.cn)!