The difference between parallel computing and distributed computing
1. More machines are invested in parallel computing, the data size remains unchanged, and the calculation speed is faster , while distributed computing invests more machines and can process larger data;
2. Parallel computing must require time synchronization, while distributed computing has no time limit.
Parallel Computing
Parallel Computing refers to the process of using multiple computing resources to solve computing problems at the same time. An effective means to improve the computing speed and processing power of computer systems. Its basic idea is to use multiple processors to collaboratively solve the same problem, that is, to decompose the problem to be solved into several parts, and each part is calculated in parallel by an independent processor. A parallel computing system can be either a specially designed supercomputer containing multiple processors or a cluster of several independent computers interconnected in some way. Data processing is completed through parallel computing clusters, and the processing results are returned to the user.
Parallel computing can be divided into time parallelism and spatial parallelism.
Temporal parallelism: refers to assembly line technology. For example, when a factory produces food, the steps are divided into:
1. Rinse: Rinse food thoroughly.
2. Disinfection: Disinfect food.
3. Cutting: Cut food into small pieces.
4. Packaging: Put food into packaging bags.
If the assembly line is not used, the next food will not be processed until one food has completed the above four steps, which is time-consuming and affects efficiency. But using assembly line technology, four foods can be processed at the same time. This is time parallelism in parallel algorithms. Starting two or more operations at the same time greatly improves computing performance.
Spatial parallelism: refers to the concurrent execution of calculations by multiple processors, that is, connecting more than two processors through a network to calculate different parts of the same task at the same time, or a single processor cannot Large-scale problems solved.
For example, Xiao Li plans to plant three trees on Arbor Day. If Xiao Li alone needs 6 hours to complete the task, he calls his good friends Xiao Hong and Xiao Wang on Arbor Day, and the three of them start at the same time. After digging holes and planting trees, everyone completed the task of planting a tree in 2 hours. This is spatial parallelism in parallel algorithms, which divides a large task into multiple identical subtasks to speed up problem solving.
Distributed computing
Broad definition
Study how to divide a problem that requires very huge computing power into many small parts, and then These parts are assigned to many computers for processing, and finally the calculation results are combined to obtain the final result.
Recent distributed computing projects have been used to use the idle computing power of thousands of volunteer computers around the world, through the Internet, to analyze electrical signals from outer space to search for hidden black holes. And explore the possible existence of extraterrestrial intelligent life; you can search for Mersenne prime numbers with more than 10 million digits; you can also search for and discover more effective drugs against HIV. These projects are very large and require an amazing amount of calculations. It is absolutely impossible for a single computer or individual to complete them within an acceptable time.
Definition of the Chinese Academy of Sciences
When two or more software share information with each other, these software can run on the same computer or on multiple computers connected through a network run. Distributed computing has the following advantages over other algorithms:
1. Rare resources can be shared.
2. Through distributed computing, the computing load can be balanced on multiple computers.
3. You can place the program on the computer that is most suitable for running it.
Among them, sharing rare resources and balancing loads is one of the core ideas of computer distributed computing.
Recommended tutorial: "PHP Tutorial"
The above is the detailed content of The difference between parallel computing and distributed computing. For more information, please follow other related articles on the PHP Chinese website!