Home > Article > Backend Development > nginx - Send 10,000 requests to a certain URL, will PHP crash?
Send 10,000 requests containing sql commands to a certain URL almost simultaneously and let php update 10,000 pieces of data. Will php hang? Or will they be dealt with one by one?
Need to use a queue? How meaningful is it to use it?
Send 10,000 requests containing sql commands to a certain URL almost simultaneously and let php update 10,000 pieces of data. Will php hang? Or will they be dealt with one by one?
Need to use a queue? How meaningful is it to use it?
How many servers are there
Concurrency
First of all, 1W requests will not generate 1W fpm. For details, you can check the fpm configuration. There is a limit on the maximum number of processes. Subsequent requests are discarded and error 502 is reported. This should be very common, right?
Even if you have the resources to accommodate 1,000 processes, PHP actually runs very fast. It’s just that each initialization takes time, and the logical processing ends quickly. Link to mysql. Can mysql accommodate 1,000 links at the same time? Report an error, sql error.
PHP and mysql are both supported, so there is nothing to say, and it will definitely not be paralyzed.
According to your server resources (1 server), it is recommended to use queue, preferably singleton mode, so that it will not affect other businesses too much.
If you use a queue, it can be optimized. The data requested for 10,000 times is taken out and merged. Can it be updated all at once? Can it be merged into several times (less than 1W)?