Home >Backend Development >Python Tutorial >Concurrent vs. Parallel: How Does FastAPI Handle Requests?
Concurrent vs. Parallel Requests in FastAPI
In FastAPI, there's a common misconception about why async def endpoints don't always run requests in parallel. This behavior is not caused by FastAPI's design but rather relates to how FastAPI utilizes asynchronous programming.
When using def endpoints, FastAPI runs them synchronously in separate threads, enabling concurrency. However, async def endpoints typically execute directly in the event loop, ensuring both concurrency and parallelism when the code interacts with asynchronous I/O operations.
Synchronous vs. Asynchronous Code in FastAPI
FastAPI supports asynchronous code through async def, which allows control to be passed back to the event loop using await. This capability enables non-blocking operations, such as waiting for data from a client or a database response. However, if a synchronous task, like time.sleep(), is employed within an async def endpoint, it blocks the event loop and ultimately the server, resulting in sequential processing of requests.
External Threadpool
To ensure that blocking tasks don't hinder the event loop, FastAPI uses an external threadpool, which runs tasks defined with def on separate threads and awaits them before resuming event loop execution. This approach achieves concurrency for def endpoints, even though it's not true parallelization.
Best Practices
The above is the detailed content of Concurrent vs. Parallel: How Does FastAPI Handle Requests?. For more information, please follow other related articles on the PHP Chinese website!