Home >Backend Development >Python Tutorial >How can I run multiple subprocesses in parallel and collect their output without using multiprocessing or threading in Python?

How can I run multiple subprocesses in parallel and collect their output without using multiprocessing or threading in Python?

Susan Sarandon
Susan SarandonOriginal
2024-10-26 21:51:031086browse

How can I run multiple subprocesses in parallel and collect their output without using multiprocessing or threading in Python?

Running Subprocesses in Parallel with Output Collection

In the given scenario, multiple cat | zgrep commands are being executed sequentially on a remote server. To run these commands concurrently while gathering individual outputs, we need to avoid using multiprocessing or threading.

A simple solution is to employ the Popen function from the subprocess module. By creating individual Popen objects for each command and passing them a shell argument, we can run them in parallel. Once the commands have completed, we can collect their exit codes using the wait method. Here's an example:

<code class="python">from subprocess import Popen

# Create a list of commands
commands = ['echo {i:d}; sleep 2; echo {i:d}' for i in range(5)]

# Run commands in parallel
processes = [Popen(command, shell=True) for command in commands]

# Collect statuses
exitcodes = [p.wait() for p in processes]</code>

This code runs the five commands simultaneously and collects their exit codes once they're completed.

To collect the output from the commands, we can use threads or the communicate method in a separate process. For example, using a thread pool:

<code class="python">from multiprocessing.dummy import Pool # thread pool
from subprocess import Popen

# Run commands in parallel
processes = [Popen(command, shell=True, close_fds=True) for command in commands]

# Collect output in parallel
def get_output(process):
    return process.communicate()[0]

outputs = Pool(len(processes)).map(get_output, processes)</code>

This code runs all commands concurrently in a thread pool and gathers their output into a list, where each element corresponds to an individual command's output.

Another alternative is to use the asyncio module for output collection in the same thread (Python 3.8 and above):

<code class="python">import asyncio
from subprocess import PIPE

async def get_output(command):
    process = await asyncio.create_subprocess_shell(command, stdout=PIPE)
    return (await process.communicate()[0]).decode()

# Get commands output in parallel
coros = [get_output(command) for command in commands]
outputs = await asyncio.gather(*coros)</code>

This code creates coroutines that execute the commands concurrently and returns their outputs as a list.

The above is the detailed content of How can I run multiple subprocesses in parallel and collect their output without using multiprocessing or threading in Python?. For more information, please follow other related articles on the PHP Chinese website!

Statement:
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn