Home > Article > Backend Development > How to Implement Multiprocessing-Aware Logging in Python: A Queue-Based Solution?
How to Implement Multiprocessing-Aware Logging in Python
Multiprocessing in Python allows the creation of multiple processes that run independently. However, accessing shared resources like log files can become complex as multiple processes may attempt to write to them simultaneously.
To avoid this issue, the Python multiprocessing module provides module-level multiprocessing-aware logging capabilities. This enables the logger to prevent garbling of log messages by ensuring that only one process writes to a specific file descriptor at a time.
However, existing modules within the framework may not be multiprocessing-aware, leading to the need for alternative solutions. One approach involves creating a custom log handler that sends logging messages to the parent process via a pipe.
An implementation of this approach is provided below:
from logging.handlers import RotatingFileHandler import multiprocessing, threading, logging, sys, traceback class MultiProcessingLog(logging.Handler): def __init__(self, name, mode, maxsize, rotate): logging.Handler.__init__(self) # Set up the file handler for the parent process self._handler = RotatingFileHandler(name, mode, maxsize, rotate) # Create a queue to receive log messages from child processes self.queue = multiprocessing.Queue(-1) # Start a thread in the parent process to receive and log messages t = threading.Thread(target=self.receive) t.daemon = True t.start() def receive(self): while True: try: # Get a log record from the queue record = self.queue.get() # Log the record using the parent process's file handler self._handler.emit(record) # Exit the thread if an exception is raised except (KeyboardInterrupt, SystemExit): raise except EOFError: break except: traceback.print_exc(file=sys.stderr) def send(self, s): # Put the log record into the queue for the receiving thread self.queue.put_nowait(s) def _format_record(self, record): # Stringify any objects in the record to ensure that they can be sent over the pipe if record.args: record.msg = record.msg % record.args record.args = None if record.exc_info: dummy = self.format(record) record.exc_info = None return record def emit(self, record): try: # Format and send the log record through the pipe s = self._format_record(record) self.send(s) except (KeyboardInterrupt, SystemExit): raise except: self.handleError(record) def close(self): # Close the file handler and the handler itself self._handler.close() logging.Handler.close(self)
This custom log handler allows modules within the framework to use standard logging practices without having to be multiprocessing-aware themselves. The log messages are sent from the child processes to the parent process via a pipe, ensuring that they are not garbled and written correctly to the log file.
The above is the detailed content of How to Implement Multiprocessing-Aware Logging in Python: A Queue-Based Solution?. For more information, please follow other related articles on the PHP Chinese website!