Home  >  Q&A  >  body text

Concurrency model - Why does python put the process pool in the decorator not taking effect and no error reported?

I want to encapsulate the process pool in a decorator, but it neither takes effect nor reports an error

# coding:utf-8
import multiprocessing
import tornado
from tornado.httpclient import AsyncHTTPClient

process_num = 20  # 进程数
url = "https://www.baidu.com"

def handle_request(response):
    print str(response)

def run_in_process(process_num):
    def _run_in_process(f):
        def __run_in_process(*args, **kwargs):
            pool = multiprocessing.Pool(processes=process_num)
            for i in range(process_num):
                pool.apply_async(f, args=args, kwds=kwargs, callback=kwargs.get("callback"))
            pool.close()
            pool.join()

        return __run_in_process

    return _run_in_process


@run_in_process(process_num)
def main():
    http_client = AsyncHTTPClient()
    http_client.fetch(url, callback=handle_request)
    global loop
    loop = tornado.ioloop.IOLoop.instance()
    if loop._running is False:
        loop.start()


if __name__ == '__main__':
    main()

The results are as follows

/usr/bin/python2.7 /home/xxx/workspace/py_project/crawler/center/sample.py

Process finished with exit code 0

But the strange thing is that I rewrote it in multi-process mode and found that it can take effect

# coding:utf-8
import multiprocessing
import tornado
from tornado.httpclient import AsyncHTTPClient

process_num = 20  # 进程数
url = "https://www.baidu.com"

def handle_request(response):
    print str(response)

def run_in_process(process_num):
    def _run_in_process(f):
        def __run_in_process(*args, **kwargs):
            _processes = []
            for i in xrange(process_num):
                p = multiprocessing.Process(target=f, args=args, kwargs=kwargs)
                p.start()
                _processes.append(p)

            for p in _processes:
                p.join()

        return __run_in_process
    return _run_in_process


@run_in_process(process_num)
def main():
    http_client = AsyncHTTPClient()
    http_client.fetch(url, callback=handle_request)
    global loop
    loop = tornado.ioloop.IOLoop.instance()
    if loop._running is False:
        loop.start()


if __name__ == '__main__':
    main()

The log is as follows

/usr/bin/python2.7 /home/shufeng/workspace/private_project/jobscrawler/center/sample.py
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa21ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa425d0>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa42250>,request_time=0.014312028884887695,time_info={})
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa21ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa43450>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa430d0>,request_time=0.02327895164489746,time_info={})
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa21ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa43510>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa43190>,request_time=0.026951074600219727,time_info={})
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa21ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa42690>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa42310>,request_time=0.0552978515625,time_info={})
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa24ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa39e10>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa39a90>,request_time=0.05612993240356445,time_info={})

The same situation will also occur when using thread pools and coroutines. Does anyone know what is going on?

仅有的幸福仅有的幸福2669 days ago760

reply all(2)I'll reply

  • 大家讲道理

    大家讲道理2017-06-28 09:26:20

    Zhihu Spirit Sword Master has answered this question: https://www.zhihu.com/questio...

    reply
    0
  • 巴扎黑

    巴扎黑2017-06-28 09:26:20

    When running under linux, you will get the following error:

    PicklingError: Can't pickle <type 'function'>: attribute lookup __builtin__.function failed

    This error is caused by an error being reported when a unserializable object is passed into the process pool, and this object is an instance method. You can try running it with py3, because the instance method of 3 has been Can support serialization

    Reference: https://virusdefender.net/ind...

    reply
    0
  • Cancelreply