Heim  >  Fragen und Antworten  >  Hauptteil

Parallelitätsmodell – Warum stellt Python den Prozesspool in den Dekorator, ohne dass dies wirksam wird und kein Fehler gemeldet wird?

Ich möchte den Prozesspool in einem Decorator kapseln, aber er wird weder wirksam noch meldet er einen Fehler

# coding:utf-8
import multiprocessing
import tornado
from tornado.httpclient import AsyncHTTPClient

process_num = 20  # 进程数
url = "https://www.baidu.com"

def handle_request(response):
    print str(response)

def run_in_process(process_num):
    def _run_in_process(f):
        def __run_in_process(*args, **kwargs):
            pool = multiprocessing.Pool(processes=process_num)
            for i in range(process_num):
                pool.apply_async(f, args=args, kwds=kwargs, callback=kwargs.get("callback"))
            pool.close()
            pool.join()

        return __run_in_process

    return _run_in_process


@run_in_process(process_num)
def main():
    http_client = AsyncHTTPClient()
    http_client.fetch(url, callback=handle_request)
    global loop
    loop = tornado.ioloop.IOLoop.instance()
    if loop._running is False:
        loop.start()


if __name__ == '__main__':
    main()

Die Ergebnisse sind wie folgt

/usr/bin/python2.7 /home/xxx/workspace/py_project/crawler/center/sample.py

Process finished with exit code 0

Aber das Seltsame ist, dass ich es im Multiprozessmodus umgeschrieben habe und festgestellt habe, dass es wirksam werden kann

# coding:utf-8
import multiprocessing
import tornado
from tornado.httpclient import AsyncHTTPClient

process_num = 20  # 进程数
url = "https://www.baidu.com"

def handle_request(response):
    print str(response)

def run_in_process(process_num):
    def _run_in_process(f):
        def __run_in_process(*args, **kwargs):
            _processes = []
            for i in xrange(process_num):
                p = multiprocessing.Process(target=f, args=args, kwargs=kwargs)
                p.start()
                _processes.append(p)

            for p in _processes:
                p.join()

        return __run_in_process
    return _run_in_process


@run_in_process(process_num)
def main():
    http_client = AsyncHTTPClient()
    http_client.fetch(url, callback=handle_request)
    global loop
    loop = tornado.ioloop.IOLoop.instance()
    if loop._running is False:
        loop.start()


if __name__ == '__main__':
    main()

Das Protokoll lautet wie folgt

/usr/bin/python2.7 /home/shufeng/workspace/private_project/jobscrawler/center/sample.py
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa21ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa425d0>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa42250>,request_time=0.014312028884887695,time_info={})
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa21ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa43450>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa430d0>,request_time=0.02327895164489746,time_info={})
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa21ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa43510>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa43190>,request_time=0.026951074600219727,time_info={})
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa21ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa42690>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa42310>,request_time=0.0552978515625,time_info={})
HTTPResponse(_body=None,buffer=<_io.BytesIO object at 0x7f2fdaa24ef0>,code=200,effective_url='http://www.baidu.com',error=None,headers=<tornado.httputil.HTTPHeaders object at 0x7f2fdaa39e10>,reason='OK',request=<tornado.httpclient.HTTPRequest object at 0x7f2fdaa39a90>,request_time=0.05612993240356445,time_info={})

Die gleiche Situation tritt auch bei der Verwendung von Thread-Pools und Coroutinen auf. Weiß jemand, was los ist?

仅有的幸福仅有的幸福2669 Tage vor761

Antworte allen(2)Ich werde antworten

  • 大家讲道理

    大家讲道理2017-06-28 09:26:20

    知乎灵剑大神已回答此问题:https://www.zhihu.com/questio...

    Antwort
    0
  • 巴扎黑

    巴扎黑2017-06-28 09:26:20

    在linux下运行, 会得到下面的报错:

    PicklingError: Can't pickle <type 'function'>: attribute lookup __builtin__.function failed

    而这个报错是因为, 传入不可序列化的对象进进程池时, 报错导致的, 而这个对象就是实例方法, 可以试下用py3运行下, 因为3的实例方法已经可以支持序列化

    参考资料: https://virusdefender.net/ind...

    Antwort
    0
  • StornierenAntwort