写了Python脚本用来部署及分发项目到各个节点, 脚本后面跟不同的参数对应不同的项目, 但存在一个问题, 就是我必须等待脚本执行完成才能继续执行下一次部署, 请问大神, 有什么方法能让该脚本同时多进程执行而互不影响? 感谢!
黄舟2017-04-18 10:27:23
Self-question and answer, it seems that there is a loophole in my question. In fact, python executes scripts through the python interpreter, and the python interpreter itself is an independent process to perform operations, so I can go anywhere at any time from the client. Execute the script. This question is my problem because I don’t have a clear understanding of the multi-threading application scenario. This question is over, thank you for your attention!
迷茫2017-04-18 10:27:23
You can consider using fabric for deployment. However, multi-node batch deployment can be achieved.
PHP中文网2017-04-18 10:27:23
There are many ways, for example, you can throw the task directly into celery, or you can program your script into multiple processes/threads, and use set or list to pass parameters