search
HomeBackend DevelopmentPHP TutorialAn explanation of how python's distributed task huey implements asynchronous tasks_PHP tutorial

Explanation on how python's distributed task huey implements asynchronous tasks

In this article we will share a python lightweight task queue program, which can enable python's distributed task huey to be implemented Asynchronous tasks, interested friends can take a look.

A lightweight task queue. Its functions and related brokers are not as powerful as celery. It focuses on being lightweight, and the code is relatively simple to read.


Introduction to huey: (Lighter than celery, easier to use than mrq and rq!)

a lightweight alternative.

Written in python

No deps outside stdlib, except redis (or roll your own backend)

Support for django

supports:

Multi-threaded task execution

Scheduled execution at a given time

Periodic execution, like a crontab

Retrying tasks that fail

Task result storage


Installation:

The code is as follows
Installing
 代码如下  
Installing
huey can be installed very easily using pip.
 
pip install huey
huey has no dependencies outside the standard library, but currently the only fully-implemented queue backend it ships with requires redis. To use the redis backend, you will need to install the python client.
 
pip install redis
Using git
If you want to run the very latest, feel free to pull down the repo from github and install by hand.
 
git clone https://github.com/coleifer/huey.git
cd huey
python setup.py install
You can run the tests using the test-runner:
 
python setup.py test
huey can be installed very easily using pip.


pip install huey
huey has no dependencies outside the standard library, but currently the only fully-implemented queue backend it ships with requires redis. To use the redis backend, you will need to install the python client.

pip install redis
 代码如下  
from huey import RedisHuey, crontab
 
huey = RedisHuey('my-app', host='redis.myapp.com')
 
@huey.task()
def add_numbers(a, b):
    return a + b
 
@huey.periodic_task(crontab(minute='0', hour='3'))
def nightly_backup():
    sync_all_data()
Using git If you want to run the very latest, feel free to pull down the repo from github and install by hand. git clone https://github.com/coleifer/huey.git cd huey python setup.py install You can run the tests using the test-runner: python setup.py test
Regarding huey’s API, there is a detailed introduction and parameter introduction below.
The code is as follows
from huey import RedisHuey, crontab huey = RedisHuey('my-app', host='redis.myapp.com') @huey.task() def add_numbers(a, b): Return a + b @huey.periodic_task(crontab(minute='0', hour='3')) def nightly_backup(): sync_all_data()




When juey is a woker, some cli parameters.


Commonly used ones are:

-l About the execution of log files.

-WORKERS, -W has a large value, it must be the ability to increase tasks

-p --periodic When starting huey worker, it will find tasks that require crontab from tasks.py, and will send out several threads to handle these tasks.

-n does not start the pre-periodic execution in crontab. Weekly tasks will only be executed when you trigger it.

--threads You know what it means.
1

The code is as follows
# Original text:                                                                 The following table lists the options available for the consumer as well as their default values.
 代码如下  
# 原文:     
The following table lists the options available for the consumer as well as their default values.
 
-l, --logfile
Path to file used for logging. When a file is specified, by default Huey will use a rotating file handler (1MB / chunk) with a maximum of 3 backups. You can attach your own handler (huey.logger) as well. The default loglevel is INFO.
-v, --verbose
Verbose logging (equates to DEBUG level). If no logfile is specified and verbose is set, then the consumer will log to the console. This is very useful for testing/debugging.
-q, --quiet
Only log errors. The default loglevel for the consumer is INFO.
-w, --workers
Number of worker threads, the default is 1 thread but for applications that have many I/O bound tasks, increasing this number may lead to greater throughput.
-p, --periodic
Indicate that this consumer process should start a thread dedicated to enqueueing “periodic” tasks (crontab-like functionality). This defaults to True, so should not need to be specified in practice.
-n, --no-periodic
Indicate that this consumer process should not enqueue periodic tasks.
-d, --delay
When using a “polling”-type queue backend, the amount of time to wait between polling the backend. Default is 0.1 seconds.
-m, --max-delay
The maximum amount of time to wait between polling, if using weighted backoff. Default is 10 seconds.
-b, --backoff
The amount to back-off when polling for results. Must be greater than one. Default is 1.15.
-u, --utc
Indicates that the consumer should use UTC time for all tasks, crontabs and scheduling. Default is True, so in practice you should not need to specify this option.
--localtime
Indicates that the consumer should use localtime for all tasks, crontabs and scheduling. Default is False.
Examples
 
Running the consumer with 8 threads, a logfile for errors only, and a very short polling interval:
 
huey_consumer.py my.app.huey -l /var/log/app.huey.log -w 8 -b 1.1 -m 1.0

-l, --logfile
Path to file used for logging. When a file is specified, by default Huey will use a rotating file handler (1MB / chunk) with a maximum of 3 backups. You can attach your own handler (huey.logger) as well. The default loglevel is INFO.
-v, --verbose
Verbose logging (equates to DEBUG level). If no logfile is specified and verbose is set, then the consumer will log to the console. This is very useful for testing/debugging.
-q, --quiet
Only log errors. The default loglevel for the consumer is INFO.
-w, --workers
Number of worker threads, the default is 1 thread but for applications that have many I/O bound tasks, increasing this number may lead to greater throughput.

-p, --periodic
 代码如下  
# config.py
from huey import Huey
from huey.backends.redis_backend import RedisBlockingQueue
 
queue = RedisBlockingQueue('test-queue', host='localhost', port=6379)
huey = Huey(queue)
Indicate that this consumer process should start a thread dedicated to enqueueing “periodic” tasks (crontab-like functionality). This defaults to True, so should not need to be specified in practice. -n, --no-periodic Indicate that this consumer process should not enqueue periodic tasks. -d, --delay When using a “polling”-type queue backend, the amount of time to wait between polling the backend. Default is 0.1 seconds. -m, --max-delay The maximum amount of time to wait between polling, if using weighted backoff. Default is 10 seconds. -b, --backoff The amount to back-off when polling for results. Must be greater than one. Default is 1.15. -u, --utc Indicates that the consumer should use UTC time for all tasks, crontabs and scheduling. Default is True, so in practice you should not need to specify this option. --localtime Indicates that the consumer should use localtime for all tasks, crontabs and scheduling. Default is False. Examples Running the consumer with 8 threads, a logfile for errors only, and a very short polling interval: huey_consumer.py my.app.huey -l /var/log/app.huey.log -w 8 -b 1.1 -m 1.0
Task queue huey relies on redis to implement queue task storage, so we need to install redis-server and redis-py in advance. I won’t go into the installation method, just search it yourself. We first create a huey link instance:
The code is as follows
# config.py from huey import Huey from huey.backends.redis_backend import RedisBlockingQueue queue = RedisBlockingQueue('test-queue', host='localhost', port=6379) huey = Huey(queue)


Then it’s about tasks, that is, who you want to be in the task queue circle. Like celey, rq, and mrq, they are all represented by tasks.py.

The code is as follows
from config import huey # import the huey we instantiated in config.py
 代码如下  
from config import huey # import the huey we instantiated in config.py
 
 
@huey.task()
def count_beans(num):
    print '-- counted %s beans --' % num


@huey.task()
def count_beans(num):
Print '-- counted %s beans --' % num

 代码如下  
main.py
from config import huey  # import our "huey" object
from tasks import count_beans  # import our task
 
 
if __name__ == '__main__':
    beans = raw_input('How many beans? ')
    count_beans(int(beans))
    print 'Enqueued job to count %s beans' % beans


Ensure you have Redis running locally

Ensure you have installed huey

Start the consumer: huey_consumer.py main.huey (notice this is “main.huey” and not “config.huey”).

Run the main program: python main.py


Here’s another one that actually goes into execution. main.py is equivalent to the producer, and tasks.py is equivalent to the consumer relationship. main.py is responsible for feeding data.

The code is as follows
main.py
from config import huey # import our "huey" object
from tasks import count_beans # import our task

if __name__ == '__main__':
 代码如下  
from huey import Huey
from huey.backends.redis_backend import RedisBlockingQueue
from huey.backends.redis_backend import RedisDataStore  # ADD THIS LINE
 
 
queue = RedisBlockingQueue('test-queue', host='localhost', port=6379)
result_store = RedisDataStore('results', host='localhost', port=6379)  # ADDED
 
huey = Huey(queue, result_store=result_store) # ADDED result store
beans = raw_input('How many beans? ')

Count_beans(int(beans))
Print 'Enqueued job to count %s beans' % beans

Ensure you have Redis running locally
 代码如下  
>>> from main import count_beans
>>> res = count_beans(100)
>>> res  # what is "res" ?

>>> res.get()  # get the result of this task
'Counted 100 beans'
Ensure you have installed huey Start the consumer: huey_consumer.py main.huey (notice this is “main.huey” and not “config.huey”). Run the main program: python main.py
Like celery and rq, to ​​obtain its results, you need to specify its storage method in your config.py or main code. Currently, huey only supports redis, but compared to its characteristics and volume, this is enough. ! Just a few sentences, import the RedisDataStore library and declare the storage address.
The code is as follows
from huey import Huey from huey.backends.redis_backend import RedisBlockingQueue from huey.backends.redis_backend import RedisDataStore # ADD THIS LINE queue = RedisBlockingQueue('test-queue', host='localhost', port=6379) result_store = RedisDataStore('results', host='localhost', port=6379) # ADDED huey = Huey(queue, result_store=result_store) # ADDED result store
At this time, when we try again in ipython, we will find that we can get the return value in tasks.py. In fact, when you get it in main.py, it is still taken out from redis through uuid.
The code is as follows
>>> from main import count_beans >>> res = count_beans(100) >>> res # what is "res" ? >>> res.get() # get the result of this task 'Counted 100 beans'




huey also supports celey's delayed execution and crontab functions. These functions are very important, and the priority can be customized or there is no need to rely on Linux's own crontab.


The usage is very simple, just add an extra delay time. After looking at the source code of huey, it is executed immediately by default. Of course, it still depends on whether all your threads are in a state of pending execution.

The code is as follows
>>> import datetime
 代码如下  
>>> import datetime
>>> res = count_beans.schedule(args=(100,), delay=60)
>>> res

>>> res.get()  # this returns None, no data is ready
>>> res.get()  # still no data...
>>> res.get(blocking=True)  # ok, let's just block until its ready
'Counted 100 beans'
>>> res = count_beans.schedule(args=(100,), delay=60)

>>> res
python的分布式任务huey如何实现异步化任务讲解   帮客之家 >>> res.get() # this returns None, no data is ready
>>> res.get() # still no data...
>>> res.get(blocking=True) # ok, let's just block until its ready
'Counted 100 beans'

 代码如下  
# tasks.py
from datetime import datetime
 
from config import huey
 
@huey.task(retries=3, retry_delay=10)
def try_thrice():
    print 'trying....%s' % datetime.now()
    raise Exception('nope')


An explanation of how pythons distributed task huey implements asynchronous tasks_PHP tutorial
Here is another introduction to retry. Huey also has retry, which is a very practical thing. If you have seen my introduction to the celery retry mechanism in the above article, you should also understand what huey is about. Yes, he actually also made a decorator in front of the specific functions in tasks. There is a func try exception retry logic in the decorator. Everyone understands.

The code is as follows
# tasks.py
from datetime import datetime
 代码如下  
# count some beans
res = count_beans(10000000)
 
res.revoke()
The same applies to tasks that are scheduled in the future:
 
res = count_beans.schedule(args=(100000,), eta=in_the_future)
res.revoke()
 
@huey.task(crontab(minute='*'))
def print_time():
    print datetime.now()
from config import huey


@huey.task(retries=3, retry_delay=10)
def try_thrice():
Print 'trying....%s' % datetime.now()
Raise Exception('nope')






huey gives you a chance to regret it~ That is to say, after you complete deley's planned task, if you want to cancel it, you can just revoke it.

The code is as follows
# count some beans res = count_beans(10000000)


res.revoke()
The same applies to tasks that are scheduled in the future:

res = count_beans.schedule(args=(100000,), eta=in_the_future)

res.revoke()
 代码如下  
from config import huey
from tasks import count_beans
 
 
if __name__ == '__main__':
    beans = raw_input('How many beans? ')
    count_beans(int(beans))
    print('Enqueued job to count %s beans' % beans)
@huey.task(crontab(minute='*')) def print_time(): Print datetime.now()
task() - a transparent decorator that makes your function more beautiful. periodic_task() - This is a periodic task crontab() - The attached crontab periodic task when starting the worker. BaseQueue - Task Queue BaseDataStore - After the task is executed, the results can be stuffed into it. BAseDataStore can be rewritten by yourself. The official huey git library provides relevant test code: main.py
The code is as follows
from config import huey from tasks import count_beans if __name__ == '__main__': beans = raw_input('How many beans? ') Count_beans(int(beans)) Print('Enqueued job to count %s beans' % beans)




tasks.py

 代码如下  
import random
 代码如下  
import random
import time
from huey import crontab
 
from config import huey
 
 
@huey.task()
def count_beans(num):
    print "start..."
    print('-- counted %s beans --' % num)
    time.sleep(3)
    print "end..."
    return 'Counted %s beans' % num
 
@huey.periodic_task(crontab(minute='*/5'))
def every_five_mins():
    print('Consumer prints this every 5 mins')
 
@huey.task(retries=3, retry_delay=10)
def try_thrice():
    if random.randint(1, 3) == 1:
        print('OK')
    else:
        print('About to fail, will retry in 10 seconds')
        raise Exception('Crap something went wrong')
 
@huey.task()
def slow(n):
    time.sleep(n)
    print('slept %s' % n)
import time

from huey import crontab
 
from config import huey
 

 
 代码如下  
#!/bin/bash
echo "HUEY CONSUMER"
echo "-------------"
echo "In another terminal, run 'python main.py'"
echo "Stop the consumer using Ctrl+C"
PYTHONPATH=.:$PYTHONPATH
python ../../huey/bin/huey_consumer.py main.huey --threads=2

=>
@huey.task()

def count_beans(num):
    print "start..."
    print('-- counted %s beans --' % num)

    time.sleep(3)
 代码如下  
[xiaorui@devops /tmp ]$ git clone https://github.com/coleifer/huey.git
Cloning into 'huey'...
remote: Counting objects: 1423, done.
remote: Compressing objects: 100% (9/9), done.
Receiving objects:  34% (497/1423), 388.00 KiB | 29.00 KiB/s   KiB/s
 
Receiving objects:  34% (498/1423), 628.00 KiB | 22.00 KiB/s
 
 
remote: Total 1423 (delta 0), reused 0 (delta 0)
Receiving objects: 100% (1423/1423), 2.24 MiB | 29.00 KiB/s, done.
Resolving deltas: 100% (729/729), done.
Checking connectivity... done.
[xiaorui@devops /tmp ]$cd huey/examples/simple
[xiaorui@devops simple (master)]$ ll
total 40
-rw-r--r--  1 xiaorui  wheel    79B  9  8 08:49 README
-rw-r--r--  1 xiaorui  wheel     0B  9  8 08:49 __init__.py
-rw-r--r--  1 xiaorui  wheel    56B  9  8 08:49 config.py
-rwxr-xr-x  1 xiaorui  wheel   227B  9  8 08:49 cons.sh
-rw-r--r--  1 xiaorui  wheel   205B  9  8 08:49 main.py
-rw-r--r--  1 xiaorui  wheel   607B  9  8 08:49 tasks.py
[xiaorui@devops simple (master)]$
    print "end..."     return 'Counted %s beans' % num   @huey.periodic_task(crontab(minute='*/5')) def every_five_mins():     print('Consumer prints this every 5 mins')   @huey.task(retries=3, retry_delay=10) def try_thrice():     if random.randint(1, 3) == 1:         print('OK')     else:         print('About to fail, will retry in 10 seconds')         raise Exception('Crap something went wrong')   @huey.task() def slow(n):     time.sleep(n)     print('slept %s' % n)
run.sh
 代码如下  
#!/bin/bash echo "HUEY CONSUMER" echo "-------------" echo "In another terminal, run 'python main.py'" echo "Stop the consumer using Ctrl+C" PYTHONPATH=.:$PYTHONPATH python ../../huey/bin/huey_consumer.py main.huey --threads=2 =>
咱们可以先clone下huey的代码库。 里面有个examples例子目录,可以看到他是支持django的,但是这不是重点 !
 代码如下  
[xiaorui@devops /tmp ]$ git clone https://github.com/coleifer/huey.git Cloning into 'huey'... remote: Counting objects: 1423, done. remote: Compressing objects: 100% (9/9), done. Receiving objects:  34% (497/1423), 388.00 KiB | 29.00 KiB/s   KiB/s   Receiving objects:  34% (498/1423), 628.00 KiB | 22.00 KiB/s     remote: Total 1423 (delta 0), reused 0 (delta 0) Receiving objects: 100% (1423/1423), 2.24 MiB | 29.00 KiB/s, done. Resolving deltas: 100% (729/729), done. Checking connectivity... done. [xiaorui@devops /tmp ]$cd huey/examples/simple [xiaorui@devops simple (master)]$ ll total 40 -rw-r--r--  1 xiaorui  wheel    79B  9  8 08:49 README -rw-r--r--  1 xiaorui  wheel     0B  9  8 08:49 __init__.py -rw-r--r--  1 xiaorui  wheel    56B  9  8 08:49 config.py -rwxr-xr-x  1 xiaorui  wheel   227B  9  8 08:49 cons.sh -rw-r--r--  1 xiaorui  wheel   205B  9  8 08:49 main.py -rw-r--r--  1 xiaorui  wheel   607B  9  8 08:49 tasks.py [xiaorui@devops simple (master)]$



An explanation of how pythons distributed task huey implements asynchronous tasks_PHP tutorial

www.bkjia.comtruehttp: //www.bkjia.com/PHPjc/968078.htmlTechArticleExplanation on how python's distributed task huey implements asynchronous tasks. In this article, we will share a python lightweight task queue program. , he can make python's distributed task huey asynchronous...
Statement
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
详细讲解Python之Seaborn(数据可视化)详细讲解Python之Seaborn(数据可视化)Apr 21, 2022 pm 06:08 PM

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于Seaborn的相关问题,包括了数据可视化处理的散点图、折线图、条形图等等内容,下面一起来看一下,希望对大家有帮助。

详细了解Python进程池与进程锁详细了解Python进程池与进程锁May 10, 2022 pm 06:11 PM

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于进程池与进程锁的相关问题,包括进程池的创建模块,进程池函数等等内容,下面一起来看一下,希望对大家有帮助。

Python自动化实践之筛选简历Python自动化实践之筛选简历Jun 07, 2022 pm 06:59 PM

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于简历筛选的相关问题,包括了定义 ReadDoc 类用以读取 word 文件以及定义 search_word 函数用以筛选的相关内容,下面一起来看一下,希望对大家有帮助。

归纳总结Python标准库归纳总结Python标准库May 03, 2022 am 09:00 AM

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于标准库总结的相关问题,下面一起来看一下,希望对大家有帮助。

Python数据类型详解之字符串、数字Python数据类型详解之字符串、数字Apr 27, 2022 pm 07:27 PM

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于数据类型之字符串、数字的相关问题,下面一起来看一下,希望对大家有帮助。

分享10款高效的VSCode插件,总有一款能够惊艳到你!!分享10款高效的VSCode插件,总有一款能够惊艳到你!!Mar 09, 2021 am 10:15 AM

VS Code的确是一款非常热门、有强大用户基础的一款开发工具。本文给大家介绍一下10款高效、好用的插件,能够让原本单薄的VS Code如虎添翼,开发效率顿时提升到一个新的阶段。

详细介绍python的numpy模块详细介绍python的numpy模块May 19, 2022 am 11:43 AM

本篇文章给大家带来了关于Python的相关知识,其中主要介绍了关于numpy模块的相关问题,Numpy是Numerical Python extensions的缩写,字面意思是Python数值计算扩展,下面一起来看一下,希望对大家有帮助。

python中文是什么意思python中文是什么意思Jun 24, 2019 pm 02:22 PM

pythn的中文意思是巨蟒、蟒蛇。1989年圣诞节期间,Guido van Rossum在家闲的没事干,为了跟朋友庆祝圣诞节,决定发明一种全新的脚本语言。他很喜欢一个肥皂剧叫Monty Python,所以便把这门语言叫做python。

See all articles

Hot AI Tools

Undresser.AI Undress

Undresser.AI Undress

AI-powered app for creating realistic nude photos

AI Clothes Remover

AI Clothes Remover

Online AI tool for removing clothes from photos.

Undress AI Tool

Undress AI Tool

Undress images for free

Clothoff.io

Clothoff.io

AI clothes remover

AI Hentai Generator

AI Hentai Generator

Generate AI Hentai for free.

Hot Article

Hot Tools

PhpStorm Mac version

PhpStorm Mac version

The latest (2018.2.1) professional PHP integrated development tool

MantisBT

MantisBT

Mantis is an easy-to-deploy web-based defect tracking tool designed to aid in product defect tracking. It requires PHP, MySQL and a web server. Check out our demo and hosting services.

SublimeText3 Linux new version

SublimeText3 Linux new version

SublimeText3 Linux latest version

SecLists

SecLists

SecLists is the ultimate security tester's companion. It is a collection of various types of lists that are frequently used during security assessments, all in one place. SecLists helps make security testing more efficient and productive by conveniently providing all the lists a security tester might need. List types include usernames, passwords, URLs, fuzzing payloads, sensitive data patterns, web shells, and more. The tester can simply pull this repository onto a new test machine and he will have access to every type of list he needs.

EditPlus Chinese cracked version

EditPlus Chinese cracked version

Small size, syntax highlighting, does not support code prompt function