Home  >  Q&A  >  body text

python redis list insertion speed is too slow

pool = redis.ConnectionPool(host=host, port=port)
client = redis.StrictRedis(connection_pool=pool)

for i in range(10000):
    for j in range(30):
        client.lpush(IDLE_TASKS, json.dumps(args))
 

This kind of execution efficiency is terrible.
You need to wait dozens of seconds before the insertion is completed.
Is there any more efficient way to handle this?

args is just a tuple content (1,2,"3") or something like that

typechotypecho2669 days ago1268

reply all(1)I'll reply

  • 怪我咯

    怪我咯2017-06-28 09:23:58

    Because I personally have never used the redis library, so I can only try to give some suggestions based on the code you gave. If you don’t like it, don’t criticize:

    1. I don’t know where your args came from, but there seems to be no change in the loop body, so can you put this json.dumps(args) outside the loop body and execute it:

    args_dump = json.dumps(args)
    for i in range(10000):
        for j in range(30):
            client.lpush(IDLE_TASKS, args_dump)

    2. Seeing that you need to generate about 300,000 pieces of the same data, can you generate this data first and then run it client.lpush? Because after all, tcp also has its own delay factors

    3. You can use the cProfile library to find out what takes a long time, or you can try to use another library to implement it (you have to google this for details)

    reply
    0
  • Cancelreply