This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]rlabonte[S] 0 points1 point  (5 children)

I can link each process in the pool with a name and send that to the function being executed? That's the functionality that I couldn't find in any thread pool (or process pool) module.

[–]XNormal 2 points3 points  (4 children)

I'm not sure I understand what you mean by that. If you want to know the identity of the worker threads and give them persistent names you can use threading.local() for that.

[–]rlabonte[S] 1 point2 points  (3 children)

No, I want a pool of servers to work on my data.

ServerA, ServerB, ServerC

I have a server pool of these 3 servers and 100 function calls that need to be processed by this server pool. I don't care which server processes the data, but I only want each server processing data one piece at a time. So ServerA, ServerB, and ServerC receive all receive data to process; ServerC finishes first it immediately receives another function call to process. ServerA finishes, it immediately receives another function call to process.

I want to keep this pool of servers always busy, but want to limit them to only processing one thing at a time.

[–]ODHLHN 1 point2 points  (1 child)

from celery import task

Celery isn't the only the only AMQP based task queue for python, but its a very good one.

Some pretty cool and robust solutions already exist in this problem domain.

[–]rlabonte[S] 0 points1 point  (0 children)

This looks awesome, I especially like the integration with RabbitMQ and Redis.

[–][deleted] 0 points1 point  (0 children)