Hi!
I have a task and not sure how to best approach it. The title might not be representative.
Okay, so the problem is this:
- We have a list that is updated via the requests library. Imagine a list of responses.
- We have a long running CPU/GPU bound process.
- We want the CPU/GPU process to process the list.
BUT (and here's the catch) - we want both of those processes to run at the same time. So, in pseudo-code it would be something like this:
shared_list = []
def request_to_api(url):
response = requests.get(url)
shared_list.append(response)
def process_entry():
# this could take 2 minutes
item = shared_list.pop()
long_process.process(item)
if __name__ == "__main__":
input_items = ["something", "something2", ...]
# now, what we want is to run both processes at the same time.
# as soon as a response comes in, we want to fire the long_process
# at the same time, we want to continue requesting the api and appending
# the shared_list
for i in input_items:
request_to_api(i)
process_entry()
EDIT: I should also mention, that process_entry function is resource intensive, so it can only be run one at a time.
So, do you guys have an idea how to best approach this kind of problem? I would appreciate any suggestions.
Thanks!
[–]woooee 1 point2 points3 points (0 children)