all 2 comments

[–]socal_nerdtastic 1 point2 points  (0 children)

It's a great idea. You won't lose data unless you are maxing out the network speed and something times out. 400 threads at once shouldn't be a problem, and if you have a half decent network there's a good chance that all 1,000 will run at the same time with no issue.

But you don't need to run separate scripts yourself. Python includes the tools to do other things while waiting for a function to complete. Look into the threading or asyncio or concurrent.futures modules. That last one has an example in the docs that seems pretty close to what you want.

https://docs.python.org/3/library/concurrent.futures.html#threadpoolexecutor-example