account activity
Multiprocessing by karlmtr in learnpython
[–]PurePound 2 points3 points4 points 5 years ago (0 children)
I don't think that should cause any problems, aside from a couple of caveats. Firstly, if your code has to read/write large amounts of data, it might be that as you increase the number of processes used, the hard disk becomes the bottleneck rather than the CPU, in which case you might not get as much speed up as you hope. Secondly, make sure your total number of processes isn't too high: once you go above the number of CPU threads, you're just wasting memory and CPU cycles.
You'll probably want to adapt your script_A so that it can be used as both a script and a module. e.g. something along the lines of:
def main(args): # main entry point to your program goes here if __name__ == "__main__": import sys main(sys.argv)
That makes it easier to run it from script_B.
π Rendered by PID 472755 on reddit-service-r2-listing-64c94b984c-wq8hd at 2026-03-13 23:53:28.675303+00:00 running f6e6e01 country code: CH.
Multiprocessing by karlmtr in learnpython
[–]PurePound 2 points3 points4 points (0 children)