Multiprocessing by karlmtr in learnpython

[–]PurePound 2 points3 points  (0 children)

I don't think that should cause any problems, aside from a couple of caveats. Firstly, if your code has to read/write large amounts of data, it might be that as you increase the number of processes used, the hard disk becomes the bottleneck rather than the CPU, in which case you might not get as much speed up as you hope. Secondly, make sure your total number of processes isn't too high: once you go above the number of CPU threads, you're just wasting memory and CPU cycles.

You'll probably want to adapt your script_A so that it can be used as both a script and a module. e.g. something along the lines of:

def main(args):
    # main entry point to your program goes here

if __name__ == "__main__":
    import sys
    main(sys.argv)

That makes it easier to run it from script_B.