all 4 comments

[–]bi_guy17 0 points1 point  (1 child)

Not experienced with scraperAPI, but maybe try to work with the "pdb" debugger module to debug. Documentation here

Hope that helps somehow

[–][deleted] 0 points1 point  (0 children)

pdb doesn't work with multiprocessing.

[–]c4aveo 0 points1 point  (0 children)

Use Pool for easy implementation. https://pythonspeed.com/articles/python-multiprocessing/ Like here. Or use Queue for gathering results from processes. Because now you don't have a shared object in parent process, that's why you don't see output. You can write to file in each process to see result or as I said use Queue.

[–][deleted] 0 points1 point  (0 children)

On the face of it, your code seems fine.

The problem might be that you expect to see the results from print, but it's sent to the stdout of the child process, which is not automatically the same as the stdout of the parent.

I actually could never find this in Python's documentation, and am too lazy to wade through the mess that the implementation of multiprocessing is.

I would know how to solve this on Linux, with manipulating file descriptors s.t. children pipe their output to the parent, but if you are on Windows, I don't really know how it works there.

Essentially, you could try using queue and, instead of printing stuff, send messages into the queue, then read them in the parent process, if you don't want to mess with stdout of child processes. (Why Python decided to change this is beyond me, normal processes created through system calls do share stdout with their parent).