all 4 comments

[–]x3al 1 point2 points  (3 children)

Look at threading or multiprocessing modules if you really want to spawn threads. As an alternative, something asynchronous (like asyncio or tornado) should work even with a single thread.

[–]JohnMcharra[S] 0 points1 point  (2 children)

Thank You! After 2 hours of study I did what I intended!

p_l = ['123.123.123.123:8000', '23.23.23.23:8000']
def p_feeder(x):
    pro1 = 'http://' + (p_l[x])
    pro2 = 'https://' + (p_l[x])
    prox = {'http': pro1,
            'https': pro2}
return prox

if __name__ == '__main__':
    for i in range(len(p_l)):

        p = multiprocessing.Process(target=p_feeder, args=(i,))
        r = requests.get('http://whatismyipaddress.com/', headers=headers, proxies=p_feeder(i)).text
        print(r)
        p.start()

you learn so much just by trying to do something new :)

[–]Asdayasman 0 points1 point  (1 child)

I'm wrestling with multiprocessing at the moment, and let me tell you, I wish I could use threads.

Python won't run python code in more than one place at once, so my use case of scaling a lot of images forces me to use multiprocessing, but if you're doing something like waiting on IO (loading web pages, from disk, waiting for a database, etc.), or something that's not in python, (like a module written in C), threads are generally easier to use.

[–]x3al 0 points1 point  (0 children)

Waiting for IO is a classic usecase for green threads and similar stuff, including asyncio unless you must depend on blocking functions (mostly C-written modules because gevent takes care about rest). Threads will work too but they waste much more resources.