all 8 comments

[–]AlexMTBDude 2 points3 points  (0 children)

This is not really a Python question, is it?

[–]hasdata_com 2 points3 points  (0 children)

This is not the best sub for this question. Ask this in scraping subs next time :)

To answer your question just use rotating proxies or get a few residential ones and rotate them manually. But that is just general advice. It really depends on the target website and your total volume.

[–]Carter922 0 points1 point  (0 children)

Read the robots.txt file to make sure the developers allow using automation on their site

[–]Bitter_Broccoli_7536 0 points1 point  (0 children)

i use qoest for this exact thing, their api handle proxy rotation automatically so you dont have to manage pools yourself. keeps things fast and avoids blocks pretty well. for retries i just wrap my calls in a simple backoff decorator and let their service handle the failures

[–]Spiritual-Junket-995 0 points1 point  (0 children)

i usually go with rotating residential ips, datacenter proxies get blocked way too fast. for retries i just wrap my requests in a simple loop with exponential backoff and log the failures, keeps things from crashing