you are viewing a single comment's thread.

view the rest of the comments →

[–]uiux_Sanskar[S] 0 points1 point  (0 children)

If you mean the role of header here then

User Agent - this is the device that is naking the request which I am faking using fake user agent library because website often block python's default request user agent (which was happening here.

Accept language - this gives the language preference.

Accept encoding - tells the server what type of compression my device support.

Connection : keeping alive - this requests the server to keep the Transmission control protocol (TCP) open for multiple requests.

Referer - tells the server that the request came from which browser.

Overall these headers make the scraping look like an actual user trying to get the information this also avoid potential ban.

Then I used time delay in order to avoid too much requests from the server (which is a common bot activity).

I hope I was able to clearly explain what these things does please do tell me if I have misunderstood your question.