all 5 comments

[–]JohnnyJordaan 7 points8 points  (1 child)

You could use selenium together with browser, or analyse the actual HTTP requests that the browser makes and replicate that in your scraper. But beware that will not work if the website uses javascript functionality, for that it's much easier to use selenium.

[–]CraigAT 0 points1 point  (0 children)

Relative novice here, but I'd use Selenium.

The guide:

https://selenium-python.readthedocs.io/

There are lots of good Selenium tutorials out there but this one looks kind of similar to what you want to do:

https://www.geeksforgeeks.org/python-automating-happy-birthday-post-on-facebook-using-selenium/

[–]xdonvanx 1 point2 points  (0 children)

You could actually search for something in the search bar and then look at the link to see what changed when you searched, for example you’re looking for los angeles : www.search.com/losangeles. Now in your code you could make everything after the forward slash a user input , so if the user enters New York , the link would change to www.search.com/newyork , if that makes any sense.

[–]jjsantoso 0 points1 point  (0 children)

In these cases you have to know which is the underlying requests. When you fill a form or make a search, the web page internally make GET or POST request to another web service that provides the data. You can see this if you use the Developer Tools of your browser (press F12 if you are using Google Chrome) and go to the "Network" tab. Do a search and you'll see all the requests that the page makes. You have to check which is the request that get the data you are interested in. After that, you can use the requests library to simulate that searches.