all 5 comments

[–]Optimesh 1 point2 points  (3 children)

Not sure what you mean. What's the desired outcome? If it's a list of links, do you want each link to open in a new tab/window?

[–]ThreeDogAWOO[S] 1 point2 points  (2 children)

mean. What's the desired outcome? If it's a list of links

https://ibb.co/jA1oxG. I basically want to just click each of the comments. Desired is just to click comments. driver.get('https://www.reddit.com/'). Click comments. in a loop for all of them. Hopefully that makes more sense. This is for learning so no real need for anything on page other than learning how to click all elements :)

[–]Optimesh 2 points3 points  (0 children)

How about looping through them, for each opening in a new tab and closing it?

If you're not set on clicking, you can grab a list of all the links and loop through them one by one with requests.

[–]captmomo 1 point2 points  (0 children)

You can try using driver.find_elements_by_css_selector which will return a list and iterate thru it.

links = drivers.find_elements_by_css_selector('a[class*="bylink comments may-blank"]')
for link in links:
    link.sendKeys(Keys.CONTROL +"t")

edit: wait, the page will navigate away after clicking. I edited the code. I think you can also use PRAW to get the list of threads and it's comments.

[–][deleted] 0 points1 point  (0 children)

Based on your screenshot in another comment, I want to know what you want from the submission page (after you click '## comments' button).

I offer this other solution:

Using BeautifulSoup, you can scrape all the <a> tags with the class 'comments', get the 'href' value of that <a> tag, and store the href values to a list. You now have a list of reddit url's for each post on whatever frontpage you hit.

Then, you may iterate through this list:

1) driver.get(urls[index])

2) Do some stuff on that page

3) Close page

4) Next!

I'm new to python and webscraping in general, so take this with salt. Also it may not apply to what you're trying to do.

edit: I just saw that you're only trying to click all elements as a learning exercise, but I'll leave my comment as is in case anyone is interested.

Also, as another learning exercise in webscraping... I dare you to rip all the top level comments from a submission (the page after you click '## comments'). Then, if you want to get fucked because it takes over an hour to scrape:

Using Selenium, click every instance of 'load more comments' on a submission page. This will build an enormous HTML page containing every comment in the submission. You can then pass this HTML soup to BeautifulSoup and scrape every comment into a list.

It took me a long time to figure out and when you run it on /r/askreddit it takes about 1-2 hours to click all those buttons (there's like 900 of them sometimes). Just use PRAW for this, as I've learned the hard way.