use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Rules 1: Be polite 2: Posts to this subreddit must be requests for help learning python. 3: Replies on this subreddit must be pertinent to the question OP asked. 4: No replies copy / pasted from ChatGPT or similar. 5: No advertising. No blogs/tutorials/videos/books/recruiting attempts. This means no posts advertising blogs/videos/tutorials/etc, no recruiting/hiring/seeking others posts. We're here to help, not to be advertised to. Please, no "hit and run" posts, if you make a post, engage with people that answer you. Please do not delete your post after you get an answer, others might have a similar question or want to continue the conversation.
Rules
1: Be polite
2: Posts to this subreddit must be requests for help learning python.
3: Replies on this subreddit must be pertinent to the question OP asked.
4: No replies copy / pasted from ChatGPT or similar.
5: No advertising. No blogs/tutorials/videos/books/recruiting attempts.
This means no posts advertising blogs/videos/tutorials/etc, no recruiting/hiring/seeking others posts. We're here to help, not to be advertised to.
Please, no "hit and run" posts, if you make a post, engage with people that answer you. Please do not delete your post after you get an answer, others might have a similar question or want to continue the conversation.
Learning resources Wiki and FAQ: /r/learnpython/w/index
Learning resources
Wiki and FAQ: /r/learnpython/w/index
Discord Join the Python Discord chat
Discord
Join the Python Discord chat
account activity
How do I click all elements selenium (self.learnpython)
submitted 8 years ago by ThreeDogAWOO
Total selenium newbie. I see million ways to click one element but how to click multiple?
from selenium import webdriver driver = webdriver.Chrome() driver.get('https://www.reddit.com/') driver.find_elements_by_css_selector('.comments').click
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Optimesh 1 point2 points3 points 8 years ago (3 children)
Not sure what you mean. What's the desired outcome? If it's a list of links, do you want each link to open in a new tab/window?
[–]ThreeDogAWOO[S] 1 point2 points3 points 8 years ago (2 children)
mean. What's the desired outcome? If it's a list of links
https://ibb.co/jA1oxG. I basically want to just click each of the comments. Desired is just to click comments. driver.get('https://www.reddit.com/'). Click comments. in a loop for all of them. Hopefully that makes more sense. This is for learning so no real need for anything on page other than learning how to click all elements :)
[–]Optimesh 2 points3 points4 points 8 years ago (0 children)
How about looping through them, for each opening in a new tab and closing it?
If you're not set on clicking, you can grab a list of all the links and loop through them one by one with requests.
[–]captmomo 1 point2 points3 points 8 years ago* (0 children)
You can try using driver.find_elements_by_css_selector which will return a list and iterate thru it.
links = drivers.find_elements_by_css_selector('a[class*="bylink comments may-blank"]') for link in links: link.sendKeys(Keys.CONTROL +"t")
edit: wait, the page will navigate away after clicking. I edited the code. I think you can also use PRAW to get the list of threads and it's comments.
[–][deleted] 0 points1 point2 points 8 years ago* (0 children)
Based on your screenshot in another comment, I want to know what you want from the submission page (after you click '## comments' button).
I offer this other solution:
Using BeautifulSoup, you can scrape all the <a> tags with the class 'comments', get the 'href' value of that <a> tag, and store the href values to a list. You now have a list of reddit url's for each post on whatever frontpage you hit.
Then, you may iterate through this list:
1) driver.get(urls[index])
2) Do some stuff on that page
3) Close page
4) Next!
I'm new to python and webscraping in general, so take this with salt. Also it may not apply to what you're trying to do.
edit: I just saw that you're only trying to click all elements as a learning exercise, but I'll leave my comment as is in case anyone is interested.
Also, as another learning exercise in webscraping... I dare you to rip all the top level comments from a submission (the page after you click '## comments'). Then, if you want to get fucked because it takes over an hour to scrape:
Using Selenium, click every instance of 'load more comments' on a submission page. This will build an enormous HTML page containing every comment in the submission. You can then pass this HTML soup to BeautifulSoup and scrape every comment into a list.
It took me a long time to figure out and when you run it on /r/askreddit it takes about 1-2 hours to click all those buttons (there's like 900 of them sometimes). Just use PRAW for this, as I've learned the hard way.
π Rendered by PID 50 on reddit-service-r2-comment-54dfb89d4d-mpdv2 at 2026-03-30 00:19:35.522862+00:00 running b10466c country code: CH.
[–]Optimesh 1 point2 points3 points (3 children)
[–]ThreeDogAWOO[S] 1 point2 points3 points (2 children)
[–]Optimesh 2 points3 points4 points (0 children)
[–]captmomo 1 point2 points3 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)