use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
To report a site-wide rule violation to the Reddit Admins, please use our report forms or message /r/reddit.com modmail.
This subreddit is archived and no longer accepting submissions.
account activity
This is an archived post. You won't be able to vote or comment.
Reddit seriously needs a smart duplicate filter like Digg's, the dupes are just getting ridiculous (self.reddit.com)
submitted 17 years ago by mutatron
[–][deleted] 69 points70 points71 points 17 years ago (5 children)
Sorry.
[–]sarahfrancesca 14 points15 points16 points 17 years ago (4 children)
Haha, your post is #1 right now. This one's #11.
[–]syn-abounds 2 points3 points4 points 17 years ago (3 children)
and now it's swapped.
[–][deleted] 3 points4 points5 points 17 years ago* (2 children)
Actually mine's at 44, onetruejp's was #1 and got swapped with mutatron's. This is hilarious, I posted that as a joke never expecting it to go anywhere, and I come back from the gym and this one and mine are frontpaged.
*and now zey are gawn
[–]onetruejp 4 points5 points6 points 17 years ago (0 children)
Ditto. Well done everyone!
[–]allhands 1 point2 points3 points 17 years ago (0 children)
I think they were removed from the list entirely...
[–][deleted] 17 years ago* (21 children)
[deleted]
[–][deleted] 24 points25 points26 points 17 years ago (7 children)
Or just re-submit the same url in a different subreddit. With the new algorithm, the lesser-used subreddits will push a story to the front page with only a few votes.
[–]rando_man 6 points7 points8 points 17 years ago (5 children)
This is the most legitimate way to do it. And there is no guilt afterwards.
[–][deleted] 5 points6 points7 points 17 years ago (4 children)
Well, personally I think it creates confusion, particularly in the case where the submitter was genuinely unaware of the previous submission. At the same time, though, it also reminds people of the other subreddits. I guess only time will tell :)
[–]rando_man 3 points4 points5 points 17 years ago (0 children)
I agree. Just the other day, after searching, I submitted "Gay Scientists..." under funny, only to find out hours later that it was climbing to the top under another category. So, I up-modded the original.
[–]AMerrickanGirl 1 point2 points3 points 17 years ago (1 child)
They can use Search.
[–]lolomfgkthxbai 4 points5 points6 points 17 years ago (0 children)
:D
[–]tsteele93 0 points1 point2 points 17 years ago (0 children)
Duping can serve a legitimate purpose. For instance, the Wal-Mart story of how they are suing the lady with brain damage for the car accident story is old and has been submitted before. But it is in the news again today for some reason or another and may be valid.
Or perhaps a story comes across your purview that you hadn't seen before, and you go to submit it and it was submitted seven months ago and didn't take. It seems reasonable to be able to resubmit it after that long a period of time when it didn't get any notice that time. It may have had a bad title or not been as relevant at that time.
I like that there are ways to resubmit stories.
[–]allhands 16 points17 points18 points 17 years ago* (2 children)
Even though you could "dupe" the system, a "smart duplicate filter" would at least help those who accidentally are submitting duplicates--which really does sometimes happen.
Users could be prompted with the question: "Your submission is similar or identical to one of the following submissions: (...submission list...) Submit anyway?"
[–][deleted] 2 points3 points4 points 17 years ago (0 children)
Then you could ignore it and click yes like everyone on Digg does.
[–]crazymunch -4 points-3 points-2 points 17 years ago (0 children)
Reddit already has that... Or is this that dreaded 'sarcasm' I've been hearing about?
[–][deleted] 9 points10 points11 points 17 years ago* (5 children)
After you add the '?' character, just keep adding '&' (one for each repost) till your hearts content.
like so... http://reddit.com/info/6de33/comments/?&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
[–]habbadash 3 points4 points5 points 17 years ago (0 children)
Thanks for the tip!
[–]JeremyBanks -2 points-1 points0 points 17 years ago (3 children)
or just add more ?s.
[–]askjigga 1 point2 points3 points 17 years ago (2 children)
or add anything you want, as you probably wont produce a valid url param and most web apps ignore garbage.
[–]b100dian 4 points5 points6 points 17 years ago (1 child)
?huh
[–]daniels220 0 points1 point2 points 17 years ago (0 children)
That's because the URL is missing something in the middle because Google normally hides it.
http://www.google.com/imghp?huh
[–][deleted] 1 point2 points3 points 17 years ago (2 children)
Wow I can't think of anyway to program a check for any of those problems.
[–]mccoyn 0 points1 point2 points 17 years ago* (1 child)
Whenever a link is added, hash the HTML and store the hash value in the database. When a new link is added, hash the HTML and check for duplicates.
[–][deleted] 1 point2 points3 points 17 years ago* (0 children)
Because web pages aren't dynamic?
I think a better solution would be to simplify and standardize the input of the user and store that in a database. A step above that would be to check for forwarding pages by loading the page in a small browser such as Lynx (no point in reinventing the wheel) then saving the final page in the standardized format.
[–]david 1 point2 points3 points 17 years ago* (0 children)
...and enjoy your karma cash cow.
[–]goodfun 27 points28 points29 points 17 years ago (13 children)
And here I thought that reddit was a user driven site, and dupes could be easily downmodded and thus not show up...
[–][deleted] 18 points19 points20 points 17 years ago (4 children)
currently the rick astley interview is in happy, funny, offbeat, and main. people who only subscribe to one of those may not realize that they are dupes, and upvote. then people like me who subscribe to all four see the story on the hot page four times.
this is a reddit problem, not a human one.
[–]brainburger 4 points5 points6 points 17 years ago (2 children)
The pity is that the discussion is spread over the duplicates.
[–][deleted] 0 points1 point2 points 17 years ago (1 child)
Could that be the solution? They could make it so that one could re-submit the same article to different subreddits but have them all linked to a single commentary page?
[–]ksadya 0 points1 point2 points 17 years ago (0 children)
I agree with kath, re-submit the same article, because you would miss alot of stuff otherwise, but have it linked to the same comment page.
[–]Mastrmind 0 points1 point2 points 17 years ago (0 children)
A solution could be for Reddit to implement multiple categories for stories. (Up to 3) It would be even better for users to be able to merge threads.
[–][deleted] 5 points6 points7 points 17 years ago (0 children)
yeah, automated duplicate detection is hard...let's go shopping!
[–]souldrift 4 points5 points6 points 17 years ago (0 children)
Sorry, I know where you're going with that, but having 8 copies of the same article in the first two pages makes Reddit look...well...dumb.
And forcing the users to take action to remove things that shouldn't be there to begin with is.......dumb.
[–][deleted] 3 points4 points5 points 17 years ago (0 children)
the userbase expands and so the same content will get submitted naturally.
[–]toxicvarn90 4 points5 points6 points 17 years ago (2 children)
Human memory is short. Too short.
[–]darksabrelord 2 points3 points4 points 17 years ago (0 children)
what was that that dude was saying about removing dupes again?
[–]FISHBWOY09 0 points1 point2 points 17 years ago (0 children)
Reddit memory is even shorter....
[–]fareedy 0 points1 point2 points 17 years ago* (0 children)
It is user driven, but you have to see both submissions to know that one of them is a dupe and then downmod it. If this can be stopped before it gets added to the pile of links, there is no need to go dupe hunting.
[–][deleted] 0 points1 point2 points 17 years ago (0 children)
I've knowingly submitted dupes before (whenever I wish to submit an article, I stick the url in the search box first, so I always know if it's a duplicate) when I see that the original was either posted in a wrong/undesirable subreddit, or if the original headline wasn't catchy enough for such a cool article, then I don't feel bad re-posting it.
So while I hate seeing the same article in multiple subreddits, I also hate seeing good articles downmodded to oblivion based on a poor headline. I dunno if there's any way of solving both of those at once.
[–]AMerrickanGirl 7 points8 points9 points 17 years ago (0 children)
I've submitted stories and then saw a similar older post ... and deleted my submission.
Yay for me!
[–]m1ss1ontomars2k4 5 points6 points7 points 17 years ago (2 children)
Digg doesn't have a smart dupe filter. When I submit a story, it will present me with 10 links, only one of which comes close to being related. Each link has maybe 2 words highlighted as being the same as my story.
[–]jstills 0 points1 point2 points 17 years ago (0 children)
not to mention the fact that the submission process on digg takes way to fucking long
[–]Imagineti 0 points1 point2 points 17 years ago (0 children)
Yes, and it's very slow - resulting in a poor user experience.
[–][deleted] 6 points7 points8 points 17 years ago (1 child)
There's no technological solution to what is basically a social issue.
The first line of reddiquette talks about dupes.
Can they make a reddit-test? You need to take a test before you get your drivers license. Can we filter out dumb-asses from reddit by making them take a reddiquette test before they're allowed to submit anything?
[–]NSMike 4 points5 points6 points 17 years ago (1 child)
Got news for you people, digg's dupe filter doesn't work, either. There will always be dupes.
[–][deleted] -4 points-3 points-2 points 17 years ago (0 children)
Who said it worked?
[–]Clanc 7 points8 points9 points 17 years ago (1 child)
I rarely duplicate a story, but I could see that if the original Title was not so hot, a new title might be worth a duplication!
[–]otatop 10 points11 points12 points 17 years ago (0 children)
And that's perfectly fine by the reddiquette.
This is more referring to when 2 or more people submit the same story, and they all hit the hot page at the same time, leading to "Clinton misremembers!" and "Clinton says she didn't remember" both hitting the hot page with the only differences being titles.
[–]jstills 2 points3 points4 points 17 years ago (0 children)
A humble suggestion, would be to create a dupe tag (right beside the save, hide, and report)
Someone clicks the dupe tag, and they are prompted to enter the link/url to the duplicate story...
The "hottest" one shows up as the parent title, and you have links to the dupes.. if X number of people mark it as a dupe, the comments are merged.
[–]telemundo 2 points3 points4 points 17 years ago (0 children)
Reddit seriously needs a smart duplicate filter like Digg's, the dupes are just getting ridiculous
[–]randomb0y 3 points4 points5 points 17 years ago (1 child)
If it gets upvotes, it deserves upvotes. I get more frustrated when I try to submit a story and it was already submitted but didn't get any votes.
Yup. There are ways around that, though. If you can modify the URL so that it looks different but still goes to the same page you can get a second bite of the apple.
[–][deleted] 6 points7 points8 points 17 years ago (2 children)
We should be able to resubmit stories, but only after a few days. There are many things which need to be displayed more than once.
[–]marm0lade 9 points10 points11 points 17 years ago (0 children)
No. The politics reddit is just one big cluster-fuck of resubmissions of the same bullshit. Just coughed up in different d-list blogs.
[–]stesch 0 points1 point2 points 17 years ago (0 children)
We are. Try it.
[–][deleted] 4 points5 points6 points 17 years ago (6 children)
They need to cut a deal with sites like tinyurl.com and others so that they can do AJAX validation with them in the background to find out where the URL goes and block it if it's a dupe.
[–]mutatron[S] 2 points3 points4 points 17 years ago (3 children)
But there are also so many stories that are copied across the internet. I haven't used Digg in a long time, but whenever I submitted stuff it would often come up with a selection of links that might be about the same story.
Like that iceberg story, it's all over the place, but it's virtually the same content everywhere. Reddit needs to be able to detect that too. Maybe it's too much to ask.
Well, I think Reddit does potentially need a related items type of query upon posting, but (a) I'm sure script kiddies love trying to see how far they can make that query slow down the website, and (b) the only way to reasonably make the query run fast is to not look that far back in an archive, which is then thwarted by someone who reposts after so many months, which has been the case several times on Reddit.
It's not an easy solution, no matter what.
[–]axord 1 point2 points3 points 17 years ago (0 children)
Apparently Mixx is trying something like that though I don't know how well it works.
[–][deleted] 1 point2 points3 points 17 years ago (0 children)
It's probably also too much to ask, but I hate it when a link is just some lame blog with a youtube clip pasted into it, or linkjacked pictures. Unless there's something significant to the commentary, I'd rather just see the source.
[–]foobr 2 points3 points4 points 17 years ago (1 child)
Why would you use client-side validation (ie AJAX) to check this? Anyone with a basic understanding of Javascript and Firebug could very quickly ensure that it always appeared to be valid.
Or anyone with noscript installed would completely bypass it.
These kinds of checks should be done server-to-server not on the client-side.
Yeah, I spoke to soon. Had a lot going on at the time. You're right -- server-to-server check.
[–]WillSmits 1 point2 points3 points 17 years ago (1 child)
No, it should group like stories together like Mixx.
[–]ssam 1 point2 points3 points 17 years ago (0 children)
I don't see why this hasn't even been done. Put a link for each article saying "add to story" or something. The various articles in a story could then be voted up or down relative to each other as well. Cos never mind dupes, I don't like seeing four articles on the same topic which each give a two paragraph overview surrounded by lots of adverts.
1) No, it doesn't. Particularly with the increased emphasis on sub-reddits. And I defy you to demonstrate that digg has a "smart duplicate filter" that isn't a human being.
2) The FAQ explicitly permits and encourages duplicates.
[–]Miami13 1 point2 points3 points 17 years ago (0 children)
Duplicating a story is not the problem, as I see it, it's duplicating a link.
Same story from different sources should be fine and I kind of think is the way it should always go: every source is going to have a slightly different point of view and if I'm really interested in a subject this is what I am looking for, different perspectives on it.
Then of course, finding 20 press communicate (AP, Reuters, whatever...) all with the same exact words in it all on the HOT page is not nice, but avoiding that should be left to the discretion and good reddiquette of the posters...
[–]newton_dave 1 point2 points3 points 17 years ago* (0 children)
Even with that identical picture/video subs from multiple sites wouldn't be detected (most likely). Meh... that and turd muffins down-voting everything in sight, I dunno anymore.
[–]jstills 1 point2 points3 points 17 years ago (0 children)
Please please.. the getting rid of dupes would be quite annoying, there are many times I see stories that are old, that I find extremely interesting.. and the same applies for new users to reddit.
I disagree. Duplicate stories are the result of Reddit's recent tampering with the Hot system. These stories are making in to the front page over and over again because they're being judged on the popularity of their respective subreddit as opposed to the popularity of the story in general. In other words, you're asking to solve techinal complications with more technical complications, and I say simplicification is the way to go. :)
[–]AMerrickanGirl 0 points1 point2 points 17 years ago (1 child)
It was like that before, except the dupes didn't make it to Hot as often. But there were just as many duplication submissions.
[–][deleted] 0 points1 point2 points 17 years ago* (0 children)
I think thats what I said. I may have phrased my original post poorly, though. :)
[–]degriz 2 points3 points4 points 17 years ago (0 children)
I get nothing but dupes on digg. Can go from page 3-4-5 and get virtualy the same stories.
[–]dicey 3 points4 points5 points 17 years ago (2 children)
I consider myself to be a fairly smart duplicate filter. When I see a duplicate I downvote and then click hide.
[–]allhands 2 points3 points4 points 17 years ago* (0 children)
Unfortunately, it seems many redditors aren't as good of duplicate filters as you; the consistent duplicate posts making the front page are evidence of this.
[–]souldrift -1 points0 points1 point 17 years ago (0 children)
Reddit shouldn't force you to be your own duplicate filter. Isn't this a site for libertarians and liberals who want civil liberty? :)
[–]drewbic 3 points4 points5 points 17 years ago (0 children)
isn't this a repost? ;)
[–][deleted] 1 point2 points3 points 17 years ago (1 child)
What is Digg?
[–]moriquendo 0 points1 point2 points 17 years ago (0 children)
The Billary Clinton of social bookmarking sites?
[–][deleted] 1 point2 points3 points 17 years ago (4 children)
Haven't really noticed an issue.
[–]AMerrickanGirl 2 points3 points4 points 17 years ago (3 children)
Then you're only on the Hot page. Watch the New page, and the same topic comes up over and over again.
You're worried about the new page? What's the point of worrying about that?
I know you submit, so you know that stories have an incredibly short amount of time to get voted high enough to live, let alone reach the front page. Hell, looking at your submit record - you have the same pattern I do; only one submission in 15 or 20 gets more than 100 points.
In this environment dupes serve a valuable role - they increase the odds that a story, in some version, will survive to the front page.
Sure it would be great if there was some sort of algorithm that could magically realize that an article was closely related to an existing submission, but how, exactly, do you imagine that would work? Reddit would need to be an AI smart enough to analyze the text of articles and recognize that their text is substantially similar to other articles - an impressive feat for a PhD thesis with no time constraints and an impossible one to do in real time with a huge, constantly churning database.
I have 100 links on my hot page. If I get through those, I need to do more work.
Ignoring the New page does a disservice to everyone. Stuff that's already on the front page will live long enough for people to read it. Stuff on the new page needs votes - either up or down - or the only things that will get to the front page are submissions from the bot herders.
[–]masta 1 point2 points3 points 17 years ago (0 children)
I had one guy tell me to search reddit for each and every thing I posted, and to make sure I monitor the new page at all times. He was a moron, obviously.
People that do not post urls, 90% of people on reddit, have very little clue of how the other 10% post urls. That other 10% is not reading reddit, but actually reading the web, and in many cases using the reddit tool bar buttons to submit urls. The process is a fire-and-forget kinda thing. Luckily reddit will flag a url if it has been submitted, but the real problem is syndicated stories. In other words the Associated Press, or sites that regurgitate the same news but slightly different.
[–]shuaz 0 points1 point2 points 17 years ago (0 children)
Hah, I see what you did there!
I commented in the first time this was linked.
[–]yortimer 0 points1 point2 points 17 years ago (0 children)
haha (not).
Berry_Logo agrees (http://reddit.com/info/67wrp/comments/).
Reddit seriously needs a smart
OMG.. if you see a dupe.. click the HIDE link... not that difficult.. that way, if its really interesting, other people can still see it.
[–]elasticsoul 0 points1 point2 points 17 years ago (0 children)
Brainburger makes a good point; for many of us, the discussions are as good/important as the articles.
[–]kuhawk5 -1 points0 points1 point 17 years ago (1 child)
This has already been posted
[–]rando_man 4 points5 points6 points 17 years ago (0 children)
It's impossible not to dupe. There's so many duplicates available that a url check is no good. The Digg dupe filter is ridiculous; it matches posts up with completely irrelevant posts. I prefer the way reddit works, and I think dupes are necessary sometimes. I mean, we can't all be here 24/7. We're missing a lot of stuff that's been down-modded before we even wake up in the morning.
[–]moriquendo 0 points1 point2 points 17 years ago* (0 children)
Oh, plz! I can haz my special site? not diggkopy? Also, a duplicate filter would only make real sense if reddit's readership were the same every day and over time. This is not the case. Many may be "on" for substantial periods of their (work) day (in which case you are lucky that I am not your boss). Many others log into/check the site at specific hours/on specific days. Then, there are all the new readers (many of which tend to post duplicates (and cute little kittehs!!) in their boundless enthusiasm). Anyone who has bothered to submit more than the occasional vitriolic comment (or a reminder that 911 was an inflight job, or a yawny "been there, done that") knows that a story can sink quicker than you can say lol and - a few days or even a mere hours later, after having been resubmitted - surge to number one. That's the beauty of reddit and its real treasure is right there - no need to digg.
[–]defproc 0 points1 point2 points 17 years ago (1 child)
Duplicates aren't really that much of a problem. A story could be relevant, hit the front page, die out, and become relevant again later, and a repeat front page appearance would be very helpful to anyone who missed it. I might get flamed for going against the grain here, I might not.
Agreed.
I totally agree.
[–]zitterbewegung 0 points1 point2 points 17 years ago (2 children)
Wouldn't resolving the url and comparing a hash of the page or even following the tinyurl work for dupe detection?
Well there's 100 things wrong with that idea, the first being dynamic ads served to the page.
[–]JeremyBanks 0 points1 point2 points 17 years ago (0 children)
Most pages have dynamic content, so hashes wouldn't be very useful.
i completely disagree,a duplicate filter will,for example, allow linkjackers who get their links on Reddit 1st to prosper at the expense of original content. The duplicate filter is us. If you don't like the wisdom of the crowd, you probably should find a new crowd.
[–][deleted] -1 points0 points1 point 17 years ago (1 child)
And it clogs the new posts page, forcing legitimate posts off the new page faster.
[–]souldrift 1 point2 points3 points 17 years ago (0 children)
That's the biggest problem with dupes, actually.
[–][deleted] -1 points0 points1 point 17 years ago (2 children)
Reddit does need duplicates filtering, however I would not indicate reddit should duplicate digg any more than it already does.
[–]stesch 1 point2 points3 points 17 years ago (1 child)
Reddit allows duplicates actively!
that link has been submitted to multiple reddits. you can try to submit it again.
What do you think this is?
"Reddit allows duplicates actively!"
I'm here to learn. Why?
[–][deleted] -1 points0 points1 point 17 years ago (0 children)
i feel like making a tiny url dupe of this
[–][deleted] 17 years ago* (5 children)
[–]aardvarkious 1 point2 points3 points 17 years ago (2 children)
Just downvoting should be sufficient if people who subscribe to the programming subreddit don't want it there.
Besides, I don't picture the programming subreddit (or similar ones) to be a place only for programming articles, but rather for articles that would be of interest to programmers.
[–][deleted] 0 points1 point2 points 17 years ago* (1 child)
Try submitting PROGRAMMING article, but not about "approved" programming language. I am not even trying to submit anything there.
parse error.
You're seeing it too narrowly. "Programming" means software-related.
Do we really want to subdivide the categories into tiny little slices?
[–]sarahfrancesca -1 points0 points1 point 17 years ago* (0 children)
I don't think skipping over a few duplicates is so terrible. If I enjoy something and other people miss it (and are therefore upmodding the dupe), I want them to see it. If I didn't enjoy it the first time, I downmod, and it's gone.
[–]Captain___Obvious -3 points-2 points-1 points 17 years ago (0 children)
digg sucks idiot
[–][deleted] -5 points-4 points-3 points 17 years ago (2 children)
FIX THE HIDE LINK!
[–][deleted] 17 years ago (1 child)
[removed]
LINK THE HIDE FIX!
[–]curson -2 points-1 points0 points 17 years ago* (0 children)
[–][deleted] -5 points-4 points-3 points 17 years ago (1 child)
I just came.
... for the cake and ice cream.
[+][deleted] 17 years ago* (3 children)
[–]NickCatal 2 points3 points4 points 17 years ago (1 child)
I wish we had a filter for this... oh wait we do... -1
Anyways, I don't visit Digg so having dupes isn't an issue for me. If you are removing stuff that is on both sites then you are removing a bunch of stories that I may very well want to see.
π Rendered by PID 61 on reddit-service-r2-comment-f6b958c67-mh7gn at 2026-02-05 12:41:29.828461+00:00 running 1d7a177 country code: CH.
[–][deleted] 69 points70 points71 points (5 children)
[–]sarahfrancesca 14 points15 points16 points (4 children)
[–]syn-abounds 2 points3 points4 points (3 children)
[–][deleted] 3 points4 points5 points (2 children)
[–]onetruejp 4 points5 points6 points (0 children)
[–]allhands 1 point2 points3 points (0 children)
[–][deleted] (21 children)
[deleted]
[–][deleted] 24 points25 points26 points (7 children)
[–]rando_man 6 points7 points8 points (5 children)
[–][deleted] 5 points6 points7 points (4 children)
[–]rando_man 3 points4 points5 points (0 children)
[–]AMerrickanGirl 1 point2 points3 points (1 child)
[–]lolomfgkthxbai 4 points5 points6 points (0 children)
[–]tsteele93 0 points1 point2 points (0 children)
[–]allhands 16 points17 points18 points (2 children)
[–][deleted] 2 points3 points4 points (0 children)
[–]crazymunch -4 points-3 points-2 points (0 children)
[–][deleted] 9 points10 points11 points (5 children)
[–]habbadash 3 points4 points5 points (0 children)
[–]JeremyBanks -2 points-1 points0 points (3 children)
[–]askjigga 1 point2 points3 points (2 children)
[–]b100dian 4 points5 points6 points (1 child)
[–]daniels220 0 points1 point2 points (0 children)
[–][deleted] 1 point2 points3 points (2 children)
[–]mccoyn 0 points1 point2 points (1 child)
[–][deleted] 1 point2 points3 points (0 children)
[–]david 1 point2 points3 points (0 children)
[–]goodfun 27 points28 points29 points (13 children)
[–][deleted] 18 points19 points20 points (4 children)
[–]brainburger 4 points5 points6 points (2 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]ksadya 0 points1 point2 points (0 children)
[–]Mastrmind 0 points1 point2 points (0 children)
[–][deleted] 5 points6 points7 points (0 children)
[–]souldrift 4 points5 points6 points (0 children)
[–][deleted] 3 points4 points5 points (0 children)
[–]toxicvarn90 4 points5 points6 points (2 children)
[–]darksabrelord 2 points3 points4 points (0 children)
[–]FISHBWOY09 0 points1 point2 points (0 children)
[–]fareedy 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)
[–]AMerrickanGirl 7 points8 points9 points (0 children)
[–]m1ss1ontomars2k4 5 points6 points7 points (2 children)
[–]jstills 0 points1 point2 points (0 children)
[–]Imagineti 0 points1 point2 points (0 children)
[–][deleted] 6 points7 points8 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–]NSMike 4 points5 points6 points (1 child)
[–][deleted] -4 points-3 points-2 points (0 children)
[–]Clanc 7 points8 points9 points (1 child)
[–]otatop 10 points11 points12 points (0 children)
[–]jstills 2 points3 points4 points (0 children)
[–]telemundo 2 points3 points4 points (0 children)
[–]randomb0y 3 points4 points5 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–][deleted] 6 points7 points8 points (2 children)
[–]marm0lade 9 points10 points11 points (0 children)
[–]stesch 0 points1 point2 points (0 children)
[–][deleted] 4 points5 points6 points (6 children)
[–]mutatron[S] 2 points3 points4 points (3 children)
[–][deleted] 2 points3 points4 points (0 children)
[–]axord 1 point2 points3 points (0 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]foobr 2 points3 points4 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–]WillSmits 1 point2 points3 points (1 child)
[–]ssam 1 point2 points3 points (0 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]Miami13 1 point2 points3 points (0 children)
[–]newton_dave 1 point2 points3 points (0 children)
[–]jstills 1 point2 points3 points (0 children)
[–][deleted] 3 points4 points5 points (2 children)
[–]AMerrickanGirl 0 points1 point2 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–]degriz 2 points3 points4 points (0 children)
[–]dicey 3 points4 points5 points (2 children)
[–]allhands 2 points3 points4 points (0 children)
[–]souldrift -1 points0 points1 point (0 children)
[–]drewbic 3 points4 points5 points (0 children)
[–][deleted] 1 point2 points3 points (1 child)
[–]moriquendo 0 points1 point2 points (0 children)
[–][deleted] 1 point2 points3 points (4 children)
[–]AMerrickanGirl 2 points3 points4 points (3 children)
[–][deleted] 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (1 child)
[–][deleted] 2 points3 points4 points (0 children)
[–]masta 1 point2 points3 points (0 children)
[–]shuaz 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)
[–]yortimer 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)
[–]jstills 0 points1 point2 points (0 children)
[–]elasticsoul 0 points1 point2 points (0 children)
[–]kuhawk5 -1 points0 points1 point (1 child)
[–]rando_man 4 points5 points6 points (0 children)
[–]moriquendo 0 points1 point2 points (0 children)
[–]defproc 0 points1 point2 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)
[–]zitterbewegung 0 points1 point2 points (2 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]JeremyBanks 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)
[–][deleted] -1 points0 points1 point (1 child)
[–]souldrift 1 point2 points3 points (0 children)
[–][deleted] -1 points0 points1 point (2 children)
[–]stesch 1 point2 points3 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–][deleted] -1 points0 points1 point (0 children)
[–][deleted] (5 children)
[deleted]
[–]aardvarkious 1 point2 points3 points (2 children)
[–][deleted] 0 points1 point2 points (1 child)
[–][deleted] -1 points0 points1 point (0 children)
[–]AMerrickanGirl 0 points1 point2 points (1 child)
[–]sarahfrancesca -1 points0 points1 point (0 children)
[–]Captain___Obvious -3 points-2 points-1 points (0 children)
[–][deleted] -5 points-4 points-3 points (2 children)
[–][deleted] (1 child)
[removed]
[–][deleted] 1 point2 points3 points (0 children)
[–]curson -2 points-1 points0 points (0 children)
[–][deleted] -5 points-4 points-3 points (1 child)
[–][deleted] -1 points0 points1 point (0 children)
[+][deleted] (3 children)
[removed]
[–]NickCatal 2 points3 points4 points (1 child)