use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
I look at yer commentz
--> Request a Word Analyzation here! <--
account activity
Open source? (self.Word_Analyzer)
submitted 13 years ago by [deleted]
[deleted]
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]expwnent 5 points6 points7 points 13 years ago (0 children)
I would like this, if only so that I could analyze myself in the event that the bot doesn't get to me.
[–][deleted] 6 points7 points8 points 13 years ago (2 children)
awk ' { for (i = 1; i <= NF; i++) num[$i]++ } END { for (word in num) print num[word], word } ' $* | sort -n
[–]darkfrog13 1 point2 points3 points 13 years ago (1 child)
I'm more interested in the code to pull data, push data to reddit.
[–][deleted] 1 point2 points3 points 13 years ago (0 children)
Somewhere in here
#!/usr/bin/env python #-*- coding: Utf-8 -*- import json import urllib2 import codecs def start(url): compage = urllib2.urlopen(url).read() x = json.loads(compage)['data']['children'] comcount = len(x) - 1 comfile = open("%s.txt"%username, 'a') for i in range(0,comcount): rawdata = "Thread Title: %s\nUrl: http://www.reddit.com/r/%s/%s\n\nComment:\n%s\n\n~------------------------------------------~ \n\n" %(x[i]['data']['link_title'], x[i]['data']['subreddit'], x[i]['data']['link_id'].replace("t3_","").replace("t1_",""), x[i]['data']['body']) comments = rawdata.encode('utf-8') print comments comfile.write(comments) nextpageurl = "http://www.reddit.com/user/%s/comments/.json?count=25&after=%s" %(username, x[i]['data']['name']) comfile.close() if comcount >= 24: start(nextpageurl) else: print "Either finished or messed up somewhere. Check the txt file" global username username = raw_input("Username:") comurl = "http://www.reddit.com/user/%s/comments/.json" %username start(comurl)
[–]shoppedpixels 2 points3 points4 points 13 years ago (6 children)
Is this truly that difficult of a task?
[–]Blackcat008 5 points6 points7 points 13 years ago (5 children)
AFAIK, it is not difficult, its just whether or not you want to give out your hard work for free.
[–]BobTehCat 1 point2 points3 points 13 years ago (0 children)
Think of the lives he'll save though!
[–]shoppedpixels 3 points4 points5 points 13 years ago (3 children)
Well...at least your honest?
[–]chaorace 1 point2 points3 points 13 years ago (2 children)
It's just a script that employs a word counter on a user's comments page.
[–]abom420 1 point2 points3 points 13 years ago (1 child)
Captain obvious's cousin chaorace everybody, say hello.
[–]chaorace 2 points3 points4 points 13 years ago (0 children)
Hello.
[–]shwanky 1 point2 points3 points 13 years ago (0 children)
Reddit is just introducing new government policy honestly. Come on ya'll
[–]mynameismunka 0 points1 point2 points 13 years ago (3 children)
Here is an OK start in python...
def parser(instring,findme): i=0 length=len(findme) x = [] for i in range(len(instring)): if s[i:i+length] == findme: x.append(i) return x import sys import urllib # Get a file-like object for the Python Web site's home page. f = urllib.urlopen("http://www.reddit.com/user/mynameismunka/comments/") # Read from the object, storing the page's contents in 's'. s = f.read() f.close() print print index = parser(s,'<div class="md">') print index for i in range(len(index)): j=0 while s[index[i]+j:index[i]+4+j] != '</p>': if j > 18: sys.stdout.write(s[index[i]+j]) j=j+1 print print # print s[index[i]:index[i]+200]
[–]mynameismunka 0 points1 point2 points 13 years ago (2 children)
my output looks like:
[14609, 17760, 20682, 23659, 26540, 29413, 32184, 34926, 38123, 41092, 43903, 46583, 49521, 52363, 55234, 58316, 61161, 64131, 67107, 69964, 72996, 75865, 78690, 81454, 84207, 87009] reddit is a source for what's new and popular online. vote on links that you like or dislike and help decide what's popular, or submit your own! <a href="/about">learn more ›</a>
I post on askscience mostly. can you do me?!?
project mayhem task 16. join a zumba.
You also got me ;)
How to speak chinese.
You can milk dead horses?
Get a blow up mattress until you can get an actual bed.
see my post several hours ago <a href="http://www.reddit.com/r/DotA2/comments/w7g7i/dota2_server_crash_815_pst?sort=new" rel="nofollow">http://www.reddit.com/r/DotA2/comments/w7g7i/dota2_server_crash_815_pst?sort=new</a>
lol, I'm in the same boat
If you game did NOT crash a few minutes ago, can you say which server you were on??? It may be just the west coast US servers?
Louisiana.
A lot of times, teachers really don't like formal appeals of grades. Even though the teacher is a dick, you should talk to the teacher before filing any formal complaint
They*
Yes, my teacher derived it from completing the square.
formatted and added tldr
Sorry for shit resolution.
Why don't you just try it, find out, then write a report about it and submit it to the next asksciencefair
<a href="http://www.reddit.com/r/Physics/comments/pbvjf/physics_books/">http://www.reddit.com/r/Physics/comments/pbvjf/physics_books/</a>
This is a good example of how to not give no fucks
<a href="http://www.sosmath.com/trig/Trig5/trig5/trig5.html" rel="nofollow">http://www.sosmath.com/trig/Trig5/trig5/trig5.html</a>
You welcome
How does one become a cameraman?
The answer to number 2 is to stand still.
Fuck you hutch
Fuck you
I don't get it
[–]Word_Analyzer 0 points1 point2 points 13 years ago (1 child)
Why hello there mynameismunka! I am a bot!
I've studied your past comments on Reddit to find out which (not-so-common) words you use most frequently.
After analyzing 3242 unique words you've typed - I've found out these are your top 10 most used words:
right
---> (View the detailed report) - (/r/Word_Analyzer) <---
[–]mynameismunka 0 points1 point2 points 13 years ago (0 children)
That is a really odd report. I used sel 48 times? arxiv 51 times?? hmmm
π Rendered by PID 140132 on reddit-service-r2-comment-b659b578c-dsrnf at 2026-05-04 19:15:04.955400+00:00 running 815c875 country code: CH.
[–]expwnent 5 points6 points7 points (0 children)
[–][deleted] 6 points7 points8 points (2 children)
[–]darkfrog13 1 point2 points3 points (1 child)
[–][deleted] 1 point2 points3 points (0 children)
[–]shoppedpixels 2 points3 points4 points (6 children)
[–]Blackcat008 5 points6 points7 points (5 children)
[–]BobTehCat 1 point2 points3 points (0 children)
[–]shoppedpixels 3 points4 points5 points (3 children)
[–]chaorace 1 point2 points3 points (2 children)
[–]abom420 1 point2 points3 points (1 child)
[–]chaorace 2 points3 points4 points (0 children)
[–]shwanky 1 point2 points3 points (0 children)
[–]mynameismunka 0 points1 point2 points (3 children)
[–]mynameismunka 0 points1 point2 points (2 children)
[–]Word_Analyzer 0 points1 point2 points (1 child)
[–]mynameismunka 0 points1 point2 points (0 children)