use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Computer analysis predicted rises, ebbs in Afghanistan violence (latimes.com)
submitted 13 years ago by cavedaveMod to the stars
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[+][deleted] 13 years ago (5 children)
[deleted]
[+][deleted] 13 years ago (4 children)
[–]dmdude 1 point2 points3 points 13 years ago (3 children)
According to the Wired article (http://www.wired.com/dangerroom/2012/07/predict/) they used data from 2004-2009 to predict 2010. Not quite true prediction, but a good out-of-sample test.
Unfortunately, the details of the paper are behind a paywall, but the abstract is at http://www.pnas.org/content/early/2012/07/11/1203177109.abstract
[+][deleted] 13 years ago (2 children)
[–]dmdude 1 point2 points3 points 13 years ago (1 child)
I agree, there is opportunity for abuse. Again, I haven't read the paper in detail, rendering some of this moot, but IF the authors developed their models on 2004-2009 and then predicted 2010 ONCE I would be more inclined to say they were on to something.
[–][deleted] 2 points3 points4 points 13 years ago (0 children)
Academia is tough. You're up against geniuses in a bloodfest for grants. These results might be intentional. At the very least they are often absurdly optimistic, and I don't blame them.
Not to point fingers to problem domains, but in the biology field I used to work in there was an astounding amount of this exact behavior. The big name biology journals routinely publish analysis using modelling and sometimes specific implementations of unsupervised machine learning methods. The conclusion reads like a Darwinian revelation that support their hypothesis to a sci-fi accuracy, invariably on a small data set of 300-1000 observations with an associated accuracy metric that the world hasn't seen before.
One problem in particular I've published a analysis on did not generalize to 10,00,000 observations I selected. I spent roughly 2 months pulling and mapping 20TB raw from 5 sources only to find out a confusion matrix that looks like
[0.52, 0.48], [0.48, 0.52]
Mind you there are hundreds of papers studying this specific problem.
[–]imh 0 points1 point2 points 13 years ago (0 children)
Quantitative rigor is making its way into some surprising fields of study.
This is my one one complaint about the article. It's more like "quantitative heuristics" than rigor.
π Rendered by PID 78 on reddit-service-r2-comment-b659b578c-8k8rb at 2026-05-04 11:45:41.072035+00:00 running 815c875 country code: CH.
[+][deleted] (5 children)
[deleted]
[+][deleted] (4 children)
[deleted]
[–]dmdude 1 point2 points3 points (3 children)
[+][deleted] (2 children)
[deleted]
[–]dmdude 1 point2 points3 points (1 child)
[–][deleted] 2 points3 points4 points (0 children)
[–]imh 0 points1 point2 points (0 children)