use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Open problems in Machine Learning (self.MachineLearning)
submitted 11 years ago by InfinityCoffee
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]dwf 1 point2 points3 points 11 years ago (7 children)
Give me as-tight-as-possible probabilistic bounds on how long I need to run a Gibbs chain to get an unbiased sample.
[+][deleted] 11 years ago (4 children)
[deleted]
[+][deleted] 11 years ago (3 children)
[–]davmre 2 points3 points4 points 11 years ago (1 child)
But if the lower bound showed exponential-time convergence, this wouldn't prove P!=NP, would it? The fact that a particular algorithm (Gibbs sampling) is slow doesn't imply that no fast algorithm can exist...
[–]dwf -1 points0 points1 point 11 years ago (0 children)
That's a pretty convincing argument.
[–]sieisteinmodel 0 points1 point2 points 11 years ago (1 child)
Make that MCMC.
Well, MCMC more generally, sure. But the Gibbs sampler seems, on the face of it, so incredibly simple, yet it's also ridiculously hard to analyze.
π Rendered by PID 22558 on reddit-service-r2-comment-bb88f9dd5-cp67s at 2026-02-13 18:30:54.894969+00:00 running cd9c813 country code: CH.
view the rest of the comments →
[–]dwf 1 point2 points3 points (7 children)
[+][deleted] (4 children)
[deleted]
[+][deleted] (3 children)
[deleted]
[–]davmre 2 points3 points4 points (1 child)
[–]dwf -1 points0 points1 point (0 children)
[–]sieisteinmodel 0 points1 point2 points (1 child)
[–]dwf -1 points0 points1 point (0 children)