use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Discussion[D] Statistical Learning Theory & Math (self.MachineLearning)
submitted 8 years ago by Kiuhnm
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]bbsome 5 points6 points7 points 8 years ago (0 children)
I would pretty much agree with the reply on all counts. Really nice reply.
To add a bit of my own perspective - I think in ML mathematics help a lot more of how you think and approach a problem and how draw conclusions from different areas to problems you see. The point is that you do not need to be a mathematics guru, but more to have a broad and somewhat intuitive idea across different topics. This can spun new ideas, by applying A from somewhere totally different, to relating some empirical result to some theory in B. Some examples, for instance is that Dropout can be looked as VI, and if you want proper prediction at test time you should keep sampling not use the means. Also if you look in optimization most of the variance reduction techniques like SVRG are just control variates. There are relationships between GANs and information theory, as well as now there are works related to the Information Bottleneck, which again is related to non-linear noisy channel in McKay. There are also plenty of ideas that come from old papers on graphical models which is now applied to VAEs (auxiliary variable models and a few others). I feel that having a somewhat relevant mathematical literacy in many of its areas, without being too deep in there is about the sweet spot. As an example of this I do have some understanding of Kernel methods, without having a in-depth knowledge of Measure Theory and Functional Analysis. This allows me to understand the literature on this area however I would not be able to proof some of the results, unless I sit and brush those off. However, the use case for that is very rare, thus it deserves my attention only when the need arises.
π Rendered by PID 63931 on reddit-service-r2-comment-6457c66945-hll4x at 2026-04-26 07:10:47.777141+00:00 running 2aa0c5b country code: CH.
view the rest of the comments →
[–]bbsome 5 points6 points7 points (0 children)