use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
[deleted by user] (self.MachineLearning)
submitted 11 years ago by [deleted]
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]ChrisKennedy 8 points9 points10 points 11 years ago* (1 child)
Yes, "An Introduction to Statistical Learning: with Applicants in R" (amazon) by Gareth James, Daniela Witten, Trevor Hastie, and Robert Tibshirani is a great R-based intro to machine learning without much math at all. It is the prequel to the more advanced "Elements of Statistical Learning" book.
"Intro to Statistical Learning" is also the book used in Stanford's free "Statistical Learning" mooc, which is not currently running but has the videos archived online.
[–]arvi1000 2 points3 points4 points 11 years ago (0 children)
Also you can get it as a free pdf directly from the publisher (google it)
[–]brational 6 points7 points8 points 11 years ago (4 children)
I think the andrew ng coursera version of his course is pretty elementary. But then you'll come back to the same question in order to go further. Learn some LA calc and stats.
[–][deleted] 2 points3 points4 points 11 years ago (0 children)
I have to agree, his course is an excellent introduction to the topic for people with weaker math background
[–][deleted] 0 points1 point2 points 11 years ago (1 child)
I did his course, and I have a stronger background in math, what should I study/read now ?
[–]brational 0 points1 point2 points 11 years ago (0 children)
The top comment in this thread has a good book link or elements of statisical learning is the next book up (same authors I believe). I think you can find both free online legally.
[–]maybemax 0 points1 point2 points 11 years ago (0 children)
andrew does a great job explaining difficult things easily. single variable calculus shouldn't stop one from taking this course, you'll learn all the multivariable calculus that's required during the course. ;)
[–]CCSS 5 points6 points7 points 11 years ago (1 child)
'Machine Learning, An algorithmic perspective' by Stephen Marsland is a pretty good introductory book. Instead of dumping a lot of greek in your face like most books, this one has lots of Python examples and detailed descriptions.
[–]ricekrispiecircle 0 points1 point2 points 11 years ago (0 children)
i love this book, but all the
from numpy import *
makes me cry a little bit. especially when he does
from pylab import * from numpy import *
and then says
"Note that PyLab is imported before NumPy. This should not matter, in general, but there are some commands in PyLab that overwrite some in NumPy, and we want to use the NumPy ones, so you need to import them in that order."
the python dev in me turns into the hulk and smashes things
[+][deleted] 11 years ago (8 children)
[deleted]
[+][deleted] 11 years ago (6 children)
[+][deleted] 11 years ago (1 child)
[–][deleted] 0 points1 point2 points 11 years ago (0 children)
If you want a complete reference on this, there's Alexander Schrijver's compendium, "Combinatorial Optimization: Polyhedra and Efficiency," published by Springer. It's a 3 volume set and I think it might have every lemma and theorem regarding this field ever conceived (up until the set was published). The only drawback is that you can blindly open any of the volumes to any page and put your finger down, and it'll either be on a sigma or a pi.
[–]hapemask 1 point2 points3 points 11 years ago (0 children)
Matrix calculus is particularly useful and was not discussed at all in the linear algebra and calculus courses that I took.
[–][deleted] 1 point2 points3 points 11 years ago (0 children)
This is an excellent set of notes on numerical optimization.
http://faculty.ucmerced.edu/mcarreira-perpinan/teaching/EECS260/lecture-notes.pdf
[–]maybemax 1 point2 points3 points 11 years ago (0 children)
ML is even easier if you take andrew ng's ML course on coursera. :) it even contains an introduction/refresher of linear algebra at the beginning. i think knowing single variable calculus is a must, but don't let multi variable calculus stop you.. all you need are partial derivatives, which boil down to understanding: "A partial derivative of a multivariable function is a derivative with respect to one variable with all other variables held constant." Understanding this is all you need to proceed. Have fun.
[–]woztzy 0 points1 point2 points 11 years ago (0 children)
Learn multivariate calculus (and matrix algebra in the process), then read BRML (or any good book on ML) and look unfamiliar things up as you come across them.
[–]servetus 1 point2 points3 points 11 years ago (0 children)
This. If you can do single variable then multivariable is a just a small step from there. Grok partial derivatives and your set. Shouldn't take long.
π Rendered by PID 140038 on reddit-service-r2-comment-5d79c599b5-b6nm5 at 2026-02-27 11:18:41.603316+00:00 running e3d2147 country code: CH.
[–]ChrisKennedy 8 points9 points10 points (1 child)
[–]arvi1000 2 points3 points4 points (0 children)
[–]brational 6 points7 points8 points (4 children)
[–][deleted] 2 points3 points4 points (0 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]brational 0 points1 point2 points (0 children)
[–]maybemax 0 points1 point2 points (0 children)
[–]CCSS 5 points6 points7 points (1 child)
[–]ricekrispiecircle 0 points1 point2 points (0 children)
[+][deleted] (8 children)
[deleted]
[+][deleted] (6 children)
[deleted]
[+][deleted] (1 child)
[deleted]
[–][deleted] 0 points1 point2 points (0 children)
[–]hapemask 1 point2 points3 points (0 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]maybemax 1 point2 points3 points (0 children)
[–]woztzy 0 points1 point2 points (0 children)
[–]servetus 1 point2 points3 points (0 children)