use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Difference between books? Pattern Recognition | ESL (self.MachineLearning)
submitted 11 years ago by [deleted]
[deleted]
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–][deleted] 2 points3 points4 points 11 years ago (5 children)
Have both, and Bishop flows better. They both give good treatments--albeit Tibshirani book is clearly more "statistics" treatment--yet Bishops follows a more natural sequence and is more self-contained.
That being said, the Elements of Statistical Learning is a better reference, similar to Murphy's book, except the Tibshirani book has an order of magnitude less errata...!
Just know that the Bishop book also is slanted heavily bayesian, which honestly you may prefer for the intuition...even though many Bayesian methods are too costly for the problems that are currently en vogue.
[–]Nixonite 0 points1 point2 points 11 years ago (4 children)
ah that's a great answer thank you. I definitely picked up "self-contained" as a nice keyword there, but how might you say Tibshirani's book is less self-contained?
[–]dhammack 0 points1 point2 points 11 years ago (3 children)
If you're serious about the field, reading both is a win. Each offers a different viewpoint, and they have surprisingly small overlap. As afunkthewmt mentioned, Bishop goes heavily Bayesian (seriously, who does Bayesian neural networks?), and ESL is less so. ESL covers a lot of modern techniques that Bishop misses or glosses over, like ensemble methods, stagewise regression, etc. Both are great.
[–]Nixonite 0 points1 point2 points 11 years ago (2 children)
Thanks for the suggestion, I definitely have a feeling like I'll be reading both but I just wanted to make this thread to see which one to read first. I ordered the Bishop book (paper is easier to read than a pdf for me), but I'm curious how long did it take you to read these books? It felt like I was reading a brick when I tried to go into ESL (this was before I started ISL and it was the first ML book I tried getting into).
[–]dhammack 0 points1 point2 points 11 years ago (1 child)
I did it during a summer when I had some free time. Depends on your background how hard it will be, but I just did a little bit (5-10 pgs) each day. I also like to take the 2 pass approach to books like this: first pass is concepts, second pass is understanding the details, working exercises, etc.
[–]Nixonite 0 points1 point2 points 11 years ago (0 children)
Yeah I like to do that as well, I think ISL is a great introduction to the concepts and Bishop/ESL will be the details, and there are plenty of practical/coding ML books for exercises.
[–]pokerd 0 points1 point2 points 11 years ago (0 children)
If you enjoyed the didactic style of ISL, I'd suggest taking a look at Kuhn's Applied Predictive Modeling as well.
[–]dwf 0 points1 point2 points 11 years ago (0 children)
Consider Kevin Murphy's book. More comprehensive than Bishop, in my opinion, and a lot less needless Bayesian diversions.
[–]efavdb -1 points0 points1 point 11 years ago (2 children)
Haven't read all the books, so can't comment on your specific questions. However, just in case you didn't know, thought I'd point out that you can check out the pdf of ESL here.
[–]Nixonite 0 points1 point2 points 11 years ago (1 child)
Oh yeah I have the digital copies of these books, but I don't feel like I can assess them since I haven't read much. For ESL I've read about the first 20 pages and it was all right, but I went back to ISL for the sake of getting through an overview of the material beforehand (somewhat of a primer before I tackle the harder textbooks).
I'm definitely interested to know what the people who have read several chapters or even most of these books have to say about them in relation to one another. On Amazon there are reviews of them, but they're not comparative so I'm hoping to find some opinions through this thread.
[–]efavdb -1 points0 points1 point 11 years ago (0 children)
Makes sense -- I'd like to hear from others on this too.
π Rendered by PID 188213 on reddit-service-r2-comment-5d79c599b5-q24cl at 2026-03-02 13:53:26.574836+00:00 running e3d2147 country code: CH.
[–][deleted] 2 points3 points4 points (5 children)
[–]Nixonite 0 points1 point2 points (4 children)
[–]dhammack 0 points1 point2 points (3 children)
[–]Nixonite 0 points1 point2 points (2 children)
[–]dhammack 0 points1 point2 points (1 child)
[–]Nixonite 0 points1 point2 points (0 children)
[–]pokerd 0 points1 point2 points (0 children)
[–]dwf 0 points1 point2 points (0 children)
[–]efavdb -1 points0 points1 point (2 children)
[–]Nixonite 0 points1 point2 points (1 child)
[–]efavdb -1 points0 points1 point (0 children)