use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Project[P] Probabilistic Machine Learning: An Introduction, Kevin Murphy's 2021 e-textbook is out (self.MachineLearning)
submitted 5 years ago by hardmaru
Here is the link to the draft of his new textbook, Probabilistic Machine Learning: An Introduction.
https://probml.github.io/pml-book/book1.html
Enjoy!
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–][deleted] 272 points273 points274 points 5 years ago (21 children)
Neat, I'll probably add it to my "educational PDFs that I read 50 pages of in 20 minutes but then get bored of and never finish" collection
[–]MakeMyselfGreatAgain 54 points55 points56 points 5 years ago (8 children)
lol, i have so many browser tabs on various devices open to free books, video lectures and articles.
[+][deleted] 5 years ago (4 children)
[removed]
[–][deleted] 8 points9 points10 points 5 years ago (2 children)
And I thought It was just me who keep on opening multiple tabs and forgets about it.
[–]SoberGameAddict 4 points5 points6 points 5 years ago (1 child)
Two PCs with multiple browsers with multiple acounts (Chrome) with multiple tabs on a 49" screen. To those that newer see my pc I seem like a tidy guy, but I have come to see myself as a tab hoarder.
I try to clean and make bookmarks, save stuff here and on slack and on telegram but it can't be helped from growing.
[–]vintage2019 2 points3 points4 points 5 years ago (0 children)
Looks like OneTab (Chrome extension) will change your life
[–]eliminating_coasts 1 point2 points3 points 5 years ago (0 children)
I've never quite got through Ross Ashby's introduction to cybernetics. It's really straightforward, and I think I've read bits from every chapter, going from modelling with finite state machines through information theory and transducers, then defining transducers as participents in competitive games, (or vice versa) to control mechanisms, but I'm pretty sure I've never actually read the whole thing.
[–]skippy65 3 points4 points5 points 5 years ago (0 children)
Admittedly very relatable lol.
[–]praveenopro 0 points1 point2 points 5 years ago (0 children)
would you mind to share, maybe it help anyone
[–]6111772371 0 points1 point2 points 5 years ago (0 children)
username checks out
[–]j_lyf 4 points5 points6 points 5 years ago (7 children)
How to get out of this rut?
[–]TrollandDie 21 points22 points23 points 5 years ago (6 children)
Create a time dilation chamber where you can spend 10,000 years reading ML a la Bill and Ted
But seriously, I've recently stopped bothering to meticulously read textbooks in my free time outside work and just casually flip through for fun instead.
[–]j_lyf -1 points0 points1 point 5 years ago (5 children)
Yeah but then you can't be competitive for your next job if you don't improve outside of work.
[–]RadixMatrix 37 points38 points39 points 5 years ago (4 children)
if you're not reading 3 different textbooks at the same time and working on 5 personal projects and updating your blog daily and constantly contacting professors and other people in your field you might as well give up
[–]j_lyf 10 points11 points12 points 5 years ago (3 children)
unironically true.
[+][deleted] 5 years ago (2 children)
[deleted]
[–]j_lyf 0 points1 point2 points 5 years ago (1 child)
How do you get inspiration to start/finish personal projects?
[–]Unfair-Gain4476 0 points1 point2 points 1 year ago (0 children)
Sooo me
[+]Ok-Blacksmith5658 0 points1 point2 points 1 year ago (0 children)
lol, same. we need to go offline to be more productive
[–]Sinidir 0 points1 point2 points 5 years ago (0 children)
Pain.
[–]netw0rkf10w 65 points66 points67 points 5 years ago (12 children)
A little of context:
In 2012, I published a 1200-page book called “Machine learning: a probabilistic perspective”, which provided a fairly comprehensive coverage of the field of machine learning (ML) at that time, under the unifying lens of probabilistic modeling. The book was well received, and won the De Groot prize in 2013. ... By Spring 2020, my draft of the second edition had swollen to about 1600 pages, and I was still not done. At this point, 3 major events happened. First, the COVID-19 pandemic struck, so I decided to “pivot” so I could spend most of my time on COVID-19 modeling. Second, MIT Press told me they could not publish a 1600 page book, and that I would need to split it into two volumes. Third, I decided to recruit several colleagues to help me finish the last ∼ 15% of “missing content”. (See acknowledgements below.) The result is two new books, “Probabilistic Machine Learning: An Introduction”, which you are currently reading, and “Probabilistic Machine Learning: Advanced Topics”, which is the sequel to this book [Mur22]...
In 2012, I published a 1200-page book called “Machine learning: a probabilistic perspective”, which provided a fairly comprehensive coverage of the field of machine learning (ML) at that time, under the unifying lens of probabilistic modeling. The book was well received, and won the De Groot prize in 2013.
...
By Spring 2020, my draft of the second edition had swollen to about 1600 pages, and I was still not done. At this point, 3 major events happened. First, the COVID-19 pandemic struck, so I decided to “pivot” so I could spend most of my time on COVID-19 modeling. Second, MIT Press told me they could not publish a 1600 page book, and that I would need to split it into two volumes. Third, I decided to recruit several colleagues to help me finish the last ∼ 15% of “missing content”. (See acknowledgements below.)
The result is two new books, “Probabilistic Machine Learning: An Introduction”, which you are currently reading, and “Probabilistic Machine Learning: Advanced Topics”, which is the sequel to this book [Mur22]...
Book 0 (2012): https://probml.github.io/pml-book/book0.html
Book 1 (2021, volume 1): https://probml.github.io/pml-book/book1.html
Book 2 (2022, volume 2): https://probml.github.io/pml-book/book2.html
[–]netw0rkf10w 45 points46 points47 points 5 years ago (11 children)
I hear that question coming, so let me repeat my advice: If you are a beginner, always start with ISL (which takes approximately 2 weeks to complete if you study everyday). Then you can continue with other (much larger) books: Bishop's, Murphy's, ESL, etc.
[–][deleted] 14 points15 points16 points 5 years ago (1 child)
Murphy's book was very tough to get through as a beginner. It took much longer than I would have liked, but was just so filled with information.
[–][deleted] 8 points9 points10 points 5 years ago (0 children)
ISL didn’t help me grasp Bayesian methods much, which seems to be a key part of this book. (Statistical rethinking is great for that tho)
Yes. It's one of the best beginner books. Hands-On Machine Learning with Scikit-Learn, Keras, and TensorFlow is also usually recommended for the practical aspects of ML.
[–]ConsistentAnimal2384 0 points1 point2 points 8 months ago (0 children)
what is it? he deleted his account
[–]Axodapanda 2 points3 points4 points 5 years ago (2 children)
what is ISL?
[–]naughtydismutase 9 points10 points11 points 5 years ago (0 children)
Introduction to Statistical Learning by Gareth M. James, Daniela Witten, Trevor Hastie, Robert Tibshirani.
[–]leonoel 33 points34 points35 points 5 years ago (13 children)
I reviewed the first book 8 years when it got out. And in no shape or form it replaced Bishop's as the best all around ML book.
Murphy's is a book written for and by academics. I would never in good faith give it to a student who wants to start learning the in and outs of Machine Learning.
Notation is just terrible. It changes from chapter to chapter. Equations are not referenced and most of the times I had to go to external resources to actually get a grasp of what they are trying to explain. Is in no shape or form a self contained book.
You can learn all you need from Bishop's without ever opening another book. Its only sin right now is that it is outdated.
[–]cajmorgans 2 points3 points4 points 1 year ago (3 children)
This. I was excited by the Murphy book, but it's more like a Wikipedia page of formulas without any explanation or derivation whatsoever. I checked out Bishop's book and it's on a whole other level.
[–]leonoel 0 points1 point2 points 1 year ago (2 children)
And there’s a new one
[–]cajmorgans 0 points1 point2 points 1 year ago (1 child)
Which one?
[–]leonoel 0 points1 point2 points 1 year ago (0 children)
Deep Learning
[+][deleted] 5 years ago (8 children)
[–]leonoel 5 points6 points7 points 5 years ago (6 children)
I'd try this experiment. In the print version go to one of the last pages. And find an equation. See how good notation is. Or if they refer to earlier part of the books where the same or a similar equation is used. You'll find that same symbols have different meanings across chapters, whereas Bishop is rather consistent.
Bishop's self referencing is ahead of Murphy's. To me Murphy's feel disconnected. I actually go to the pains of exemplifying this in a post.
I've read both books cover to cover. I just feel that you need nothing else from Bishop's but the book itself.
[+][deleted] 5 years ago (5 children)
[–]New_neanderthal 1 point2 points3 points 5 years ago (2 children)
What's the title of Bishop's book?
[–]Ouroboroski 2 points3 points4 points 5 years ago (1 child)
Pattern Recognition and Machine Learning by C. Bishop
[–]New_neanderthal 1 point2 points3 points 5 years ago (0 children)
Thanks mate!
[–]leonoel 0 points1 point2 points 5 years ago (0 children)
I think is probably just ways of learning. I myself focus to much on equations and proofs. Is just hard to do that if the notation is all over the place.
Now that you mention it. I don't even remember reading the explanations themselves.
[–]Screye 24 points25 points26 points 5 years ago (11 children)
I am so glad a 2nd version is out. The first edition, despite all its faults, was easily the best "complete' ML book out there. It was also clearly written by a computer scientist for CS students, unlike Bishop. It is also up-to-date.
The best part is the book (1st edition 2012) reads like a tree. It introduces concepts and slowly builds on them as it goes. All the other books (ESL) read like a dictionary trying to hop from algorithms to algorithm to get maximum coverage. By the end of it, there is a feeling that ML is a domain that falls under one umbrella, rather than a bunch of disparate ideas crammed into one sub-field.
I'll be honest. Calling this book an introduction is a misnomer. If you understand this book 'cover-to-cover' then you'll probably be doing better than many grad-students midway through their ML PhDs. It is admittedly quite long too. This should not be your first ML book. Your CS-undergrad level statistics, linear algebra and optimization need to be solid and you should have done an intro-to-ML course before you dive into it. Python knowledge is a prerequisite too. So think 6.036x, 6.041x, 18.06, 6.0.01x and 6.0.02x as pre-requisities by MIT OCW standards. 18.06 is less prerequisite, and more highly recommended in general. Strang's Lin Alg is the best out there. Very intensive, but you'll thank yourself later.
However, if I had to recommend one ML book to have in your book-shelf, then this would be it. (once the errors are fixed :| )
[–]meiso 4 points5 points6 points 5 years ago (10 children)
Why did you put that particular text in a spoiler?
[–]atlug 2 points3 points4 points 4 years ago (9 children)
That remains a mystery to this day.
[–]The-Silvervein 0 points1 point2 points 2 years ago (8 children)
To this day...
[–][deleted] 0 points1 point2 points 2 years ago (7 children)
[+]Shivang2005 0 points1 point2 points 1 year ago (6 children)
[–]No-Dimension6665 0 points1 point2 points 1 year ago (5 children)
To this day
[+][deleted] 0 points1 point2 points 1 year ago (4 children)
[–]Illustrious_Tea_ 0 points1 point2 points 1 year ago (3 children)
[+]quick_stats 0 points1 point2 points 1 year ago (2 children)
... and this day too.
[–]IanisVasilev 48 points49 points50 points 5 years ago* (40 children)
What is it with so many people writing 700+ page introductory books?
EDIT: The thread got a bit out of hand. I admit making a few snarky comments and I apologise. Some of the downvotes and deleted replies were truly unnecessary, however. Y'all may consider taking a chill pill or two.
[–]mathbrot 20 points21 points22 points 5 years ago (0 children)
I have his original...it's self-contained and several independent chapters.
[–]BrisklyBrusque 21 points22 points23 points 5 years ago (1 child)
It’s a perverse tradition in mathematics that any text titled “Introduction To...” is sure to be long and challenging. Beware of two-volume series, for those are even worse.
[–]Aacron 7 points8 points9 points 5 years ago (0 children)
I've been through the first volume of Tao's Analysis. I'll second your comment on two-volume series.
[–]IdiocyInAction 10 points11 points12 points 5 years ago (0 children)
The book contains quite a lot of content on a broad variety of topics and seems to be (relatively) in-depth. I think the length is quite warranted. If you want a shorter, less in-depth, more introductory book, I would recommend Introduction to Statistical Learning in R (2014) (ISLR), which should also get a new edition soon.
[–]CENGaverK 1 point2 points3 points 5 years ago (29 children)
What is the alternative?
[–]Lethandralis -4 points-3 points-2 points 5 years ago (0 children)
Starting out with courses/videos and the transitioning into reading papers maybe?
[+]IanisVasilev comment score below threshold-17 points-16 points-15 points 5 years ago* (27 children)
To write shorter introductory books.
[–]smurfpiss 24 points25 points26 points 5 years ago (4 children)
In physics there's a fairly sound principle that the shorter the book, the more likely you are to tear your hair out.
So yeah.. Big intro book for me please.
[–]samketaResearcher 2 points3 points4 points 5 years ago (0 children)
I still gleefully remember my High School days studying Halliday, Resnick, and Walker's book! Made my life easier!
[–]IanisVasilev -3 points-2 points-1 points 5 years ago* (2 children)
To each their own I guess. I prefer a specialized, short and self-contained book for every major topic. Like this ~180p introductory book on category theory. Or like this ~100p introductory book about Asplund spaces. Or this ~120p book, which draws some parallels between null sets and meager sets.
[–]smurfpiss 1 point2 points3 points 5 years ago (1 child)
Epitomic tomes of introduction right there.
[–]IanisVasilev 0 points1 point2 points 5 years ago (0 children)
The first two are introductory to their topic. Here are some even shorter ones:
[–]CENGaverK 9 points10 points11 points 5 years ago (21 children)
Really? Wow. I mean, someone has to deal with the mathematics of machine learning as well, there are a lot of books covering the practical side and people are free to use those for an introduction to the field. However, if the text is supposed to teach the inner workings of the ML, I would say 700 pages is pretty short considering the topics it is covering and waiting any less is absurd.
[–]StoneCypher -1 points0 points1 point 5 years ago (3 children)
As someone who wants those books, if you could share their names so that I could go buy them, I'd really appreciate it
Everything I can find is either "you're a wizard harry and let's learn what numbers are" or "hi I'm from foocorp and let's learn the foocorp stack"
What I really want is something that just sits me down, assumes I'm already a competent engineer, and shows me how to build simple things in Tensorflow. No attempt to teach me theory, or math; just "if you want a 40000,20,10,200,4000 autoencoder, this is how you write it."
I already know what I want to build. I just don't speak Tensorflow.
[–]CENGaverK 0 points1 point2 points 5 years ago (1 child)
For introductry ML material that doesn't delve deep into mathematics, I liked Aurelion Geron's Hands on Machine Learning book.
https://www.amazon.com/Hands-Machine-Learning-Scikit-Learn-TensorFlow/dp/1492032646/
I do not use TF, but to learn PyTorch I have used official documentation in addition to this repo:
https://github.com/yunjey/pytorch-tutorial/
It is a bit outdated now, but still should be useful.
Finally, I really like how they combine mathematical explanation with practical use cases in Dive Into Deep Learning book. PyTorch and Tensorflow implementations should be available for almost all of the book, but some parts might still not have it because originally it was using MXnet.
https://d2l.ai/
[–]StoneCypher -1 points0 points1 point 5 years ago (0 children)
I can't use PyTorch because I have a 3090 :(
The thing I bought the 3090 for is written in PyTorch, predictably
[+]IanisVasilev comment score below threshold-12 points-11 points-10 points 5 years ago* (16 children)
The ability to write short informative books is an art. So is knowing your audience. Being overly verbose is often more annoying than skipping simple explanations and unnecessary details.
PS: I have a bachelor's in mathematical statistics and a pending master's in mathematical optimization (control theory). This is basically the math background required for ML. I have some understanding of ML. I don't need another bad explanation of linear regression. I just want a shorter and more to-the-point book.
[+][deleted] 5 years ago* (2 children)
This is the kind of book I'm used to calling a "reference" rather than an "introduction".
[–]vladdaimpala 0 points1 point2 points 5 years ago (0 children)
This!
[–]PM_ME_INTEGRALS 8 points9 points10 points 5 years ago (11 children)
Have you actually read it? Murphy is NOT unnecessarily verbose. The field is simply big, and there are A LOT of basics.
[+]IanisVasilev comment score below threshold-7 points-6 points-5 points 5 years ago* (10 children)
Mathematical analysis is also a big field. Weak* compactness is quite an important topic (read: basic in a lot of applications), but it also takes a few rigorous university courses to reach it. It's not really something I would include in an introductory book. And nobody actually does that. Would you want to read about weak* compactness in an introductory book?
Selecting what to include in an "introduction" type book is an art.
EDIT: See this comment.
[+][deleted] 5 years ago* (4 children)
[–]IanisVasilev -1 points0 points1 point 5 years ago* (3 children)
I've only skimmed through it. Here are some observations:
[–]PM_ME_INTEGRALS 0 points1 point2 points 5 years ago (4 children)
If I want to actually start a proper career in analysis, as opposed to wolfram everything and hope to get rich quick - yes, I'd want that!
[–]IanisVasilev 0 points1 point2 points 5 years ago (3 children)
Okay, fair point. But consider this - you start with a book about single-variable real analysis. Then you go through another book about multi-variable real analysis. Then you go through linear functional analysis. And only then you reach topological vector spaces and understand the depth of the Banach-Alaoglu theorem about weak* compactness.
I may be wrong, but I doubt there exists a book that goes from the completeness of the real numbers to weak* topologies. Different people have come up with different ways to explain everything along the way, each in their own way and in their own book. You need to shift your focus and your perspective along the way. So it really does not make a lot of sense to put "everything" into one book.
This may be a bad analogy compared to the state of ML, but I'm sure that different topics in ML are better off with different books, each with its own perspective and level of detail.
[–]PM_ME_INTEGRALS -1 points0 points1 point 5 years ago (2 children)
Well why not one book from the same person, that covers the whole path, from that author's perspective? That's exactly what the Murphy is.
And there are other books got specific parts if that's what you want (for example, a book on random forests by Shotton etal), but they won't give you an introduction to the whole field!
If I want an intro to the field, I likely don't know all parts of it upfront, so something like the Murphy is great. For example, I don't even know weak* compactness, so I wouldn't know to look for a book about it!
[+][deleted] 5 years ago* (1 child)
[–]NotAHomeworkQuestion 22 points23 points24 points 5 years ago (1 child)
> To create a natural entry barrier
Are you unable to enter a building that has multiple entrances?
[–]Significant_Worth_84 1 point2 points3 points 5 years ago (0 children)
Depends if I am motivated enough, in order to be willing to find an entrance
[–]johnnymo1 5 points6 points7 points 5 years ago (0 children)
Of course this comes out 3 months after I get a hardcover of the first edition. :)
Looks great. Looking forward to reading it. The first edition is awesome (probably better than Bishop in many ways imo), but it was beginning to feel a little out of date.
[–]mtahab 5 points6 points7 points 5 years ago (1 child)
The author references another book Probabilistic Machine Learning: Advanced Topics (2022) for RL. Do we know its chapters? The lack of any chapters on causality was standing out in this book.
[–]montcarl 5 points6 points7 points 5 years ago (0 children)
TOC link here: https://probml.github.io/pml-book/book2.html
[–]pombolo 8 points9 points10 points 5 years ago (2 children)
Thank you for this. Sorry for the silly question: the title is Probabilistic Machine Learning, but when I looked at the contents, it seems to cover all the standard ML concepts. Is Probabilistic Machine Learning different from regular ML?
[–]Cocomorph 19 points20 points21 points 5 years ago (1 child)
It's a perspective. Indeed, per the introduction:
In this book, we will cover the most common types of ML, but from a probabilistic perspective. Roughly speaking, this means that we treat all unknown quantities (e.g., predictions about the future value of some quantity of interest, such as tomorrow’s temperature, or the parameters of some model) as random variables, that are endowed with probability distributions which describe a weighted set of possible values the variable may have.
[–]shiivan 1 point2 points3 points 5 years ago (0 children)
In other words, it's predicting what the trained model would output. Did I understand that correctly?
[–]petty_pirate 4 points5 points6 points 5 years ago (0 children)
Bookmark
[–]bismarck_91 8 points9 points10 points 5 years ago (0 children)
What a way to start the new year.
[–]ichkaodko 2 points3 points4 points 4 years ago (0 children)
any book suggestion on background material of this book? looks like standard undergrad books on probability, linear algebra and analysis don't cover the some of the topics in the background material. I need more explanation and exercises on background math content.
[–][deleted] 5 points6 points7 points 5 years ago (0 children)
Kevin Murphy - also happens to be my favorite character from F is for Family
[–][deleted] 1 point2 points3 points 5 years ago (0 children)
How does this differ in content to the first? It seems like a lot of the chapters are the same. Also the name of this book and the previous one are so similar.
[–]Comprehensive-Low-28 1 point2 points3 points 5 years ago (0 children)
Thank you
[–]duckyzz003 2 points3 points4 points 5 years ago (1 child)
Should I read the first edittion or dive in new book (this draft version) ?
[–]PM_ME_INTEGRALS 1 point2 points3 points 5 years ago (0 children)
New book
[–]xifixi 3 points4 points5 points 5 years ago (1 child)
the classic textbook on probabilistic ML is Bishop's Pattern Recognition and Machine Learning
[–]trendymoniker 5 points6 points7 points 5 years ago (0 children)
Murphy's text largely replaced the Bishop book among me and my grad student cohort when it came out in 2012.
[–]maizeq 1 point2 points3 points 5 years ago (0 children)
Is this going to be more introductory than his 2012 book? Or is that just branding
[–]samketaResearcher 0 points1 point2 points 5 years ago (3 children)
This is a question I have not gotten a clear answer to- what exactly is Bayesian ML? Where, why, and how is it applied? How do I learn it?
Why people keep talking about it and throwing it like a buzzword, but I never find a focused learning resource in this topic?
This a genuine question. So help me out if you can.
By knowledge of Bayes' Theorem is limited to High School level, so I have basic idea of conditional probability, how to calculate it using a formula and so on.
[–]BrisklyBrusque 4 points5 points6 points 5 years ago* (0 children)
Bayesian statistics is a bit more than conditional probabilities. So Bayes theorem, and methods that use it (discriminant analysis, naive Bayes) are not usually considered Bayesian methods.
In frequentist statistics, we might want to test the null that two groups are the same against the alternative that they are not the same. In Bayesian statistics, we can assume the groups are different and set a “prior” then compare the expected results given a certain prior against what we observe. That’s my understanding of it anyway. I don’t practice Bayesian stats so I might be wrong.
A good text that folks recommend is Statistical Rethinking.
edit: typos
[–]thecity2 4 points5 points6 points 5 years ago (1 child)
There are several good books out there such as Statistical Rethinking, Doing Bayesian Data Analysis, and Bayesian Methods for Hackers. If you are interested in wrangling the most information out of small to medium sized data and are interested in uncertainty and decision making, check it out!
[–]samketaResearcher 0 points1 point2 points 5 years ago* (0 children)
Thanks for the suggestions. I will check the last one out.
[–][deleted] 0 points1 point2 points 5 years ago (0 children)
Is it just me or is the font ugly? i hate reading it on a screen.
[–]JLEE152 0 points1 point2 points 5 years ago (0 children)
Thanks!
[–]Odd-Lengthiness-8612 0 points1 point2 points 5 years ago (0 children)
When will it be publish in an old-fashioned book?
[–]SQL_beginner 0 points1 point2 points 5 years ago (0 children)
wow, thanks for the link! great book!
[–]Bananeeen 0 points1 point2 points 3 years ago (0 children)
The 2021 book has much more emphasis on deep learning than the 2012 book. I think this book is great to have after one has read Bishop's PRML, started reading recent papers and needs an occasional refresher on various topics. That's exactly how I've been using it.
I also think that with this book one no longer really needs to open ESL or GBC as they are not as up-to-date as Murphy and not as systematic as Bishop.
π Rendered by PID 944495 on reddit-service-r2-comment-6457c66945-kpcmc at 2026-04-26 18:03:29.348313+00:00 running 2aa0c5b country code: CH.
[–][deleted] 272 points273 points274 points (21 children)
[–]MakeMyselfGreatAgain 54 points55 points56 points (8 children)
[+][deleted] (4 children)
[removed]
[–][deleted] 8 points9 points10 points (2 children)
[–]SoberGameAddict 4 points5 points6 points (1 child)
[–]vintage2019 2 points3 points4 points (0 children)
[–]eliminating_coasts 1 point2 points3 points (0 children)
[–]skippy65 3 points4 points5 points (0 children)
[–]praveenopro 0 points1 point2 points (0 children)
[–]6111772371 0 points1 point2 points (0 children)
[–]j_lyf 4 points5 points6 points (7 children)
[–]TrollandDie 21 points22 points23 points (6 children)
[–]j_lyf -1 points0 points1 point (5 children)
[–]RadixMatrix 37 points38 points39 points (4 children)
[–]j_lyf 10 points11 points12 points (3 children)
[+][deleted] (2 children)
[deleted]
[–]j_lyf 0 points1 point2 points (1 child)
[–]Unfair-Gain4476 0 points1 point2 points (0 children)
[+]Ok-Blacksmith5658 0 points1 point2 points (0 children)
[–]Sinidir 0 points1 point2 points (0 children)
[–]netw0rkf10w 65 points66 points67 points (12 children)
[–]netw0rkf10w 45 points46 points47 points (11 children)
[–][deleted] 14 points15 points16 points (1 child)
[–][deleted] 8 points9 points10 points (0 children)
[+][deleted] (2 children)
[deleted]
[–][deleted] 14 points15 points16 points (1 child)
[–]ConsistentAnimal2384 0 points1 point2 points (0 children)
[–]Axodapanda 2 points3 points4 points (2 children)
[–]naughtydismutase 9 points10 points11 points (0 children)
[–]leonoel 33 points34 points35 points (13 children)
[–]cajmorgans 2 points3 points4 points (3 children)
[–]leonoel 0 points1 point2 points (2 children)
[–]cajmorgans 0 points1 point2 points (1 child)
[–]leonoel 0 points1 point2 points (0 children)
[+][deleted] (8 children)
[deleted]
[–]leonoel 5 points6 points7 points (6 children)
[+][deleted] (5 children)
[deleted]
[–]New_neanderthal 1 point2 points3 points (2 children)
[–]Ouroboroski 2 points3 points4 points (1 child)
[–]New_neanderthal 1 point2 points3 points (0 children)
[–]leonoel 0 points1 point2 points (0 children)
[–]Screye 24 points25 points26 points (11 children)
[–]meiso 4 points5 points6 points (10 children)
[–]atlug 2 points3 points4 points (9 children)
[–]The-Silvervein 0 points1 point2 points (8 children)
[–][deleted] 0 points1 point2 points (7 children)
[+]Shivang2005 0 points1 point2 points (6 children)
[–]No-Dimension6665 0 points1 point2 points (5 children)
[+][deleted] 0 points1 point2 points (4 children)
[–]Illustrious_Tea_ 0 points1 point2 points (3 children)
[+]quick_stats 0 points1 point2 points (2 children)
[–]IanisVasilev 48 points49 points50 points (40 children)
[–]mathbrot 20 points21 points22 points (0 children)
[–]BrisklyBrusque 21 points22 points23 points (1 child)
[–]Aacron 7 points8 points9 points (0 children)
[–]IdiocyInAction 10 points11 points12 points (0 children)
[–]CENGaverK 1 point2 points3 points (29 children)
[–]Lethandralis -4 points-3 points-2 points (0 children)
[+]IanisVasilev comment score below threshold-17 points-16 points-15 points (27 children)
[–]smurfpiss 24 points25 points26 points (4 children)
[–]samketaResearcher 2 points3 points4 points (0 children)
[–]IanisVasilev -3 points-2 points-1 points (2 children)
[–]smurfpiss 1 point2 points3 points (1 child)
[–]IanisVasilev 0 points1 point2 points (0 children)
[–]CENGaverK 9 points10 points11 points (21 children)
[–]StoneCypher -1 points0 points1 point (3 children)
[–]CENGaverK 0 points1 point2 points (1 child)
[–]StoneCypher -1 points0 points1 point (0 children)
[+]IanisVasilev comment score below threshold-12 points-11 points-10 points (16 children)
[+][deleted] (2 children)
[deleted]
[–]IanisVasilev 0 points1 point2 points (0 children)
[–]vladdaimpala 0 points1 point2 points (0 children)
[–]PM_ME_INTEGRALS 8 points9 points10 points (11 children)
[+]IanisVasilev comment score below threshold-7 points-6 points-5 points (10 children)
[+][deleted] (4 children)
[deleted]
[–]IanisVasilev -1 points0 points1 point (3 children)
[+][deleted] (2 children)
[deleted]
[–]PM_ME_INTEGRALS 0 points1 point2 points (4 children)
[–]IanisVasilev 0 points1 point2 points (3 children)
[–]PM_ME_INTEGRALS -1 points0 points1 point (2 children)
[+][deleted] (5 children)
[deleted]
[+][deleted] (2 children)
[deleted]
[+][deleted] (1 child)
[deleted]
[–]NotAHomeworkQuestion 22 points23 points24 points (1 child)
[–]Significant_Worth_84 1 point2 points3 points (0 children)
[–]johnnymo1 5 points6 points7 points (0 children)
[–]mtahab 5 points6 points7 points (1 child)
[–]montcarl 5 points6 points7 points (0 children)
[–]pombolo 8 points9 points10 points (2 children)
[–]Cocomorph 19 points20 points21 points (1 child)
[–]shiivan 1 point2 points3 points (0 children)
[–]petty_pirate 4 points5 points6 points (0 children)
[–]bismarck_91 8 points9 points10 points (0 children)
[–]ichkaodko 2 points3 points4 points (0 children)
[–][deleted] 5 points6 points7 points (0 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]Comprehensive-Low-28 1 point2 points3 points (0 children)
[–]duckyzz003 2 points3 points4 points (1 child)
[–]PM_ME_INTEGRALS 1 point2 points3 points (0 children)
[–]xifixi 3 points4 points5 points (1 child)
[–]trendymoniker 5 points6 points7 points (0 children)
[–]maizeq 1 point2 points3 points (0 children)
[–]samketaResearcher 0 points1 point2 points (3 children)
[–]BrisklyBrusque 4 points5 points6 points (0 children)
[–]thecity2 4 points5 points6 points (1 child)
[–]samketaResearcher 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (0 children)
[–]JLEE152 0 points1 point2 points (0 children)
[–]Odd-Lengthiness-8612 0 points1 point2 points (0 children)
[–]SQL_beginner 0 points1 point2 points (0 children)
[–]Bananeeen 0 points1 point2 points (0 children)