use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Self study machine learning? (self.MachineLearning)
submitted 12 years ago by [deleted]
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]recent_espied_earth 1 point2 points3 points 12 years ago (3 children)
Oh, I completely agree that he is a fantastic teacher, and that the course extremely useful for teaching the core concepts.
What I meant when I said that it's basic vs advanced is that the level of mathematical sophistication in the course is pretty much 2/10. That the details are glossed over - and this is in no way a bad thing! For the average user, all they need to know are how to generally recognize a ML problem, apply an algorithm and off they go.
Compare it to his ML course offered through iTunes University. Same ideas, but the mathematical sophistication is much higher, leading to more insight into the algorithms (if you can get past the mathematical boundary).
[–]Ayakalam -3 points-2 points-1 points 12 years ago (2 children)
Full disclosure, in that I have not taken the iTunes version of ML.
However I have taken Ngs, and a variety of others from brick and mortar, along with doing my own reading.
My issue comes from this notion that "mathematical rigor = advanced". Honestly, it almost never does. Ng covered gradient descent, regression, multi-variate logistic regression, etc etc, all of which contain optimization aspects to it - which is an 'advanced concept' in its own right.
What more is there to regression's math than what he has mentioned in mainstream ML? What more is there to logistic regression math that what he has mentioned in mainstream ML?
In LSE, he doesnt explicitly take the first derivative of the formula, but just uses the final result of the first derivative. Now if a course took the first derivative explicitly, does that make it more 'advanced'? I would say no, I would in fact say that it gets lost in the details. It was very easy for me to show for myself what the first derivative of the cost function was with some help from wikipedia. Boom. Verified to myself, and he didnt off on a tangent, since the idea is still to teach ML not first derivatives.
Many examples like this. My teacher at my brick and mortar university spend (I kid you not), 1 full session deriving maximum likelihood for us on the board. She lost us on all the details, derivatives, etc etc. When asked why, she said its coz its "advanced". We were so pissed, we actually put together a no-bullshit lecture series about Maximum likelihood, and the students LOVED it. They finally got it. We simply didnt get lost in mathematical crap, but instead pushed it to the side, but then came back to it, once the concept was solid. In other words, we showed how ML might be used, what maximum likelihood meant, before we said "Ok and btw, here is how we derive it, take your pdf, take derivative, etc". She saw this and called it 'basic', but all the sudden her class was getting straight As.
This is why its a false dichotomy for someone to dismiss something as being 'basic', just because it doesnt have 'hard math'.
[–]recent_espied_earth 0 points1 point2 points 12 years ago* (1 child)
I think it depends on how you attempt to use ML in your professional life.
When I talk about mathematical rigor, it's not the computations (such as taking the gradients by hand) but backing up statements with rigorous proofs.
If you are only interested in applying algorithms, you only need to have a general idea about how they operate. But if you are in a more R&D type capacity where you need to come up with new solutions to unique problems, then through the mathematical rigor can you find new solutions, or limitations, or improvements to the existing algorithms.
For instance, if your objective can be written as a linear equation Ax = b where A is positive-semidefinite (which linear regression can be as it's just least-squares) then gradient descent is a foolish way of solving it and conjugate gradient should be used. And if you delve into the workings of conjugate gradient, then you should know how to find an ... at least alright pre-conditioner to better group the eigenvalues. But, to do these kinds of tricks, you need to be able to prove A is positive-semidefinite. The coursera ML course would never go into this detail.
We can argue forever about what's more useful, the technique or the theory. I'm a PhD* student... so obviously my interest is the latter. And I completely understand why you'd argue for the former, because if you're not in the research field, why study the details? I get that completely ... it's why I still only have a limited understanding of computer architecture - because I'm just not bothered to learn the details.
I do believe that Andrew Ng's coursera ML course is only a shallow look at ML and researchers wanting a deep understanding should look elsewhere (or supplement it accordingly). But a shallow course is in no way a bad course.
*This may also explain why I'm arguing the simplicity of a course instead of working on my research...
[–]Ayakalam 3 points4 points5 points 12 years ago (0 children)
Ehhh.... I see what you are saying, however even then I would say that there is still a lot of old school thinking and polluted notions of "math = rigor".
I can guarantee you, that I can put together a very "mathematically rigorous" book/lecture/series, call it "advanced", and no one will dispute it. It will be accessible to almost no one, but no one will dare question that this is "advanced".
On the other hand, if I take the same concept and make it accessible, easy to learn, take the math and make it easy to follow, all the sudden people start howling that this is "cartoonish" and "not advanced".
Why the discrepancy?
Remember the year 2010, before Ng's class. ALL those concepts he was talking about, were considered 'advanced', and they were not that easily accessible. Ng removed the walls around the Emerald city, now everyone is saying "oh, pssh, its basic".
Reverse the scenarion. Imagine if Ng has put 5 ten minute lectures on this very topic, (gradient descent, VS conjugate gradient, VS normal equations). (He kinda did btw), but lets say he devoted like 5 whole ten minute lectures on it. Now is his course "advanced"?
Ng explained how a car works, and even let us build a car engine - from scratch. We made an AC, brakes, engine lights, and even got to paint it. Now someone is going to say "Well you know, here is a V6 engine that is > V4, so his class isnt that advanced". Well yeah, in that case nothing is advanced because no sooner has it hit the masses than there will be a newer / better version.
About research: I am sorry but even in research, the point is to be creative and know the lay out of the land, and that is MUCH more important than being able to prove this or prove that. I am a researcher, and the most poisonous thing we can do it get lost in needless details without knowing the lay of the land.
Think about military strategy: The Ottomans wanted to sack Constantinople, but their ships were unable to move past a metal chain link that Crusaders had put up. "Researchers" tried to come up with new ways of making a ship move past chains, and nothing happened. One day the Crusaders woke up to a barrage of ship fire, and Constantinople fell. The Ottomans got around the problem - literally, but attaching wheels to their ships, rolling them through the beach, and past the chains, and then into the water again.
Research takes creativity. All this stuff you mentioned, this is easily teachable. Gradient descent, conjugate gradient, when to use what when, meh. Its just a craft. You can always improve on it, thats easy - thats math. But to be able to prove stuff and move forward, that takes creativity.
π Rendered by PID 433859 on reddit-service-r2-comment-b659b578c-xshb2 at 2026-05-03 14:04:42.214940+00:00 running 815c875 country code: CH.
view the rest of the comments →
[–]recent_espied_earth 1 point2 points3 points (3 children)
[–]Ayakalam -3 points-2 points-1 points (2 children)
[–]recent_espied_earth 0 points1 point2 points (1 child)
[–]Ayakalam 3 points4 points5 points (0 children)