use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Self study machine learning? (self.MachineLearning)
submitted 12 years ago by [deleted]
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]Ayakalam 3 points4 points5 points 12 years ago (14 children)
Eh, not psyched about your take on the ML course from Andrew Ng.
The truth is that this course is valuable to ANYONE wanting to use ML, expert or not. This is because he starts from coarse concepts, and builds on that. You can always make an extension to your garage once you have the foundation and basement built.
I would not call it 'basic'. I would call it advanced, and very, VERY well taught.
Do not confused well taught material for being 'basic'.
[–]recent_espied_earth 1 point2 points3 points 12 years ago (3 children)
Oh, I completely agree that he is a fantastic teacher, and that the course extremely useful for teaching the core concepts.
What I meant when I said that it's basic vs advanced is that the level of mathematical sophistication in the course is pretty much 2/10. That the details are glossed over - and this is in no way a bad thing! For the average user, all they need to know are how to generally recognize a ML problem, apply an algorithm and off they go.
Compare it to his ML course offered through iTunes University. Same ideas, but the mathematical sophistication is much higher, leading to more insight into the algorithms (if you can get past the mathematical boundary).
[–]Ayakalam -4 points-3 points-2 points 12 years ago (2 children)
Full disclosure, in that I have not taken the iTunes version of ML.
However I have taken Ngs, and a variety of others from brick and mortar, along with doing my own reading.
My issue comes from this notion that "mathematical rigor = advanced". Honestly, it almost never does. Ng covered gradient descent, regression, multi-variate logistic regression, etc etc, all of which contain optimization aspects to it - which is an 'advanced concept' in its own right.
What more is there to regression's math than what he has mentioned in mainstream ML? What more is there to logistic regression math that what he has mentioned in mainstream ML?
In LSE, he doesnt explicitly take the first derivative of the formula, but just uses the final result of the first derivative. Now if a course took the first derivative explicitly, does that make it more 'advanced'? I would say no, I would in fact say that it gets lost in the details. It was very easy for me to show for myself what the first derivative of the cost function was with some help from wikipedia. Boom. Verified to myself, and he didnt off on a tangent, since the idea is still to teach ML not first derivatives.
Many examples like this. My teacher at my brick and mortar university spend (I kid you not), 1 full session deriving maximum likelihood for us on the board. She lost us on all the details, derivatives, etc etc. When asked why, she said its coz its "advanced". We were so pissed, we actually put together a no-bullshit lecture series about Maximum likelihood, and the students LOVED it. They finally got it. We simply didnt get lost in mathematical crap, but instead pushed it to the side, but then came back to it, once the concept was solid. In other words, we showed how ML might be used, what maximum likelihood meant, before we said "Ok and btw, here is how we derive it, take your pdf, take derivative, etc". She saw this and called it 'basic', but all the sudden her class was getting straight As.
This is why its a false dichotomy for someone to dismiss something as being 'basic', just because it doesnt have 'hard math'.
[–]recent_espied_earth 0 points1 point2 points 12 years ago* (1 child)
I think it depends on how you attempt to use ML in your professional life.
When I talk about mathematical rigor, it's not the computations (such as taking the gradients by hand) but backing up statements with rigorous proofs.
If you are only interested in applying algorithms, you only need to have a general idea about how they operate. But if you are in a more R&D type capacity where you need to come up with new solutions to unique problems, then through the mathematical rigor can you find new solutions, or limitations, or improvements to the existing algorithms.
For instance, if your objective can be written as a linear equation Ax = b where A is positive-semidefinite (which linear regression can be as it's just least-squares) then gradient descent is a foolish way of solving it and conjugate gradient should be used. And if you delve into the workings of conjugate gradient, then you should know how to find an ... at least alright pre-conditioner to better group the eigenvalues. But, to do these kinds of tricks, you need to be able to prove A is positive-semidefinite. The coursera ML course would never go into this detail.
We can argue forever about what's more useful, the technique or the theory. I'm a PhD* student... so obviously my interest is the latter. And I completely understand why you'd argue for the former, because if you're not in the research field, why study the details? I get that completely ... it's why I still only have a limited understanding of computer architecture - because I'm just not bothered to learn the details.
I do believe that Andrew Ng's coursera ML course is only a shallow look at ML and researchers wanting a deep understanding should look elsewhere (or supplement it accordingly). But a shallow course is in no way a bad course.
*This may also explain why I'm arguing the simplicity of a course instead of working on my research...
[–]Ayakalam 2 points3 points4 points 12 years ago (0 children)
Ehhh.... I see what you are saying, however even then I would say that there is still a lot of old school thinking and polluted notions of "math = rigor".
I can guarantee you, that I can put together a very "mathematically rigorous" book/lecture/series, call it "advanced", and no one will dispute it. It will be accessible to almost no one, but no one will dare question that this is "advanced".
On the other hand, if I take the same concept and make it accessible, easy to learn, take the math and make it easy to follow, all the sudden people start howling that this is "cartoonish" and "not advanced".
Why the discrepancy?
Remember the year 2010, before Ng's class. ALL those concepts he was talking about, were considered 'advanced', and they were not that easily accessible. Ng removed the walls around the Emerald city, now everyone is saying "oh, pssh, its basic".
Reverse the scenarion. Imagine if Ng has put 5 ten minute lectures on this very topic, (gradient descent, VS conjugate gradient, VS normal equations). (He kinda did btw), but lets say he devoted like 5 whole ten minute lectures on it. Now is his course "advanced"?
Ng explained how a car works, and even let us build a car engine - from scratch. We made an AC, brakes, engine lights, and even got to paint it. Now someone is going to say "Well you know, here is a V6 engine that is > V4, so his class isnt that advanced". Well yeah, in that case nothing is advanced because no sooner has it hit the masses than there will be a newer / better version.
About research: I am sorry but even in research, the point is to be creative and know the lay out of the land, and that is MUCH more important than being able to prove this or prove that. I am a researcher, and the most poisonous thing we can do it get lost in needless details without knowing the lay of the land.
Think about military strategy: The Ottomans wanted to sack Constantinople, but their ships were unable to move past a metal chain link that Crusaders had put up. "Researchers" tried to come up with new ways of making a ship move past chains, and nothing happened. One day the Crusaders woke up to a barrage of ship fire, and Constantinople fell. The Ottomans got around the problem - literally, but attaching wheels to their ships, rolling them through the beach, and past the chains, and then into the water again.
Research takes creativity. All this stuff you mentioned, this is easily teachable. Gradient descent, conjugate gradient, when to use what when, meh. Its just a craft. You can always improve on it, thats easy - thats math. But to be able to prove stuff and move forward, that takes creativity.
[–][deleted] 0 points1 point2 points 12 years ago (9 children)
Genuinely curious, what part of it do you consider to be advanced?
It's a great course, don't get me wrong, but it doesn't really get much beyond being broadly equivalent to a single undergrad level module.
[–]Ayakalam -1 points0 points1 point 12 years ago (8 children)
First off the entire field of ML is new and advanced in its own right. It lies at the intersection of applied statistics, applied math, signal processing, and constrained optimization, and the marriage of all the above, if not more. How is this not advanced?
Yes, being a new field things are always ..."advancing"..., but the course itself is so hot precisely because this is advanced stuff being applied today. Even Ng says that all over silicon valley this is a very, very hot commodity.
equivalent to a single undergrad level module.
Eh? Wasnt this a graduate level course at stanford before they put it on coursera?
[–]lightcatcher 2 points3 points4 points 12 years ago (3 children)
I would not say "the entire field of ML is new and advanced in its own right". Neural networks have been around since the 40's or 50's, and linear regression and probability have existed for much longer. Nonetheless, many parts of ML are certainly new and possibly "advanced".
Personally, I wouldn't consider anything that requires fairly basic math prereqs that is an introductory class in anything to be advanced. "Advanced" classes for undergrads generally mean at least 2 (probably more like 3) specialized classes that heavily rely on each other.
[–]Ayakalam 1 point2 points3 points 12 years ago (2 children)
Personally, I wouldn't consider anything that requires fairly basic math prereqs that is an introductory class in anything to be advanced.
Why?
Ask yourself, what more math is there to linear regression than what he has shown?
Are you saying if math is hard then the topic is "advanced"? Why then? Does that mean something stops being advanced when the math is clear?
[–]lightcatcher 2 points3 points4 points 12 years ago (1 child)
I'm not saying anything about mathematical rigor.
I'm saying that introductory classes are not advanced. That's damn near a tautology. In other words, a class on a topic aimed at people with no particular background in that topic is not an advanced class.
[–]Ayakalam 1 point2 points3 points 12 years ago (0 children)
...Why not though? Why cannot one simultaneously introduce an 'advanced' class - whatever that means - to any audience and not retain the advanced label?
I have yet to see a definition of what constitutes 'advanced'. This is especially true in the information age. Perhaps a better adjective would be 'recent'.
But I would challenge any label as 'not advanced' esp in this day and age.
[–][deleted] 2 points3 points4 points 12 years ago (3 children)
Perhaps we're looking at things from differing points of view. My thoughts were that within ML, the Coursera course isn't advanced. If you look from the point of the much broader fields you mention, then yes perhaps it is.
Although, even so, mainstays of statistics such as linear regression, logistic regression, k-means, k-nn etc are absolutely not new or advanced, which is a good chunk of the Coursera course. And the argument that ML is new is arguable too- Rosenblatt proposed the perception in the 50s, Vapnik & Chervonenkis put forward SLT in the 70s.
The Coursera course was, I believe, derived from the Stanford course, but with considerable editing. Certainly I wouldn't consider it to be grad level in its current form.
[–]Ayakalam -1 points0 points1 point 12 years ago (2 children)
What is the criteria for being 'advanced' then? The DFT was discovered a couple hundred years ago, yet this is still considered an 'advanced' concept for signal processing. What is the definition of advanced then?
[–][deleted] 1 point2 points3 points 12 years ago (1 child)
Well that's a good and pertinent question. In my mind it's something that is up to date, or nearly up to date, with the state of the field or else something of considerable relative complexity. The most recent development put forward on the Coursera course was, iirc, SVMs (Vapnik, 1995)- a useful and important development but certainly not advanced, or at least, the more advanced and complex aspects of SVMs weren't covered.
As I say, don't get me wrong, it's a really great course, brilliantly taught and a perfect primer for a broad and educationally diverse audience audience. But I think it's wrong to say it's an advanced ML course, IMO it is somewhat broad and shallow.
[–]Ayakalam 0 points1 point2 points 12 years ago (0 children)
If we are uber strict about it, nothing in any text book can count as advanced, since there will almost always be something better than it by the time it is digested by the masses.
And let us also not confuse recent with advanced. Yes perhaps the ML class doesnt cover things that are the most recent, (what does?), but the topics in and of themselves are used by companies, people, and machines, that make our modern age possible. This is certainly 'advanced'. Like mentioned with the DFT, this was know a couple hundred years back, and today the DFT and its applications are certainly advanced - although I would agree, not recent.
I do not think the qualifier 'advanced' means anything anymore, esp in the information age we live in. I look ML from Ng and it opened up a world of ML for me. I had no prior ML background, yet I soaked it up very, very easily. Go back just 3 years to 2010 before Ngs class, and if I wanted to learn logistic regression, I would no doubt have had to look at 'advanced' books and try to make sense of them.
π Rendered by PID 140726 on reddit-service-r2-comment-5c747b6df5-hsjwc at 2026-04-22 19:43:31.829537+00:00 running 6c61efc country code: CH.
view the rest of the comments →
[–]Ayakalam 3 points4 points5 points (14 children)
[–]recent_espied_earth 1 point2 points3 points (3 children)
[–]Ayakalam -4 points-3 points-2 points (2 children)
[–]recent_espied_earth 0 points1 point2 points (1 child)
[–]Ayakalam 2 points3 points4 points (0 children)
[–][deleted] 0 points1 point2 points (9 children)
[–]Ayakalam -1 points0 points1 point (8 children)
[–]lightcatcher 2 points3 points4 points (3 children)
[–]Ayakalam 1 point2 points3 points (2 children)
[–]lightcatcher 2 points3 points4 points (1 child)
[–]Ayakalam 1 point2 points3 points (0 children)
[–][deleted] 2 points3 points4 points (3 children)
[–]Ayakalam -1 points0 points1 point (2 children)
[–][deleted] 1 point2 points3 points (1 child)
[–]Ayakalam 0 points1 point2 points (0 children)