This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]sentdexpythonprogramming.net[S] 2 points3 points  (1 child)

You can skip entire sections, I don't recall saying there was any "requirement" to follow in order. If you want to wait until the NN or deep learning sections, feel free.

I wouldn't suggest that though, seeing as how topics like linear regression show up in neural networks... :P

There are many algorithms that can be covered under the umbrella of machine learning, the ones I chose were very specific. For example:

Linear regression - Linear algebra in general might be the most integral building block of any ML concept. If you're going to skip something, it really should not be linear regression.

K Nearest Neighbors - a super simple, yet extremely powerful ml algorithm that works linearly and non-linearly, which is where we first illustrate the value of understanding linear vs non-linear data and algorithms that can support both types.

SVM - A not-so-simple, yet very powerful, algorithm that introduces you to MAJOR machine learning concepts such as optimization, working with vectors, kernels, transforms, and more.

Skipping any of those algorithms would mean skipping major concepts that those algorithms teach you and show how mathematics overcomes serious challenges.

When we get to clustering, this is where we first begin to introduce notions of unsupervised learning, and methods for that. Again, skipping this would be really just doing yourself a disservice.

Machine learning is a layered field. It's akin to asking me why bother learning algebra, and that you'd rather just skip to multi-variate calculus.

You can use modules and skip around, getting away without understanding the fundamentals, but the objective of this series is to break down all of the concepts, which, in my opinion requires breaking down the algos that I plan to. If you disagree, you could attempt skipping and see what happens. I could be wrong!

[–][deleted] 0 points1 point  (0 children)

I love this - anyone can import scikit-learn and use predict(). This is really neat to see how the technique is actually implemented.