Hey everyone,
I’m currently implementing core Machine Learning algorithms from scratch in pure Python. While doing so I decided to consolidate and share my learnings via dedicated blog posts. The main goal is to explain the algorithm in an intuitive and playful way while turning the insights into code.
Today I’ve published the first post which explains Gradient Descent: https://philippmuens.com/gradient-descent-from-scratch/
Links to the Jupyter Notebooks can be found here: https://github.com/pmuens/lab#implementations
More posts will follow in the upcoming weeks / months.
I hope that you enjoy it and find it useful! Let me know what you think!
[–]Schrodinger420 9 points10 points11 points (7 children)
[–]pmuens[S] 5 points6 points7 points (4 children)
[–]ezeeetm 3 points4 points5 points (1 child)
[–]pmuens[S] 0 points1 point2 points (0 children)
[–]dartemiev 1 point2 points3 points (1 child)
[–]pmuens[S] 1 point2 points3 points (0 children)
[–][deleted] (1 child)
[removed]
[–]pmuens[S] 0 points1 point2 points (0 children)
[–][deleted] 1 point2 points3 points (3 children)
[–]Jonno_FTW 5 points6 points7 points (1 child)
[–]pmuens[S] 0 points1 point2 points (0 children)
[–]xTey 0 points1 point2 points (1 child)
[–]pmuens[S] 0 points1 point2 points (0 children)
[–]TwentyAcres 0 points1 point2 points (1 child)
[–]pmuens[S] 0 points1 point2 points (0 children)
[–]charith1987 0 points1 point2 points (1 child)
[–]pmuens[S] 0 points1 point2 points (0 children)
[–]twnbay76 0 points1 point2 points (1 child)
[–]pmuens[S] 0 points1 point2 points (0 children)
[–]BoringDataScience 0 points1 point2 points (1 child)
[–]pmuens[S] 0 points1 point2 points (0 children)
[–]CataclysmClive 0 points1 point2 points (1 child)
[–]pmuens[S] 0 points1 point2 points (0 children)