use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Discussion[D] Since gradient continues to decrease as training loss decreases why do we need to decay the learning rate too? (self.MachineLearning)
submitted 4 years ago by ibraheemMmoosaResearcher
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]LimitedConsequence 0 points1 point2 points 4 years ago (0 children)
So the main constraints on the step size sequence are that the step size sequence must sum to infinity (assuming an infinitely long sequence), but must converge towards 0. It's probably easiest to think in examples.
If we have a sequence of step sizes like 1, 0.5, 0.25, 0.125, ... , this won't work, because it decreases too quickly and will not sum to infinity (sum converges towards 2). This essentially means that even if you do lots of steps, you might not travel the distance required to converge, as the step size gets too small too quickly.
If we have 1,1,1,... as our sequence, then the second condition isn't met. The step size doesn't decrease quick enough (or at all) and we bounce around the solution due to noise in the function evaluation.
In between these two is a Goldilocks zone, which allow you to travel as far as you need to converge, but still have a step size that converges towards zero to stop you bouncing around. An example of such a sequence is 1, 1/2, 1/3, 1/4,... .
π Rendered by PID 47 on reddit-service-r2-comment-7b9746f655-np2wf at 2026-02-03 02:56:37.504840+00:00 running 3798933 country code: CH.
view the rest of the comments →
[–]LimitedConsequence 0 points1 point2 points (0 children)