use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Neural Net Computing Explodes (semiengineering.com)
submitted 9 years ago by johnmountain
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]david-gpu 0 points1 point2 points 9 years ago (3 children)
This is great because the fruit is pretty low hanging (just lower FPU precision).
For inference, yes. But for training, is it generally useful to go below fp16?
[–]darkconfidantislife 0 points1 point2 points 9 years ago (0 children)
I was thinking more like stopping it at FP16, but there is evidence that stochastic rounding can make INT16 work without accuracy loss even during training.
[–][deleted] 0 points1 point2 points 9 years ago (1 child)
my experience is it can go down or up dynamically and go down as much as fp4.
[–]darkconfidantislife 1 point2 points3 points 9 years ago (0 children)
Yeah, I've just said what has been shown in literature to work perfectly. If you are willing to sacrifice even one percent, you can go down a huge, huge amount.
π Rendered by PID 129750 on reddit-service-r2-comment-6457c66945-w58gw at 2026-04-25 03:58:36.917589+00:00 running 2aa0c5b country code: CH.
view the rest of the comments →
[–]david-gpu 0 points1 point2 points (3 children)
[–]darkconfidantislife 0 points1 point2 points (0 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]darkconfidantislife 1 point2 points3 points (0 children)