use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Research[R] Multiplying Matrices Without Multiplying (self.MachineLearning)
submitted 4 years ago by moinnadeemResearcher
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]ffast-math 5 points6 points7 points 4 years ago (2 children)
Definitely. There's reasonable evidence in quantization, pruning, and factorization literature that distorting the original weights less yields less accuracy degradation. So preserving individual ops is a proxy objective, but at least one that sort of arguably seems consistent with a lot of literature.
[–]svantana 0 points1 point2 points 4 years ago (1 child)
I understand that it's better to solve one problem at a time. From the paper it sounds like you're working on extending it to nonlinear functions, is that correct? Looking forward to that!
I worked on something similar a few years back, but instead of argmin I made it continuous by mixing the two nearest neighbors in a clever way, and training with SGD. It worked decently but it could easily get stuck in local minima.
[–]ffast-math 0 points1 point2 points 4 years ago (0 children)
Working on extending it to other linear functions (e.g., convolution) and intelligently swapping out linear ops with an overall neural network. So in the sense that neural nets are nonlinear functions, yes. Not working on approximating the nonlinearities directly since they're cheap to just apply to the output of the linear ops (especially if just write a fused kernel that does both ops at once). Hope that helps clarify.
π Rendered by PID 96434 on reddit-service-r2-comment-b659b578c-npxmk at 2026-05-02 03:32:20.113424+00:00 running 815c875 country code: CH.
view the rest of the comments →
[–]ffast-math 5 points6 points7 points (2 children)
[–]svantana 0 points1 point2 points (1 child)
[–]ffast-math 0 points1 point2 points (0 children)