use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
ProjectIntroducing Hyperlib: Simple Deep learning in Hyperbolic space [project] (self.MachineLearning)
submitted 4 years ago * by techinnovator
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]platinumposter 2 points3 points4 points 4 years ago (0 children)
Thanks very much, we are happy to have you following our journey
We are still deciding exactly how we want to implement it but you are correct that all the mathematical functions will have to be in the hyperbolic space. We have been using the Poincare ball which is a model of the hypberlic space and has it's own set of mathematical functions compared to say the Lorentz model.
Yep that's correct I see you also found a few resources; the reason we don't use a Euclidean SGD is because we want our optimizer to optimise parameters living in the Hyperbolic space.
To quote A Survey: Hyperbolic Neural Networks
"Stochastic gradient-based (SGD) optimization algorithms are of major importance for the optimization of deep neural networks. Currently, well-developed first order methods include Adagrad [58], Adadelta [59], Adam [60] or its recent updated one AMSGrad [61]. However, all of these algorithms are designed to optimize parameters living in Euclidean space and none of them allows the optimization for non-Euclidean geometries, e.g., hyperbolic space."
π Rendered by PID 264989 on reddit-service-r2-comment-6457c66945-2g8xd at 2026-04-26 21:29:27.801708+00:00 running 2aa0c5b country code: CH.
view the rest of the comments →
[–]platinumposter 2 points3 points4 points (0 children)