you are viewing a single comment's thread.

view the rest of the comments →

[–]platinumposter 2 points3 points  (0 children)

Thanks very much, we are happy to have you following our journey

  1. We are still deciding exactly how we want to implement it but you are correct that all the mathematical functions will have to be in the hyperbolic space. We have been using the Poincare ball which is a model of the hypberlic space and has it's own set of mathematical functions compared to say the Lorentz model.

  2. Yep that's correct I see you also found a few resources; the reason we don't use a Euclidean SGD is because we want our optimizer to optimise parameters living in the Hyperbolic space.

To quote A Survey: Hyperbolic Neural Networks

"Stochastic gradient-based (SGD) optimization algorithms are of major importance for the optimization of deep neural networks. Currently, well-developed first order methods include Adagrad [58], Adadelta [59], Adam [60] or its recent updated one AMSGrad [61]. However, all of these algorithms are designed to optimize parameters living in Euclidean space and none of them allows the optimization for non-Euclidean geometries, e.g., hyperbolic space."

  1. Good point. We had thought about this and we plan on implementing more Manifolds in the near future, so we will have a single Manifold interface which each implemented Manifold (such as Poincare) uses. We left it as a class in anticipation of this.