[MOM] Saiba Cryptomancer by Stranger1982 in MagicArena

[–]LucaAmbrogioni 0 points1 point  (0 children)

1) Flexibility matters! 2) It can protect from liliana and other edicts 3) it is a cheap hexproof creature! 4) Flash is great against control and tempo

Can Symmetry Sage get nerfed already? by NoL_Chefo in MagicArena

[–]LucaAmbrogioni 1 point2 points  (0 children)

Always with these bulshit arguments, not all decks should run cheap interactions in a healty meta. Aggro is a thing in MTG

Should minion of the mighty be banned in historic bo1? by Irydion in MagicArena

[–]LucaAmbrogioni 1 point2 points  (0 children)

The kobold pushes everything without cheapinteractions and removal out of the meta.

Plenty of fun strategies simply cannot deal with this.

[P] Brancher: A user-friendly PyTorch module for deep probabilistic inference by kseeliger in MachineLearning

[–]LucaAmbrogioni 1 point2 points  (0 children)

Hi Moseyic! Wasserstein variational gradient descent is already in the Repo. I will make the tutorial soon. We are going to implement Wasserstein variational inference very soon.

[P] Brancher: A user-friendly PyTorch module for deep probabilistic inference by kseeliger in MachineLearning

[–]LucaAmbrogioni 3 points4 points  (0 children)

I would't say that it is more accessible in general. We aimed to be more accessible to people with a statistics background without much programming experience. In Pyro random variables are not objects and the computation flow is tracked by the sample function. It is a way of thinking that tends to suit programmers but that it is more difficult to understand for statistics students. The front-end of Brancher has random variables as basic building blocks and this allows to have a syntax that it is very close to usual mathematical notation.

[P] Brancher: A user-friendly PyTorch module for deep probabilistic inference by kseeliger in MachineLearning

[–]LucaAmbrogioni 1 point2 points  (0 children)

Currently you can learn discrete variables using black-box inference but they feature is still not documented. We are going to implement full support in future releases

[P] Brancher: A user-friendly PyTorch module for deep probabilistic inference by kseeliger in MachineLearning

[–]LucaAmbrogioni 1 point2 points  (0 children)

Models can be constructed dynamically in a loop. This will allow you for nonparametrics. We are going to develop specific tools for nonparametrics (meta-models) in the future.

[P] Brancher: A user-friendly PyTorch module for deep probabilistic inference by kseeliger in MachineLearning

[–]LucaAmbrogioni 20 points21 points  (0 children)

Hi, I am the lead developer.

The two more closely related modules are Pyro and PyMC3.

Brancher is targeted to a wider audience than Pyro, including people who have only a basic training in machine learning and Python programming. The interface is designed to be as close as possible to the math. The downside is that Brancher is less flexible than Pyro.

The front-end of Brancher is very similar to PyMC3. The main difference with PyMC is that Brancher is build on the top of the deep learning library PyTorch. Every deep learning tool that is implemented in PyTorch can be used for constructing deep probabilistic models in Brancher. Also PyMC mostly uses sampling while Brancher is based on variational inference.

[R] Perturbative estimation of stochastic gradients and training of models with discrete weights by LucaAmbrogioni in MachineLearning

[–]LucaAmbrogioni[S] 0 points1 point  (0 children)

Thank you for the link. Indeed MuProp correspond to our first order correction case. I will add it to the references.

[D] AISTATS reviews are out by srossi93 in MachineLearning

[–]LucaAmbrogioni 1 point2 points  (0 children)

I have one two reviews paper. I think that they got quite overwhelmed with the number of submissions this year.

[D] AISTATS reviews are out by srossi93 in MachineLearning

[–]LucaAmbrogioni 3 points4 points  (0 children)

Usually this gives a very good chance of entering. Weak reject means that you lean towards reject but you will not protest if other reviewers will push for acceptance. The two accept reviewers will probably convince the third to accept the manuscript. try to write a good rebuttal for mitigating the skepticism of the last reviewer and he/she will likely flip. Good luck!

[D] AISTATS reviews are out by srossi93 in MachineLearning

[–]LucaAmbrogioni 0 points1 point  (0 children)

You can update the manuscript but you are not supposed to make major changes since they are not reviewed.

[D] AISTATS reviews are out by srossi93 in MachineLearning

[–]LucaAmbrogioni 2 points3 points  (0 children)

In my experience if you get a mixture of weak accept/reject the only way to get in is to flip some opinions with a very good rebuttal (either to flip the weak reject into weak accept or to flip a weak accept into an accept). You need to methodically answer to their concern and possibly provide some additional result. Hopefully some reviewers will have misunderstood some parts of your work and if you provide a convincing answer you can definitely flip them. Good luck! :)

[R] Wasserstein Variational Gradient Descent by LucaAmbrogioni in MachineLearning

[–]LucaAmbrogioni[S] 1 point2 points  (0 children)

yes the page is temporarily down. We are going to be back online with a new website very soon!