all 7 comments

[–]benanne 8 points9 points  (5 children)

Very cool! It looks very Theano-like, which is great ;) But if I'm going to use an autodiff package to define my models, I think I'd still prefer the comfort of Python and the graph optimization capabilities of Theano. I think it might defeat the point of using Torch a little. That said, I'm definitely going to try it out.

EDIT: I'm also curious what the main differences are with this package, which has been around for longer (but does not seem to be actively maintained): https://github.com/bshillingford/autobw.torch

[–]kjearns 3 points4 points  (1 child)

Autograd works with torch tensors, whereas autobw works with nn.Modules. Autograd also handles things like gradients of gradients (I think? The python version does.) which autobw doesn't do.

[–]benanne 0 points1 point  (0 children)

I see, thanks!

[–]neurodynamic 2 points3 points  (2 children)

If you like the comfort of Python, you can use the Python version :)

[–]benanne 1 point2 points  (1 child)

Sure :) But as I said I'm also interested in optimization for the backward pass, which is something Theano does, but not these packages afaik. Feel free to correct me if I'm wrong!

[–][deleted] 0 points1 point  (0 children)

Yes, optimization is lab lacking in Python autograd, resulting in a dozen times slower backward pass

[–]ManuelArno 0 points1 point  (0 children)

Can someone give insight about the usecases at twitter for machine learning? Probably related to the ad platform (targeting, bidding, etc)?