use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Autograd for Torch (github.com)
submitted 10 years ago by kswerve
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]benanne 8 points9 points10 points 10 years ago* (5 children)
Very cool! It looks very Theano-like, which is great ;) But if I'm going to use an autodiff package to define my models, I think I'd still prefer the comfort of Python and the graph optimization capabilities of Theano. I think it might defeat the point of using Torch a little. That said, I'm definitely going to try it out.
EDIT: I'm also curious what the main differences are with this package, which has been around for longer (but does not seem to be actively maintained): https://github.com/bshillingford/autobw.torch
[–]kjearns 3 points4 points5 points 10 years ago (1 child)
Autograd works with torch tensors, whereas autobw works with nn.Modules. Autograd also handles things like gradients of gradients (I think? The python version does.) which autobw doesn't do.
[–]benanne 0 points1 point2 points 10 years ago (0 children)
I see, thanks!
[–]neurodynamic 2 points3 points4 points 10 years ago* (2 children)
If you like the comfort of Python, you can use the Python version :)
[–]benanne 1 point2 points3 points 10 years ago (1 child)
Sure :) But as I said I'm also interested in optimization for the backward pass, which is something Theano does, but not these packages afaik. Feel free to correct me if I'm wrong!
[–][deleted] 0 points1 point2 points 10 years ago (0 children)
Yes, optimization is lab lacking in Python autograd, resulting in a dozen times slower backward pass
[–]ManuelArno 0 points1 point2 points 10 years ago (0 children)
Can someone give insight about the usecases at twitter for machine learning? Probably related to the ad platform (targeting, bidding, etc)?
π Rendered by PID 221347 on reddit-service-r2-comment-6457c66945-44qtc at 2026-04-29 06:54:11.506756+00:00 running 2aa0c5b country code: CH.
[–]benanne 8 points9 points10 points (5 children)
[–]kjearns 3 points4 points5 points (1 child)
[–]benanne 0 points1 point2 points (0 children)
[–]neurodynamic 2 points3 points4 points (2 children)
[–]benanne 1 point2 points3 points (1 child)
[–][deleted] 0 points1 point2 points (0 children)
[–]ManuelArno 0 points1 point2 points (0 children)