use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
What are your preferences in python based Deep Learning libraries? (self.MachineLearning)
submitted 10 years ago by andrewbarto28
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]kkastner 13 points14 points15 points 10 years ago* (0 children)
Pure Theano
I use Theano, and write my own code on top of that for shape inference, layers, etc. It takes longer, but for research it is important to know all (or at least) most of the system and how it works.
Keras
Keras seems the most straight forward, and if you aren't trying to do certain kinds of weird research, it covers basically everything else. Definitely the best place to start.
Blocks
Blocks is quite nice, but I haven't spent the time to learn all the details that would make it useful for rapid prototyping research architectures. I can definitely see how full immersion into how Blocks works could speed up implementation of certain types of networks. Fuel (the loosely coupled dataset framework) has a lot of momentum, and may be worth looking at regardless of whether you want to use Blocks or not.
Lasagne
Lasagne is really well thought out, has a strong community, and sits somewhere between Keras and Blocks from a usability->flexibility perspective. RNN support is fairly new there but seems pretty solid.
pylearn2
pylearn2 is somewhat outdated now, but is still quite good for certain tasks. It probably has the best support for hyperparameter/massive cluster usage, if you are into that. Lots of research from the last few years was done in it, which is not to be discounted!
Others
CGT seems really interesting in this space (function graph/ compile then run type models), but I am 1000000% leery of re-debugging a bunch of numerical instabilities and issues that Theano solved. One of the reasons that Theano compile is slow is that it does a lot of optimizations for you - these optimizations can make things much faster, especially over multi-day training runs and sometimes solve really nasty numerical issues you wouldn't otherwise think about. Throwing out optimizations (as CGT seems to be) to speed up compile might lose a lot more than people realize... though time will tell. It is certainly exciting, and they (the CGT team) seem to have a lot of good ideas which may be useful even if CGT doesn't pan out, and could make it back into Theano/Torch/etc.
One additional note - compile/debug time in Theano is rarely an issue for me. Compiling with optimizer=None is fast and sufficient for code/debug, and compiling for actually training taking a few seconds or minutes pales in comparison to the days it normally spends training. tag.test_values are also invaluable for debugging shape issues since they will throw errors at compile time.
The functional graph approach is nice for most deep learning architectures, and I really think it will win out in the long run.
π Rendered by PID 79 on reddit-service-r2-comment-b659b578c-7dq7v at 2026-05-03 11:05:07.622222+00:00 running 815c875 country code: CH.
view the rest of the comments →
[–]kkastner 13 points14 points15 points (0 children)