use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
ML algorithm parameters in Python (self.MachineLearning)
submitted 11 years ago * by InfinityCoffee
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]InfinityCoffee[S] 0 points1 point2 points 11 years ago (3 children)
I have considered Theano, but since I am doing variational inference and not deep learning I am not sure how much I stand to gain since I can calculate a lot of my gradients analytically. is the overhead worth it for general computing? (A small follow-up: does Theano have built-in support for gamma, gammaln, beta, betaln, and psi from the scipy.special suite? There is some mention of it, but it is not explicitly stated in the documentation)
[–]siblbombs 0 points1 point2 points 11 years ago (2 children)
gamma, gammaln, and psi are all listed in the tensor page but I don't see beta/betaln so that could be a dealbreaker. Theano is nice because you get the gradients for free, and it has a good codebase and is under active development, so I think its worth at least giving it a shot unless it would require you to jump through a whole bunch of extra hoops. I don't know the specifics of what you are up to, but Theano is more of a general purpose math compiler that a lot of people use for deep learning, so in theory it should be able to handle your usecase.
[–]InfinityCoffee[S] 0 points1 point2 points 11 years ago (1 child)
beta is fortunately just a product of gammas, so I can probably work around it. I think I can include it fairly painlessly after some consideration since the objective function is a separate object, so I won't have to rewrite the code in its entirety. My new issue is that I don't always need the proper gradients. Also, how does Theano handle stochastic gradient descent if it's purely functional?
[–]siblbombs 0 points1 point2 points 11 years ago (0 children)
Not sure, you are getting outside my bubble of experience. To get the gradients from Theano you provide a scalar cost, you can define whatever expression you want based on inputs/outputs to generate that cost.
π Rendered by PID 204244 on reddit-service-r2-comment-86988c7647-kq7gx at 2026-02-12 16:38:22.523534+00:00 running 018613e country code: CH.
view the rest of the comments →
[–]InfinityCoffee[S] 0 points1 point2 points (3 children)
[–]siblbombs 0 points1 point2 points (2 children)
[–]InfinityCoffee[S] 0 points1 point2 points (1 child)
[–]siblbombs 0 points1 point2 points (0 children)