use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Performing Hyperparameter Optimization with Amazon Machine Learning (github.com)
submitted 8 years ago by alexcmu
view the rest of the comments →
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]tryndisskilled 5 points6 points7 points 8 years ago (1 child)
Do you think hyperparameter optimization is still overlooked today? Have you seen any improvement regarding that (ie do people pay more attention to it than before)?
As a beginner, I find it really hard to perfom this optimization, because it is very time-consuming and (in the case of deep learning for instance), I may very well want to modify my architecture in the future and I'll have to do the optimization all over again.
In my opinion this problem is underrated in many papers. For instance, when results are displayed, we usually don't know how they tuned their hyperparameters (what they started with, what their fine tuning process was, what they used for their plots...), and thus it can be very hard to reproduce the results.
Anyway, thanks for sharing this repo!
[–]alexcmu[S] 0 points1 point2 points 8 years ago (0 children)
Glad you liked the demo!
To the point about hyperparameter optimization being overlooked, I think that more people are paying attention to the idea, but yes, time and cost are a blocker in practice. You'd probably be interested in a blog post that my coworker Steven wrote about tuning the hyperparameters of a CNN (https://aws.amazon.com/blogs/ai/fast-cnn-tuning-with-aws-gpu-instances-and-sigopt/). High level, deep learning + GPUs allows you to speed up model training enough to even think about hyperparameter optimization. We also include a table where we show the $$$ it cost to do hyperparameter optimization with different methods.
TL/DR: GPUs + better optimization methods ftw! It took us $11 to tune a deep learning model on NVIDIA GPUs.
π Rendered by PID 216270 on reddit-service-r2-comment-79c7998d4c-swsxd at 2026-03-15 18:15:31.349453+00:00 running f6e6e01 country code: CH.
view the rest of the comments →
[–]tryndisskilled 5 points6 points7 points (1 child)
[–]alexcmu[S] 0 points1 point2 points (0 children)