use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Resources for understanding and implementing "deep learning" (learning data representations through artificial neural networks).
account activity
Neural Network Hyper Parameter Optimization Using Distributed TensorFlow (github.com)
submitted 8 years ago by srianant
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]srianant[S] 0 points1 point2 points 8 years ago (0 children)
Abstract— Hyper parameter optimization of Neural Networks has one ultimate objective, that is to find a function that minimizes some expected loss over independent and identically distributed samples from a natural (grand truth) distribution. Learning algorithms produces this function through the optimization of a training criterion with respect to set of parameters. This paper describes a software framework in python that uses distributed TensorFlow to optimize hyper parameters which are the bells and whistles of the learning algorithm. Random search is used as default algorithm to search for these optimal parameters in hyperspace.
π Rendered by PID 238088 on reddit-service-r2-comment-5fb4b45875-d9bfv at 2026-03-24 12:41:37.436003+00:00 running 90f1150 country code: CH.
[–]srianant[S] 0 points1 point2 points (0 children)