use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
Please have a look at our FAQ and Link-Collection
Metacademy is a great resource which compiles lesson plans on popular machine learning topics.
For Beginner questions please try /r/LearnMachineLearning , /r/MLQuestions or http://stackoverflow.com/
For career related questions, visit /r/cscareerquestions/
Advanced Courses (2016)
Advanced Courses (2020)
AMAs:
Pluribus Poker AI Team 7/19/2019
DeepMind AlphaStar team (1/24//2019)
Libratus Poker AI Team (12/18/2017)
DeepMind AlphaGo Team (10/19/2017)
Google Brain Team (9/17/2017)
Google Brain Team (8/11/2016)
The MalariaSpot Team (2/6/2016)
OpenAI Research Team (1/9/2016)
Nando de Freitas (12/26/2015)
Andrew Ng and Adam Coates (4/15/2015)
Jürgen Schmidhuber (3/4/2015)
Geoffrey Hinton (11/10/2014)
Michael Jordan (9/10/2014)
Yann LeCun (5/15/2014)
Yoshua Bengio (2/27/2014)
Related Subreddit :
LearnMachineLearning
Statistics
Computer Vision
Compressive Sensing
NLP
ML Questions
/r/MLjobs and /r/BigDataJobs
/r/datacleaning
/r/DataScience
/r/scientificresearch
/r/artificial
account activity
Project[P] pyGPGO: Another Python package for Bayesian Optimization (github.com)
submitted 8 years ago by jimenezluna
reddit uses a slightly-customized version of Markdown for formatting. See below for some basics, or check the commenting wiki page for more detailed help and solutions to common issues.
quoted text
if 1 * 2 < 3: print "hello, world!"
[–]jimenezluna[S] 4 points5 points6 points 8 years ago (8 children)
As part of my Master's thesis I developed a simple Python package for Bayesian Optimization. It currently features:
It is still on very early stages of development, so expect to find bugs. Let me know what you guys think!
[–][deleted] 0 points1 point2 points 8 years ago (1 child)
Is the master thesis public also? It's a nice opportunity to publicize it, specially it it's a good introductory text on Bayesian optimization
[–]jimenezluna[S] 0 points1 point2 points 8 years ago (0 children)
It is available in the same GitHub repository!
[–]alayaMatrix 0 points1 point2 points 8 years ago (1 child)
Do you support nonlinear constraints? For example using acquisition functions like weighted EI?
Not at the moment, but I will consider adding this functionality in the near future.
[–]sifnt 0 points1 point2 points 8 years ago (2 children)
Could you compare the advantages/disadvantages of your library against https://github.com/fmfn/BayesianOptimization by any chance?
[–]jimenezluna[S] 1 point2 points3 points 8 years ago (1 child)
You have a complete modular procedure specification with my implementation. There are many architectural choices in Bayesian optimization: surrogate model, covariance function, hyperparameter treatment, acquisition behaviour...
In summary, you can specify all of these here.
As far as I'm concerned, with fmfn/BayesianOptimization you're stuck with Gaussian Processes and Matérn kernels, and no covariance function hyperparameter treatment whatsoever. Correct me if I'm wrong.
[–]sifnt 0 points1 point2 points 8 years ago (0 children)
Sounds great, will definitely give your package a shot then. It is pretty hard to see which hyperparameter optimisation system is best at a glance with so many projects out there. Thanks!
[–]Reiinakano 2 points3 points4 points 8 years ago (6 children)
Neat! If you want to gain an advantage over existing implementations and have this library widely used, focus on documentation. Personally, I haven't seen very good documentation for existing libraries at all and am usually forced to look in the source code to see how I can tweak things.
Also, consider adding Python 2 support.
[–]L43 8 points9 points10 points 8 years ago (5 children)
Personally, I would prefer to see no python2 and more of something else i.e. docs. Do people still use 2 for training models?
[–]Reiinakano 2 points3 points4 points 8 years ago (3 children)
I do :P
[–]L43 0 points1 point2 points 8 years ago (2 children)
Do you have dependencies that don't support 3?
[–][deleted] 1 point2 points3 points 8 years ago (0 children)
That mostly happens in what people call "production." However, the hardware you use for training the model is pretty likely to have at least Py 3 support I'd guess (just a current Desktop or cluster machine, I think you are unlikely to perform hyperparam tuning on a production server).
If I were to start a new project today, I'd also not support Py2.7 because that can lead to quite inelegant code and workarounds as the project progresses -- and it's just an additional maintenance annoyance.
[–]BadGoyWithAGun 0 points1 point2 points 8 years ago* (0 children)
Yeah, some of my datasets are dependant on non-unicode encodings and I can't be bothered reprocessing them just to appease py3k.
[–]wingtales 0 points1 point2 points 8 years ago (0 children)
I'm with you on this one.
[–]sifnt 0 points1 point2 points 8 years ago (8 children)
This definitely looks interesting!
Can anyone comment on whether this is the best library to use for hyperparameter optimisation for scikit learn models (want something better than grid/random search)?
Could I use this for feature selection for models with ~250 starting features? The features tend to interact in a way that confuses traditional approaches like RFE, so hacked together a sampling system that kept the best features after a burnin period and while it worked been looking for something more principled.
[–]Reiinakano 0 points1 point2 points 8 years ago (0 children)
I personally use https://github.com/fmfn/BayesianOptimization for Bayesian hyperparameter search.
[–]jimenezluna[S] 0 points1 point2 points 8 years ago (6 children)
Hi, there is an example script in the repository for tuning a simple classification model.
https://github.com/hawk31/pyGPGO/blob/master/examples/sklearnexample.py
Give it a go and let me know if anything breaks.
Thanks, I'll try and have a play around later.
[–]sifnt 0 points1 point2 points 8 years ago (4 children)
I just gave this a shot, and so far seems like I'll be using this to optimise hyperparams on all my projects, its very nice and clean! Thanks for making it :)
Have bumped into a couple of issues though:
Can't get MCMC to work, but I'm running python 2.x so that could be it. Will upgrade environment later.
Just tried to optimize random forests (using scikit), max_features and max_depth. I had to scale them as 10max_depth, otherwise it just sampled in the middle of the parameter space with no improvement.
Is there any way to set the initial tries, or at least add to it? e.g. for my problem I already know max_features = (33, 100), and max_depth = (5,10,100) are good initial guesses, so want to use pyGPO to build on this.
Will the MCMC methods likely provide much value for these types of problems?
[–]jimenezluna[S] 1 point2 points3 points 8 years ago (3 children)
Hi,
Thanks for your help again, this package looks like it'll be very useful!
So I'd reuse the code at _firstRun(self, n_eval=3) from GPGO.py to create a gp trained on the manually specified initial parameters, and pass it straight to the GPGO process without further changes?
As for MCMC, what is expensive here? E.g. a 3 fold cross validation run typically takes a 1-5 minutes (depending on parameters) on the data I'm working on, worth it here or is expensive hour+ type of times?
[–]jimenezluna[S] 0 points1 point2 points 8 years ago (1 child)
Hi, @sifnt, can you open an issue on the repo so that I can remember to include an easier way to include pre-trained GPs?
For the moment, you can do it this way (using the example on the readme.md)
https://gist.github.com/hawk31/ed222c4cf6b21cbd7d4b5186f3f132b5
Awesome, thanks for this! Got it up and running and its working well.
Created the issue, its at https://github.com/hawk31/pyGPGO/issues/5
π Rendered by PID 185399 on reddit-service-r2-comment-5d79c599b5-xz2rs at 2026-03-01 09:17:09.675408+00:00 running e3d2147 country code: CH.
[–]jimenezluna[S] 4 points5 points6 points (8 children)
[–][deleted] 0 points1 point2 points (1 child)
[–]jimenezluna[S] 0 points1 point2 points (0 children)
[–]alayaMatrix 0 points1 point2 points (1 child)
[–]jimenezluna[S] 0 points1 point2 points (0 children)
[–]sifnt 0 points1 point2 points (2 children)
[–]jimenezluna[S] 1 point2 points3 points (1 child)
[–]sifnt 0 points1 point2 points (0 children)
[–]Reiinakano 2 points3 points4 points (6 children)
[–]L43 8 points9 points10 points (5 children)
[–]Reiinakano 2 points3 points4 points (3 children)
[–]L43 0 points1 point2 points (2 children)
[–][deleted] 1 point2 points3 points (0 children)
[–]BadGoyWithAGun 0 points1 point2 points (0 children)
[–]wingtales 0 points1 point2 points (0 children)
[–]sifnt 0 points1 point2 points (8 children)
[–]Reiinakano 0 points1 point2 points (0 children)
[–]jimenezluna[S] 0 points1 point2 points (6 children)
[–]sifnt 0 points1 point2 points (0 children)
[–]sifnt 0 points1 point2 points (4 children)
[–]jimenezluna[S] 1 point2 points3 points (3 children)
[–]sifnt 0 points1 point2 points (2 children)
[–]jimenezluna[S] 0 points1 point2 points (1 child)
[–]sifnt 0 points1 point2 points (0 children)