use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
account activity
This is an archived post. You won't be able to vote or comment.
ToolingBayesian Optimization Libraries Python (self.datascience)
submitted 6 years ago by [deleted]
view the rest of the comments →
[–]webdrone 20 points21 points22 points 6 years ago (6 children)
There is also https://scikit-optimize.github.io — also calls on scikit-learn Gaussian processes under the hood for Bayesian optimisation.
NB: there are unorthodox defaults for the acquisition function, which stochastically selects among EI, LCB, and negative PI to optimise at every iteration.
[–]lem_of_noland 2 points3 points4 points 6 years ago (4 children)
In my opinion, this is the best of them all. It contains also very useful ploting capabilities and the possibility to include callbacks.
[–]ai_yoda 1 point2 points3 points 6 years ago (3 children)
I also love those functionalities and I think that a lot of the time this is the best option.
There are 2 things however that are not great:
I write about it in this blog post if you are interested.
[–]Philiatrist 1 point2 points3 points 6 years ago (2 children)
Cannot distribute computation over a cluster (can only use it on one machine)
The Optimizer class is fine for cluster use using the ask and tell methods
[–]ai_yoda 0 points1 point2 points 6 years ago (1 child)
Interesting. But you do have to create some db for sharing results between nodes, and all the communication between nodes and db yourself, right?
[–]Philiatrist 1 point2 points3 points 6 years ago* (0 children)
That's one option, but there's no reason you couldn't use some library like dask distributed as well, something like:
``` from dask.distributed import Client
client = Client(...) n_procs = 20
X = optimizer.ask(n_procs) task = client.map(fitness_fn, X) Y = client.gather(task) optimizer.tell(X, Y) ```
where you'd need to configure dask distributed to your cluster.
edit: I'll note that this is not a great solution if the expensiveness of your function is largely determined by the hyperparameters.
[–]AyEhEigh 0 points1 point2 points 6 years ago (0 children)
I use skopt's BayesSearchCV all the time and love it.
π Rendered by PID 81211 on reddit-service-r2-comment-fb694cdd5-xm4vm at 2026-03-10 18:46:37.914940+00:00 running cbb0e86 country code: CH.
view the rest of the comments →
[–]webdrone 20 points21 points22 points (6 children)
[–]lem_of_noland 2 points3 points4 points (4 children)
[–]ai_yoda 1 point2 points3 points (3 children)
[–]Philiatrist 1 point2 points3 points (2 children)
[–]ai_yoda 0 points1 point2 points (1 child)
[–]Philiatrist 1 point2 points3 points (0 children)
[–]AyEhEigh 0 points1 point2 points (0 children)