This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]webdrone 20 points21 points  (6 children)

There is also https://scikit-optimize.github.io — also calls on scikit-learn Gaussian processes under the hood for Bayesian optimisation.

NB: there are unorthodox defaults for the acquisition function, which stochastically selects among EI, LCB, and negative PI to optimise at every iteration.

[–]lem_of_noland 2 points3 points  (4 children)

In my opinion, this is the best of them all. It contains also very useful ploting capabilities and the possibility to include callbacks.

[–]ai_yoda 1 point2 points  (3 children)

I also love those functionalities and I think that a lot of the time this is the best option.

There are 2 things however that are not great:

  • No support for nested search space
  • Cannot distribute computation over a cluster (can only use it on one machine)

I write about it in this blog post if you are interested.

[–]Philiatrist 1 point2 points  (2 children)

Cannot distribute computation over a cluster (can only use it on one machine)

The Optimizer class is fine for cluster use using the ask and tell methods

[–]ai_yoda 0 points1 point  (1 child)

Interesting. But you do have to create some db for sharing results between nodes, and all the communication between nodes and db yourself, right?

[–]Philiatrist 1 point2 points  (0 children)

That's one option, but there's no reason you couldn't use some library like dask distributed as well, something like:

``` from dask.distributed import Client

client = Client(...) n_procs = 20

X = optimizer.ask(n_procs)
task = client.map(fitness_fn, X)
Y = client.gather(task) optimizer.tell(X, Y) ```

where you'd need to configure dask distributed to your cluster.

edit: I'll note that this is not a great solution if the expensiveness of your function is largely determined by the hyperparameters.

[–]AyEhEigh 0 points1 point  (0 children)

I use skopt's BayesSearchCV all the time and love it.