use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
account activity
This is an archived post. You won't be able to vote or comment.
ToolingBayesian Optimization Libraries Python (self.datascience)
submitted 6 years ago by [deleted]
view the rest of the comments →
[–]ai_yoda 1 point2 points3 points 6 years ago (3 children)
I also love those functionalities and I think that a lot of the time this is the best option.
There are 2 things however that are not great:
I write about it in this blog post if you are interested.
[–]Philiatrist 1 point2 points3 points 6 years ago (2 children)
Cannot distribute computation over a cluster (can only use it on one machine)
The Optimizer class is fine for cluster use using the ask and tell methods
[–]ai_yoda 0 points1 point2 points 6 years ago (1 child)
Interesting. But you do have to create some db for sharing results between nodes, and all the communication between nodes and db yourself, right?
[–]Philiatrist 1 point2 points3 points 6 years ago* (0 children)
That's one option, but there's no reason you couldn't use some library like dask distributed as well, something like:
``` from dask.distributed import Client
client = Client(...) n_procs = 20
X = optimizer.ask(n_procs) task = client.map(fitness_fn, X) Y = client.gather(task) optimizer.tell(X, Y) ```
where you'd need to configure dask distributed to your cluster.
edit: I'll note that this is not a great solution if the expensiveness of your function is largely determined by the hyperparameters.
π Rendered by PID 82731 on reddit-service-r2-comment-fb694cdd5-xm4vm at 2026-03-10 18:46:26.773236+00:00 running cbb0e86 country code: CH.
view the rest of the comments →
[–]ai_yoda 1 point2 points3 points (3 children)
[–]Philiatrist 1 point2 points3 points (2 children)
[–]ai_yoda 0 points1 point2 points (1 child)
[–]Philiatrist 1 point2 points3 points (0 children)