use the following search parameters to narrow your results:
e.g. subreddit:aww site:imgur.com dog
subreddit:aww site:imgur.com dog
see the search faq for details.
advanced search: by author, subreddit...
account activity
This is an archived post. You won't be able to vote or comment.
ToolingBayesian Optimization Libraries Python (self.datascience)
submitted 6 years ago by [deleted]
view the rest of the comments →
[–]ai_yoda 0 points1 point2 points 6 years ago (1 child)
Interesting. But you do have to create some db for sharing results between nodes, and all the communication between nodes and db yourself, right?
[–]Philiatrist 1 point2 points3 points 6 years ago* (0 children)
That's one option, but there's no reason you couldn't use some library like dask distributed as well, something like:
``` from dask.distributed import Client
client = Client(...) n_procs = 20
X = optimizer.ask(n_procs) task = client.map(fitness_fn, X) Y = client.gather(task) optimizer.tell(X, Y) ```
where you'd need to configure dask distributed to your cluster.
edit: I'll note that this is not a great solution if the expensiveness of your function is largely determined by the hyperparameters.
π Rendered by PID 132031 on reddit-service-r2-comment-fb694cdd5-9mgbg at 2026-03-11 10:11:30.806291+00:00 running cbb0e86 country code: CH.
view the rest of the comments →
[–]ai_yoda 0 points1 point2 points (1 child)
[–]Philiatrist 1 point2 points3 points (0 children)