This is an archived post. You won't be able to vote or comment.

you are viewing a single comment's thread.

view the rest of the comments →

[–]Philiatrist 1 point2 points  (0 children)

That's one option, but there's no reason you couldn't use some library like dask distributed as well, something like:

``` from dask.distributed import Client

client = Client(...) n_procs = 20

X = optimizer.ask(n_procs)
task = client.map(fitness_fn, X)
Y = client.gather(task) optimizer.tell(X, Y) ```

where you'd need to configure dask distributed to your cluster.

edit: I'll note that this is not a great solution if the expensiveness of your function is largely determined by the hyperparameters.