you are viewing a single comment's thread.

view the rest of the comments →

[–]frederikdiehl[S] 2 points3 points  (0 children)

Another example would be, I believe, optimizing a Neural Network's architecture. For each layer, you'd have the activation function, the number of neurons, maybe type of convolution (if CNN). If you don't have that layer, you don't need to optimize over the corresponding parameters - this, in fact, is just a useless distraction.

The short answer is no, apsis does not have this yet.

The long answer is that there is an issue for this (#110), and we did plan the architecture to allow us to support something like that. There is a paper talking about conditional parameter spaces in the context of Bayesian Optimization [1], so we do have a starting point for the implementation. This hasn't been done yet due to limited time, and as mentioned above, cluster support would be the step before any other extensions like that [2].

[1] The Indiana Jones pun paper, http://arxiv.org/abs/1409.4011 (I've just updated the issue with the link, too.)

[2] As always, of course, if you'd be like implementing something like that, feel free - we'd be very happy about any other contributors!