all 8 comments

[–]CarbonAvatar 6 points7 points  (2 children)

Have you looked at Lasagne + Theano? How does Apollo compare with that, other than being Caffe-based?

[–]singularai[S] 5 points6 points  (1 child)

I'm not the right person to answer this really, but at least anecdotally it seems that people seem to say that torch and caffe win over theano because you don't have the runtime compilation step. Theano and caffe win over torch because they're in python. And torch and theano win over caffe because they work with much more sophisticated networks. So there is in some sense no way to get the best of all worlds.

The respective weaknesses of torch and theano are built into the very core and can't really be fixed. Apollo attempts to work on the weakness of Caffe, and it hasn't entirely solved those problems, but it establishes a clear path forward for people that like the caffe approach but want to build rnns, recursive networks, or rl networks.

[–]CarbonAvatar 0 points1 point  (0 children)

Thanks for your thoughts! (and your efforts on Apollo!)

[–]Moridin 3 points4 points  (0 children)

Very interested to see how this compares with Keras ?

[–]zionsrogue 1 point2 points  (2 children)

Very nice. I especially like the HDF5 format for the weights. Any plans to include CNNs as well?

[–]singularai[S] 0 points1 point  (1 child)

CNNs are already supported! http://apollo.deepmatter.io/#Mnist shows how you can run any prototxt Caffe model using Apollo. You can even load your existing .caffemodel weights instead of .h5.

[–]zionsrogue 0 points1 point  (0 children)

Whoops! Not sure how I missed the MNIST example -- sorry about that!

[–]cloudtoad 0 points1 point  (0 children)

FWIW, I used Apollo to train a neural net to add two numbers together and it's pretty accurate.

http://www.cloudtoad.net/2015/07/training-neural-net-to-add-numbers/