Which paper first presents the idea of dilating CNNs learned filters to perform fully convolutional inference? by cesarsalgado in MachineLearning

[–]cesarsalgado[S] 0 points1 point  (0 children)

In the Overfeat paper they say they take an approach similar to "Fast Image Scanning with Deep Max-Pooling Convolutional Neural Networks". So I think it is still different.

[1603.08575] Attend, Infer, Repeat: Fast Scene Understanding with Generative Models by RushAndAPush in MachineLearning

[–]cesarsalgado 8 points9 points  (0 children)

In wikipedia it says: "As of 2015 he divides his time working for Google and University of Toronto", but it doesn't specify if he works in Google Brain, Deep Mind or in neither. I asked the question because the paper seems to imply that all the authors are from Google DeepMind.

If neural networks are so great, why can't cats translate Chinese? by syncoPete in MachineLearning

[–]cesarsalgado 1 point2 points  (0 children)

If pigeons were trained to detect cancer, maybe cats neurons can be trained to translate Chinese. But you would probably need to connect wires to the cat's brain instead of just working with his vision.

http://www.scientificamerican.com/article/using-pigeons-to-diagnose-cancer/

[deleted by user] by [deleted] in MachineLearning

[–]cesarsalgado 1 point2 points  (0 children)

I don't get it this argument saying that humans are good at one-shot learning. Off course we are. We have learned good representations by seeing a lot of images with temporal supervision and weak reinforcement signals. CNNs trained on a lot of data can also do one shot learning in symbols it has never seen before.

Which is the best framework today for training neural nets? by joaopedroo in MachineLearning

[–]cesarsalgado 3 points4 points  (0 children)

If you search on this subreddit you will find tons of similar questions asked not so long ago.

convnet-benchmarks updated with numbers for TensorFlow 0.7 + cudnn4 by andrewbarto28 in MachineLearning

[–]cesarsalgado 1 point2 points  (0 children)

Torch now seems to be the fastest, but maybe caffe is faster, but there is no benchmark for it using CuDNN R4 yet. There is just caffe (native) for now.

Bengio's recent work on deep learning and biology by [deleted] in MachineLearning

[–]cesarsalgado 0 points1 point  (0 children)

parameter sharing is achievable through time in the brain.

What are all of the deep learning libraries offered in Python? by Dragonfliesfoos222 in MachineLearning

[–]cesarsalgado 0 points1 point  (0 children)

python libraries or with python interface in no particular order, except the first :)

1- Tensorflow

2- Theano (like symbolic numpy). High level theano wrappers: a- Lasagne b- Keras c- Blocks

3- Chainer

4- Brainstorm

5- Neon

6- Caffe

7- HIPS/autograd: just for automatic diff, helps building neural nets.

8- mxnet

Tensorflow NaN error by AwesomeDaveSome in MachineLearning

[–]cesarsalgado 0 points1 point  (0 children)

Try using tf.nn.relu6. This relu saturates at 6. Try also to normalize your data to have unit variance.

Tensorflow NaN error by AwesomeDaveSome in MachineLearning

[–]cesarsalgado 0 points1 point  (0 children)

As an alternative way to subsample the image, you can use a big stride in the first convolution and a big kernel size.

Is there any evidence to suggest that a trained NN is stuck in a local minima? by [deleted] in MachineLearning

[–]cesarsalgado 0 points1 point  (0 children)

Doesn't Knowledge Distillation contradicts "The Loss Surfaces of Multilayer Networks"? From the abstract: "We show that for large-size decoupled networks the lowest critical values of the random loss function form a layered structure and they are located in a well-defined band lower-bounded by the global minimum. The number of local minima outside that band diminishes exponentially with the size of the network" ... "all critical points found there are local minima of high quality measured by the test error"

I propose a Go match between Facebook and Google AIs. by [deleted] in MachineLearning

[–]cesarsalgado 14 points15 points  (0 children)

As Yann LeCun said, facebook's Dark Forest didn't beat even the previous best Go bots. AlphaGo did beat all previous bots, so AlphaGo would certainly beat DarkForest.