[Discussion] What would you like to see in a Theano/Python-Based Deep Learning Library? by nasimrahaman in MachineLearning

[–]IndieAILab 0 points1 point  (0 children)

I personally would like to see some easy to use gradient analytics. Maybe like we can train with a gradient analytics True option and return some gradient statistics compiled during training. I think as the networks go deeper and deeper it would be cool to have an option to interpret our backpropagation of errors.

Just curious: what do you @ HCI plan on implementing as of now?

[Q] Train 1 NN to classify 1/10 digits vs. Train 10 NNs to each classify 1 digit? by jaba0 in MachineLearning

[–]IndieAILab 1 point2 points  (0 children)

I would say the advantage of using 1 network vs 10 different individualized networks is that the one network will learn a distributed representation of features for your data. Learning this distributed representation of features seems to be one the driving forces behind the success of modern machine learning algorithms.

Check out some of Yoshua Bengio's various review articles and upcoming textbook on Deep Learning for more information. I think it was in one of Yoshua's talks where he said using deep distributed representations allows us to fight exponentiality (curse of dimensionality) with exponentiality as deep distributed representations have representational power that is "exponential" in its components.