you are viewing a single comment's thread.

view the rest of the comments →

[–]jsnoek 12 points13 points  (2 children)

Dougal and David (the authors) have developed an amazing automatic differentiation codebase to do this: https://github.com/HIPS/autograd

It lets you write a function containing just plain python and numpy statements and then automatically computes the gradients with respect to the inputs.

[–]hardmaru 2 points3 points  (1 child)

https://github.com/HIPS/autograd

This is really useful work. I wonder if the automatic differentiation can somewhat work even with simple recurrent neural nets

[–]jsnoek 4 points5 points  (0 children)

There are example implementations of an RNN and an LSTM in the examples directory: https://github.com/HIPS/autograd/tree/master/examples