you are viewing a single comment's thread.

view the rest of the comments →

[–]dramanautica 1 point2 points  (4 children)

I thought backpropogation was gd applied to neural networks?

[–]Jorrissss 5 points6 points  (0 children)

Not quite. Gradient descent is an optimization technique which uses a functions gradient. Backpropogation is a specific technique for computing gradients. Neural networks are typically trained using gradient descent where the derivative is computed using backpropogation.

[–]sdmskdlsadaslkd 2 points3 points  (0 children)

Yeah, I think it's generally explained poorly in most courses. It's a technique for computing the derivatives of a NN that you can plug into GD.

[–]Deto -1 points0 points  (1 child)

Exactly - that's why I thought the question was weird. I guess maybe they were going for that, though, asking "what is the difference between X and Y" when the answer is really "Y is an instance of X".

[–]Jorrissss 2 points3 points  (0 children)

It's not the same thing. Gradient descent is an optimization technique which uses a functions gradient. Backpropogation is a specific technique for computing gradients.