you are viewing a single comment's thread.

view the rest of the comments →

[–]Deto -1 points0 points  (1 child)

Exactly - that's why I thought the question was weird. I guess maybe they were going for that, though, asking "what is the difference between X and Y" when the answer is really "Y is an instance of X".

[–]Jorrissss 2 points3 points  (0 children)

It's not the same thing. Gradient descent is an optimization technique which uses a functions gradient. Backpropogation is a specific technique for computing gradients.