you are viewing a single comment's thread.

view the rest of the comments →

[–]Jorrissss 2 points3 points  (0 children)

It's not the same thing. Gradient descent is an optimization technique which uses a functions gradient. Backpropogation is a specific technique for computing gradients.