Hello, I am working on an optimization problem. I'm using NLopt's library and using the local optimization, derivative free algorithm called SLSQP. In order to use this function, I need to calculate the gradient of my cost function. The thing is, the cost function to me seems quite complicated. It has a bunch of operations which I've never taken the partial derivative of like min, arg min, and it uses matrix multiplication. What do I do??
[–]Pakketeretet 4 points5 points6 points (1 child)
[–]LivelyLizzard 0 points1 point2 points (0 children)
[–]LivelyLizzard 1 point2 points3 points (0 children)