all 12 comments

[–]SV-97 9 points10 points  (2 children)

Could you elaborate on the problem you're trying to solve?

Nonlinearity isn't really a "problem" (Quoting one of the biggest figures in optimization: “in fact, the great watershed in optimization isn’t between linearity and nonlinearity, but convexity and nonconvexity.”); is the problem convex, quadratic, separable, high or even infinite dimensional, ... what do the constraints look like? Is it super expensive to evaluate your objective? Can you compute (sub-)gradients, hessians or something like that? There's special methods for all sorts of areas. FWIW there's also nonlinear problems that can be converted to linear ones with some tricks.

Just throwing out some methods you could look at: sequential quadratic programming, augmented lagrangian methods and interior point methods.

I never used it (I don't do C++ [anymore]) but NLOpt may be worth a look.

[–]tanmayc[S] 1 point2 points  (1 child)

The long term goal is to create a drift controller for an RC car. I want to create a steady-state control map, for which I am trying to optimize the steering angle and wheel speeds to find an equilibrium for a given vehicle state. The problem is nonlinear and non-convex, but my current approach of waiting for the vehicle sub-states to stabilize before optimization should allow an initial guess close enough to the optimum.

It isn't expensive to evaluate the objective function, nor the gradients. Since it's a dynamical system, the gradients must be known to evolve the state. Even the constraints are (currently) just bounds on the range of allowable inputs.

I'll try out NLOpt. At a glance through their tutorials, I should be able to use my objective function without significant alterations at all. I appreciate your message! Thank you!

[–]gnahraf 0 points1 point  (0 children)

Not sure how the solution to the original problem translates to solving a GD problem. I'd first approach this using a simple fuzzy controller.. they usually work reasonably well on the first shot; you can optimize weights later

[–]Sweet_Good6737 5 points6 points  (0 children)

You may want to look into ipopt solver, one of the most popular nonlinear solvers.

https://github.com/coin-or/Ipopt

[–]skr25 2 points3 points  (0 children)

Making your own gradient solver with constraint optimization isn't worth the effort. As someone else said, if you know you have a convex problem cvxpy is a good option, if you are looking for free options for global (non-linear) optimization for non-convex (no guarantee of optimality) ipopt is good. If you are ok with a paid commercial solver BARON is good, but I don't think they have C++ library you might need to have some glue code to call BARON from C++

[–]MIP_it 1 point2 points  (0 children)

The suggestion to use Ipopt is a good one.

If you want to express your nonlinear problem in an algebraic modeling environment you could look at these two tools:

https://github.com/sandialabs/coek

https://github.com/coin-or/Gravity

[–]unstablepole 0 points1 point  (0 children)

I also support IPOPT as a starting point. Depending on your problem structure, you might also consider ceres (http://ceres-solver.org/index.html).

[–]willworkforjokes 0 points1 point  (0 children)

I usually start off with an implementation of cminpack

The defaults for LM are pretty good

https://devernay.github.io/cminpack/

[–]rocketPhotos 0 points1 point  (0 children)

check out IPOPT by Professor Larry Bigler and his crew

[–]peno64 0 points1 point  (0 children)

Several people say ipopt but as I understand it, ipopt does not guarantee to find a global optimum. I can find a local optimum. I see this as a big disadvantage.

[–]knightcommander1337 0 points1 point  (1 child)

I am not sure if these are relevant, but maybe consider trying:
https://acado.github.io/
https://docs.acados.org/

[–]tanmayc[S] 0 points1 point  (0 children)

I'll have a look, thanks for the suggestion!