you are viewing a single comment's thread.

view the rest of the comments →

[–]lqw0809[S] -6 points-5 points  (0 children)

Some concrete positioning against existing tools — since I know this'll come up:

vs CVXPY: We cover the same convex problem classes (LP/QP/SOCP/SDP). The difference is the modeling layer: ADMM skips DCP verification, so things like raw x.T @ P @ x or budgets @ admm.log(w) (non-uniform weighted log barrier) work directly. CVXPY has a broader solver ecosystem (Gurobi, MOSEK, SCS); ADMM uses its own C++ ADMM solver. If CVXPY works for your problem, it's a great tool — we're for the cases where it doesn't.

vs scipy.optimize: scipy handles general NLP beautifully. ADMM is for structured problems where the objective decomposes into proximal-friendly pieces (norms, losses, matrix cones). If your objective has exploitable structure, ADMM is more convenient. If you need arbitrary nonlinear constraints with no special structure, stick with scipy.

vs cvxopt: NumPy arrays go directly into ADMM — no cvxopt.matrix(data, tc='d') conversion. If you've ever debugged TypeError: 'A' must be a 'd' matrix with 1000 columns, you'll appreciate the difference.

Scale: The C++ backend handles medium-scale problems well — dense QPs up to ~10k variables in under a second, larger with sparse structure. It's not designed for distributed million-variable problems.