- ralg is a free constrained NLP/NSP solver written in Python and NumPy by Dmitrey.
- It spreads with OpenOpt framework and is one of OO main features.
- ralg (and OpenOpt) BSD License allows to use it in both free and non-free code.
- The solver can handle user-provided 1st derivatives df, dc, dh (subgradient is OK as well).
- Current implementation is intended for medium-scaled problems (nVars up to 1000..1500, nConstraints - any).
- The solver handles Ill-conditioned, piecewise linear and polynomial, non-smooth & noisy problems rather good.
- Another well-known r-algorithm implementation is SolvOpt; it has Fortran90, C and MATLAB API, see also SolvOpt paper.
- It is based on r-algorithm with adaptive space dilation, invented by Ukrainian academician Naum Z. Shor (//that was my teacher// - D.) and Nikolay Zhurbenko. For details see the paper N. Z. Shor and N. G. Zhurbenko, "The minimization method using space dilatation in direction of difference of two sequential gradients," Kibernetika, No. 3, 51-59 (1971).
- Most of optimization software written by our department of optimization (cybernetics institute, Ukrainian NAS) is based on r-algorithm (mostly Fortran code).
- Current Python implementation is a mix of classic r-algorithm and filter methods. I didn't intended to implement filter methods to ralg, but that is what it became after a series of changes made for my dissertation.
- Currently both ralg and SolvOpt are direct solvers, while some other implementations use dual subproblem.
- known issue: current implementation has experimental handling of linear equality constraints that is far from perfect yet (especially for large number of the constraints).
- r-algorithm is heuristic one. Convergence for the one hasn't been proved even for smooth convex unconstrained functions.
- Current implementation stores in computer memory matrix of size nVars * nVars, and each iteration consumes multiplication operations.
- There are other (Fortran-written) ralg modifications that allows storing a matrix of size instead (where m << nVars) and further decrease of iter multiplications number.
- improve handling of
- linear equality constraints (if large number of the ones is present)
- non-linear equality constraints
- some other changes to be done
- OpenOpt NLP example with ralg solver
- FuncDesigner NLP example (with automatic differentiation) using ralg solver
- interalg - global solver for NLP/NSP/GLP/IP/SNLE with user-provided required accuracy
- gsubg - large-scale solver for searching convex NLP/NSP local extremum with user-provided required accuracy