GLP

From OpenOpt

Jump to: navigation, search
Global Problems (GLP)
\mathbf {f(x) \to min,\ max\ \ } (global)
subjected to
\mathbf{lb} \le \mathbf{x} \le \mathbf{ub}
\mathbf{A x} \le \mathbf{b}
 \mathbf{\forall i=0,...,I: c_i(x) \le 0}
 \mathbf{\forall j=0,...,J: h_j(x) = 0}
 \mathbf{ f, c_i, h_j :R^n \to R }
\mathbf{x \in R^n}

Note! GLP solvers are slower than NLP/NSP and cannot handle problems with large number of variables (typical sizes are 1..10..100), if you know either you have the only one local=global optimum in your problem, or you search for local one - you'd better use NLP/NSP, they will solve it much faster.



GLP solvers connected to OpenOpt:

Solver License Made by Are finite box-bounds required Constraints that can be handled Info Parameters
interalg BSD Dmitrey Yes (but they can be very huge) (since v. 0.36) all Yields exact optimum subjected to required accuracy fTol: abs(f - f*) < fTol.
See also interalg benchmark vs Direct, intsolver and commercial BARON.
maxNodes = 15000, maxActiveNodes = 1500, fStart = None, (unestablished) dataHandling = 'auto' | 'raw' | 'sorted'
de BSD Stepan Hlushak, stepanko - at - gmail - dot - com, connected to OO by Dmitrey Yes (since v. 0.37) All Two array differential evolution algorithm (Feoktistov V. Differential Evolution. In Search of Solutions; Springer, 2006, ISBN 0387368965). Code is included into OO and Stepan has subversion commit rights for the one. baseVectorStrategy = {'random'}/ 'best'; searchDirectionStrategy = {'random'}/'best'; differenceFactorStrategy = {'random'}/'constant'; population = 'default: 10*nVars will be used'; differenceFactor = 0.8; crossoverRate = 0.5; hndvi = 1 (probably hndvi will be changed to more informative name till next OO release)
galileo GPL Donald Goodman Yes box bounds GA-based solver. Cannot handle other than box-bound constraints. Code is included into OO. This solver doesn't work with Python 3.x yet. population = 15; crossoverRate = 1.0; mutationRate = 0.05; useInteger = False (if useInteger = True or 1 then search solution with all integer variables)
pswarm LGPL A. I. F. Vaz Seems like no, mb constraints Ax <= b that provide optimization within finite volume are enough box bounds, linear inequalities Can handle user-provided x0. Download and install pswarm from the URL mentioned, ensure author-provided RunPSwarm.py works ok, and pswarm_py.so is inside PYTHONPATH. Pay attention: for installation from sources you should use "make py_linear" to enable general linear constraints (Ax <= b). Documentation says pswarm is capable of using parallel calculations (via MPI) but I don't know is it relevant to Python API. The algorithm combines pattern search and particle swarm. Basically, it applies a directional direct search in the poll step (coordinate search in the pure simple bounds case) and particle swarm in the search step. See also: resent paper on PSwarm published at optimization-online.org. social = 0.5; cognitial = 0.5; fweight = 0.4; iweight = 0.9; size = 42; tol = 1e-5; ddelta = 0.5; idelta = 2.0
stogo LGPL Kaj Madsen Yes lb, ub Can use derivatives. Requires nlopt installed. useRand = True (use GD_STOGO or STOGO_RAND routine, see here for details)
isres LGPL S. G. Johnson Yes All isres = "Improved Stochastic Ranking Evolution Strategy", by S. G. Johnson. Requires nlopt ver >= 2.2 installed. population (default 20×(nVars+1))
mlsl LGPL S. G. Johnson Yes lb, ub mlsl = "Multi-Level Single-Linkage". This one is for smooth multi-extrema funcs (derivatives are passed to local optimizer). Requires nlopt ver >= 2.2 installed. G_MLSL_LDS is used with LD_TNEWTON_PRECOND_RESTART as local optimizer population (number of local solvers, default 4)


See also:

Why SciPy anneal is not included
MOP (multi-objective problems)
NLP (nonlinear problems)
NSP (nonsmooth problems)
Global Optimization portal by Arnold Neumaier
Retrieved from "http://openopt.org/GLP"
Personal tools
    Latest OOSuite 0.38

    from 2012-03-15

    Next release:

    2012-06-15

    Development