NLopt¶
-
class
NLopt
(*args)¶ Interface to NLopt.
This class exposes the solvers from the non-linear optimization library [nlopt2009].
More details about available algorithms are available here.
- Parameters
- problem
OptimizationProblem
Optimization problem to solve.
- algoNamestr
The NLopt identifier of the algorithm. Use
GetAlgorithmNames()
to list available names.
- problem
See also
AbdoRackwitz
,Cobyla
,SQP
,TNC
Notes
Here are some properties of the different algorithms:
Algorithm
Derivative info
Constraint support
AUGLAG
no derivative
all
AUGLAG_EQ
no derivative
all
GD_MLSL
first derivative
bounds required
GD_MLSL_LDS
first derivative
bounds required
GD_STOGO (optional)
first derivative
bounds required
GD_STOGO_RAND (optional)
first derivative
bounds required
GN_AGS (optional)
no derivative
bounds required, inequality
GN_CRS2_LM
no derivative
bounds required
GN_DIRECT
no derivative
bounds required
GN_DIRECT_L
no derivative
bounds required
GN_DIRECT_L_NOSCAL
no derivative
bounds required
GN_DIRECT_L_RAND
no derivative
bounds required
GN_DIRECT_L_RAND_NOSCAL
no derivative
bounds required
GN_ESCH
no derivative
bounds required
GN_ISRES
no derivative
bounds required, all
GN_MLSL
no derivative
bounds required
GN_MLSL_LDS
no derivative
bounds required
GN_ORIG_DIRECT
no derivative
bounds required, inequality
GN_ORIG_DIRECT_L
no derivative
bounds required, inequality
G_MLSL
no derivative
bounds required
G_MLSL_LDS
no derivative
bounds required
LD_AUGLAG
first derivative
all
LD_AUGLAG_EQ
first derivative
all
LD_CCSAQ
first derivative
bounds, inequality
LD_LBFGS
first derivative
bounds
LD_MMA
first derivative
bounds, inequality
LD_SLSQP
first derivative
all
LD_TNEWTON
first derivative
bounds
LD_TNEWTON_PRECOND
first derivative
bounds
LD_TNEWTON_PRECOND_RESTART
first derivative
bounds
LD_TNEWTON_RESTART
first derivative
bounds
LD_VAR1
first derivative
bounds
LD_VAR2
first derivative
bounds
LN_AUGLAG
no derivative
all
LN_AUGLAG_EQ
no derivative
all
LN_BOBYQA
no derivative
bounds
LN_COBYLA
no derivative
all
LN_NELDERMEAD
no derivative
bounds
LN_NEWUOA
no derivative
bounds
LN_NEWUOA_BOUND
no derivative
bounds
LN_PRAXIS
no derivative
bounds
LN_SBPLX
no derivative
bounds
Availability of algorithms marked as optional may vary depending on the NLopt version or compilation options used.
Examples
>>> import openturns as ot >>> dim = 4 >>> bounds = ot.Interval([-3.0] * dim, [5.0] * dim) >>> linear = ot.SymbolicFunction(['x1', 'x2', 'x3', 'x4'], ['x1+2*x2-3*x3+4*x4']) >>> problem = ot.OptimizationProblem(linear, ot.Function(), ot.Function(), bounds) >>> print(ot.NLopt.GetAlgorithmNames()) [AUGLAG,AUGLAG_EQ,GD_MLSL,GD_MLSL_LDS,... >>> algo = ot.NLopt(problem, 'LD_MMA') >>> algo.setStartingPoint([0.0] * 4) >>> algo.run() >>> result = algo.getResult() >>> x_star = result.getOptimalPoint() >>> y_star = result.getOptimalValue()
Methods
Accessor to the list of algorithms provided by NLopt, by names.
Ask whether NLopt support is available.
SetSeed
(seed)Initialize the random generator seed.
computeLagrangeMultipliers
(self, x)Compute the Lagrange multipliers of a problem at a given point.
getAlgorithmName
(self)Accessor to the algorithm name.
getClassName
(self)Accessor to the object’s name.
getId
(self)Accessor to the object’s id.
getInitialStep
(self)Initial local derivative-free algorithms step accessor.
getLocalSolver
(self)Local solver accessor.
getMaximumAbsoluteError
(self)Accessor to maximum allowed absolute error.
Accessor to maximum allowed constraint error.
Accessor to maximum allowed number of evaluations.
Accessor to maximum allowed number of iterations.
getMaximumRelativeError
(self)Accessor to maximum allowed relative error.
getMaximumResidualError
(self)Accessor to maximum allowed residual error.
getName
(self)Accessor to the object’s name.
getProblem
(self)Accessor to optimization problem.
getResult
(self)Accessor to optimization result.
getShadowedId
(self)Accessor to the object’s shadowed id.
getStartingPoint
(self)Accessor to starting point.
getVerbose
(self)Accessor to the verbosity flag.
getVisibility
(self)Accessor to the object’s visibility state.
hasName
(self)Test if the object is named.
hasVisibleName
(self)Test if the object has a distinguishable name.
run
(self)Launch the optimization.
setAlgorithmName
(self, algoName)Accessor to the algorithm name.
setInitialStep
(self, initialStep)Initial local derivative-free algorithms step accessor.
setLocalSolver
(self, localSolver)Local solver accessor.
setMaximumAbsoluteError
(self, …)Accessor to maximum allowed absolute error.
setMaximumConstraintError
(self, …)Accessor to maximum allowed constraint error.
setMaximumEvaluationNumber
(self, …)Accessor to maximum allowed number of evaluations.
setMaximumIterationNumber
(self, …)Accessor to maximum allowed number of iterations.
setMaximumRelativeError
(self, …)Accessor to maximum allowed relative error.
setMaximumResidualError
(self, …)Accessor to maximum allowed residual error.
setName
(self, name)Accessor to the object’s name.
setProblem
(self, problem)Accessor to optimization problem.
setProgressCallback
(self, \*args)Set up a progress callback.
setResult
(self, result)Accessor to optimization result.
setShadowedId
(self, id)Accessor to the object’s shadowed id.
setStartingPoint
(self, startingPoint)Accessor to starting point.
setStopCallback
(self, \*args)Set up a stop callback.
setVerbose
(self, verbose)Accessor to the verbosity flag.
setVisibility
(self, visible)Accessor to the object’s visibility state.
-
__init__
(self, \*args)¶ Initialize self. See help(type(self)) for accurate signature.
-
static
GetAlgorithmNames
()¶ Accessor to the list of algorithms provided by NLopt, by names.
- Returns
- names
Description
List of algorithm names provided by NLopt, according to its naming convention.
- names
Examples
>>> import openturns as ot >>> print(ot.NLopt.GetAlgorithmNames()) [AUGLAG,AUGLAG_EQ,GD_MLSL,...
-
static
IsAvailable
()¶ Ask whether NLopt support is available.
- Returns
- availablebool
Whether NLopt support is available.
-
static
SetSeed
(seed)¶ Initialize the random generator seed.
- Parameters
- seedint
The RNG seed.
-
computeLagrangeMultipliers
(self, x)¶ Compute the Lagrange multipliers of a problem at a given point.
- Parameters
- xsequence of float
Point at which the Lagrange multipliers are computed.
- Returns
- lagrangeMultipliersequence of float
Lagrange multipliers of the problem at the given point.
Notes
The Lagrange multipliers are associated with the following Lagrangian formulation of the optimization problem:
where .
- The Lagrange multipliers are stored as , where:
is of dimension 0 if there is no equality constraint, else of dimension the dimension of ie the number of scalar equality constraints
and are of dimension 0 if there is no bound constraint, else of dimension of
is of dimension 0 if there is no inequality constraint, else of dimension the dimension of ie the number of scalar inequality constraints
The vector is solution of the following linear system:
If there is no constraint of any kind, is of dimension 0, as well as if no constraint is active.
-
getAlgorithmName
(self)¶ Accessor to the algorithm name.
- Returns
- algoNamestr
The NLopt identifier of the algorithm.
-
getClassName
(self)¶ Accessor to the object’s name.
- Returns
- class_namestr
The object class name (object.__class__.__name__).
-
getId
(self)¶ Accessor to the object’s id.
- Returns
- idint
Internal unique identifier.
-
getInitialStep
(self)¶ Initial local derivative-free algorithms step accessor.
- Returns
- dx
Point
The initial step.
- dx
-
getMaximumAbsoluteError
(self)¶ Accessor to maximum allowed absolute error.
- Returns
- maximumAbsoluteErrorfloat
Maximum allowed absolute error, where the absolute error is defined by where and are two consecutive approximations of the optimum.
-
getMaximumConstraintError
(self)¶ Accessor to maximum allowed constraint error.
- Returns
- maximumConstraintErrorfloat
Maximum allowed constraint error, where the constraint error is defined by where is the current approximation of the optimum and is the function that gathers all the equality and inequality constraints (violated values only)
-
getMaximumEvaluationNumber
(self)¶ Accessor to maximum allowed number of evaluations.
- Returns
- Nint
Maximum allowed number of evaluations.
-
getMaximumIterationNumber
(self)¶ Accessor to maximum allowed number of iterations.
- Returns
- Nint
Maximum allowed number of iterations.
-
getMaximumRelativeError
(self)¶ Accessor to maximum allowed relative error.
- Returns
- maximumRelativeErrorfloat
Maximum allowed relative error, where the relative error is defined by if , else .
-
getMaximumResidualError
(self)¶ Accessor to maximum allowed residual error.
- Returns
- maximumResidualErrorfloat
Maximum allowed residual error, where the residual error is defined by if , else .
-
getName
(self)¶ Accessor to the object’s name.
- Returns
- namestr
The name of the object.
-
getProblem
(self)¶ Accessor to optimization problem.
- Returns
- problem
OptimizationProblem
Optimization problem.
- problem
-
getResult
(self)¶ Accessor to optimization result.
- Returns
- result
OptimizationResult
Result class.
- result
-
getShadowedId
(self)¶ Accessor to the object’s shadowed id.
- Returns
- idint
Internal unique identifier.
-
getVerbose
(self)¶ Accessor to the verbosity flag.
- Returns
- verbosebool
Verbosity flag state.
-
getVisibility
(self)¶ Accessor to the object’s visibility state.
- Returns
- visiblebool
Visibility flag.
-
hasName
(self)¶ Test if the object is named.
- Returns
- hasNamebool
True if the name is not empty.
-
hasVisibleName
(self)¶ Test if the object has a distinguishable name.
- Returns
- hasVisibleNamebool
True if the name is not empty and not the default one.
-
run
(self)¶ Launch the optimization.
-
setAlgorithmName
(self, algoName)¶ Accessor to the algorithm name.
- Parameters
- algoNamestr
The NLopt identifier of the algorithm.
-
setInitialStep
(self, initialStep)¶ Initial local derivative-free algorithms step accessor.
- Parameters
- dxsequence of float
The initial step.
-
setMaximumAbsoluteError
(self, maximumAbsoluteError)¶ Accessor to maximum allowed absolute error.
- Parameters
- maximumAbsoluteErrorfloat
Maximum allowed absolute error, where the absolute error is defined by where and are two consecutive approximations of the optimum.
-
setMaximumConstraintError
(self, maximumConstraintError)¶ Accessor to maximum allowed constraint error.
- Parameters
- maximumConstraintErrorfloat
Maximum allowed constraint error, where the constraint error is defined by where is the current approximation of the optimum and is the function that gathers all the equality and inequality constraints (violated values only)
-
setMaximumEvaluationNumber
(self, maximumEvaluationNumber)¶ Accessor to maximum allowed number of evaluations.
- Parameters
- Nint
Maximum allowed number of evaluations.
-
setMaximumIterationNumber
(self, maximumIterationNumber)¶ Accessor to maximum allowed number of iterations.
- Parameters
- Nint
Maximum allowed number of iterations.
-
setMaximumRelativeError
(self, maximumRelativeError)¶ Accessor to maximum allowed relative error.
- Parameters
- maximumRelativeErrorfloat
Maximum allowed relative error, where the relative error is defined by if , else .
-
setMaximumResidualError
(self, maximumResidualError)¶ Accessor to maximum allowed residual error.
- Parameters
- Maximum allowed residual error, where the residual error is defined by
if , else .
-
setName
(self, name)¶ Accessor to the object’s name.
- Parameters
- namestr
The name of the object.
-
setProblem
(self, problem)¶ Accessor to optimization problem.
- Parameters
- problem
OptimizationProblem
Optimization problem.
- problem
-
setProgressCallback
(self, \*args)¶ Set up a progress callback.
Can be used to programmatically report the progress of an optimization.
- Parameters
- callbackcallable
Takes a float as argument as percentage of progress.
Examples
>>> import sys >>> import openturns as ot >>> rosenbrock = ot.SymbolicFunction(['x1', 'x2'], ['(1-x1)^2+100*(x2-x1^2)^2']) >>> problem = ot.OptimizationProblem(rosenbrock) >>> solver = ot.OptimizationAlgorithm(problem) >>> solver.setStartingPoint([0, 0]) >>> solver.setMaximumResidualError(1.e-3) >>> solver.setMaximumIterationNumber(100) >>> def report_progress(progress): ... sys.stderr.write('-- progress=' + str(progress) + '%\n') >>> solver.setProgressCallback(report_progress) >>> solver.run()
-
setResult
(self, result)¶ Accessor to optimization result.
- Parameters
- result
OptimizationResult
Result class.
- result
-
setShadowedId
(self, id)¶ Accessor to the object’s shadowed id.
- Parameters
- idint
Internal unique identifier.
-
setStartingPoint
(self, startingPoint)¶ Accessor to starting point.
- Parameters
- startingPoint
Point
Starting point.
- startingPoint
-
setStopCallback
(self, \*args)¶ Set up a stop callback.
Can be used to programmatically stop an optimization.
- Parameters
- callbackcallable
Returns an int deciding whether to stop or continue.
Examples
>>> import openturns as ot >>> rosenbrock = ot.SymbolicFunction(['x1', 'x2'], ['(1-x1)^2+100*(x2-x1^2)^2']) >>> problem = ot.OptimizationProblem(rosenbrock) >>> solver = ot.OptimizationAlgorithm(problem) >>> solver.setStartingPoint([0, 0]) >>> solver.setMaximumResidualError(1.e-3) >>> solver.setMaximumIterationNumber(100) >>> def ask_stop(): ... return True >>> solver.setStopCallback(ask_stop) >>> solver.run()
-
setVerbose
(self, verbose)¶ Accessor to the verbosity flag.
- Parameters
- verbosebool
Verbosity flag state.
-
setVisibility
(self, visible)¶ Accessor to the object’s visibility state.
- Parameters
- visiblebool
Visibility flag.