HiGHS¶
- class HiGHS(*args)¶
Linear optimization solver.
Warning
This class is experimental and likely to be modified in future releases. To use it, import the
openturns.experimentalsubmodule.This class exposes the linear solver from the HiGHS library. It is only meant to be used with linear mixed-integer optimization problems.
- Parameters:
- problem
OptimizationProblem The problem
- problem
Methods
Accessor to check status flag.
Accessor to the object's name.
Accessor to maximum allowed absolute error.
Accessor to maximum allowed number of calls.
Accessor to maximum allowed constraint error.
Accessor to maximum allowed number of iterations.
Accessor to maximum allowed relative error.
Accessor to maximum allowed residual error.
Accessor to the maximum duration.
getName()Accessor to the object's name.
Accessor to optimization problem.
Accessor to optimization result.
Accessor to starting point.
Accessor to starting sample.
hasName()Test if the object is named.
run()Launch the optimization.
setCheckStatus(checkStatus)Accessor to check status flag.
setMaximumAbsoluteError(maximumAbsoluteError)Accessor to maximum allowed absolute error.
setMaximumCallsNumber(maximumCallsNumber)Accessor to maximum allowed number of calls
setMaximumConstraintError(maximumConstraintError)Accessor to maximum allowed constraint error.
setMaximumIterationNumber(maximumIterationNumber)Accessor to maximum allowed number of iterations.
setMaximumRelativeError(maximumRelativeError)Accessor to maximum allowed relative error.
setMaximumResidualError(maximumResidualError)Accessor to maximum allowed residual error.
setMaximumTimeDuration(maximumTime)Accessor to the maximum duration.
setName(name)Accessor to the object's name.
setProblem(problem)Accessor to optimization problem.
setProgressCallback(*args)Set up a progress callback.
setResult(result)Accessor to optimization result.
setStartingPoint(startingPoint)Accessor to starting point.
setStartingSample(startingSample)Accessor to starting sample.
setStopCallback(*args)Set up a stop callback.
See also
Notes
HiGHS solver can be adapted using the parameters described here. These parameters can be modified through the
ResourceMap. For every optionoptionName, simply add a key namedHiGHS-optionNamewith the value to use, as shown below:>>> import openturns as ot >>> ot.ResourceMap.AddAsBool('HiGHS-output_flag', True) >>> ot.ResourceMap.AddAsUnsignedInteger('HiGHS-threads', 4)
HiGHS prefers to solve problems with at least one integer (or binary) variable, but this is not actually enforced.
Examples
Solve a mixed integer linear optimization problem with objective:
with inequality constraints:
and bound constraints:
>>> import openturns as ot >>> import openturns.experimental as otexp >>> cost = [1.1, 1.0] >>> bounds = ot.Interval([0.0, 1.0], [4.0, 1e30]) >>> A = ot.Matrix([[0.0, 1.0], [1.0, 2.0], [3.0, 2.0]]) >>> cb = ot.Interval([-1e9, 5.0, 6.0], [7.0, 15.0, 1e9]) >>> problem = otexp.LinearProblem(cost, bounds, A, cb) >>> problem.setVariablesType([ot.OptimizationProblemImplementation.INTEGER] * 2) >>> algo = otexp.HiGHS(problem) >>> algo.run() >>> result = algo.getResult() >>> x_star = result.getOptimalPoint() >>> y_star = result.getOptimalValue()
- __init__(*args)¶
- getCheckStatus()¶
Accessor to check status flag.
- Returns:
- checkStatusbool
Whether to check the termination status. If set to False,
run()will not throw an exception if the algorithm does not fully converge and will allow one to still find a feasible candidate.
- getClassName()¶
Accessor to the object’s name.
- Returns:
- class_namestr
The object class name (object.__class__.__name__).
- getMaximumAbsoluteError()¶
Accessor to maximum allowed absolute error.
- Returns:
- maximumAbsoluteErrorfloat
Maximum allowed absolute error, where the absolute error is defined by
where
and
are two consecutive approximations of the optimum.
- getMaximumCallsNumber()¶
Accessor to maximum allowed number of calls.
- Returns:
- maximumEvaluationNumberint
Maximum allowed number of direct objective function calls through the () operator. Does not take into account eventual indirect calls through finite difference gradient calls.
- getMaximumConstraintError()¶
Accessor to maximum allowed constraint error.
- Returns:
- maximumConstraintErrorfloat
Maximum allowed constraint error, where the constraint error is defined by
where
is the current approximation of the optimum and
is the function that gathers all the equality and inequality constraints (violated values only)
- getMaximumIterationNumber()¶
Accessor to maximum allowed number of iterations.
- Returns:
- maximumIterationNumberint
Maximum allowed number of iterations.
- getMaximumRelativeError()¶
Accessor to maximum allowed relative error.
- Returns:
- maximumRelativeErrorfloat
Maximum allowed relative error, where the relative error is defined by
if
, else
.
- getMaximumResidualError()¶
Accessor to maximum allowed residual error.
- Returns:
- maximumResidualErrorfloat
Maximum allowed residual error, where the residual error is defined by
if
, else
.
- getMaximumTimeDuration()¶
Accessor to the maximum duration.
- Returns:
- maximumTimefloat
Maximum optimization duration in seconds.
- getName()¶
Accessor to the object’s name.
- Returns:
- namestr
The name of the object.
- getProblem()¶
Accessor to optimization problem.
- Returns:
- problem
OptimizationProblem Optimization problem.
- problem
- getResult()¶
Accessor to optimization result.
- Returns:
- result
OptimizationResult Result class.
- result
- hasName()¶
Test if the object is named.
- Returns:
- hasNamebool
True if the name is not empty.
- run()¶
Launch the optimization.
- setCheckStatus(checkStatus)¶
Accessor to check status flag.
- Parameters:
- checkStatusbool
Whether to check the termination status. If set to False,
run()will not throw an exception if the algorithm does not fully converge and will allow one to still find a feasible candidate.
- setMaximumAbsoluteError(maximumAbsoluteError)¶
Accessor to maximum allowed absolute error.
- Parameters:
- maximumAbsoluteErrorfloat
Maximum allowed absolute error, where the absolute error is defined by
where
and
are two consecutive approximations of the optimum.
- setMaximumCallsNumber(maximumCallsNumber)¶
Accessor to maximum allowed number of calls
- Parameters:
- maximumEvaluationNumberint
Maximum allowed number of direct objective function calls through the () operator. Does not take into account eventual indirect calls through finite difference gradient calls.
- setMaximumConstraintError(maximumConstraintError)¶
Accessor to maximum allowed constraint error.
- Parameters:
- maximumConstraintErrorfloat
Maximum allowed constraint error, where the constraint error is defined by
where
is the current approximation of the optimum and
is the function that gathers all the equality and inequality constraints (violated values only)
- setMaximumIterationNumber(maximumIterationNumber)¶
Accessor to maximum allowed number of iterations.
- Parameters:
- maximumIterationNumberint
Maximum allowed number of iterations.
- setMaximumRelativeError(maximumRelativeError)¶
Accessor to maximum allowed relative error.
- Parameters:
- maximumRelativeErrorfloat
Maximum allowed relative error, where the relative error is defined by
if
, else
.
- setMaximumResidualError(maximumResidualError)¶
Accessor to maximum allowed residual error.
- Parameters:
- maximumResidualErrorfloat
Maximum allowed residual error, where the residual error is defined by
if
, else
.
- setMaximumTimeDuration(maximumTime)¶
Accessor to the maximum duration.
- Parameters:
- maximumTimefloat
Maximum optimization duration in seconds.
- setName(name)¶
Accessor to the object’s name.
- Parameters:
- namestr
The name of the object.
- setProblem(problem)¶
Accessor to optimization problem.
- Parameters:
- problem
OptimizationProblem Optimization problem.
- problem
- setProgressCallback(*args)¶
Set up a progress callback.
Can be used to programmatically report the progress of an optimization.
- Parameters:
- callbackcallable
Takes a float as argument as percentage of progress.
Examples
>>> import sys >>> import openturns as ot >>> rosenbrock = ot.SymbolicFunction(['x1', 'x2'], ['(1-x1)^2+100*(x2-x1^2)^2']) >>> problem = ot.OptimizationProblem(rosenbrock) >>> solver = ot.OptimizationAlgorithm(problem) >>> solver.setStartingPoint([0, 0]) >>> solver.setMaximumResidualError(1.e-3) >>> solver.setMaximumCallsNumber(10000) >>> def report_progress(progress): ... sys.stderr.write('-- progress=' + str(progress) + '%\n') >>> solver.setProgressCallback(report_progress) >>> solver.run()
- setResult(result)¶
Accessor to optimization result.
- Parameters:
- result
OptimizationResult Result class.
- result
- setStartingPoint(startingPoint)¶
Accessor to starting point.
- Parameters:
- startingPoint
Point Starting point.
- startingPoint
- setStartingSample(startingSample)¶
Accessor to starting sample.
- Parameters:
- startingSample
Sample Starting sample.
- startingSample
- setStopCallback(*args)¶
Set up a stop callback.
Can be used to programmatically stop an optimization.
- Parameters:
- callbackcallable
Returns an int deciding whether to stop or continue.
Examples
>>> import openturns as ot >>> rosenbrock = ot.SymbolicFunction(['x1', 'x2'], ['(1-x1)^2+100*(x2-x1^2)^2']) >>> problem = ot.OptimizationProblem(rosenbrock) >>> solver = ot.OptimizationAlgorithm(problem) >>> solver.setStartingPoint([0, 0]) >>> solver.setMaximumResidualError(1.e-3) >>> solver.setMaximumCallsNumber(10000) >>> def ask_stop(): ... return True >>> solver.setStopCallback(ask_stop) >>> solver.run()
OpenTURNS