.. only:: html
.. note::
:class: sphx-glr-download-link-note
Click :ref:`here ` to download the full example code
.. rst-class:: sphx-glr-example-title
.. _sphx_glr_auto_numerical_methods_optimization_plot_optimization_dlib.py:
Optimization using dlib
=======================
In this example we are going to explore optimization using OpenTURNS' `dlib `_ interface.
.. code-block:: default
from __future__ import print_function
import openturns as ot
import openturns.viewer as viewer
from matplotlib import pylab as plt
ot.Log.Show(ot.Log.NONE)
List available algorithms
.. code-block:: default
for algo in ot.Dlib.GetAlgorithmNames():
print(algo)
.. rst-class:: sphx-glr-script-out
Out:
.. code-block:: none
CG
BFGS
LBFGS
Newton
Global
LSQ
LSQLM
TrustRegion
More details on dlib algorithms are available `here `_ .
Solving an unconstrained problem with conjugate gradient algorithm
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
The following example will demonstrate the use of dlib conjugate gradient algorithm to find the minimum of `Rosenbrock function `_. The optimal point can be computed analytically, and its value is [1.0, 1.0].
Define the problem based on Rosebrock function
.. code-block:: default
rosenbrock = ot.SymbolicFunction(['x1', 'x2'], ['(1-x1)^2+(x2-x1^2)^2'])
problem = ot.OptimizationProblem(rosenbrock)
The optimization algorithm is instanciated from the problem to solve and the name of the algorithm
.. code-block:: default
algo = ot.Dlib(problem,'CG')
print("Dlib algorithm, type ", algo.getAlgorithmName())
print("Maximum iteration number: ", algo.getMaximumIterationNumber())
print("Maximum evaluation number: ", algo.getMaximumEvaluationNumber())
print("Maximum absolute error: ", algo.getMaximumAbsoluteError())
print("Maximum relative error: ", algo.getMaximumRelativeError())
print("Maximum residual error: ", algo.getMaximumResidualError())
print("Maximum constraint error: ", algo.getMaximumConstraintError())
.. rst-class:: sphx-glr-script-out
Out:
.. code-block:: none
Dlib algorithm, type CG
Maximum iteration number: 100
Maximum evaluation number: 1000
Maximum absolute error: 1e-05
Maximum relative error: 1e-05
Maximum residual error: 1e-05
Maximum constraint error: 1e-05
When using conjugate gradient, BFGS/LBFGS, Newton, least squares or trust region methods, optimization proceeds until one of the following criteria is met:
- the errors (absolute, relative, residual, constraint) are all below the limits set by the user ;
- the process reaches the maximum number of iterations or function evaluations.
Adjust number of iterations/evaluations
.. code-block:: default
algo.setMaximumIterationNumber(1000)
algo.setMaximumEvaluationNumber(10000)
algo.setMaximumAbsoluteError(1e-3)
algo.setMaximumRelativeError(1e-3)
algo.setMaximumResidualError(1e-3)
Solve the problem
.. code-block:: default
startingPoint = [1.5, 0.5]
algo.setStartingPoint(startingPoint)
algo.run()
Retrieve results
.. code-block:: default
result = algo.getResult()
print('x^ = ', result.getOptimalPoint())
print("f(x^) = ", result.getOptimalValue())
print("Iteration number: ", result.getIterationNumber())
print("Evaluation number: ", result.getEvaluationNumber())
print("Absolute error: ", result.getAbsoluteError())
print("Relative error: ", result.getRelativeError())
print("Residual error: ", result.getResidualError())
print("Constraint error: ", result.getConstraintError())
.. rst-class:: sphx-glr-script-out
Out:
.. code-block:: none
x^ = [0.995311,0.989195]
f(x^) = [2.4084e-05]
Iteration number: 41
Evaluation number: 85
Absolute error: 0.0009776096028751445
Relative error: 0.0006966679389276846
Residual error: 4.302851151659242e-06
Constraint error: 0.0
Solving problem with bounds, using LBFGS strategy
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In the following example, the input variables will be bounded so the function global optimal point is not included in the search interval.
The problem will be solved using LBFGS strategy, which allows the user to limit the amount of memory used by the optimization process.
Define the bounds and the problem
.. code-block:: default
bounds = ot.Interval([0.0, 0.0], [0.8, 2.0])
boundedProblem = ot.OptimizationProblem(rosenbrock,ot.Function(),ot.Function(),bounds)
Define the Dlib algorithm
.. code-block:: default
boundedAlgo = ot.Dlib(boundedProblem,"LBFGS")
boundedAlgo.setMaxSize(15) # Default value for LBFGS' maxSize parameter is 10
startingPoint = [0.5, 1.5]
boundedAlgo.setStartingPoint(startingPoint)
boundedAlgo.run()
Retrieve results
.. code-block:: default
result = boundedAlgo.getResult()
print('x^ = ', result.getOptimalPoint())
print("f(x^) = ", result.getOptimalValue())
print("Iteration number: ", result.getIterationNumber())
print("Evaluation number: ", result.getEvaluationNumber())
print("Absolute error: ", result.getAbsoluteError())
print("Relative error: ", result.getRelativeError())
print("Residual error: ", result.getResidualError())
print("Constraint error: ", result.getConstraintError())
.. rst-class:: sphx-glr-script-out
Out:
.. code-block:: none
x^ = [0.8,0.64]
f(x^) = [0.04]
Iteration number: 6
Evaluation number: 10
Absolute error: 0.0
Relative error: 0.0
Residual error: 0.0
Constraint error: 0.0
**Remark:**
The bounds defined for input variables are always strictly respected when using dlib algorithms. Consequently, the constraint error is always 0.
Draw optimal value history
.. code-block:: default
graph = result.drawOptimalValueHistory()
view = viewer.View(graph)
.. image:: /auto_numerical_methods/optimization/images/sphx_glr_plot_optimization_dlib_001.png
:alt: Optimal value history
:class: sphx-glr-single-img
Solving least squares problem
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
In least squares problem, the user provides the residual function to minimize. Here the underlying OptimizationProblem is defined as a LeastSquaresProblem.
dlib least squares algorithms use the same stop criteria as CG, BFGS/LBFGS and Newton algorithms. However, optimization will stop earlier if no significant improvement can be achieved during the process.
Define residual function
.. code-block:: default
import numpy as np
n = 3
m = 20
x = [[0.5 + 0.1*i] for i in range(m)]
model = ot.SymbolicFunction(['a', 'b', 'c', 'x'], ['a + b * exp(-c *x^2)'])
p_ref = [2.8, 1.2, 0.5] # Reference a, b, c
modelx = ot.ParametricFunction(model, [0, 1, 2], p_ref)
# Generate reference sample (with normal noise)
y = np.multiply(modelx(x), np.random.normal(1.0,0.05,m))
def residualFunction(params):
modelx = ot.ParametricFunction(model, [0, 1, 2], params)
return [modelx(x[i])[0] - y[i, 0] for i in range(m)]
Definition of residual as ot.PythonFunction and optimization problem
.. code-block:: default
residual = ot.PythonFunction(n,m,residualFunction)
lsqProblem = ot.LeastSquaresProblem(residual)
Definition of Dlib solver, setting starting point
.. code-block:: default
lsqAlgo = ot.Dlib(lsqProblem, "LSQ")
lsqAlgo.setStartingPoint([0.0,0.0,0.0])
lsqAlgo.run()
Retrieve results
.. code-block:: default
result = lsqAlgo.getResult()
print('x^ = ', result.getOptimalPoint())
print("f(x^) = ", result.getOptimalValue())
print("Iteration number: ", result.getIterationNumber())
print("Evaluation number: ", result.getEvaluationNumber())
print("Absolute error: ", result.getAbsoluteError())
print("Relative error: ", result.getRelativeError())
print("Residual error: ", result.getResidualError())
print("Constraint error: ", result.getConstraintError())
.. rst-class:: sphx-glr-script-out
Out:
.. code-block:: none
x^ = [2.96227,1.26954,0.5]
f(x^) = [3.3287e-27]
Iteration number: 7
Evaluation number: 8
Absolute error: 1.9163904173253564e-07
Relative error: 5.875962531713081e-08
Residual error: 1.939643249220384e-14
Constraint error: 0.0
Draw errors history
.. code-block:: default
graph = result.drawErrorHistory()
view = viewer.View(graph)
.. image:: /auto_numerical_methods/optimization/images/sphx_glr_plot_optimization_dlib_002.png
:alt: Error history
:class: sphx-glr-single-img
Draw optimal value history
.. code-block:: default
graph = result.drawOptimalValueHistory()
view = viewer.View(graph)
plt.show()
.. image:: /auto_numerical_methods/optimization/images/sphx_glr_plot_optimization_dlib_003.png
:alt: Optimal value history
:class: sphx-glr-single-img
.. rst-class:: sphx-glr-timing
**Total running time of the script:** ( 0 minutes 0.386 seconds)
.. _sphx_glr_download_auto_numerical_methods_optimization_plot_optimization_dlib.py:
.. only :: html
.. container:: sphx-glr-footer
:class: sphx-glr-footer-example
.. container:: sphx-glr-download sphx-glr-download-python
:download:`Download Python source code: plot_optimization_dlib.py `
.. container:: sphx-glr-download sphx-glr-download-jupyter
:download:`Download Jupyter notebook: plot_optimization_dlib.ipynb `
.. only:: html
.. rst-class:: sphx-glr-signature
`Gallery generated by Sphinx-Gallery `_