.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_numerical_methods/optimization/plot_optimization_dlib.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_numerical_methods_optimization_plot_optimization_dlib.py: Optimization using dlib ======================= .. GENERATED FROM PYTHON SOURCE LINES 6-7 In this example we are going to explore optimization using OpenTURNS' `dlib `_ interface. .. GENERATED FROM PYTHON SOURCE LINES 9-16 .. code-block:: default from __future__ import print_function import numpy as np import openturns as ot import openturns.viewer as viewer from matplotlib import pylab as plt ot.Log.Show(ot.Log.NONE) .. GENERATED FROM PYTHON SOURCE LINES 17-18 List available algorithms .. GENERATED FROM PYTHON SOURCE LINES 18-21 .. code-block:: default for algo in ot.Dlib.GetAlgorithmNames(): print(algo) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none cg bfgs lbfgs newton global least_squares least_squares_lm trust_region .. GENERATED FROM PYTHON SOURCE LINES 22-23 More details on dlib algorithms are available `here `_ . .. GENERATED FROM PYTHON SOURCE LINES 25-29 Solving an unconstrained problem with conjugate gradient algorithm ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ The following example will demonstrate the use of dlib conjugate gradient algorithm to find the minimum of `Rosenbrock function `_. The optimal point can be computed analytically, and its value is [1.0, 1.0]. .. GENERATED FROM PYTHON SOURCE LINES 31-32 Define the problem based on Rosebrock function .. GENERATED FROM PYTHON SOURCE LINES 32-35 .. code-block:: default rosenbrock = ot.SymbolicFunction(['x1', 'x2'], ['(1-x1)^2+(x2-x1^2)^2']) problem = ot.OptimizationProblem(rosenbrock) .. GENERATED FROM PYTHON SOURCE LINES 36-37 The optimization algorithm is instanciated from the problem to solve and the name of the algorithm .. GENERATED FROM PYTHON SOURCE LINES 37-46 .. code-block:: default algo = ot.Dlib(problem, 'cg') print("Dlib algorithm, type ", algo.getAlgorithmName()) print("Maximum iteration number: ", algo.getMaximumIterationNumber()) print("Maximum evaluation number: ", algo.getMaximumEvaluationNumber()) print("Maximum absolute error: ", algo.getMaximumAbsoluteError()) print("Maximum relative error: ", algo.getMaximumRelativeError()) print("Maximum residual error: ", algo.getMaximumResidualError()) print("Maximum constraint error: ", algo.getMaximumConstraintError()) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Dlib algorithm, type cg Maximum iteration number: 100 Maximum evaluation number: 1000 Maximum absolute error: 1e-05 Maximum relative error: 1e-05 Maximum residual error: 1e-05 Maximum constraint error: 1e-05 .. GENERATED FROM PYTHON SOURCE LINES 47-51 When using conjugate gradient, BFGS/LBFGS, Newton, least squares or trust region methods, optimization proceeds until one of the following criteria is met: - the errors (absolute, relative, residual, constraint) are all below the limits set by the user ; - the process reaches the maximum number of iterations or function evaluations. .. GENERATED FROM PYTHON SOURCE LINES 53-54 Adjust number of iterations/evaluations .. GENERATED FROM PYTHON SOURCE LINES 54-60 .. code-block:: default algo.setMaximumIterationNumber(1000) algo.setMaximumEvaluationNumber(10000) algo.setMaximumAbsoluteError(1e-3) algo.setMaximumRelativeError(1e-3) algo.setMaximumResidualError(1e-3) .. GENERATED FROM PYTHON SOURCE LINES 61-62 Solve the problem .. GENERATED FROM PYTHON SOURCE LINES 62-67 .. code-block:: default startingPoint = [1.5, 0.5] algo.setStartingPoint(startingPoint) algo.run() .. GENERATED FROM PYTHON SOURCE LINES 68-69 Retrieve results .. GENERATED FROM PYTHON SOURCE LINES 69-79 .. code-block:: default result = algo.getResult() print('x^ = ', result.getOptimalPoint()) print("f(x^) = ", result.getOptimalValue()) print("Iteration number: ", result.getIterationNumber()) print("Evaluation number: ", result.getEvaluationNumber()) print("Absolute error: ", result.getAbsoluteError()) print("Relative error: ", result.getRelativeError()) print("Residual error: ", result.getResidualError()) print("Constraint error: ", result.getConstraintError()) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none x^ = [0.995311,0.989195] f(x^) = [2.4084e-05] Iteration number: 41 Evaluation number: 85 Absolute error: 0.0009776096028751445 Relative error: 0.0006966679389276845 Residual error: 4.302851151659242e-06 Constraint error: 0.0 .. GENERATED FROM PYTHON SOURCE LINES 80-86 Solving problem with bounds, using LBFGS strategy ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ In the following example, the input variables will be bounded so the function global optimal point is not included in the search interval. The problem will be solved using LBFGS strategy, which allows the user to limit the amount of memory used by the optimization process. .. GENERATED FROM PYTHON SOURCE LINES 88-89 Define the bounds and the problem .. GENERATED FROM PYTHON SOURCE LINES 89-93 .. code-block:: default bounds = ot.Interval([0.0, 0.0], [0.8, 2.0]) boundedProblem = ot.OptimizationProblem( rosenbrock, ot.Function(), ot.Function(), bounds) .. GENERATED FROM PYTHON SOURCE LINES 94-95 Define the Dlib algorithm .. GENERATED FROM PYTHON SOURCE LINES 95-103 .. code-block:: default boundedAlgo = ot.Dlib(boundedProblem, "lbfgs") boundedAlgo.setMaxSize(15) # Default value for LBFGS' maxSize parameter is 10 startingPoint = [0.5, 1.5] boundedAlgo.setStartingPoint(startingPoint) boundedAlgo.run() .. GENERATED FROM PYTHON SOURCE LINES 104-105 Retrieve results .. GENERATED FROM PYTHON SOURCE LINES 105-115 .. code-block:: default result = boundedAlgo.getResult() print('x^ = ', result.getOptimalPoint()) print("f(x^) = ", result.getOptimalValue()) print("Iteration number: ", result.getIterationNumber()) print("Evaluation number: ", result.getEvaluationNumber()) print("Absolute error: ", result.getAbsoluteError()) print("Relative error: ", result.getRelativeError()) print("Residual error: ", result.getResidualError()) print("Constraint error: ", result.getConstraintError()) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none x^ = [0.8,0.64] f(x^) = [0.04] Iteration number: 6 Evaluation number: 10 Absolute error: 0.0 Relative error: 0.0 Residual error: 0.0 Constraint error: 0.0 .. GENERATED FROM PYTHON SOURCE LINES 116-118 **Remark:** The bounds defined for input variables are always strictly respected when using dlib algorithms. Consequently, the constraint error is always 0. .. GENERATED FROM PYTHON SOURCE LINES 120-121 Draw optimal value history .. GENERATED FROM PYTHON SOURCE LINES 121-124 .. code-block:: default graph = result.drawOptimalValueHistory() view = viewer.View(graph) .. image-sg:: /auto_numerical_methods/optimization/images/sphx_glr_plot_optimization_dlib_001.png :alt: Optimal value history :srcset: /auto_numerical_methods/optimization/images/sphx_glr_plot_optimization_dlib_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 125-131 Solving least squares problem ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ In least squares problem, the user provides the residual function to minimize. Here the underlying OptimizationProblem is defined as a LeastSquaresProblem. dlib least squares algorithms use the same stop criteria as CG, BFGS/LBFGS and Newton algorithms. However, optimization will stop earlier if no significant improvement can be achieved during the process. .. GENERATED FROM PYTHON SOURCE LINES 133-134 Define residual function .. GENERATED FROM PYTHON SOURCE LINES 134-151 .. code-block:: default n = 3 m = 20 x = [[0.5 + 0.1*i] for i in range(m)] model = ot.SymbolicFunction(['a', 'b', 'c', 'x'], ['a + b * exp(-c *x^2)']) p_ref = [2.8, 1.2, 0.5] # Reference a, b, c modelx = ot.ParametricFunction(model, [0, 1, 2], p_ref) # Generate reference sample (with normal noise) y = np.multiply(modelx(x), np.random.normal(1.0, 0.05, m)) def residualFunction(params): modelx = ot.ParametricFunction(model, [0, 1, 2], params) return [modelx(x[i])[0] - y[i, 0] for i in range(m)] .. GENERATED FROM PYTHON SOURCE LINES 152-153 Definition of residual as ot.PythonFunction and optimization problem .. GENERATED FROM PYTHON SOURCE LINES 153-156 .. code-block:: default residual = ot.PythonFunction(n, m, residualFunction) lsqProblem = ot.LeastSquaresProblem(residual) .. GENERATED FROM PYTHON SOURCE LINES 157-158 Definition of Dlib solver, setting starting point .. GENERATED FROM PYTHON SOURCE LINES 158-163 .. code-block:: default lsqAlgo = ot.Dlib(lsqProblem, "least_squares") lsqAlgo.setStartingPoint([0.0, 0.0, 0.0]) lsqAlgo.run() .. GENERATED FROM PYTHON SOURCE LINES 164-165 Retrieve results .. GENERATED FROM PYTHON SOURCE LINES 165-175 .. code-block:: default result = lsqAlgo.getResult() print('x^ = ', result.getOptimalPoint()) print("f(x^) = ", result.getOptimalValue()) print("Iteration number: ", result.getIterationNumber()) print("Evaluation number: ", result.getEvaluationNumber()) print("Absolute error: ", result.getAbsoluteError()) print("Relative error: ", result.getRelativeError()) print("Residual error: ", result.getResidualError()) print("Constraint error: ", result.getConstraintError()) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none x^ = [2.85086,1.2218,0.5] f(x^) = [8.87469e-31] Iteration number: 11 Evaluation number: 15 Absolute error: 1.0659712679890398e-09 Relative error: 3.3929872448316826e-10 Residual error: 6.494805357414886e-18 Constraint error: 0.0 .. GENERATED FROM PYTHON SOURCE LINES 176-177 Draw errors history .. GENERATED FROM PYTHON SOURCE LINES 177-180 .. code-block:: default graph = result.drawErrorHistory() view = viewer.View(graph) .. image-sg:: /auto_numerical_methods/optimization/images/sphx_glr_plot_optimization_dlib_002.png :alt: Error history :srcset: /auto_numerical_methods/optimization/images/sphx_glr_plot_optimization_dlib_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 181-182 Draw optimal value history .. GENERATED FROM PYTHON SOURCE LINES 182-186 .. code-block:: default graph = result.drawOptimalValueHistory() view = viewer.View(graph) plt.show() .. image-sg:: /auto_numerical_methods/optimization/images/sphx_glr_plot_optimization_dlib_003.png :alt: Optimal value history :srcset: /auto_numerical_methods/optimization/images/sphx_glr_plot_optimization_dlib_003.png :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 0.531 seconds) .. _sphx_glr_download_auto_numerical_methods_optimization_plot_optimization_dlib.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_optimization_dlib.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_optimization_dlib.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_