Ceres¶
- class Ceres(*args)¶
- Interface to Ceres Solver. - This class exposes the solvers from the non-linear least squares optimization library [ceres2012]. - More details about least squares algorithms are available here. - Algorithms are also available for general unconstrained optimization. - Parameters
- problemOptimizationProblem
- Optimization problem to solve, either least-squares or general (unconstrained). 
- algoNamestr
- The identifier of the algorithm. Use - GetAlgorithmNames()to list available names.
 
- problem
 - Notes - Solvers use first order derivative information. - As for constraint support, only the trust-region solvers allow for bound constraints: - Algorithm - Method type - Problem type support - Constraint support - LEVENBERG_MARQUARDT - trust-region - least-squares - bounds - DOGLEG - trust-region - least-squares - bounds - STEEPEST_DESCENT - line-search - least-squares, general - none - NONLINEAR_CONJUGATE_GRADIENT - line-search - least-squares, general - none - LBFGS - line-search - least-squares, general - none - BFGS - line-search - least-squares, general - none - Ceres least squares solver can be further tweaked thanks to the following - ResourceMapparameters, refer to nlls solver options for more details.- Key - Type - Ceres-minimizer_type - str - Ceres-line_search_direction_type - str - Ceres-line_search_type - str - Ceres-nonlinear_conjugate_gradient_type - str - Ceres-max_lbfgs_rank - int - Ceres-use_approximate_eigenvalue_bfgs_scaling - bool - Ceres-line_search_interpolation_type - str - Ceres-min_line_search_step_size - float - Ceres-line_search_sufficient_function_decrease - float - Ceres-max_line_search_step_contraction - float - Ceres-min_line_search_step_contraction - float - Ceres-max_num_line_search_step_size_iterations - int - Ceres-max_num_line_search_direction_restarts - int - Ceres-line_search_sufficient_curvature_decrease - float - Ceres-max_line_search_step_expansion - float - Ceres-trust_region_strategy_type - str - Ceres-dogleg_type - str - Ceres-use_nonmonotonic_steps - bool - Ceres-max_consecutive_nonmonotonic_steps - int - Ceres-max_num_iterations - int - Ceres-max_solver_time_in_seconds - float - Ceres-num_threads - int - Ceres-initial_trust_region_radius - float - Ceres-max_trust_region_radius - float - Ceres-min_trust_region_radius - float - Ceres-min_relative_decrease - float - Ceres-min_lm_diagonal - float - Ceres-max_lm_diagonal - float - Ceres-max_num_consecutive_invalid_steps - int - Ceres-function_tolerance - float - Ceres-gradient_tolerance - float - Ceres-parameter_tolerance - float - Ceres-preconditioner_type - str - Ceres-visibility_clustering_type - str - Ceres-dense_linear_algebra_library_type - str - Ceres-sparse_linear_algebra_library_type - str - Ceres-use_explicit_schur_complement - bool - Ceres-use_postordering - bool - Ceres-dynamic_sparsity - bool - Ceres-min_linear_solver_iterations - int - Ceres-max_linear_solver_iterations - int - Ceres-eta - float - Ceres-jacobi_scaling - bool - Ceres-use_inner_iterations - bool - Ceres-inner_iteration_tolerance - float - Ceres-logging_type - str - Ceres-minimizer_progress_to_stdout - bool - Ceres-trust_region_problem_dump_directory - str - Ceres-trust_region_problem_dump_format_type - str - Ceres-check_gradients - bool - Ceres-gradient_check_relative_precision - float - Ceres-gradient_check_numeric_derivative_relative_step_size - float - Ceres-update_state_every_iteration - bool - Ceres unconstrained solver can be further tweaked using the following - ResourceMapparameters, refer to gradient solver options for more details.- Key - Type - Ceres-line_search_direction_type - str - Ceres-line_search_type - str - Ceres-nonlinear_conjugate_gradient_type - str - Ceres-max_lbfgs_rank - int - Ceres-use_approximate_eigenvalue_bfgs_scaling - bool - Ceres-line_search_interpolation_type - str - Ceres-min_line_search_step_size - float - Ceres-line_search_sufficient_function_decrease - float - Ceres-max_line_search_step_contraction - float - Ceres-min_line_search_step_contraction - float - Ceres-max_num_line_search_step_size_iterations - int - Ceres-max_num_line_search_direction_restarts - int - Ceres-line_search_sufficient_curvature_decrease - float - Ceres-max_line_search_step_expansion - float - Ceres-max_num_iterations - int - Ceres-max_solver_time_in_seconds - float - Ceres-function_tolerance - float - Ceres-gradient_tolerance - float - Ceres-parameter_tolerance - float - Ceres-logging_type - str - Ceres-minimizer_progress_to_stdout - bool - Examples - List available algorithms: - >>> import openturns as ot >>> print(ot.Ceres.GetAlgorithmNames()) [LEVENBERG_MARQUARDT,DOGLEG,... - Solve a least-squares problem: - >>> dim = 2 >>> residualFunction = ot.SymbolicFunction(['x0', 'x1'], ['10*(x1-x0^2)', '1-x0']) >>> problem = ot.LeastSquaresProblem(residualFunction) >>> problem.setBounds(ot.Interval([-3.0] * dim, [5.0] * dim)) >>> ot.ResourceMap.AddAsScalar('Ceres-gradient_tolerance', 1e-5) >>> algo = ot.Ceres(problem, 'LEVENBERG_MARQUARDT') >>> algo.setStartingPoint([0.0] * dim) >>> algo.run() >>> result = algo.getResult() >>> x_star = result.getOptimalPoint() >>> y_star = result.getOptimalValue() - Or, solve a general optimization problem: - >>> dim = 4 >>> linear = ot.SymbolicFunction(['x1', 'x2', 'x3', 'x4'], ['(x1-1)^2+(x2-2)^2+(x3-3)^2+(x4-4)^2']) >>> problem = ot.OptimizationProblem(linear) >>> ot.ResourceMap.AddAsScalar('Ceres-gradient_tolerance', 1e-5) >>> algo = ot.Ceres(problem, 'BFGS') >>> algo.setStartingPoint([0.0] * 4) >>> algo.run() >>> result = algo.getResult() >>> x_star = result.getOptimalPoint() >>> y_star = result.getOptimalValue() - Methods - Accessor to the list of algorithms provided, by names. - Ask whether Ceres support is available. - Accessor to the algorithm name. - Accessor to the object's name. - getId()- Accessor to the object's id. - Accessor to maximum allowed absolute error. - Accessor to maximum allowed constraint error. - Accessor to maximum allowed number of evaluations. - Accessor to maximum allowed number of iterations. - Accessor to maximum allowed relative error. - Accessor to maximum allowed residual error. - getName()- Accessor to the object's name. - Accessor to optimization problem. - Accessor to optimization result. - Accessor to the object's shadowed id. - Accessor to starting point. - Accessor to the verbosity flag. - Accessor to the object's visibility state. - hasName()- Test if the object is named. - Test if the object has a distinguishable name. - run()- Launch the optimization. - setAlgorithmName(algoName)- Accessor to the algorithm name. - setMaximumAbsoluteError(maximumAbsoluteError)- Accessor to maximum allowed absolute error. - setMaximumConstraintError(maximumConstraintError)- Accessor to maximum allowed constraint error. - Accessor to maximum allowed number of evaluations. - setMaximumIterationNumber(maximumIterationNumber)- Accessor to maximum allowed number of iterations. - setMaximumRelativeError(maximumRelativeError)- Accessor to maximum allowed relative error. - setMaximumResidualError(maximumResidualError)- Accessor to maximum allowed residual error. - setName(name)- Accessor to the object's name. - setProblem(problem)- Accessor to optimization problem. - setProgressCallback(*args)- Set up a progress callback. - setResult(result)- Accessor to optimization result. - setShadowedId(id)- Accessor to the object's shadowed id. - setStartingPoint(startingPoint)- Accessor to starting point. - setStopCallback(*args)- Set up a stop callback. - setVerbose(verbose)- Accessor to the verbosity flag. - setVisibility(visible)- Accessor to the object's visibility state. - __init__(*args)¶
 - static GetAlgorithmNames()¶
- Accessor to the list of algorithms provided, by names. - Returns
- namesDescription
- List of algorithm names provided, according to its naming convention. - The trust region methods are not able to solve general optimization problems, in that case a warning is printed and the default line search method is used instead. 
 
- names
 - Examples - >>> import openturns as ot >>> print(ot.Ceres.GetAlgorithmNames()) [LEVENBERG_MARQUARDT,DOGLEG,STEEPEST_DESCENT,NONLINEAR_CONJUGATE_GRADIENT,LBFGS,BFGS] 
 - static IsAvailable()¶
- Ask whether Ceres support is available. - Returns
- availablebool
- Whether Ceres support is available. 
 
 
 - getAlgorithmName()¶
- Accessor to the algorithm name. - Returns
- algoNamestr
- The identifier of the algorithm. 
 
 
 - getClassName()¶
- Accessor to the object’s name. - Returns
- class_namestr
- The object class name (object.__class__.__name__). 
 
 
 - getId()¶
- Accessor to the object’s id. - Returns
- idint
- Internal unique identifier. 
 
 
 - getMaximumAbsoluteError()¶
- Accessor to maximum allowed absolute error. - Returns
- maximumAbsoluteErrorfloat
- Maximum allowed absolute error, where the absolute error is defined by - where - and - are two consecutive approximations of the optimum. 
 
 
 - getMaximumConstraintError()¶
- Accessor to maximum allowed constraint error. - Returns
- maximumConstraintErrorfloat
- Maximum allowed constraint error, where the constraint error is defined by - where - is the current approximation of the optimum and - is the function that gathers all the equality and inequality constraints (violated values only) 
 
 
 - getMaximumEvaluationNumber()¶
- Accessor to maximum allowed number of evaluations. - Returns
- Nint
- Maximum allowed number of evaluations. 
 
 
 - getMaximumIterationNumber()¶
- Accessor to maximum allowed number of iterations. - Returns
- Nint
- Maximum allowed number of iterations. 
 
 
 - getMaximumRelativeError()¶
- Accessor to maximum allowed relative error. - Returns
- maximumRelativeErrorfloat
- Maximum allowed relative error, where the relative error is defined by - if - , else - . 
 
 
 - getMaximumResidualError()¶
- Accessor to maximum allowed residual error. - Returns
- maximumResidualErrorfloat
- Maximum allowed residual error, where the residual error is defined by - if - , else - . 
 
 
 - getName()¶
- Accessor to the object’s name. - Returns
- namestr
- The name of the object. 
 
 
 - getProblem()¶
- Accessor to optimization problem. - Returns
- problemOptimizationProblem
- Optimization problem. 
 
- problem
 
 - getResult()¶
- Accessor to optimization result. - Returns
- resultOptimizationResult
- Result class. 
 
- result
 
 - getShadowedId()¶
- Accessor to the object’s shadowed id. - Returns
- idint
- Internal unique identifier. 
 
 
 - getVerbose()¶
- Accessor to the verbosity flag. - Returns
- verbosebool
- Verbosity flag state. 
 
 
 - getVisibility()¶
- Accessor to the object’s visibility state. - Returns
- visiblebool
- Visibility flag. 
 
 
 - hasName()¶
- Test if the object is named. - Returns
- hasNamebool
- True if the name is not empty. 
 
 
 - hasVisibleName()¶
- Test if the object has a distinguishable name. - Returns
- hasVisibleNamebool
- True if the name is not empty and not the default one. 
 
 
 - run()¶
- Launch the optimization. 
 - setAlgorithmName(algoName)¶
- Accessor to the algorithm name. - Parameters
- algoNamestr
- The identifier of the algorithm. 
 
 
 - setMaximumAbsoluteError(maximumAbsoluteError)¶
- Accessor to maximum allowed absolute error. - Parameters
- maximumAbsoluteErrorfloat
- Maximum allowed absolute error, where the absolute error is defined by - where - and - are two consecutive approximations of the optimum. 
 
 
 - setMaximumConstraintError(maximumConstraintError)¶
- Accessor to maximum allowed constraint error. - Parameters
- maximumConstraintErrorfloat
- Maximum allowed constraint error, where the constraint error is defined by - where - is the current approximation of the optimum and - is the function that gathers all the equality and inequality constraints (violated values only) 
 
 
 - setMaximumEvaluationNumber(maximumEvaluationNumber)¶
- Accessor to maximum allowed number of evaluations. - Parameters
- Nint
- Maximum allowed number of evaluations. 
 
 
 - setMaximumIterationNumber(maximumIterationNumber)¶
- Accessor to maximum allowed number of iterations. - Parameters
- Nint
- Maximum allowed number of iterations. 
 
 
 - setMaximumRelativeError(maximumRelativeError)¶
- Accessor to maximum allowed relative error. - Parameters
- maximumRelativeErrorfloat
- Maximum allowed relative error, where the relative error is defined by - if - , else - . 
 
 
 - setMaximumResidualError(maximumResidualError)¶
- Accessor to maximum allowed residual error. - Parameters
- Maximum allowed residual error, where the residual error is defined by
- if - , else - . 
 
 
 - setName(name)¶
- Accessor to the object’s name. - Parameters
- namestr
- The name of the object. 
 
 
 - setProblem(problem)¶
- Accessor to optimization problem. - Parameters
- problemOptimizationProblem
- Optimization problem. 
 
- problem
 
 - setProgressCallback(*args)¶
- Set up a progress callback. - Can be used to programmatically report the progress of an optimization. - Parameters
- callbackcallable
- Takes a float as argument as percentage of progress. 
 
 - Examples - >>> import sys >>> import openturns as ot >>> rosenbrock = ot.SymbolicFunction(['x1', 'x2'], ['(1-x1)^2+100*(x2-x1^2)^2']) >>> problem = ot.OptimizationProblem(rosenbrock) >>> solver = ot.OptimizationAlgorithm(problem) >>> solver.setStartingPoint([0, 0]) >>> solver.setMaximumResidualError(1.e-3) >>> solver.setMaximumEvaluationNumber(10000) >>> def report_progress(progress): ... sys.stderr.write('-- progress=' + str(progress) + '%\n') >>> solver.setProgressCallback(report_progress) >>> solver.run() 
 - setResult(result)¶
- Accessor to optimization result. - Parameters
- resultOptimizationResult
- Result class. 
 
- result
 
 - setShadowedId(id)¶
- Accessor to the object’s shadowed id. - Parameters
- idint
- Internal unique identifier. 
 
 
 - setStartingPoint(startingPoint)¶
- Accessor to starting point. - Parameters
- startingPointPoint
- Starting point. 
 
- startingPoint
 
 - setStopCallback(*args)¶
- Set up a stop callback. - Can be used to programmatically stop an optimization. - Parameters
- callbackcallable
- Returns an int deciding whether to stop or continue. 
 
 - Examples - >>> import openturns as ot >>> rosenbrock = ot.SymbolicFunction(['x1', 'x2'], ['(1-x1)^2+100*(x2-x1^2)^2']) >>> problem = ot.OptimizationProblem(rosenbrock) >>> solver = ot.OptimizationAlgorithm(problem) >>> solver.setStartingPoint([0, 0]) >>> solver.setMaximumResidualError(1.e-3) >>> solver.setMaximumEvaluationNumber(10000) >>> def ask_stop(): ... return True >>> solver.setStopCallback(ask_stop) >>> solver.run() 
 - setVerbose(verbose)¶
- Accessor to the verbosity flag. - Parameters
- verbosebool
- Verbosity flag state. 
 
 
 - setVisibility(visible)¶
- Accessor to the object’s visibility state. - Parameters
- visiblebool
- Visibility flag. 
 
 
 
 OpenTURNS
      OpenTURNS