Function

class Function(*args)

Function base class.

Methods

draw(*args)

Draw the output of function as a Graph.

drawCrossCuts(*args)

Draw the 2D and 1D cross cuts of a 1D output function as a GridLayout.

getCallsNumber()

Accessor to the number of direct calls to the function.

getClassName()

Accessor to the object's name.

getDescription()

Accessor to the description of the inputs and outputs.

getEvaluation()

Accessor to the evaluation function.

getEvaluationCallsNumber()

Accessor to the number of times the evaluation of the function has been called.

getGradient()

Accessor to the gradient function.

getGradientCallsNumber()

Accessor to the number of times the gradient of the function has been called.

getHessian()

Accessor to the hessian function.

getHessianCallsNumber()

Accessor to the number of times the hessian of the function has been called.

getId()

Accessor to the object's id.

getImplementation()

Accessor to the underlying implementation.

getInputDescription()

Accessor to the description of the input vector.

getInputDimension()

Accessor to the dimension of the input vector.

getMarginal(*args)

Accessor to marginal.

getName()

Accessor to the object's name.

getOutputDescription()

Accessor to the description of the output vector.

getOutputDimension()

Accessor to the number of the outputs.

getParameter()

Accessor to the parameter values.

getParameterDescription()

Accessor to the parameter description.

getParameterDimension()

Accessor to the dimension of the parameter.

gradient(inP)

Return the Jacobian transposed matrix of the function at a point.

hessian(inP)

Return the hessian of the function at a point.

isLinear()

Accessor to the linearity of the function.

isLinearlyDependent(index)

Accessor to the linearity of the function with regard to a specific variable.

parameterGradient(inP)

Accessor to the gradient against the parameter.

setDescription(description)

Accessor to the description of the inputs and outputs.

setEvaluation(evaluation)

Accessor to the evaluation function.

setGradient(gradient)

Accessor to the gradient function.

setHessian(hessian)

Accessor to the hessian function.

setInputDescription(inputDescription)

Accessor to the description of the input vector.

setName(name)

Accessor to the object's name.

setOutputDescription(inputDescription)

Accessor to the description of the output vector.

setParameter(parameter)

Accessor to the parameter values.

setParameterDescription(description)

Accessor to the parameter description.

setStopCallback(callBack[, state])

Set up a stop callback.

Notes

A function acts on points to produce points: f: \Rset^d \rightarrow \Rset^{d'}.

A function enables to evaluate its gradient and its hessian when mathematically defined.

Examples

Create a Function from a list of analytical formulas and descriptions of the input vector and the output vector :

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x0', 'x1'],
...                         ['x0 + x1', 'x0 - x1'])
>>> print(f([1, 2]))
[3,-1]

Create a Function from strings:

>>> import openturns as ot
>>> f = ot.SymbolicFunction('x', '2.0*sqrt(x)')
>>> print(f(([16],[4])))
    [ y0 ]
0 : [ 8  ]
1 : [ 4  ]

Create a Function from a Python function:

>>> def a_function(X):
...     return [X[0] + X[1]]
>>> f = ot.PythonFunction(2, 1, a_function)
>>> print(f(((10, 5),(6, 7))))
    [ y0 ]
0 : [ 15 ]
1 : [ 13 ]

See PythonFunction for further details.

Create a Function from a Python class:

>>> class FUNC(ot.OpenTURNSPythonFunction):
...     def __init__(self):
...         super(FUNC, self).__init__(2, 1)
...         self.setInputDescription(['R', 'S'])
...         self.setOutputDescription(['T'])
...     def _exec(self, X):
...         Y = [X[0] + X[1]]
...         return Y
>>> F = FUNC()
>>> func = ot.Function(F)
>>> print(func((1.0, 2.0)))
[3]

See PythonFunction for further details.

Create a Function from another Function:

>>> f = ot.SymbolicFunction(ot.Description.BuildDefault(4, 'x'),
...                         ['x0', 'x0 + x1', 'x0 + x2 + x3'])

Then create another function by setting x1=4 and x3=10:

>>> g = ot.ParametricFunction(f, [3, 1], [10.0, 4.0], True)
>>> print(g.getInputDescription())
[x0,x2]
>>> print(g((1, 2)))
[1,5,13]

Or by setting x0=6 and x2=5:

>>> g = ot.ParametricFunction(f, [3, 1], [6.0, 5.0], False)
>>> print(g.getInputDescription())
[x3,x1]
>>> print(g((1, 2)))
[6,8,12]

Create a Function from another Function and by using a comparison operator:

>>> analytical = ot.SymbolicFunction(['x0','x1'], ['x0 + x1'])
>>> indicator = ot.IndicatorFunction(ot.LevelSet(analytical, ot.Less(), 0.0))
>>> print(indicator([2, 3]))
[0]
>>> print(indicator([2, -3]))
[1]

Create a Function from a collection of functions:

>>> functions = list()
>>> functions.append(ot.SymbolicFunction(['x1', 'x2', 'x3'],
...                                      ['x1^2 + x2', 'x1 + x2 + x3']))
>>> functions.append(ot.SymbolicFunction(['x1', 'x2', 'x3'],
...                                      ['x1 + 2 * x2 + x3', 'x1 + x2 - x3']))
>>> function = ot.AggregatedFunction(functions)
>>> print(function([1.0, 2.0, 3.0]))
[3,6,8,0]

Create a Function which is the linear combination linComb of the functions defined in functionCollection with scalar weights defined in scalarCoefficientColl:

functionCollection  = (f_1, \hdots, f_N) where \forall 1 \leq i \leq N, \,     f_i: \Rset^n \rightarrow \Rset^{p} and scalarCoefficientColl = (c_1, \hdots, c_N) \in \Rset^N then the linear combination is:

linComb: \left|\begin{array}{rcl}
              \Rset^n & \rightarrow & \Rset^{p} \\
              \vect{X} & \mapsto & \displaystyle \sum_i c_if_i (\vect{X})
          \end{array}\right.

>>> function2 = ot.LinearCombinationFunction(functions, [2.0, 4.0])
>>> print(function2([1.0, 2.0, 3.0]))
[38,12]

Create a Function which is the linear combination vectLinComb of the scalar functions defined in scalarFunctionCollection with vectorial weights defined in vectorCoefficientColl:

If scalarFunctionCollection = (f_1, \hdots, f_N) where \forall 1 \leq i \leq N, \,    f_i: \Rset^n \rightarrow \Rset and vectorCoefficientColl = (\vect{c}_1, \hdots, \vect{c}_N) where \forall 1 \leq i \leq N, \,   \vect{c}_i \in \Rset^p

vectLinComb: \left|\begin{array}{rcl}
                 \Rset^n & \rightarrow & \Rset^{p} \\
                 \vect{X} & \mapsto & \displaystyle \sum_i \vect{c}_if_i (\vect{X})
             \end{array}\right.

>>> functions=list()
>>> functions.append(ot.SymbolicFunction(['x1', 'x2', 'x3'],
...                                      ['x1 + 2 * x2 + x3']))
>>> functions.append(ot.SymbolicFunction(['x1', 'x2', 'x3'],
...                                      ['x1^2 + x2']))
>>> function2 = ot.DualLinearCombinationFunction(functions, [[2., 4.], [3., 1.]])
>>> print(function2([1, 2, 3]))
[25,35]

If the input and output dimensions of two functions are equal then we can perform arithmetic operations on these functions using the +, - and * operators.

>>> g1 = ot.SymbolicFunction(['x1', 'x2', 'x3'],
...                          ['x1 + 2 * x2 + x3'])
>>> g2 = ot.SymbolicFunction(['x1', 'x2', 'x3'],
...                          ['x1^2 + x2'])
>>> g_sum = g1 + g2
>>> g_difference = g1 - g2
>>> g_product = g1 * g2

Create a Function from values of the inputs and the outputs:

>>> inputSample = [[1.0, 1.0], [2.0, 2.0]]
>>> outputSample = [[4.0], [5.0]]
>>> database = ot.DatabaseFunction(inputSample, outputSample)
>>> x = [1.8]*database.getInputDimension()
>>> print(database(x))
[5]

Create a Function which is the composition function f\circ g:

>>> g = ot.SymbolicFunction(['x1', 'x2'],
...                         ['x1 + x2','3 * x1 * x2'])
>>> f = ot.SymbolicFunction(['x1', 'x2'], ['2 * x1 - x2'])
>>> composed = ot.ComposedFunction(f, g)
>>> print(composed([3, 4]))
[-22]
__init__(*args)
draw(*args)

Draw the output of function as a Graph.

Available usages:

draw(inputMarg, outputMarg, centralPoint, xiMin, xiMax, ptNb, scale)

draw(firstInputMarg, secondInputMarg, outputMarg, centralPoint, xiMin_xjMin, xiMax_xjMax, ptNbs, scale, isFilled)

draw(xiMin, xiMax, ptNb, scale)

draw(xiMin_xjMin, xiMax_xjMax, ptNbs, scale)

Parameters:
outputMarg, inputMargint, outputMarg, inputMarg \geq 0

outputMarg is the index of the marginal to draw as a function of the marginal with index inputMarg.

firstInputMarg, secondInputMargint, firstInputMarg, secondInputMarg \geq 0

In the 2D case, the marginal outputMarg is drawn as a function of the two marginals with indexes firstInputMarg and secondInputMarg.

centralPointsequence of float

Central point with dimension equal to the input dimension of the function.

xiMin, xiMaxfloat

Define the interval where the curve is plotted.

xiMin_xjMin, xiMax_xjMaxsequence of float of dimension 2.

In the 2D case, define the intervals where the curves are plotted.

ptNbint ptNb > 0

The number of points to draw the curves.

ptNbslist of int of dimension 2 ptNbs_k > 0, k=1,2

The number of points to draw the contour in the 2D case.

scalebool

scale indicates whether the logarithmic scale is used either for one or both axes:

  • ot.GraphImplementation.NONE or 0: no log scale is used,

  • ot.GraphImplementation.LOGX or 1: log scale is used only for horizontal data,

  • ot.GraphImplementation.LOGY or 2: log scale is used only for vertical data,

  • ot.GraphImplementation.LOGXY or 3: log scale is used for both data.

isFilledbool

isFilled indicates whether the contour graph is filled or not

Notes

We note f: \Rset^n \rightarrow \Rset^p where \vect{x} = (x_1, \dots, x_n) and f(\vect{x}) = (f_1(\vect{x}), \dots, f_p(\vect{x})), with n\geq 1 and p\geq 1.

  • In the first usage:

Draws graph of the given 1D outputMarg marginal f_k: \Rset^n \rightarrow \Rset as a function of the given 1D inputMarg marginal with respect to the variation of x_i in the interval [x_i^{min}, x_i^{max}], when all the other components of \vect{x} are fixed to the corresponding components of the centralPoint \vect{c}. Then OpenTURNS draws the graph:

y = f_k^{(i)}(s)

for any s \in [x_i^{min}, x_i^{max}] where f_k^{(i)}(s) is defined by the equation:

f_k^{(i)}(s) = f_k(c_1, \dots, c_{i-1}, s,  c_{i+1} \dots, c_n).

  • In the second usage:

Draws the iso-curves of the given outputMarg marginal f_k as a function of the given 2D firstInputMarg and secondInputMarg marginals with respect to the variation of (x_i, x_j) in the interval [x_i^{min}, x_i^{max}] \times [x_j^{min}, x_j^{max}], when all the other components of \vect{x} are fixed to the corresponding components of the centralPoint \vect{c}. Then OpenTURNS draws the graph:

y = f_k^{(i,j)}(s, t)

for any (s, t) \in [x_i^{min}, x_i^{max}] \times [x_j^{min}, x_j^{max}] where f_k^{(i,j)} is defined by the equation:

f_k^{(i,j)}(s,t) = f_k(c_1, \dots, c_{i-1}, s, c_{i+1}, \dots, c_{j-1}, t,  c_{j+1} \dots, c_n).

  • In the third usage:

The same as the first usage but only for function f: \Rset \rightarrow \Rset.

  • In the fourth usage:

The same as the second usage but only for function f: \Rset^2 \rightarrow \Rset.

Examples

>>> import openturns as ot
>>> from openturns.viewer import View
>>> f = ot.SymbolicFunction('x', 'sin(2*pi_*x)*exp(-x^2/2)')
>>> graph = f.draw(-1.2, 1.2, 100)
>>> View(graph).show()
drawCrossCuts(*args)

Draw the 2D and 1D cross cuts of a 1D output function as a GridLayout.

Parameters:
centralPointlist of float

Central point with dimension equal to the input dimension of the function.

xMin, xMaxlist of float

Define the interval where the curve is plotted.

pointNumberIndices

The number of points to draw the contours and the curves.

withMonoDimensionalCutsbool, optional

withMonoDimensionalCuts indicates whether the mono dimension cuts are drawn or not Default value is specified in the CrossCuts-DefaultWithMonoDimensionalCuts ResourceMap key.

isFilledbool, optional

isFilled indicates whether the contour graphs are filled or not Default value is specified in the Contour-DefaultIsFilled ResourceMap key

vMin, vMaxfloat, optional

Define the interval used to build the color map for the contours If not specified, these values are computed to best fit the graphs. Either specify both values or do not specify any.

Notes

We note f: \Rset^n \rightarrow \Rset^p where \vect{x} = (x_1, \dots, x_n) and f(\vect{x}) = (f_1(\vect{x}), \dots, f_p(\vect{x})), with n\geq 1 and p\geq 1.

In all usages, draw the 1D and 2D cross cuts of f_k: \Rset^n \rightarrow \Rset as a function of all input coordinates for 1D cuts and all couples of coordinates for 2D cuts. Variable coordinates x_i are sampled regularly using ptNb[i] points in the interval [x_i^{min}, x_i^{max}], when all the other components of \vect{x} are fixed to the corresponding components of the centralPoint \vect{c}. In the first usage, vMin and vMax are evaluated as the min and max of all samples of the function value calculated in all cross cuts performed.

For 1D cross cuts the graph shows:

y = f_k^{(i)}(s)

for any s \in [x_i^{min}, x_i^{max}] where f_k^{(i)}(s) is defined by the equation:

f_k^{(i)}(s) = f_k(c_1, \dots, c_{i-1}, s,  c_{i+1} \dots, c_n).

  • For 2D cross cuts:

y = f_k^{(i,j)}(s, t)

for any (s, t) \in [x_i^{min}, x_i^{max}] \times [x_j^{min}, x_j^{max}] where f_k^{(i,j)} is defined by the equation:

f_k^{(i,j)}(s,t) = f_k(c_1, \dots, c_{i-1}, s, c_{i+1}, \dots, c_{j-1}, t,  c_{j+1} \dots, c_n).

Examples

>>> import openturns as ot
>>> from openturns.viewer import View
>>> f = ot.SymbolicFunction(['x0', 'x1', 'x2'], ['sin(1*pi_*x0) + x1 - x2 ^ 2'])
>>> grid = f.drawCrossCuts([0., 0., 0.], [-3., -3, -3], [3, 3, 3], [100, 20, 20], True, True)
>>> View(grid).show()
getCallsNumber()

Accessor to the number of direct calls to the function.

Returns:
calls_numberint

Integer that counts the number of times the function has been called directly through the () operator.

getClassName()

Accessor to the object’s name.

Returns:
class_namestr

The object class name (object.__class__.__name__).

getDescription()

Accessor to the description of the inputs and outputs.

Returns:
descriptionDescription

Description of the inputs and the outputs.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                         ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getDescription())
[x1,x2,y0]
getEvaluation()

Accessor to the evaluation function.

Returns:
evaluationEvaluationImplementation

The evaluation function.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                         ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getEvaluation())
[x1,x2]->[2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6]
getEvaluationCallsNumber()

Accessor to the number of times the evaluation of the function has been called.

Returns:
evaluation_calls_numberint

Integer that counts the number of times the evaluation of the function has been called since its creation. This may include indirect calls via finite-difference gradient or Hessian.

getGradient()

Accessor to the gradient function.

Returns:
gradientGradientImplementation

The gradient function.

Examples

To get the gradient formulas when the function is a SymbolicFunction. The indices are (i,j). The corresponding formula is \left( \dfrac{\partial f_j(\vect{x})}{\partial x_i} \right). >>> import openturns as ot >>> f = ot.SymbolicFunction([‘x1’, ‘x2’], … [‘2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6’]) >>> formula_deriv_x1 = f.getGradient().getImplementation().getFormula(0,0) >>> formula_deriv_x2 = f.getGradient().getImplementation().getFormula(1,0)

getGradientCallsNumber()

Accessor to the number of times the gradient of the function has been called.

Returns:
gradient_calls_numberint

Integer that counts the number of times the gradient of the Function has been called since its creation. Note that if the gradient is implemented by a finite difference method, the gradient calls number is equal to 0 and the different calls are counted in the evaluation calls number.

getHessian()

Accessor to the hessian function.

Returns:
hessianHessianImplementation

The hessian function.

Examples

To get the hessian formulas when the function is a SymbolicFunction. The indices are (i,j,k). The corresponding formula is \left( \dfrac{\partial^2 f_k(\vect{x})}{\partial x_i \partial x_j} \right). >>> import openturns as ot >>> f = ot.SymbolicFunction([‘x1’, ‘x2’], … [‘2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6’]) >>> formula_hessian_000 = f.getHessian().getImplementation().getFormula(0,0,0) >>> formula_hessian_010 = f.getHessian().getImplementation().getFormula(0,1,0) >>> formula_hessian_110 = f.getHessian().getImplementation().getFormula(1,1,0) >>> formula_hessian_001 = f.getHessian().getImplementation().getFormula(0,0,0) >>> formula_hessian_011 = f.getHessian().getImplementation().getFormula(0,1,0) >>> formula_hessian_111 = f.getHessian().getImplementation().getFormula(1,1,0)

getHessianCallsNumber()

Accessor to the number of times the hessian of the function has been called.

Returns:
hessian_calls_numberint

Integer that counts the number of times the hessian of the Function has been called since its creation. Note that if the hessian is implemented by a finite difference method, the hessian calls number is equal to 0 and the different calls are counted in the evaluation calls number.

getId()

Accessor to the object’s id.

Returns:
idint

Internal unique identifier.

getImplementation()

Accessor to the underlying implementation.

Returns:
implImplementation

A copy of the underlying implementation object.

getInputDescription()

Accessor to the description of the input vector.

Returns:
descriptionDescription

Description of the input vector.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getInputDescription())
[x1,x2]
getInputDimension()

Accessor to the dimension of the input vector.

Returns:
inputDimint

Dimension of the input vector d.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getInputDimension())
2
getMarginal(*args)

Accessor to marginal.

Parameters:
indicesint or list of ints

Set of indices for which the marginal is extracted.

Returns:
marginalFunction

Function corresponding to either f_i or (f_i)_{i \in indices}, with f:\Rset^n \rightarrow \Rset^p and f=(f_0 , \dots, f_{p-1}).

getName()

Accessor to the object’s name.

Returns:
namestr

The name of the object.

getOutputDescription()

Accessor to the description of the output vector.

Returns:
descriptionDescription

Description of the output vector.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getOutputDescription())
[y0]
getOutputDimension()

Accessor to the number of the outputs.

Returns:
number_outputsint

Dimension of the output vector d'.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getOutputDimension())
1
getParameter()

Accessor to the parameter values.

Returns:
parameterPoint

The parameter values.

getParameterDescription()

Accessor to the parameter description.

Returns:
parameterDescription

The parameter description.

getParameterDimension()

Accessor to the dimension of the parameter.

Returns:
parameterDimensionint

Dimension of the parameter.

gradient(inP)

Return the Jacobian transposed matrix of the function at a point.

Parameters:
pointsequence of float

Point where the Jacobian transposed matrix is calculated.

Returns:
gradientMatrix

The Jacobian transposed matrix of the function at point.

Notes

Let f : \Rset^\inputDim \rightarrow \Rset^p where the f_i are the components functions. Then, the gradient evaluated at \vect{x} is the matrix with \inputDim rows and p columns defined by:

\nabla f(\vect{x}) & = \left( \dfrac{\partial f_j(\vect{x})}{\partial x_i} \right)_{i,j} \\
 & = \left(
 \begin{array}{lcr}
\dfrac{\partial f_1(\vect{x})}{\partial x_1} & \hdots & \dfrac{\partial f_p(\vect{x})}{\partial x_1}\\
vdots & \vdots & \vdots \\
\dfrac{\partial f_1(\vect{x})}{\partial x_\inputDim} & \hdots & \dfrac{\partial f_p(\vect{x})}{\partial x_\inputDim}
\end{array}
\right)

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6','x1 + x2'])
>>> print(f.gradient([3.14, 4]))
[[ 13.5345   1       ]
 [  4.00001  1       ]]
hessian(inP)

Return the hessian of the function at a point.

Parameters:
pointsequence of float

Point where the hessian of the function is calculated.

Returns:
hessianSymmetricTensor

Hessian of the function at point.

Notes

Let f : \Rset^\inputDim \rightarrow \Rset^p where the f_i are the components functions. Then, the hessian evaluated at \vect{x} is a symmetric tensor with p sheets, each one composed of \inputDim rows and \inputDim columns. The k -th sheet is defined by:

\left( \dfrac{\partial^2 f_k(\vect{x})}{\partial x_i \partial x_j} \right)_{i,j}

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6','x1 + x2'])
>>> print(f.hessian([3.14, 4]))
sheet #0
[[ 20          -0.00637061 ]
 [ -0.00637061  0          ]]
sheet #1
[[  0           0          ]
 [  0           0          ]]
isLinear()

Accessor to the linearity of the function.

Returns:
linearbool

True if the function is linear, False otherwise.

isLinearlyDependent(index)

Accessor to the linearity of the function with regard to a specific variable.

Parameters:
indexint

The index of the variable with regard to which linearity is evaluated.

Returns:
linearbool

True if the function is linearly dependent on the specified variable, False otherwise.

parameterGradient(inP)

Accessor to the gradient against the parameter.

Returns:
gradientMatrix

The gradient.

setDescription(description)

Accessor to the description of the inputs and outputs.

Parameters:
descriptionsequence of str

Description of the inputs and the outputs.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getDescription())
[x1,x2,y0]
>>> f.setDescription(['a','b','y'])
>>> print(f.getDescription())
[a,b,y]
setEvaluation(evaluation)

Accessor to the evaluation function.

Parameters:
functionEvaluationImplementation

The evaluation function.

setGradient(gradient)

Accessor to the gradient function.

Parameters:
gradient_functionGradientImplementation

The gradient function.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> f.setGradient(ot.CenteredFiniteDifferenceGradient(
...  ot.ResourceMap.GetAsScalar('CenteredFiniteDifferenceGradient-DefaultEpsilon'),
...  f.getEvaluation()))
setHessian(hessian)

Accessor to the hessian function.

Parameters:
hessian_functionHessianImplementation

The hessian function.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                         ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> f.setHessian(ot.CenteredFiniteDifferenceHessian(
...  ot.ResourceMap.GetAsScalar('CenteredFiniteDifferenceHessian-DefaultEpsilon'),
...  f.getEvaluation()))
setInputDescription(inputDescription)

Accessor to the description of the input vector.

Parameters:
descriptionDescription

Description of the input vector.

setName(name)

Accessor to the object’s name.

Parameters:
namestr

The name of the object.

setOutputDescription(inputDescription)

Accessor to the description of the output vector.

Parameters:
descriptionDescription

Description of the output vector.

setParameter(parameter)

Accessor to the parameter values.

Parameters:
parametersequence of float

The parameter values.

setParameterDescription(description)

Accessor to the parameter description.

Parameters:
parameterDescription

The parameter description.

setStopCallback(callBack, state=None)

Set up a stop callback.

Can be used to programmatically stop an evaluation.

Parameters:
callbackcallable

Returns a bool deciding whether to stop or continue.

Examples using the class

Customize your Metropolis-Hastings algorithm

Customize your Metropolis-Hastings algorithm

Bayesian calibration of a computer code

Bayesian calibration of a computer code

Bayesian calibration of the flooding model

Bayesian calibration of the flooding model

Gibbs sampling of the posterior distribution

Gibbs sampling of the posterior distribution

Linear Regression with interval-censored observations

Linear Regression with interval-censored observations

Bayesian calibration of hierarchical fission gas release models

Bayesian calibration of hierarchical fission gas release models

Sampling from an unscaled probability density

Sampling from an unscaled probability density

Posterior sampling using a PythonDistribution

Posterior sampling using a PythonDistribution

Calibration of the Chaboche mechanical model

Calibration of the Chaboche mechanical model

Calibration of the deflection of a tube

Calibration of the deflection of a tube

Calibration of the flooding model

Calibration of the flooding model

Calibration of the logistic model

Calibration of the logistic model

Calibrate a parametric model: a quick-start guide to calibration

Calibrate a parametric model: a quick-start guide to calibration

Calibration without observed inputs

Calibration without observed inputs

Generate observations of the Chaboche mechanical model

Generate observations of the Chaboche mechanical model

Generate flooding model observations

Generate flooding model observations

Estimate a GEV on the Fremantle sea-levels data

Estimate a GEV on the Fremantle sea-levels data

Estimate a GEV on the Port Pirie sea-levels data

Estimate a GEV on the Port Pirie sea-levels data

Estimate a GEV on race times data

Estimate a GEV on race times data

Estimate a GPD on the daily rainfall data

Estimate a GPD on the daily rainfall data

Model a singular multivariate distribution

Model a singular multivariate distribution

Estimate a non stationary covariance function

Estimate a non stationary covariance function

Visualize pairs between two samples

Visualize pairs between two samples

Estimate moments from sample

Estimate moments from sample

Estimate a confidence interval of a quantile

Estimate a confidence interval of a quantile

Sample manipulation

Sample manipulation

Compute squared SRC indices confidence intervals

Compute squared SRC indices confidence intervals

Kolmogorov-Smirnov : get the statistics distribution

Kolmogorov-Smirnov : get the statistics distribution

Kolmogorov-Smirnov : understand the p-value

Kolmogorov-Smirnov : understand the p-value

Function manipulation

Function manipulation

Logistic growth model

Logistic growth model

Create a process sample from a sample

Create a process sample from a sample

Value function

Value function

Vertex value function

Vertex value function

Define a connection function with a field output

Define a connection function with a field output

Create a multivariate basis of functions from scalar multivariable functions

Create a multivariate basis of functions from scalar multivariable functions

Link to a computer code with coupling tools

Link to a computer code with coupling tools

Create an aggregated function

Create an aggregated function

Create a composed function

Create a composed function

Create multivariate functions

Create multivariate functions

Get the gradient and hessian as functions

Get the gradient and hessian as functions

Increase the input dimension of a function

Increase the input dimension of a function

Increase the output dimension of a function

Increase the output dimension of a function

Create a linear combination of functions

Create a linear combination of functions

Create a parametric function

Create a parametric function

Create a Python function

Create a Python function

Create a quadratic function

Create a quadratic function

Defining Python and symbolic functions: a quick start introduction to functions

Defining Python and symbolic functions: a quick start introduction to functions

Create a symbolic function

Create a symbolic function

Getting started

Getting started

Gaussian Process Regression vs KrigingAlgorithm

Gaussian Process Regression vs KrigingAlgorithm

A quick start guide to graphs

A quick start guide to graphs

A quick start guide to contours

A quick start guide to contours

Plot the log-likelihood contours of a distribution

Plot the log-likelihood contours of a distribution

Metamodel of a field function

Metamodel of a field function

Viscous free fall: metamodel of a field function

Viscous free fall: metamodel of a field function

Create a linear least squares model

Create a linear least squares model

Distribution of estimators in linear regression

Distribution of estimators in linear regression

Mixture of experts

Mixture of experts

Export a metamodel

Export a metamodel

Create a general linear model metamodel

Create a general linear model metamodel

Create a linear model

Create a linear model

Over-fitting and model selection

Over-fitting and model selection

Perform stepwise regression

Perform stepwise regression

Taylor approximations

Taylor approximations

Gaussian Process Regression: multiple input dimensions

Gaussian Process Regression: multiple input dimensions

Gaussian Process Regression : quick-start

Gaussian Process Regression : quick-start

Gaussian Process-based active learning for reliability

Gaussian Process-based active learning for reliability

Advanced Gaussian process regression

Advanced Gaussian process regression

Gaussian Process Regression: choose an arbitrary trend

Gaussian Process Regression: choose an arbitrary trend

Gaussian Process Regression: choose a polynomial trend on the beam model

Gaussian Process Regression: choose a polynomial trend on the beam model

Gaussian Process Regression : cantilever beam model

Gaussian Process Regression : cantilever beam model

Gaussian Process Regression: surrogate model with continuous and categorical variables

Gaussian Process Regression: surrogate model with continuous and categorical variables

Gaussian Process Regression: choose a polynomial trend

Gaussian Process Regression: choose a polynomial trend

Gaussian process fitter: configure the optimization solver

Gaussian process fitter: configure the optimization solver

Gaussian Process Regression: use an isotropic covariance kernel

Gaussian Process Regression: use an isotropic covariance kernel

Gaussian process regression: draw the likelihood

Gaussian process regression: draw the likelihood

Gaussian Process Regression : generate trajectories from the metamodel

Gaussian Process Regression : generate trajectories from the metamodel

Gaussian Process Regression: metamodel of the Branin-Hoo function

Gaussian Process Regression: metamodel of the Branin-Hoo function

Example of multi output Gaussian Process Regression on the fire satellite model

Example of multi output Gaussian Process Regression on the fire satellite model

Sequentially adding new points to a Gaussian Process metamodel

Sequentially adding new points to a Gaussian Process metamodel

Gaussian Process Regression: propagate uncertainties

Gaussian Process Regression: propagate uncertainties

Polynomial chaos is sensitive to the degree

Polynomial chaos is sensitive to the degree

Fit a distribution from an input sample

Fit a distribution from an input sample

Create a polynomial chaos metamodel by integration on the cantilever beam

Create a polynomial chaos metamodel by integration on the cantilever beam

Conditional expectation of a polynomial chaos expansion

Conditional expectation of a polynomial chaos expansion

Polynomial chaos expansion cross-validation

Polynomial chaos expansion cross-validation

Apply a transform or inverse transform on your polynomial chaos

Apply a transform or inverse transform on your polynomial chaos

Validate a polynomial chaos

Validate a polynomial chaos

Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos

Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos

Compute grouped indices for the Ishigami function

Compute grouped indices for the Ishigami function

Compute Sobol’ indices confidence intervals

Compute Sobol' indices confidence intervals

Plot enumeration rules

Plot enumeration rules

Create a polynomial chaos metamodel from a data set

Create a polynomial chaos metamodel from a data set

Advanced polynomial chaos construction

Advanced polynomial chaos construction

Create a full or sparse polynomial chaos expansion

Create a full or sparse polynomial chaos expansion

Polynomial chaos exploitation

Polynomial chaos exploitation

Integrate a function with Gauss-Kronrod algorithm

Integrate a function with Gauss-Kronrod algorithm

Estimate a multivariate integral with IteratedQuadrature

Estimate a multivariate integral with IteratedQuadrature

Iterated Functions System

Iterated Functions System

Compute leave-one-out error of a polynomial chaos expansion

Compute leave-one-out error of a polynomial chaos expansion

Compute confidence intervals of a regression model from data

Compute confidence intervals of a regression model from data

Compute confidence intervals of a univariate noisy function

Compute confidence intervals of a univariate noisy function

Save/load a study

Save/load a study

Control algorithm termination

Control algorithm termination

EfficientGlobalOptimization examples

EfficientGlobalOptimization examples

Mix/max search and sensitivity from design

Mix/max search and sensitivity from design

Mix/max search using optimization

Mix/max search using optimization

Optimization using bonmin

Optimization using bonmin

Optimization with constraints

Optimization with constraints

Optimization using dlib

Optimization using dlib

Optimization using NLopt

Optimization using NLopt

Multi-objective optimization using Pagmo

Multi-objective optimization using Pagmo

Optimization of the Rastrigin test function

Optimization of the Rastrigin test function

Quick start guide to optimization

Quick start guide to optimization

Create a Joint by Conditioning distribution

Create a Joint by Conditioning distribution

Create your own distribution given its quantile function

Create your own distribution given its quantile function

Create a deconditioned distribution

Create a deconditioned distribution

Transform a distribution

Transform a distribution

Generate random variates by inverting the CDF

Generate random variates by inverting the CDF

Overview of univariate distribution management

Overview of univariate distribution management

Compare frequentist and Bayesian estimation

Compare frequentist and Bayesian estimation

Composite random vector

Composite random vector

Use the Ratio of Uniforms algorithm to sample a distribution

Use the Ratio of Uniforms algorithm to sample a distribution

Add a trend to a process

Add a trend to a process

Use the Box-Cox transformation

Use the Box-Cox transformation

Create a mesh

Create a mesh

Create a functional basis process

Create a functional basis process

Compare covariance models

Compare covariance models

Create a process from random vectors and processes

Create a process from random vectors and processes

Manipulate stochastic processes

Manipulate stochastic processes

Trend computation

Trend computation

Create a custom covariance model

Create a custom covariance model

Estimate moments from Taylor expansions

Estimate moments from Taylor expansions

Compute the L2 error between two functions

Compute the L2 error between two functions

LOLA-Voronoi sequential design of experiment

LOLA-Voronoi sequential design of experiment

Use the Smolyak quadrature

Use the Smolyak quadrature

Create a domain event

Create a domain event

Create a threshold event

Create a threshold event

Use the FORM - SORM algorithms

Use the FORM - SORM algorithms

Using the FORM - SORM algorithms on a nonlinear function

Using the FORM - SORM algorithms on a nonlinear function

Simulate an Event

Simulate an Event

Create unions or intersections of events

Create unions or intersections of events

Estimate a flooding probability

Estimate a flooding probability

An illustrated example of a FORM probability estimate

An illustrated example of a FORM probability estimate

Estimate a probability using Line Sampling

Estimate a probability using Line Sampling

Use the FORM algorithm in case of several design points

Use the FORM algorithm in case of several design points

Non parametric Adaptive Importance Sampling (NAIS)

Non parametric Adaptive Importance Sampling (NAIS)

Use the post-analytical importance sampling algorithm

Use the post-analytical importance sampling algorithm

Time variant system reliability problem

Time variant system reliability problem

Specify a simulation algorithm

Specify a simulation algorithm

Exploitation of simulation algorithm results

Exploitation of simulation algorithm results

Test the design point with the Strong Maximum Test

Test the design point with the Strong Maximum Test

Subset Sampling

Subset Sampling

Estimate Sobol indices on a field to point function

Estimate Sobol indices on a field to point function

Sobol’ sensitivity indices from chaos

Sobol' sensitivity indices from chaos

The HSIC sensitivity indices: the Ishigami model

The HSIC sensitivity indices: the Ishigami model

Use the ANCOVA indices

Use the ANCOVA indices

Parallel coordinates graph as sensitivity tool

Parallel coordinates graph as sensitivity tool

Estimate Sobol’ indices for a function with multivariate output

Estimate Sobol' indices for a function with multivariate output

Example of sensitivity analyses on the wing weight model

Example of sensitivity analyses on the wing weight model