PythonFunction

class PythonFunction(*args)

Override Function from Python.

Parameters
inputDimpositive int

Dimension of the input vector

outputDimpositive int

Dimension of the output vector

funca callable python object, optional

Called when evaluated on a single point. Default is None.

func_samplea callable python object, optional

Called when evaluated on multiple points at once. Default is None.

gradienta callable python objects, optional

Returns the gradient as a 2-d sequence of float. Default is None (uses finite-difference).

hessiana callable python object, optional

Returns the hessian as a 3-d sequence of float. Default is None (uses finite-difference).

n_cpusinteger

Number of cpus on which func should be distributed using multiprocessing. If -1, it uses all the cpus available. If 1, it does nothing. Default is None.

copybool, optional

If True, input sample is converted into a Python 2-d sequence before calling func_sample. Otherwise, it is passed directy to func_sample. Default is False.

functionLinearitybool, optional

Indicates if the function is linear. Default is False.

variablesLinearitylist of bool, optional

Indicates for each input variable if the function is linear with regard to this variable. Default is [False]*inputDim

Notes

You must provide at least func or func_sample arguments. For efficiency reasons, these functions do not receive a Point or Sample as arguments, but a proxy object which gives access to internal object data. This object supports indexing, but nothing more. It must be wrapped into another object, for instance Point in func and Sample in func_sample, or in a Numpy array, for vectorized operations.

Note that if PythonFunction is distributed (n_cpus > 1), the traceback of a raised exception by a func call is lost due to the way multiprocessing dispatches and handles func calls. This can be solved by temporarily deactivating n_cpus during the development of the wrapper or by manually handling the distribution of the wrapper with external libraries like joblib that keep track of a raised exception and shows the traceback to the user.

Examples

>>> import openturns as ot
>>> def a_exec(X):
...     Y = [3.0 * X[0] - X[1]]
...     return Y
>>> def a_grad(X):
...     dY = [[3.0], [-1.0]]
...     return dY
>>> f = ot.PythonFunction(2, 1, a_exec, gradient=a_grad)
>>> X = [100.0, 100.0]
>>> Y = f(X)
>>> print(Y)
[200]
>>> dY = f.gradient(X)
>>> print(dY)
[[  3 ]
 [ -1 ]]

Same example, but optimized for best performance with Numpy when function is going to be evaluated on large samples.

>>> import openturns as ot
>>> import numpy as np
>>> def a_exec_sample(X):
...     Xarray = np.array(X, copy=False)
...     Y = 3.0 * Xarray[:,0] - Xarray[:,1]
...     return np.expand_dims(Y, axis=1)
>>> def a_grad(X):
...     dY = [[3.0], [-1.0]]
...     return dY
>>> f = ot.PythonFunction(2, 1, func_sample=a_exec_sample, gradient=a_grad)
>>> X = [100.0, 100.0]
>>> Y = f(X)
>>> print(Y)
[200]
>>> dY = f.gradient(X)
>>> print(dY)
[[  3 ]
 [ -1 ]]

Methods

__call__(self, \*args)

Call self as a function.

draw(self, \*args)

Draw the output of function as a Graph.

getCallsNumber(self)

Accessor to the number of times the function has been called.

getClassName(self)

Accessor to the object’s name.

getDescription(self)

Accessor to the description of the inputs and outputs.

getEvaluation(self)

Accessor to the evaluation function.

getEvaluationCallsNumber(self)

Accessor to the number of times the function has been called.

getGradient(self)

Accessor to the gradient function.

getGradientCallsNumber(self)

Accessor to the number of times the gradient of the function has been called.

getHessian(self)

Accessor to the hessian function.

getHessianCallsNumber(self)

Accessor to the number of times the hessian of the function has been called.

getId(self)

Accessor to the object’s id.

getImplementation(self, \*args)

Accessor to the underlying implementation.

getInputDescription(self)

Accessor to the description of the input vector.

getInputDimension(self)

Accessor to the dimension of the input vector.

getMarginal(self, \*args)

Accessor to marginal.

getName(self)

Accessor to the object’s name.

getOutputDescription(self)

Accessor to the description of the output vector.

getOutputDimension(self)

Accessor to the number of the outputs.

getParameter(self)

Accessor to the parameter values.

getParameterDescription(self)

Accessor to the parameter description.

getParameterDimension(self)

Accessor to the dimension of the parameter.

gradient(self, inP)

Return the Jacobian transposed matrix of the function at a point.

hessian(self, inP)

Return the hessian of the function at a point.

parameterGradient(self, inP)

Accessor to the gradient against the parameter.

setDescription(self, description)

Accessor to the description of the inputs and outputs.

setEvaluation(self, evaluation)

Accessor to the evaluation function.

setGradient(self, gradient)

Accessor to the gradient function.

setHessian(self, hessian)

Accessor to the hessian function.

setInputDescription(self, inputDescription)

Accessor to the description of the input vector.

setName(self, name)

Accessor to the object’s name.

setOutputDescription(self, inputDescription)

Accessor to the description of the output vector.

setParameter(self, parameter)

Accessor to the parameter values.

setParameterDescription(self, description)

Accessor to the parameter description.

isLinear

isLinearlyDependent

__init__(self, *args)

Initialize self. See help(type(self)) for accurate signature.

draw(self, *args)

Draw the output of function as a Graph.

Available usages:

draw(inputMarg, outputMarg, CP, xiMin, xiMax, ptNb)

draw(firstInputMarg, secondInputMarg, outputMarg, CP, xiMin_xjMin, xiMax_xjMax, ptNbs)

draw(xiMin, xiMax, ptNb)

draw(xiMin_xjMin, xiMax_xjMax, ptNbs)

Parameters
outputMarg, inputMargint, outputMarg, inputMarg \geq 0

outputMarg is the index of the marginal to draw as a function of the marginal with index inputMarg.

firstInputMarg, secondInputMargint, firstInputMarg, secondInputMarg \geq 0

In the 2D case, the marginal outputMarg is drawn as a function of the two marginals with indexes firstInputMarg and secondInputMarg.

CPsequence of float

Central point.

xiMin, xiMaxfloat

Define the interval where the curve is plotted.

xiMin_xjMin, xiMax_xjMaxsequence of float of dimension 2.

In the 2D case, define the intervals where the curves are plotted.

ptNbint ptNb > 0 or list of ints of dimension 2 ptNb_k > 0, k=1,2

The number of points to draw the curves.

Notes

We note f: \Rset^n \rightarrow \Rset^p where \vect{x} = (x_1, \dots, x_n) and f(\vect{x}) = (f_1(\vect{x}), \dots,f_p(\vect{x})), with n\geq 1 and p\geq 1.

  • In the first usage:

Draws graph of the given 1D outputMarg marginal f_k: \Rset^n \rightarrow \Rset as a function of the given 1D inputMarg marginal with respect to the variation of x_i in the interval [x_i^{min}, x_i^{max}], when all the other components of \vect{x} are fixed to the corresponding ones of the central point CP. Then OpenTURNS draws the graph: t\in [x_i^{min}, x_i^{max}] \mapsto f_k(CP_1, \dots, CP_{i-1}, t,  CP_{i+1} \dots, CP_n).

  • In the second usage:

Draws the iso-curves of the given outputMarg marginal f_k as a function of the given 2D firstInputMarg and secondInputMarg marginals with respect to the variation of (x_i, x_j) in the interval [x_i^{min}, x_i^{max}] \times [x_j^{min}, x_j^{max}], when all the other components of \vect{x} are fixed to the corresponding ones of the central point CP. Then OpenTURNS draws the graph: (t,u) \in [x_i^{min}, x_i^{max}] \times [x_j^{min}, x_j^{max}] \mapsto f_k(CP_1, \dots, CP_{i-1}, t, CP_{i+1}, \dots, CP_{j-1}, u,  CP_{j+1} \dots, CP_n).

  • In the third usage:

The same as the first usage but only for function f: \Rset \rightarrow \Rset.

  • In the fourth usage:

The same as the second usage but only for function f: \Rset^2 \rightarrow \Rset.

Examples

>>> import openturns as ot
>>> from openturns.viewer import View
>>> f = ot.SymbolicFunction('x', 'sin(2*pi_*x)*exp(-x^2/2)')
>>> graph = f.draw(-1.2, 1.2, 100)
>>> View(graph).show()
getCallsNumber(self)

Accessor to the number of times the function has been called.

Returns
calls_numberint

Integer that counts the number of times the function has been called since its creation.

getClassName(self)

Accessor to the object’s name.

Returns
class_namestr

The object class name (object.__class__.__name__).

getDescription(self)

Accessor to the description of the inputs and outputs.

Returns
descriptionDescription

Description of the inputs and the outputs.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                         ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getDescription())
[x1,x2,y0]
getEvaluation(self)

Accessor to the evaluation function.

Returns
functionEvaluationImplementation

The evaluation function.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                         ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getEvaluation())
[x1,x2]->[2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6]
getEvaluationCallsNumber(self)

Accessor to the number of times the function has been called.

Returns
evaluation_calls_numberint

Integer that counts the number of times the function has been called since its creation.

getGradient(self)

Accessor to the gradient function.

Returns
gradientGradientImplementation

The gradient function.

getGradientCallsNumber(self)

Accessor to the number of times the gradient of the function has been called.

Returns
gradient_calls_numberint

Integer that counts the number of times the gradient of the Function has been called since its creation. Note that if the gradient is implemented by a finite difference method, the gradient calls number is equal to 0 and the different calls are counted in the evaluation calls number.

getHessian(self)

Accessor to the hessian function.

Returns
hessianHessianImplementation

The hessian function.

getHessianCallsNumber(self)

Accessor to the number of times the hessian of the function has been called.

Returns
hessian_calls_numberint

Integer that counts the number of times the hessian of the Function has been called since its creation. Note that if the hessian is implemented by a finite difference method, the hessian calls number is equal to 0 and the different calls are counted in the evaluation calls number.

getId(self)

Accessor to the object’s id.

Returns
idint

Internal unique identifier.

getImplementation(self, *args)

Accessor to the underlying implementation.

Returns
implImplementation

The implementation class.

getInputDescription(self)

Accessor to the description of the input vector.

Returns
descriptionDescription

Description of the input vector.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getInputDescription())
[x1,x2]
getInputDimension(self)

Accessor to the dimension of the input vector.

Returns
inputDimint

Dimension of the input vector d.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getInputDimension())
2
getMarginal(self, *args)

Accessor to marginal.

Parameters
indicesint or list of ints

Set of indices for which the marginal is extracted.

Returns
marginalFunction

Function corresponding to either f_i or (f_i)_{i \in indices}, with f:\Rset^n \rightarrow \Rset^p and f=(f_0 , \dots, f_{p-1}).

getName(self)

Accessor to the object’s name.

Returns
namestr

The name of the object.

getOutputDescription(self)

Accessor to the description of the output vector.

Returns
descriptionDescription

Description of the output vector.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getOutputDescription())
[y0]
getOutputDimension(self)

Accessor to the number of the outputs.

Returns
number_outputsint

Dimension of the output vector d'.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getOutputDimension())
1
getParameter(self)

Accessor to the parameter values.

Returns
parameterPoint

The parameter values.

getParameterDescription(self)

Accessor to the parameter description.

Returns
parameterDescription

The parameter description.

getParameterDimension(self)

Accessor to the dimension of the parameter.

Returns
parameterDimensionint

Dimension of the parameter.

gradient(self, inP)

Return the Jacobian transposed matrix of the function at a point.

Parameters
pointsequence of float

Point where the Jacobian transposed matrix is calculated.

Returns
gradientMatrix

The Jacobian transposed matrix of the function at point.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6','x1 + x2'])
>>> print(f.gradient([3.14, 4]))
[[ 13.5345   1       ]
 [  4.00001  1       ]]
hessian(self, inP)

Return the hessian of the function at a point.

Parameters
pointsequence of float

Point where the hessian of the function is calculated.

Returns
hessianSymmetricTensor

Hessian of the function at point.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6','x1 + x2'])
>>> print(f.hessian([3.14, 4]))
sheet #0
[[ 20          -0.00637061 ]
 [ -0.00637061  0          ]]
sheet #1
[[  0           0          ]
 [  0           0          ]]
parameterGradient(self, inP)

Accessor to the gradient against the parameter.

Returns
gradientMatrix

The gradient.

setDescription(self, description)

Accessor to the description of the inputs and outputs.

Parameters
descriptionsequence of str

Description of the inputs and the outputs.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getDescription())
[x1,x2,y0]
>>> f.setDescription(['a','b','y'])
>>> print(f.getDescription())
[a,b,y]
setEvaluation(self, evaluation)

Accessor to the evaluation function.

Parameters
functionEvaluationImplementation

The evaluation function.

setGradient(self, gradient)

Accessor to the gradient function.

Parameters
gradient_functionGradientImplementation

The gradient function.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> f.setGradient(ot.CenteredFiniteDifferenceGradient(
...  ot.ResourceMap.GetAsScalar('CenteredFiniteDifferenceGradient-DefaultEpsilon'),
...  f.getEvaluation()))
setHessian(self, hessian)

Accessor to the hessian function.

Parameters
hessian_functionHessianImplementation

The hessian function.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                         ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> f.setHessian(ot.CenteredFiniteDifferenceHessian(
...  ot.ResourceMap.GetAsScalar('CenteredFiniteDifferenceHessian-DefaultEpsilon'),
...  f.getEvaluation()))
setInputDescription(self, inputDescription)

Accessor to the description of the input vector.

Parameters
descriptionDescription

Description of the input vector.

setName(self, name)

Accessor to the object’s name.

Parameters
namestr

The name of the object.

setOutputDescription(self, inputDescription)

Accessor to the description of the output vector.

Parameters
descriptionDescription

Description of the output vector.

setParameter(self, parameter)

Accessor to the parameter values.

Parameters
parametersequence of float

The parameter values.

setParameterDescription(self, description)

Accessor to the parameter description.

Parameters
parameterDescription

The parameter description.