ParametricFunction

class ParametricFunction(*args)

Parametric function.

Available constructor:

ParametricFunction(function, indices, referencePoint, parametersSet)

It defines a parametric function from function by freezing the variables marked by the indices set to the values of referencePoint.

Parameters:
functionFunction

Function with full parameters from which the parametric function is built.

indicessequence of int

Indices of the frozen variables.

referencePointsequence of float

Values of the frozen variables. Must be of size of indices if parametersSet is True (default), else its size should be the complementary size of indices.

parametersSetbool

If True (default), the frozen variables are the ones referenced in indices. Otherwise, the set variables are the ones referenced in the complementary set of indices.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x', 'y', 'z'], ['x+y', 'x*z+y'])
>>> print(f)
[x,y,z]->[x+y,x*z+y]

Then create another function by setting x=2 and y=3:

>>> g=ot.ParametricFunction(f, [0,1], [2,3])
>>> print(g)
ParametricEvaluation([x,y,z]->[x+y,x*z+y], parameters positions=[0,1], parameters=[x : 2, y : 3], input positions=[2])

Or by setting z=4 using the complementary set flag:

>>> g = ot.ParametricFunction(f, [0,1], [4], False)
>>> print(g.getInputDescription())
[x,y]
>>> print(g)
ParametricEvaluation([x,y,z]->[x+y,x*z+y], parameters positions=[2], parameters=[z : 4], input positions=[0,1])

Methods

__call__(*args)

Call self as a function.

draw(*args)

Draw the output of function as a Graph.

getCallsNumber()

Accessor to the number of times the function has been called.

getClassName()

Accessor to the object's name.

getDescription()

Accessor to the description of the inputs and outputs.

getEvaluation()

Accessor to the evaluation function.

getEvaluationCallsNumber()

Accessor to the number of times the function has been called.

getGradient()

Accessor to the gradient function.

getGradientCallsNumber()

Accessor to the number of times the gradient of the function has been called.

getHessian()

Accessor to the hessian function.

getHessianCallsNumber()

Accessor to the number of times the hessian of the function has been called.

getId()

Accessor to the object's id.

getImplementation()

Accessor to the underlying implementation.

getInputDescription()

Accessor to the description of the input vector.

getInputDimension()

Accessor to the dimension of the input vector.

getMarginal(*args)

Accessor to marginal.

getName()

Accessor to the object's name.

getOutputDescription()

Accessor to the description of the output vector.

getOutputDimension()

Accessor to the number of the outputs.

getParameter()

Accessor to the parameter values.

getParameterDescription()

Accessor to the parameter description.

getParameterDimension()

Accessor to the dimension of the parameter.

gradient(inP)

Return the Jacobian transposed matrix of the function at a point.

hessian(inP)

Return the hessian of the function at a point.

isLinear()

Accessor to the linearity of the function.

isLinearlyDependent(index)

Accessor to the linearity of the function with regard to a specific variable.

parameterGradient(inP)

Accessor to the gradient against the parameter.

setDescription(description)

Accessor to the description of the inputs and outputs.

setEvaluation(evaluation)

Accessor to the evaluation function.

setGradient(gradient)

Accessor to the gradient function.

setHessian(hessian)

Accessor to the hessian function.

setInputDescription(inputDescription)

Accessor to the description of the input vector.

setName(name)

Accessor to the object's name.

setOutputDescription(inputDescription)

Accessor to the description of the output vector.

setParameter(parameter)

Accessor to the parameter values.

setParameterDescription(description)

Accessor to the parameter description.

__init__(*args)
draw(*args)

Draw the output of function as a Graph.

Available usages:

draw(inputMarg, outputMarg, centralPoint, xiMin, xiMax, ptNb)

draw(firstInputMarg, secondInputMarg, outputMarg, centralPoint, xiMin_xjMin, xiMax_xjMax, ptNbs)

draw(xiMin, xiMax, ptNb)

draw(xiMin_xjMin, xiMax_xjMax, ptNbs)

Parameters:
outputMarg, inputMargint, outputMarg, inputMarg \geq 0

outputMarg is the index of the marginal to draw as a function of the marginal with index inputMarg.

firstInputMarg, secondInputMargint, firstInputMarg, secondInputMarg \geq 0

In the 2D case, the marginal outputMarg is drawn as a function of the two marginals with indexes firstInputMarg and secondInputMarg.

centralPointsequence of float

Central point with dimension equal to the input dimension of the function.

xiMin, xiMaxfloat

Define the interval where the curve is plotted.

xiMin_xjMin, xiMax_xjMaxsequence of float of dimension 2.

In the 2D case, define the intervals where the curves are plotted.

ptNbint ptNb > 0 or list of ints of dimension 2 ptNb_k > 0, k=1,2

The number of points to draw the curves.

Notes

We note f: \Rset^n \rightarrow \Rset^p where \vect{x} = (x_1, \dots, x_n) and f(\vect{x}) = (f_1(\vect{x}), \dots, f_p(\vect{x})), with n\geq 1 and p\geq 1.

  • In the first usage:

Draws graph of the given 1D outputMarg marginal f_k: \Rset^n \rightarrow \Rset as a function of the given 1D inputMarg marginal with respect to the variation of x_i in the interval [x_i^{min}, x_i^{max}], when all the other components of \vect{x} are fixed to the corresponding components of the centralPoint \vect{c}. Then OpenTURNS draws the graph:

y = f_k^{(i)}(s)

for any s \in [x_i^{min}, x_i^{max}] where f_k^{(i)}(s) is defined by the equation:

f_k^{(i)}(s) = f_k(c_1, \dots, c_{i-1}, s,  c_{i+1} \dots, c_n).

  • In the second usage:

Draws the iso-curves of the given outputMarg marginal f_k as a function of the given 2D firstInputMarg and secondInputMarg marginals with respect to the variation of (x_i, x_j) in the interval [x_i^{min}, x_i^{max}] \times [x_j^{min}, x_j^{max}], when all the other components of \vect{x} are fixed to the corresponding components of the centralPoint \vect{c}. Then OpenTURNS draws the graph:

y = f_k^{(i,j)}(s, t)

for any (s, t) \in [x_i^{min}, x_i^{max}] \times [x_j^{min}, x_j^{max}] where f_k^{(i,j)} is defined by the equation:

f_k^{(i,j)}(s,t) = f_k(c_1, \dots, c_{i-1}, s, c_{i+1}, \dots, c_{j-1}, t,  c_{j+1} \dots, c_n).

  • In the third usage:

The same as the first usage but only for function f: \Rset \rightarrow \Rset.

  • In the fourth usage:

The same as the second usage but only for function f: \Rset^2 \rightarrow \Rset.

Examples

>>> import openturns as ot
>>> from openturns.viewer import View
>>> f = ot.SymbolicFunction('x', 'sin(2*pi_*x)*exp(-x^2/2)')
>>> graph = f.draw(-1.2, 1.2, 100)
>>> View(graph).show()
getCallsNumber()

Accessor to the number of times the function has been called.

Returns:
calls_numberint

Integer that counts the number of times the function has been called since its creation.

getClassName()

Accessor to the object’s name.

Returns:
class_namestr

The object class name (object.__class__.__name__).

getDescription()

Accessor to the description of the inputs and outputs.

Returns:
descriptionDescription

Description of the inputs and the outputs.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                         ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getDescription())
[x1,x2,y0]
getEvaluation()

Accessor to the evaluation function.

Returns:
functionEvaluationImplementation

The evaluation function.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                         ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getEvaluation())
[x1,x2]->[2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6]
getEvaluationCallsNumber()

Accessor to the number of times the function has been called.

Returns:
evaluation_calls_numberint

Integer that counts the number of times the function has been called since its creation.

getGradient()

Accessor to the gradient function.

Returns:
gradientGradientImplementation

The gradient function.

getGradientCallsNumber()

Accessor to the number of times the gradient of the function has been called.

Returns:
gradient_calls_numberint

Integer that counts the number of times the gradient of the Function has been called since its creation. Note that if the gradient is implemented by a finite difference method, the gradient calls number is equal to 0 and the different calls are counted in the evaluation calls number.

getHessian()

Accessor to the hessian function.

Returns:
hessianHessianImplementation

The hessian function.

getHessianCallsNumber()

Accessor to the number of times the hessian of the function has been called.

Returns:
hessian_calls_numberint

Integer that counts the number of times the hessian of the Function has been called since its creation. Note that if the hessian is implemented by a finite difference method, the hessian calls number is equal to 0 and the different calls are counted in the evaluation calls number.

getId()

Accessor to the object’s id.

Returns:
idint

Internal unique identifier.

getImplementation()

Accessor to the underlying implementation.

Returns:
implImplementation

A copy of the underlying implementation object.

getInputDescription()

Accessor to the description of the input vector.

Returns:
descriptionDescription

Description of the input vector.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getInputDescription())
[x1,x2]
getInputDimension()

Accessor to the dimension of the input vector.

Returns:
inputDimint

Dimension of the input vector d.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getInputDimension())
2
getMarginal(*args)

Accessor to marginal.

Parameters:
indicesint or list of ints

Set of indices for which the marginal is extracted.

Returns:
marginalFunction

Function corresponding to either f_i or (f_i)_{i \in indices}, with f:\Rset^n \rightarrow \Rset^p and f=(f_0 , \dots, f_{p-1}).

getName()

Accessor to the object’s name.

Returns:
namestr

The name of the object.

getOutputDescription()

Accessor to the description of the output vector.

Returns:
descriptionDescription

Description of the output vector.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getOutputDescription())
[y0]
getOutputDimension()

Accessor to the number of the outputs.

Returns:
number_outputsint

Dimension of the output vector d'.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getOutputDimension())
1
getParameter()

Accessor to the parameter values.

Returns:
parameterPoint

The parameter values.

getParameterDescription()

Accessor to the parameter description.

Returns:
parameterDescription

The parameter description.

getParameterDimension()

Accessor to the dimension of the parameter.

Returns:
parameterDimensionint

Dimension of the parameter.

gradient(inP)

Return the Jacobian transposed matrix of the function at a point.

Parameters:
pointsequence of float

Point where the Jacobian transposed matrix is calculated.

Returns:
gradientMatrix

The Jacobian transposed matrix of the function at point.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6','x1 + x2'])
>>> print(f.gradient([3.14, 4]))
[[ 13.5345   1       ]
 [  4.00001  1       ]]
hessian(inP)

Return the hessian of the function at a point.

Parameters:
pointsequence of float

Point where the hessian of the function is calculated.

Returns:
hessianSymmetricTensor

Hessian of the function at point.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6','x1 + x2'])
>>> print(f.hessian([3.14, 4]))
sheet #0
[[ 20          -0.00637061 ]
 [ -0.00637061  0          ]]
sheet #1
[[  0           0          ]
 [  0           0          ]]
isLinear()

Accessor to the linearity of the function.

Returns:
linearbool

True if the function is linear, False otherwise.

isLinearlyDependent(index)

Accessor to the linearity of the function with regard to a specific variable.

Parameters:
indexint

The index of the variable with regard to which linearity is evaluated.

Returns:
linearbool

True if the function is linearly dependent on the specified variable, False otherwise.

parameterGradient(inP)

Accessor to the gradient against the parameter.

Returns:
gradientMatrix

The gradient.

setDescription(description)

Accessor to the description of the inputs and outputs.

Parameters:
descriptionsequence of str

Description of the inputs and the outputs.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> print(f.getDescription())
[x1,x2,y0]
>>> f.setDescription(['a','b','y'])
>>> print(f.getDescription())
[a,b,y]
setEvaluation(evaluation)

Accessor to the evaluation function.

Parameters:
functionEvaluationImplementation

The evaluation function.

setGradient(gradient)

Accessor to the gradient function.

Parameters:
gradient_functionGradientImplementation

The gradient function.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                          ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> f.setGradient(ot.CenteredFiniteDifferenceGradient(
...  ot.ResourceMap.GetAsScalar('CenteredFiniteDifferenceGradient-DefaultEpsilon'),
...  f.getEvaluation()))
setHessian(hessian)

Accessor to the hessian function.

Parameters:
hessian_functionHessianImplementation

The hessian function.

Examples

>>> import openturns as ot
>>> f = ot.SymbolicFunction(['x1', 'x2'],
...                         ['2 * x1^2 + x1 + 8 * x2 + 4 * cos(x1) * x2 + 6'])
>>> f.setHessian(ot.CenteredFiniteDifferenceHessian(
...  ot.ResourceMap.GetAsScalar('CenteredFiniteDifferenceHessian-DefaultEpsilon'),
...  f.getEvaluation()))
setInputDescription(inputDescription)

Accessor to the description of the input vector.

Parameters:
descriptionDescription

Description of the input vector.

setName(name)

Accessor to the object’s name.

Parameters:
namestr

The name of the object.

setOutputDescription(inputDescription)

Accessor to the description of the output vector.

Parameters:
descriptionDescription

Description of the output vector.

setParameter(parameter)

Accessor to the parameter values.

Parameters:
parametersequence of float

The parameter values.

setParameterDescription(description)

Accessor to the parameter description.

Parameters:
parameterDescription

The parameter description.

Examples using the class

Estimate correlation coefficients

Estimate correlation coefficients

Compare unconditional and conditional histograms

Compare unconditional and conditional histograms

Compute SRC indices confidence intervals

Compute SRC indices confidence intervals

Visualize sensitivity

Visualize sensitivity

Create your own distribution given its quantile function

Create your own distribution given its quantile function

Generate random variates by inverting the CDF

Generate random variates by inverting the CDF

Create a linear least squares model

Create a linear least squares model

Create a general linear model metamodel

Create a general linear model metamodel

Taylor approximations

Taylor approximations

Compute grouped indices for the Ishigami function

Compute grouped indices for the Ishigami function

Validate a polynomial chaos

Validate a polynomial chaos

Create a polynomial chaos metamodel

Create a polynomial chaos metamodel

Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos

Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos

Polynomial chaos expansion cross-validation

Polynomial chaos expansion cross-validation

Create a sparse chaos by integration

Create a sparse chaos by integration

Kriging: propagate uncertainties

Kriging: propagate uncertainties

Kriging : multiple input dimensions

Kriging : multiple input dimensions

Advanced kriging

Advanced kriging

Kriging : choose a trend vector space

Kriging : choose a trend vector space

Evaluate the mean of a random vector by simulations

Evaluate the mean of a random vector by simulations

Estimate a flooding probability

Estimate a flooding probability

FAST sensitivity indices

FAST sensitivity indices

Sobol’ sensitivity indices from chaos

Sobol' sensitivity indices from chaos

Estimate Sobol’ indices for the Ishigami function by a sampling method: a quick start guide to sensitivity analysis

Estimate Sobol' indices for the Ishigami function by a sampling method: a quick start guide to sensitivity analysis

The HSIC sensitivity indices: the Ishigami model

The HSIC sensitivity indices: the Ishigami model

Example of sensitivity analyses on the wing weight model

Example of sensitivity analyses on the wing weight model

Create a parametric function

Create a parametric function

Create a quadratic function

Create a quadratic function

Logistic growth model

Logistic growth model

Generate flooding model observations

Generate flooding model observations

Calibrate a parametric model: a quick-start guide to calibration

Calibrate a parametric model: a quick-start guide to calibration

Generate observations of the Chaboche mechanical model

Generate observations of the Chaboche mechanical model

Calibration without observed inputs

Calibration without observed inputs

Calibration of the logistic model

Calibration of the logistic model

Calibration of the deflection of a tube

Calibration of the deflection of a tube

Calibration of the flooding model

Calibration of the flooding model

Calibration of the Chaboche mechanical model

Calibration of the Chaboche mechanical model

Bayesian calibration of a computer code

Bayesian calibration of a computer code

Bayesian calibration of the flooding model

Bayesian calibration of the flooding model

Compute leave-one-out error of a polynomial chaos expansion

Compute leave-one-out error of a polynomial chaos expansion

Optimization using dlib

Optimization using dlib