# Configuring an arbitrary trend in Kriging¶

The goal of this example is to show how to configure an arbitrary trend in a Kriging metamodel.

In general, any collection of multivariate functions can be used as the basis argument of a KrigingAlgorithm. In practice, it might not be convenient to create a multivariate basis and this is why we sometimes create it by tensorization of univariate functions. In this example, we first use Legendre polynomials as our univariate functions, then we create an orthogonal polynomial basis corresponding to the input marginals.

For this purpose, we use the cantilever beam example.

## Definition of the model¶

[1]:

import openturns as ot
ot.RandomGenerator.SetSeed(0)


We define the symbolic function which evaluates the output Y depending on the inputs E, F, L and I.

[2]:

model = ot.SymbolicFunction(["E", "F", "L", "I"], ["F*L^3/(3*E*I)"])


Then we define the distribution of the input random vector.

[3]:

# Young's modulus E
E = ot.Beta(0.9, 2.27, 2.5e7, 5.0e7) # in N/m^2
E.setDescription("E")
F = ot.LogNormal() # in N
F.setParameter(ot.LogNormalMuSigma()([30.e3, 9e3, 15.e3]))
F.setDescription("F")
# Length L
L = ot.Uniform(250., 260.) # in cm
L.setDescription("L")
# Moment of inertia I
I = ot.Beta(2.5, 1.5, 310, 450) # in cm^4
I.setDescription("I")


Finally, we define the dependency using a NormalCopula.

[4]:

dimension = 4 # number of inputs
R = ot.CorrelationMatrix(dimension)
R[2, 3] = -0.2
myCopula = ot.NormalCopula(ot.NormalCopula.GetCorrelationFromSpearmanCorrelation(R))
myDistribution = ot.ComposedDistribution([E, F, L, I], myCopula)


## Create the design of experiments¶

We consider a simple Monte-Carlo sampling as a design of experiments. This is why we generate an input sample using the getSample method of the distribution. Then we evaluate the output using the model function.

[5]:

sampleSize_train = 20
X_train = myDistribution.getSample(sampleSize_train)
Y_train = model(X_train)


## Create the Legendre basis¶

We first create a Legendre basis of univariate polynomials. In order to convert then into multivariate polynomials, we use a linear enumerate function.

The LegendreFactory class creates Legendre polynomials.

[6]:

univariateFactory = ot.LegendreFactory()


This factory corresponds to the Uniform distribution in the [-1,1] interval.

[7]:

univariateFactory.getMeasure()

[7]:


Uniform(a = -1, b = 1)

This interval does not correspond to the interval on which the input marginals are defined (we will come back to this topic later), but this will, anyway, create a consistent trend for the kriging.

[8]:

polyColl = [univariateFactory]*dimension

[9]:

enumerateFunction = ot.LinearEnumerateFunction(dimension)
productBasis = ot.OrthogonalProductPolynomialFactory(polyColl, enumerateFunction)

[10]:

functions = []
numberOfTrendCoefficients = 12
for i in range(numberOfTrendCoefficients):
multivariatepolynomial = productBasis.build(i)
print(multivariatepolynomial)
functions.append(multivariatepolynomial)

1
1.73205 * x0
1.73205 * x1
1.73205 * x2
1.73205 * x3
-1.11803 + 3.3541 * x0^2
(1.73205 * x0) * (1.73205 * x1)
(1.73205 * x0) * (1.73205 * x2)
(1.73205 * x0) * (1.73205 * x3)
-1.11803 + 3.3541 * x1^2
(1.73205 * x1) * (1.73205 * x2)
(1.73205 * x1) * (1.73205 * x3)

[11]:

basis = ot.Basis(functions)


## Create the metamodel¶

In order to create the kriging metamodel, we first select a constant trend with the ConstantBasisFactory class. Then we use a squared exponential covariance model. Finally, we use the KrigingAlgorithm class to create the kriging metamodel, taking the training sample, the covariance model and the trend basis as input arguments.

[12]:

covarianceModel = ot.SquaredExponential([1.]*dimension, [1.0])

[13]:

algo = ot.KrigingAlgorithm(X_train, Y_train, covarianceModel, basis)
algo.run()
result = algo.getResult()
krigingWithConstantTrend = result.getMetaModel()


The getTrendCoefficients method returns the coefficients of the trend.

[14]:

result.getTrendCoefficients()

[14]:

[class=Point name=Unnamed dimension=12 values=[11.5089,-1.26338,1.83793,0.183117,-0.471548,0.103725,-0.193355,-0.0308367,0.0220305,0.000673478,0.0361209,-0.0816526]]


We see that the number of coefficients in the trend corresponds to the number of functions in the basis.

[15]:

result.getCovarianceModel()

[15]:


SquaredExponential(scale=[0.0100001,1.60074,1.07073,0.01], amplitude=[0.0653644])

The SquaredExponential model has one amplitude coefficient and 4 scale coefficients. This is because this covariance model is anisotropic : each of the 4 input variables is associated with its own scale coefficient.

## Create an orthogonal multivariate polynomial factory¶

In order to create a Legendre basis which better corresponds to the input marginals, we could consider the orthogonal basis which would be associated to uniform marginals. To compute the bounds of these uniform distributions, we may consider the 1% and 99% quantiles of each marginal.

There is, however, a simpler way to proceed. We can simply orthogonalize the input marginals and create the orthogonal polynomial basis corresponding to the inputs. This corresponds to the method we would use in the polynomial chaos.

We first create the polynomial basis which corresponds to the inputs.

[16]:

multivariateBasis = ot.OrthogonalProductPolynomialFactory([E, F, L, I])


Then we create the multivariate basis which has maximum degree equal to 2.

[17]:

totalDegree = 2
enumerateFunction = multivariateBasis.getEnumerateFunction()
numberOfTrendCoefficients = enumerateFunction.getStrataCumulatedCardinal(totalDegree)
numberOfTrendCoefficients

[17]:

15

[18]:

functions = []
for i in range(numberOfTrendCoefficients):
multivariatepolynomial = productBasis.build(i)
print(multivariatepolynomial)
functions.append(multivariatepolynomial)

1
1.73205 * x0
1.73205 * x1
1.73205 * x2
1.73205 * x3
-1.11803 + 3.3541 * x0^2
(1.73205 * x0) * (1.73205 * x1)
(1.73205 * x0) * (1.73205 * x2)
(1.73205 * x0) * (1.73205 * x3)
-1.11803 + 3.3541 * x1^2
(1.73205 * x1) * (1.73205 * x2)
(1.73205 * x1) * (1.73205 * x3)
-1.11803 + 3.3541 * x2^2
(1.73205 * x2) * (1.73205 * x3)
-1.11803 + 3.3541 * x3^2

[19]:

basis = ot.Basis(functions)

[20]:

algo = ot.KrigingAlgorithm(X_train, Y_train, covarianceModel, basis)
algo.run()
result = algo.getResult()
krigingWithConstantTrend = result.getMetaModel()


The getTrendCoefficients method returns the coefficients of the trend.

[21]:

result.getTrendCoefficients()

[21]:

[class=Point name=Unnamed dimension=15 values=[11.4567,-1.23651,1.81254,0.161436,-0.450257,0.096574,-0.184094,-0.0354149,0.0113673,-0.0137544,0.0311788,-0.0575616,0.0153745,-0.0264187,0.0251764]]


## Conclusion¶

The trend that we have configured corresponds to the basis that we would have used in a full polynomial chaos computed with least squares.

Other extensions of this work would be:

• to use a Fourier basis with FourierSeriesFactory,

• wavelets with HaarWaveletFactory,

or any other univariate factory.