.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/kriging_metamodel/plot_gpr_beam_arbitrary_trend.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_kriging_metamodel_plot_gpr_beam_arbitrary_trend.py: Gaussian Process Regression: choose an arbitrary trend ====================================================== .. GENERATED FROM PYTHON SOURCE LINES 7-21 The goal of this example is to show how to configure an arbitrary trend in a Kriging metamodel. In the :doc:`/auto_meta_modeling/kriging_metamodel/plot_gpr_choose_trend` and :doc:`/auto_meta_modeling/kriging_metamodel/plot_gpr_beam_trend` examples, we show how to configure a polynomial trend. In general, any collection of multivariate functions can be used as the `basis` argument of a :class:`~openturns.experimental.GaussianProcessFitter` or a :class:`~openturns.experimental.GaussianProcessRegression`. In practice, it might not be convenient to create a multivariate basis and this is why we sometimes create it by tensorization of univariate functions. In this example, we first use Legendre polynomials as our univariate functions, then we create an orthogonal polynomial basis corresponding to the input marginals. For this purpose, we use the :ref:`cantilever beam ` example. .. GENERATED FROM PYTHON SOURCE LINES 21-26 .. code-block:: Python from openturns.usecases import cantilever_beam import openturns as ot import openturns.experimental as otexp .. GENERATED FROM PYTHON SOURCE LINES 27-31 Definition of the model ----------------------- We load the cantilever beam use case .. GENERATED FROM PYTHON SOURCE LINES 31-33 .. code-block:: Python cb = cantilever_beam.CantileverBeam() .. GENERATED FROM PYTHON SOURCE LINES 34-35 We load the function (model) which evaluates the output `Y` depending on the inputs. .. GENERATED FROM PYTHON SOURCE LINES 35-37 .. code-block:: Python model = cb.model .. GENERATED FROM PYTHON SOURCE LINES 38-39 Then we define the distribution of the input random vector. .. GENERATED FROM PYTHON SOURCE LINES 39-42 .. code-block:: Python dimension = cb.dim # number of inputs input_dist = cb.distribution .. GENERATED FROM PYTHON SOURCE LINES 43-45 Create the design of experiments -------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 47-50 We consider a simple Monte-Carlo sampling as a design of experiments. This is why we generate an input sample using the `getSample` method of the distribution. Then we evaluate the output using the `model` function. .. GENERATED FROM PYTHON SOURCE LINES 50-54 .. code-block:: Python sampleSize_train = 20 X_train = input_dist.getSample(sampleSize_train) Y_train = model(X_train) .. GENERATED FROM PYTHON SOURCE LINES 55-61 Create the Legendre basis ------------------------- We first create a Legendre basis of univariate polynomials. The :class:`~openturns.LegendreFactory` class creates Legendre polynomials. .. GENERATED FROM PYTHON SOURCE LINES 61-63 .. code-block:: Python univariateFactory = ot.LegendreFactory() .. GENERATED FROM PYTHON SOURCE LINES 64-65 These polynomials are orthogonal with respect to the `Uniform` distribution on :math:`[-1, 1]`, as we can see. .. GENERATED FROM PYTHON SOURCE LINES 65-67 .. code-block:: Python univariateFactory.getMeasure() .. raw:: html
Uniform
  • name=Uniform
  • dimension=1
  • weight=1
  • range=[-1, 1]
  • description=[X0]
  • isParallel=true
  • isCopula=false


.. GENERATED FROM PYTHON SOURCE LINES 68-73 Even if the measure `Uniform([-1, 1])` does not correspond to the input marginal distributions, these polynomials will, anyway, create a consistent trend for the Gaussian process regression metamodel. We use the same polynomial family for each input. The multivariate polynomials are created by tensorization of the univariate polynomials. The linear enumerate function specifies the order followed to create the multivariate polynomials. .. GENERATED FROM PYTHON SOURCE LINES 73-77 .. code-block:: Python polyColl = [univariateFactory] * dimension enumerateFunction = ot.LinearEnumerateFunction(dimension) productBasis = ot.OrthogonalProductPolynomialFactory(polyColl, enumerateFunction) .. GENERATED FROM PYTHON SOURCE LINES 78-79 The basis contains the first multivariate polynomials ordered according to the linear enumerate function. .. GENERATED FROM PYTHON SOURCE LINES 79-86 .. code-block:: Python functions = [] numberOfTrendCoefficients = 12 for i in range(numberOfTrendCoefficients): multivariatepolynomial = productBasis.build(i) print(multivariatepolynomial) functions.append(multivariatepolynomial) .. rst-class:: sphx-glr-script-out .. code-block:: none 1 1.73205 * x0 1.73205 * x1 1.73205 * x2 1.73205 * x3 -1.11803 + 3.3541 * x0^2 1.73205 * x0 * 1.73205 * x1 1.73205 * x0 * 1.73205 * x2 1.73205 * x0 * 1.73205 * x3 -1.11803 + 3.3541 * x1^2 1.73205 * x1 * 1.73205 * x2 1.73205 * x1 * 1.73205 * x3 .. GENERATED FROM PYTHON SOURCE LINES 87-89 .. code-block:: Python basis = ot.Basis(functions) .. GENERATED FROM PYTHON SOURCE LINES 90-95 Create the Gaussian Process Regression metamodel ------------------------------------------------ In order to create the Gaussian process regression metamodel, we choose the function basis created previously and a squared exponential covariance model. .. GENERATED FROM PYTHON SOURCE LINES 95-97 .. code-block:: Python covariance_model = ot.SquaredExponential([1.0] * dimension, [1.0]) .. GENERATED FROM PYTHON SOURCE LINES 98-100 First, we estimate a Gaussian process approximating the model with the class :class:`~openturns.experimental.GaussianProcessFitter`. .. GENERATED FROM PYTHON SOURCE LINES 100-106 .. code-block:: Python algo_fit = otexp.GaussianProcessFitter(X_train, Y_train, covariance_model, basis) print("First run: algo GPFitter = ", algo_fit.getOptimizationAlgorithm()) algo_fit.setOptimizationAlgorithm(ot.TNC()) algo_fit.run() fitter_result = algo_fit.getResult() .. rst-class:: sphx-glr-script-out .. code-block:: none First run: algo GPFitter = class=OptimizationAlgorithm implementation=class=Cobyla class=OptimizationAlgorithmImplementation problem=class=OptimizationProblem implementation=class=OptimizationProblemImplementation objective=class=Function name=Unnamed implementation=class=FunctionImplementation name=Unnamed description=[] evaluationImplementation=class=NoEvaluation name=Unnamed gradientImplementation=class=NoGradient name=Unnamed hessianImplementation=class=NoHessian name=Unnamed equality constraint=none inequality constraint=none bounds=none minimization=true dimension=0 startingPoint=class=Point name=Unnamed dimension=0 values=[] maximumIterationNumber=100 maximumCallsNumber=1000 maximumAbsoluteError=1e-05 maximumRelativeError=1e-05 maximumResidualError=1e-05 maximumConstraintError=1e-05 rhoBeg=0.1 .. GENERATED FROM PYTHON SOURCE LINES 107-110 Then, we condition the estimated gaussian process to make it interpolate the data set using the class :class:`~openturns.experimental.GaussianProcessRegression`. .. GENERATED FROM PYTHON SOURCE LINES 110-114 .. code-block:: Python gpr_algo = otexp.GaussianProcessRegression(fitter_result) gpr_algo.run() gpr_result = gpr_algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 115-116 We get the Gaussian process regression metamodel. .. GENERATED FROM PYTHON SOURCE LINES 116-118 .. code-block:: Python gpr_metamodel = gpr_result.getMetaModel() .. GENERATED FROM PYTHON SOURCE LINES 119-122 The method :meth:`~openturns.experimental.GaussianProcessRegressionResult.getTrendCoefficients` returns the coefficients of the trend. We see that the number of coefficients in the trend corresponds to the number of functions in the basis. .. GENERATED FROM PYTHON SOURCE LINES 122-124 .. code-block:: Python print(gpr_result.getTrendCoefficients()) .. rst-class:: sphx-glr-script-out .. code-block:: none [7.28015e-46,8.51624e-35,3.75184e-43,3.21875e-45,1.81292e-52,1.11494e-23,4.38873e-32,3.76531e-34,2.12187e-41,2.17629e-40,1.65929e-42,9.3468e-50]#12 .. GENERATED FROM PYTHON SOURCE LINES 125-128 We get the updated covariance model. The :class:`~openturns.SquaredExponential` model has one amplitude coefficient and 4 scale coefficients. This is because this covariance model is anisotropic : each of the 4 input variables is associated with its own scale coefficient. .. GENERATED FROM PYTHON SOURCE LINES 128-130 .. code-block:: Python print(gpr_result.getCovarianceModel()) .. rst-class:: sphx-glr-script-out .. code-block:: none SquaredExponential(scale=[1,1,0.17976,0.01], amplitude=[0.0407412]) .. GENERATED FROM PYTHON SOURCE LINES 131-139 Create an orthogonal multivariate polynomial factory ---------------------------------------------------- We suggest here to create a multivariate polynomial basis where each marginal polynomial family corresponds to the polynomial family orthogonal to the marginal input distribution. We use the class :class:`~openturns.OrthogonalProductPolynomialFactory` created from the input marginal distributions. This corresponds to the basis we usually use in the polynomial chaos expansion. We first create the mutlivariate polynomial basis as a tensorized product of the univariate polynomial basis orthogonal to the marginal input distributions. .. GENERATED FROM PYTHON SOURCE LINES 139-141 .. code-block:: Python multivariateBasis = ot.OrthogonalProductPolynomialFactory([cb.E, cb.F, cb.L, cb.II]) .. GENERATED FROM PYTHON SOURCE LINES 142-143 Then we create the multivariate basis which has maximum degree equal to 2. There are 15 functions contained in the basis. .. GENERATED FROM PYTHON SOURCE LINES 143-148 .. code-block:: Python totalDegree = 2 enumerateFunction = multivariateBasis.getEnumerateFunction() numberOfTrendCoefficients = enumerateFunction.getStrataCumulatedCardinal(totalDegree) print("Number of functions in the basis: ", numberOfTrendCoefficients) .. rst-class:: sphx-glr-script-out .. code-block:: none Number of functions in the basis: 15 .. GENERATED FROM PYTHON SOURCE LINES 149-155 .. code-block:: Python functions = [] for i in range(numberOfTrendCoefficients): multivariatepolynomial = productBasis.build(i) print(multivariatepolynomial) functions.append(multivariatepolynomial) .. rst-class:: sphx-glr-script-out .. code-block:: none 1 1.73205 * x0 1.73205 * x1 1.73205 * x2 1.73205 * x3 -1.11803 + 3.3541 * x0^2 1.73205 * x0 * 1.73205 * x1 1.73205 * x0 * 1.73205 * x2 1.73205 * x0 * 1.73205 * x3 -1.11803 + 3.3541 * x1^2 1.73205 * x1 * 1.73205 * x2 1.73205 * x1 * 1.73205 * x3 -1.11803 + 3.3541 * x2^2 1.73205 * x2 * 1.73205 * x3 -1.11803 + 3.3541 * x3^2 .. GENERATED FROM PYTHON SOURCE LINES 156-158 .. code-block:: Python basis = ot.Basis(functions) .. GENERATED FROM PYTHON SOURCE LINES 159-169 .. code-block:: Python algo_fit = otexp.GaussianProcessFitter(X_train, Y_train, covariance_model, basis) print("Second run: algo GPFitter = ", algo_fit.getOptimizationAlgorithm()) algo_fit.setOptimizationAlgorithm(ot.TNC()) algo_fit.run() fitter_result = algo_fit.getResult() gpr_algo = otexp.GaussianProcessRegression(fitter_result) gpr_algo.run() gpr_result = gpr_algo.getResult() print(gpr_result.getTrendCoefficients()) .. rst-class:: sphx-glr-script-out .. code-block:: none Second run: algo GPFitter = class=OptimizationAlgorithm implementation=class=Cobyla class=OptimizationAlgorithmImplementation problem=class=OptimizationProblem implementation=class=OptimizationProblemImplementation objective=class=Function name=Unnamed implementation=class=FunctionImplementation name=Unnamed description=[] evaluationImplementation=class=NoEvaluation name=Unnamed gradientImplementation=class=NoGradient name=Unnamed hessianImplementation=class=NoHessian name=Unnamed equality constraint=none inequality constraint=none bounds=none minimization=true dimension=0 startingPoint=class=Point name=Unnamed dimension=0 values=[] maximumIterationNumber=100 maximumCallsNumber=1000 maximumAbsoluteError=1e-05 maximumRelativeError=1e-05 maximumResidualError=1e-05 maximumConstraintError=1e-05 rhoBeg=0.1 [7.28015e-46,8.51624e-35,3.75184e-43,3.21875e-45,1.81292e-52,1.11494e-23,4.38873e-32,3.76531e-34,2.12187e-41,2.17629e-40,1.65929e-42,9.3468e-50,1.50988e-44,8.01436e-52,-8.13946e-46]#15 .. GENERATED FROM PYTHON SOURCE LINES 170-181 Conclusion ---------- The trend that we have configured corresponds to the basis that we would have used in a full polynomial chaos expansioncomputed with least squares. Other extensions of this work would be: * to use a Fourier basis with :class:`~openturns.FourierSeriesFactory`, * wavelets with :class:`~openturns.HaarWaveletFactory`, or any other univariate factory. .. _sphx_glr_download_auto_meta_modeling_kriging_metamodel_plot_gpr_beam_arbitrary_trend.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gpr_beam_arbitrary_trend.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gpr_beam_arbitrary_trend.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_gpr_beam_arbitrary_trend.zip `