.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/kriging_metamodel/plot_gpr_cantilever_beam.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_kriging_metamodel_plot_gpr_cantilever_beam.py: Gaussian Process Regression : cantilever beam model =================================================== .. GENERATED FROM PYTHON SOURCE LINES 7-9 In this example, we create a Gaussian Process Regression (GPR) metamodel of the :ref:`cantilever beam `. We use a squared exponential covariance kernel for the Gaussian process. In order to estimate the hyper-parameters, we use a design of experiments of size 20. .. GENERATED FROM PYTHON SOURCE LINES 9-16 .. code-block:: Python from openturns.usecases import cantilever_beam import openturns as ot import openturns.experimental as otexp import openturns.viewer as viewer # sphinx_gallery_thumbnail_number = 3 .. GENERATED FROM PYTHON SOURCE LINES 17-21 Definition of the model ----------------------- We load the cantilever beam use case : .. GENERATED FROM PYTHON SOURCE LINES 21-23 .. code-block:: Python cb = cantilever_beam.CantileverBeam() .. GENERATED FROM PYTHON SOURCE LINES 24-25 We define the function which evaluates the output depending on the inputs. .. GENERATED FROM PYTHON SOURCE LINES 25-27 .. code-block:: Python model = cb.model .. GENERATED FROM PYTHON SOURCE LINES 28-29 Then we define the distribution of the input random vector. .. GENERATED FROM PYTHON SOURCE LINES 29-31 .. code-block:: Python myDistribution = cb.distribution .. GENERATED FROM PYTHON SOURCE LINES 32-38 Create the design of experiments -------------------------------- We consider a simple Monte-Carlo sample as a design of experiments. This is why we generate an input sample using the method :meth:`~openturns.Distribution.getSample` of the distribution. Then we evaluate the output using the `model` function. .. GENERATED FROM PYTHON SOURCE LINES 40-44 .. code-block:: Python sampleSize_train = 20 X_train = myDistribution.getSample(sampleSize_train) Y_train = model(X_train) .. GENERATED FROM PYTHON SOURCE LINES 45-46 The following figure presents the distribution of the vertical deviations Y on the training sample. We observe that the large deviations occur less often. .. GENERATED FROM PYTHON SOURCE LINES 46-52 .. code-block:: Python histo = ot.HistogramFactory().build(Y_train).drawPDF() histo.setXTitle("Vertical deviation (cm)") histo.setTitle("Distribution of the vertical deviation") histo.setLegends([""]) view = viewer.View(histo) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_gpr_cantilever_beam_001.svg :alt: Distribution of the vertical deviation :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_gpr_cantilever_beam_001.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 53-59 Create the metamodel -------------------- In order to create the GPR metamodel, we first select a constant trend with the :class:`~openturns.ConstantBasisFactory` class. Then we use a squared exponential covariance kernel. The :class:`~openturns.SquaredExponential` kernel has one amplitude coefficient and 4 scale coefficients. This is because this covariance kernel is anisotropic : each of the 4 input variables is associated with its own scale coefficient. .. GENERATED FROM PYTHON SOURCE LINES 59-62 .. code-block:: Python basis = ot.ConstantBasisFactory(cb.dim).build() covarianceModel = ot.SquaredExponential(cb.dim) .. GENERATED FROM PYTHON SOURCE LINES 63-65 Typically, the optimization algorithm is quite good at setting sensible optimization bounds. In this case, however, the range of the input domain is extreme. .. GENERATED FROM PYTHON SOURCE LINES 65-68 .. code-block:: Python print("Lower and upper bounds of X_train:") print(X_train.getMin(), X_train.getMax()) .. rst-class:: sphx-glr-script-out .. code-block:: none Lower and upper bounds of X_train: [6.50289e+10,248.4,2.50959,1.33551e-07] [7.19913e+10,332.67,2.59947,1.57999e-07] .. GENERATED FROM PYTHON SOURCE LINES 69-71 We need to manually define sensible optimization bounds. Note that since the amplitude parameter is computed analytically (this is possible when the output dimension is 1), we only need to set bounds on the scale parameter. .. GENERATED FROM PYTHON SOURCE LINES 71-75 .. code-block:: Python scaleOptimizationBounds = ot.Interval( [1.0, 1.0, 1.0, 1.0e-10], [1.0e11, 1.0e3, 1.0e1, 1.0e-5] ) .. GENERATED FROM PYTHON SOURCE LINES 76-80 Finally, we use the :class:`~openturns.experimental.GaussianProcessFitter` and `GaussianProcessRegression` classes to create the GPR metamodel. It requires a training sample, a covariance kernel and a trend basis as input arguments. We need to set the initial scale parameter for the optimization. The upper bound of the input domain is a sensible choice here. We must not forget to actually set the optimization bounds defined above. .. GENERATED FROM PYTHON SOURCE LINES 80-85 .. code-block:: Python covarianceModel.setScale(X_train.getMax()) fitter_algo = otexp.GaussianProcessFitter(X_train, Y_train, covarianceModel, basis) fitter_algo.setOptimizationBounds(scaleOptimizationBounds) .. GENERATED FROM PYTHON SOURCE LINES 86-93 The method :meth:`~openturns.experimental.GaussianProcessFitter.run` of the class :class:`~openturns.experimental.GaussianProcessFitter.run` optimizes the Gaussian process hyperparameters and the method :meth:`~openturns.experimental.GaussianProcessRegression.run` of the class :class:`~openturns.experimental.GaussianProcessRegression` conditions the Gaussian process to the data set. We can then print the constant trend of the metamodel, estimated using the least squares method. .. GENERATED FROM PYTHON SOURCE LINES 93-100 .. code-block:: Python fitter_algo.run() fitter_result = fitter_algo.getResult() gpr_algo = otexp.GaussianProcessRegression(fitter_result) gpr_algo.run() gpr_result = gpr_algo.getResult() gprMetamodel = gpr_result.getMetaModel() .. GENERATED FROM PYTHON SOURCE LINES 101-104 The method :meth:`~openturns.experimental.GaussianProcessRegressionResult.getTrendCoefficients` of the class :class:`~openturns.experimental.GaussianProcessRegressionResult` returns the coefficients of the trend. .. GENERATED FROM PYTHON SOURCE LINES 104-106 .. code-block:: Python print(gpr_result.getTrendCoefficients()) .. rst-class:: sphx-glr-script-out .. code-block:: none [0.427435] .. GENERATED FROM PYTHON SOURCE LINES 107-108 We can also print the hyperparameters of the covariance model, which have been estimated by maximizing the likelihood. .. GENERATED FROM PYTHON SOURCE LINES 108-110 .. code-block:: Python gpr_result.getCovarianceModel() .. raw:: html

SquaredExponential(scale=[7.19913e+10,332.782,2.59947,1.57999e-07], amplitude=[0.528744])



.. GENERATED FROM PYTHON SOURCE LINES 111-115 Validate the metamodel ---------------------- We finally want to validate the GPR metamodel. This is why we generate a validation sample with size 100 and we evaluate the output of the model on this sample. .. GENERATED FROM PYTHON SOURCE LINES 115-119 .. code-block:: Python sampleSize_test = 100 X_test = myDistribution.getSample(sampleSize_test) Y_test = model(X_test) .. GENERATED FROM PYTHON SOURCE LINES 120-121 The class :class:`~openturns.MetaModelValidation` makes the validation easy. To create it, we use the validation samples and the metamodel. .. GENERATED FROM PYTHON SOURCE LINES 121-123 .. code-block:: Python val = ot.MetaModelValidation(Y_test, gprMetamodel(X_test)) .. GENERATED FROM PYTHON SOURCE LINES 124-125 The method :meth:`~openturns.MetaModelValidation.computeR2Score` computes the R2 score. .. GENERATED FROM PYTHON SOURCE LINES 125-128 .. code-block:: Python R2 = val.computeR2Score()[0] print(R2) .. rst-class:: sphx-glr-script-out .. code-block:: none 0.9999355169838081 .. GENERATED FROM PYTHON SOURCE LINES 129-130 The residuals are the difference between the model and the metamodel. .. GENERATED FROM PYTHON SOURCE LINES 130-137 .. code-block:: Python r = val.getResidualSample() graph = ot.HistogramFactory().build(r).drawPDF() graph.setXTitle("Residuals (cm)") graph.setTitle("Distribution of the residuals") graph.setLegends([""]) view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_gpr_cantilever_beam_002.svg :alt: Distribution of the residuals :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_gpr_cantilever_beam_002.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 138-140 We observe that the negative residuals occur with nearly the same frequency of the positive residuals: this is a first sign of good quality. The method :meth:`~openturns.MetaModelValidation.drawValidation` compares the observed outputs and the metamodel outputs. .. GENERATED FROM PYTHON SOURCE LINES 140-144 .. code-block:: Python graph = val.drawValidation() graph.setTitle("R2 = %.2f%%" % (100 * R2)) view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_gpr_cantilever_beam_003.svg :alt: R2 = 99.99% :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_gpr_cantilever_beam_003.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 145-146 Display all figures .. GENERATED FROM PYTHON SOURCE LINES 146-147 .. code-block:: Python viewer.View.ShowAll() .. _sphx_glr_download_auto_meta_modeling_kriging_metamodel_plot_gpr_cantilever_beam.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_gpr_cantilever_beam.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_gpr_cantilever_beam.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_gpr_cantilever_beam.zip `