.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/kriging_metamodel/plot_kriging_cantilever_beam_hmat.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_kriging_metamodel_plot_kriging_cantilever_beam_hmat.py: Kriging the cantilever beam model using HMAT ============================================ .. GENERATED FROM PYTHON SOURCE LINES 6-7 In this example, we create a Kriging metamodel of the :ref:`cantilever beam `. We use a squared exponential covariance kernel for the Gaussian process. In order to estimate the hyper-parameters, we use a design of experiments of size is 20. .. GENERATED FROM PYTHON SOURCE LINES 10-12 Definition of the model ----------------------- .. GENERATED FROM PYTHON SOURCE LINES 14-19 .. code-block:: default import openturns as ot import openturns.viewer as viewer from matplotlib import pylab as plt ot.Log.Show(ot.Log.NONE) .. GENERATED FROM PYTHON SOURCE LINES 20-21 We load the cantilever beam use case : .. GENERATED FROM PYTHON SOURCE LINES 21-24 .. code-block:: default from openturns.usecases import cantilever_beam as cantilever_beam cb = cantilever_beam.CantileverBeam() .. GENERATED FROM PYTHON SOURCE LINES 25-26 We define the function which evaluates the output depending on the inputs. .. GENERATED FROM PYTHON SOURCE LINES 26-28 .. code-block:: default model = cb.model .. GENERATED FROM PYTHON SOURCE LINES 29-30 Then we define the distribution of the input random vector. .. GENERATED FROM PYTHON SOURCE LINES 30-33 .. code-block:: default dim = cb.dim # number of inputs myDistribution = cb.distribution .. GENERATED FROM PYTHON SOURCE LINES 34-36 Create the design of experiments -------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 38-39 We consider a simple Monte-Carlo sample as a design of experiments. This is why we generate an input sample using the `getSample` method of the distribution. Then we evaluate the output using the `model` function. .. GENERATED FROM PYTHON SOURCE LINES 41-45 .. code-block:: default sampleSize_train = 20 X_train = myDistribution.getSample(sampleSize_train) Y_train = model(X_train) .. GENERATED FROM PYTHON SOURCE LINES 46-47 The following figure presents the distribution of the vertical deviations Y on the training sample. We observe that the large deviations occur less often. .. GENERATED FROM PYTHON SOURCE LINES 49-55 .. code-block:: default histo = ot.HistogramFactory().build(Y_train).drawPDF() histo.setXTitle("Vertical deviation (cm)") histo.setTitle("Distribution of the vertical deviation") histo.setLegends([""]) view = viewer.View(histo) .. image:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_cantilever_beam_hmat_001.png :alt: Distribution of the vertical deviation :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 56-58 Create the metamodel -------------------- .. GENERATED FROM PYTHON SOURCE LINES 60-62 We rely on `H-Matrix` approximation for accelerating the evaluation. We change default parameters (compression, recompression) to higher values. The model is less accurate but very fast to build & evaluate. .. GENERATED FROM PYTHON SOURCE LINES 64-68 .. code-block:: default ot.ResourceMap.SetAsString("KrigingAlgorithm-LinearAlgebra", "HMAT") ot.ResourceMap.SetAsScalar("HMatrix-AssemblyEpsilon", 1e-3) ot.ResourceMap.SetAsScalar( "HMatrix-RecompressionEpsilon", 1e-4) .. GENERATED FROM PYTHON SOURCE LINES 69-71 In order to create the Kriging metamodel, we first select a constant trend with the `ConstantBasisFactory` class. Then we use a squared exponential covariance kernel. The `SquaredExponential` kernel has one amplitude coefficient and 4 scale coefficients. This is because this covariance kernel is anisotropic : each of the 4 input variables is associated with its own scale coefficient. .. GENERATED FROM PYTHON SOURCE LINES 73-76 .. code-block:: default basis = ot.ConstantBasisFactory(dim).build() covarianceModel = ot.SquaredExponential(dim) .. GENERATED FROM PYTHON SOURCE LINES 77-79 Typically, the optimization algorithm is quite good at setting sensible optimization bounds. In this case, however, the range of the input domain is extreme. .. GENERATED FROM PYTHON SOURCE LINES 81-84 .. code-block:: default print("Lower and upper bounds of X_train:") print(X_train.getMin(), X_train.getMax()) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none Lower and upper bounds of X_train: [6.50131e+10,262.222,2.5196,1.309e-07] [7.07581e+10,340.736,2.5983,1.6534e-07] .. GENERATED FROM PYTHON SOURCE LINES 85-87 We need to manually define sensible optimization bounds. Note that since the amplitude parameter is computed analytically (this is possible when the output dimension is 1), we only need to set bounds on the scale parameter. .. GENERATED FROM PYTHON SOURCE LINES 89-91 .. code-block:: default scaleOptimizationBounds = ot.Interval([1.0, 1.0, 1.0, 1.0e-10], [1.0e11, 1.0e3, 1.0e1, 1.0e-5]) .. GENERATED FROM PYTHON SOURCE LINES 92-96 Finally, we use the `KrigingAlgorithm` class to create the Kriging metamodel. It requires a training sample, a covariance kernel and a trend basis as input arguments. We need to set the initial scale parameter for the optimization. The upper bound of the input domain is a sensible choice here. We must not forget to actually set the optimization bounds defined above. .. GENERATED FROM PYTHON SOURCE LINES 98-103 .. code-block:: default covarianceModel.setScale(X_train.getMax()) algo = ot.KrigingAlgorithm(X_train, Y_train, covarianceModel, basis) algo.setOptimizationBounds(scaleOptimizationBounds) .. GENERATED FROM PYTHON SOURCE LINES 104-107 The `run` method has optimized the hyperparameters of the metamodel. We can then print the constant trend of the metamodel, which have been estimated using the least squares method. .. GENERATED FROM PYTHON SOURCE LINES 109-113 .. code-block:: default algo.run() result = algo.getResult() krigingMetamodel = result.getMetaModel() .. GENERATED FROM PYTHON SOURCE LINES 114-115 The `getTrendCoefficients` method returns the coefficients of the trend. .. GENERATED FROM PYTHON SOURCE LINES 117-119 .. code-block:: default print(result.getTrendCoefficients()) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none [[0.178989]] .. GENERATED FROM PYTHON SOURCE LINES 120-121 We can also print the hyperparameters of the covariance model, which have been estimated by maximizing the likelihood. .. GENERATED FROM PYTHON SOURCE LINES 123-125 .. code-block:: default result.getCovarianceModel() .. raw:: html

SquaredExponential(scale=[7.07581e+10,340.736,2.5983,1.6534e-07], amplitude=[0.0633272])



.. GENERATED FROM PYTHON SOURCE LINES 126-128 Validate the metamodel ---------------------- .. GENERATED FROM PYTHON SOURCE LINES 130-131 We finally want to validate the Kriging metamodel. This is why we generate a validation sample with size 100 and we evaluate the output of the model on this sample. .. GENERATED FROM PYTHON SOURCE LINES 133-137 .. code-block:: default sampleSize_test = 100 X_test = myDistribution.getSample(sampleSize_test) Y_test = model(X_test) .. GENERATED FROM PYTHON SOURCE LINES 138-139 The `MetaModelValidation` classe makes the validation easy. To create it, we use the validation samples and the metamodel. .. GENERATED FROM PYTHON SOURCE LINES 141-143 .. code-block:: default val = ot.MetaModelValidation(X_test, Y_test, krigingMetamodel) .. GENERATED FROM PYTHON SOURCE LINES 144-145 The `computePredictivityFactor` computes the Q2 factor. .. GENERATED FROM PYTHON SOURCE LINES 147-150 .. code-block:: default Q2 = val.computePredictivityFactor()[0] print(Q2) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none 0.7692513766306599 .. GENERATED FROM PYTHON SOURCE LINES 151-152 The residuals are the difference between the model and the metamodel. .. GENERATED FROM PYTHON SOURCE LINES 154-161 .. code-block:: default r = val.getResidualSample() graph = ot.HistogramFactory().build(r).drawPDF() graph.setXTitle("Residuals (cm)") graph.setTitle("Distribution of the residuals") graph.setLegends([""]) view = viewer.View(graph) .. image:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_cantilever_beam_hmat_002.png :alt: Distribution of the residuals :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 162-163 We observe that the negative residuals occur with nearly the same frequency of the positive residuals: this is a first sign of good quality. .. GENERATED FROM PYTHON SOURCE LINES 165-166 The `drawValidation` method allows to compare the observed outputs and the metamodel outputs. .. GENERATED FROM PYTHON SOURCE LINES 168-169 sphinx_gallery_thumbnail_number = 3 .. GENERATED FROM PYTHON SOURCE LINES 169-174 .. code-block:: default graph = val.drawValidation() graph.setTitle("Q2 = %.2f%%" % (100*Q2)) view = viewer.View(graph) plt.show() .. image:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_kriging_cantilever_beam_hmat_003.png :alt: Q2 = 76.93% :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 0.284 seconds) .. _sphx_glr_download_auto_meta_modeling_kriging_metamodel_plot_kriging_cantilever_beam_hmat.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_kriging_cantilever_beam_hmat.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_kriging_cantilever_beam_hmat.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_