.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/kriging_metamodel/plot_propagate_gpr_ishigami.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_kriging_metamodel_plot_propagate_gpr_ishigami.py: Gaussian Process Regression: propagate uncertainties ==================================================== In this example we propagate uncertainties through a GP metamodel of the :ref:`Ishigami model`. .. GENERATED FROM PYTHON SOURCE LINES 7-13 .. code-block:: Python import openturns as ot import openturns.experimental as otexp import openturns.viewer as otv .. GENERATED FROM PYTHON SOURCE LINES 14-18 We first build the metamodel and then compute its mean with a Monte-Carlo computation. We load the Ishigami model from the usecases module: .. GENERATED FROM PYTHON SOURCE LINES 18-23 .. code-block:: Python from openturns.usecases import ishigami_function im = ishigami_function.IshigamiModel() .. GENERATED FROM PYTHON SOURCE LINES 24-26 We build a design of experiments with a Latin Hypercube Sampling (LHS) for the three input variables supposed independent. .. GENERATED FROM PYTHON SOURCE LINES 26-29 .. code-block:: Python experiment = ot.LHSExperiment(im.inputDistribution, 30, False, True) xdata = experiment.generate() .. GENERATED FROM PYTHON SOURCE LINES 30-32 We get the exact model and evaluate it at the input training data `xdata` to build the output data `ydata`. .. GENERATED FROM PYTHON SOURCE LINES 32-35 .. code-block:: Python model = im.model ydata = model(xdata) .. GENERATED FROM PYTHON SOURCE LINES 36-41 We define our GP process: - a constant basis in :math:`\mathbb{R}^3` ; - a squared exponential covariance function. .. GENERATED FROM PYTHON SOURCE LINES 41-51 .. code-block:: Python dimension = 3 basis = ot.ConstantBasisFactory(dimension).build() covarianceModel = ot.SquaredExponential([0.1] * dimension, [1.0]) fitter = otexp.GaussianProcessFitter(xdata, ydata, covarianceModel, basis) fitter.run() fitter_result = fitter.getResult() algo = otexp.GaussianProcessRegression(fitter_result) algo.run() result = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 52-53 We finally get the metamodel to use with Monte-Carlo. .. GENERATED FROM PYTHON SOURCE LINES 53-55 .. code-block:: Python metamodel = result.getMetaModel() .. GENERATED FROM PYTHON SOURCE LINES 56-60 We want to estmate the mean of the Ishigami model with Monte-Carlo using the metamodel instead of the exact model. We first create a random vector following the input distribution : .. GENERATED FROM PYTHON SOURCE LINES 60-62 .. code-block:: Python X = ot.RandomVector(im.inputDistribution) .. GENERATED FROM PYTHON SOURCE LINES 63-65 And then we create a random vector from the image of the input random vector by the metamodel : .. GENERATED FROM PYTHON SOURCE LINES 65-67 .. code-block:: Python Y = ot.CompositeRandomVector(metamodel, X) .. GENERATED FROM PYTHON SOURCE LINES 68-69 We now set our :class:`~openturns.ExpectationSimulationAlgorithm` object : .. GENERATED FROM PYTHON SOURCE LINES 69-74 .. code-block:: Python algo = ot.ExpectationSimulationAlgorithm(Y) algo.setMaximumOuterSampling(50000) algo.setBlockSize(1) algo.setCoefficientOfVariationCriterionType("NONE") .. GENERATED FROM PYTHON SOURCE LINES 75-76 We run it and store the results : .. GENERATED FROM PYTHON SOURCE LINES 76-79 .. code-block:: Python algo.run() result = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 80-81 The expectation ( :math:`\mathbb{E}(Y)` mean ) is obtained with : .. GENERATED FROM PYTHON SOURCE LINES 81-83 .. code-block:: Python expectation = result.getExpectationEstimate() .. GENERATED FROM PYTHON SOURCE LINES 84-85 The mean estimate of the metamodel is .. GENERATED FROM PYTHON SOURCE LINES 85-87 .. code-block:: Python print("Mean of the Ishigami metamodel : %.3e" % expectation[0]) .. rst-class:: sphx-glr-script-out .. code-block:: none Mean of the Ishigami metamodel : 3.532e+00 .. GENERATED FROM PYTHON SOURCE LINES 88-89 We draw the convergence history. .. GENERATED FROM PYTHON SOURCE LINES 89-92 .. code-block:: Python graph = algo.drawExpectationConvergence() view = otv.View(graph) .. image-sg:: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_propagate_gpr_ishigami_001.svg :alt: Expectation convergence graph at level 0.95 :srcset: /auto_meta_modeling/kriging_metamodel/images/sphx_glr_plot_propagate_gpr_ishigami_001.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 93-94 For reference, the exact mean of the Ishigami model is : .. GENERATED FROM PYTHON SOURCE LINES 94-96 .. code-block:: Python print("Mean of the Ishigami model : %.3e" % im.expectation) .. rst-class:: sphx-glr-script-out .. code-block:: none Mean of the Ishigami model : 3.500e+00 .. GENERATED FROM PYTHON SOURCE LINES 97-98 .. code-block:: Python otv.View.ShowAll() .. _sphx_glr_download_auto_meta_modeling_kriging_metamodel_plot_propagate_gpr_ishigami.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_propagate_gpr_ishigami.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_propagate_gpr_ishigami.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_propagate_gpr_ishigami.zip `