.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/polynomial_chaos_metamodel/plot_chaos_ishigami.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_polynomial_chaos_metamodel_plot_chaos_ishigami.py: Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos ============================================================================================ .. GENERATED FROM PYTHON SOURCE LINES 6-8 In this example, we create a polynomial chaos for the :ref:`Ishigami function`. We create a sparse polynomial with maximum total degree equal to 8. .. GENERATED FROM PYTHON SOURCE LINES 11-13 Define the model ---------------- .. GENERATED FROM PYTHON SOURCE LINES 15-23 .. code-block:: default from openturns.usecases import ishigami_function as ishigami_function import openturns as ot import openturns.viewer as viewer from matplotlib import pylab as plt import numpy as np ot.Log.Show(ot.Log.NONE) ot.RandomGenerator.SetSeed(0) .. GENERATED FROM PYTHON SOURCE LINES 24-25 We load the Ishigami model : .. GENERATED FROM PYTHON SOURCE LINES 25-27 .. code-block:: default im = ishigami_function.IshigamiModel() .. GENERATED FROM PYTHON SOURCE LINES 28-30 The `IshigamiModel` data class contains the input distribution :math:`X=(X_1, X_2, X_3)` in `im.distributionX` and the Ishigami function in `im.model`. We also have access to the input variable names with .. GENERATED FROM PYTHON SOURCE LINES 30-33 .. code-block:: default input_names = im.distributionX.getDescription() .. GENERATED FROM PYTHON SOURCE LINES 34-36 Draw the function ----------------- .. GENERATED FROM PYTHON SOURCE LINES 38-39 Create a training sample .. GENERATED FROM PYTHON SOURCE LINES 41-46 .. code-block:: default N = 1000 inputSample = im.distributionX.getSample(N) outputSample = im.model(inputSample) .. GENERATED FROM PYTHON SOURCE LINES 47-64 .. code-block:: default def plotXvsY(sampleX, sampleY, figsize=(15, 3)): import pylab as pl import openturns.viewer dimX = sampleX.getDimension() inputdescr = sampleX.getDescription() fig = pl.figure(figsize=figsize) for i in range(dimX): ax = fig.add_subplot(1, dimX, i+1) graph = ot.Graph('', inputdescr[i], 'Y', True, '') cloud = ot.Cloud(sampleX[:, i], sampleY) graph.add(cloud) _ = ot.viewer.View(graph, figure=fig, axes=[ax]) return None plotXvsY(inputSample, outputSample) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_001.png :alt: plot chaos ishigami :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 65-68 .. code-block:: default graph = ot.HistogramFactory().build(outputSample).drawPDF() view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_002.png :alt: y0 PDF :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 69-70 We see that the distribution of the output has two modes. .. GENERATED FROM PYTHON SOURCE LINES 72-74 Create the polynomial chaos model --------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 76-77 Create a training sample .. GENERATED FROM PYTHON SOURCE LINES 79-83 .. code-block:: default N = 100 inputTrain = im.distributionX.getSample(N) outputTrain = im.model(inputTrain) .. GENERATED FROM PYTHON SOURCE LINES 84-87 Create the chaos. We could use only the input and output training samples: in this case, the distribution of the input sample is computed by selecting the distribution that has the best fit. .. GENERATED FROM PYTHON SOURCE LINES 89-91 .. code-block:: default chaosalgo = ot.FunctionalChaosAlgorithm(inputTrain, outputTrain) .. GENERATED FROM PYTHON SOURCE LINES 92-93 Since the input distribution is known in our particular case, we instead create the multivariate basis from the distribution. .. GENERATED FROM PYTHON SOURCE LINES 95-107 .. code-block:: default multivariateBasis = ot.OrthogonalProductPolynomialFactory( [im.X1, im.X2, im.X3]) selectionAlgorithm = ot.LeastSquaresMetaModelSelectionFactory() projectionStrategy = ot.LeastSquaresStrategy( inputTrain, outputTrain, selectionAlgorithm) totalDegree = 8 enumfunc = multivariateBasis.getEnumerateFunction() P = enumfunc.getStrataCumulatedCardinal(totalDegree) adaptiveStrategy = ot.FixedStrategy(multivariateBasis, P) chaosalgo = ot.FunctionalChaosAlgorithm( inputTrain, outputTrain, im.distributionX, adaptiveStrategy, projectionStrategy) .. GENERATED FROM PYTHON SOURCE LINES 108-112 .. code-block:: default chaosalgo.run() result = chaosalgo.getResult() metamodel = result.getMetaModel() .. GENERATED FROM PYTHON SOURCE LINES 113-114 In order to validate the metamodel, we generate a test sample. .. GENERATED FROM PYTHON SOURCE LINES 116-123 .. code-block:: default n_valid = 1000 inputTest = im.distributionX.getSample(n_valid) outputTest = im.model(inputTest) val = ot.MetaModelValidation(inputTest, outputTest, metamodel) Q2 = val.computePredictivityFactor()[0] Q2 .. rst-class:: sphx-glr-script-out Out: .. code-block:: none 0.9994752470145457 .. GENERATED FROM PYTHON SOURCE LINES 124-125 The Q2 is very close to 1: the metamodel is excellent. .. GENERATED FROM PYTHON SOURCE LINES 127-131 .. code-block:: default graph = val.drawValidation() graph.setTitle("Q2=%.2f%%" % (Q2*100)) view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_003.png :alt: Q2=99.95% :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 132-133 The metamodel has a good predictivity, since the points are almost on the first diagonal. .. GENERATED FROM PYTHON SOURCE LINES 135-137 Compute and print Sobol' indices -------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 139-142 .. code-block:: default chaosSI = ot.FunctionalChaosSobolIndices(result) print(chaosSI.summary()) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none input dimension: 3 output dimension: 1 basis size: 21 mean: [3.51725] std-dev: [3.70631] ------------------------------------------------------------ Index | Multi-indice | Part of variance ------------------------------------------------------------ 6 | [0,4,0] | 0.273917 1 | [1,0,0] | 0.191725 5 | [1,0,2] | 0.135136 11 | [0,6,0] | 0.134366 4 | [3,0,0] | 0.120877 9 | [3,0,2] | 0.0850971 3 | [0,2,0] | 0.0253318 10 | [1,0,4] | 0.0122786 ------------------------------------------------------------ ------------------------------------------------------------ Component | Sobol index | Sobol total index ------------------------------------------------------------ 0 | 0.315184 | 0.557148 1 | 0.442823 | 0.442894 2 | 4.76385e-07 | 0.241993 ------------------------------------------------------------ .. GENERATED FROM PYTHON SOURCE LINES 143-144 We notice the a coefficient with marginal degree equal to 6 has a significant impact on the output variance. Hence, we cannot get a satisfactory polynomial chaos with total degree less that 6. .. GENERATED FROM PYTHON SOURCE LINES 146-147 Draw Sobol' indices .. GENERATED FROM PYTHON SOURCE LINES 149-158 .. code-block:: default dim_input = im.distributionX.getDimension() first_order = [chaosSI.getSobolIndex(i) for i in range(dim_input)] total_order = [chaosSI.getSobolTotalIndex(i) for i in range(dim_input)] input_names = im.model.getInputDescription() graph = ot.SobolIndicesAlgorithm.DrawSobolIndices( input_names, first_order, total_order) view = viewer.View(graph) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_004.png :alt: Sobol' indices :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 159-164 The variable which has the largest impact on the output is, taking interactions into account, X1. We see that X1 has interactions with other variables, since the first order indice is less than the total order indice. At first order, X3 has no interactions with other variables since its first order indice is close to zero. .. GENERATED FROM PYTHON SOURCE LINES 166-168 Computing the accuracy ---------------------- .. GENERATED FROM PYTHON SOURCE LINES 170-173 The interesting point with the Ishigami function is that the exact Sobol' indices are known. We can use that property in order to compute the absolute error on the Sobol' indices for the polynomial chaos. The following function computes the exact mean, variance and Sobol' indices for this function. .. GENERATED FROM PYTHON SOURCE LINES 175-199 .. code-block:: default def ishigamiSA(a, b): '''Exact sensitivity indices of the Ishigami function for given a and b.''' var = 1.0/2 + a**2/8 + b*np.pi**4/5 + b**2*np.pi**8/18 S1 = (1.0/2 + b*np.pi**4/5+b**2*np.pi**8/50)/var S2 = (a**2/8)/var S3 = 0 S13 = b**2*np.pi**8/2*(1.0/9-1.0/25)/var exact = { 'expectation': a/2, 'variance': var, 'S1': (1.0/2 + b*np.pi**4/5+b**2*np.pi**8.0/50)/var, 'S2': (a**2/8)/var, 'S3': 0, 'S12': 0, 'S23': 0, 'S13': S13, 'S123': 0, 'ST1': S1 + S13, 'ST2': S2, 'ST3': S3 + S13 } return exact .. GENERATED FROM PYTHON SOURCE LINES 200-205 .. code-block:: default a = 7. b = 0.1 exact = ishigamiSA(a, b) exact .. rst-class:: sphx-glr-script-out Out: .. code-block:: none {'expectation': 3.5, 'variance': 13.844587940719254, 'S1': 0.31390519114781146, 'S2': 0.4424111447900409, 'S3': 0, 'S12': 0, 'S23': 0, 'S13': 0.2436836640621477, 'S123': 0, 'ST1': 0.5575888552099592, 'ST2': 0.4424111447900409, 'ST3': 0.2436836640621477} .. GENERATED FROM PYTHON SOURCE LINES 206-207 To make the comparisons simpler, we gather the results into a list. .. GENERATED FROM PYTHON SOURCE LINES 209-212 .. code-block:: default S_exact = [exact["S1"], exact["S2"], exact["S3"]] ST_exact = [exact["ST1"], exact["ST2"], exact["ST3"]] .. GENERATED FROM PYTHON SOURCE LINES 213-214 Then we perform a loop over the input dimension and compute the absolute error on the Sobol' indices. .. GENERATED FROM PYTHON SOURCE LINES 216-223 .. code-block:: default for i in range(im.dim): absoluteErrorS = abs(first_order[i]-S_exact[i]) absoluteErrorST = abs(total_order[i]-ST_exact[i]) print("X%d, Abs.Err. on S=%.1e, Abs.Err. on ST=%.1e" % (i+1, absoluteErrorS, absoluteErrorST)) plt.show() .. rst-class:: sphx-glr-script-out Out: .. code-block:: none X1, Abs.Err. on S=1.3e-03, Abs.Err. on ST=4.4e-04 X2, Abs.Err. on S=4.1e-04, Abs.Err. on ST=4.8e-04 X3, Abs.Err. on S=4.8e-07, Abs.Err. on ST=1.7e-03 .. GENERATED FROM PYTHON SOURCE LINES 224-225 We see that the indices are correctly estimated with a low accuracy even if we have use only 100 function evaluations. This shows the good performance of the polynomial chaos in this case. .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 0.661 seconds) .. _sphx_glr_download_auto_meta_modeling_polynomial_chaos_metamodel_plot_chaos_ishigami.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_chaos_ishigami.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_chaos_ishigami.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_