.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/polynomial_chaos_metamodel/plot_chaos_ishigami.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_polynomial_chaos_metamodel_plot_chaos_ishigami.py: Create a polynomial chaos for the Ishigami function: a quick start guide to polynomial chaos ============================================================================================ .. GENERATED FROM PYTHON SOURCE LINES 7-12 In this example, we create a polynomial chaos for the :ref:`Ishigami function`. We create a sparse polynomial with maximum total degree equal to 8 using the :class:`~openturns.FunctionalChaosAlgorithm` class. .. GENERATED FROM PYTHON SOURCE LINES 15-17 Define the model ---------------- .. GENERATED FROM PYTHON SOURCE LINES 19-27 .. code-block:: Python from openturns.usecases import ishigami_function import openturns as ot import openturns.viewer as otv import math ot.Log.Show(ot.Log.NONE) ot.RandomGenerator.SetSeed(0) .. GENERATED FROM PYTHON SOURCE LINES 28-29 We load the Ishigami model : .. GENERATED FROM PYTHON SOURCE LINES 29-31 .. code-block:: Python im = ishigami_function.IshigamiModel() .. GENERATED FROM PYTHON SOURCE LINES 32-34 The `IshigamiModel` data class contains the input distribution :math:`\vect{X}=(X_1, X_2, X_3)` in `im.inputDistribution` and the Ishigami function in `im.model`. We also have access to the input variable names with .. GENERATED FROM PYTHON SOURCE LINES 34-37 .. code-block:: Python input_names = im.inputDistribution.getDescription() .. GENERATED FROM PYTHON SOURCE LINES 38-40 Draw the function ----------------- .. GENERATED FROM PYTHON SOURCE LINES 42-43 Create a training sample .. GENERATED FROM PYTHON SOURCE LINES 45-50 .. code-block:: Python sampleSize = 1000 inputSample = im.inputDistribution.getSample(sampleSize) outputSample = im.model(inputSample) .. GENERATED FROM PYTHON SOURCE LINES 51-52 Display relationships between the output and the inputs .. GENERATED FROM PYTHON SOURCE LINES 54-57 .. code-block:: Python grid = ot.VisualTest.DrawPairsXY(inputSample, outputSample) view = otv.View(grid, figure_kw={"figsize": (12.0, 4.0)}) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_001.png :alt: plot chaos ishigami :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_001.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 58-61 .. code-block:: Python graph = ot.HistogramFactory().build(outputSample).drawPDF() view = otv.View(graph) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_002.png :alt: y PDF :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_002.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 62-63 We see that the distribution of the output has two modes. .. GENERATED FROM PYTHON SOURCE LINES 65-67 Create the polynomial chaos model --------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 69-70 Create a training sample .. GENERATED FROM PYTHON SOURCE LINES 72-76 .. code-block:: Python sampleSize = 100 inputTrain = im.inputDistribution.getSample(sampleSize) outputTrain = im.model(inputTrain) .. GENERATED FROM PYTHON SOURCE LINES 77-80 Create the chaos. We could use only the input and output training samples: in this case, the distribution of the input sample is computed by selecting the distribution that has the best fit. .. GENERATED FROM PYTHON SOURCE LINES 82-84 .. code-block:: Python chaosalgo = ot.FunctionalChaosAlgorithm(inputTrain, outputTrain) .. GENERATED FROM PYTHON SOURCE LINES 85-90 Since the input distribution is known in our particular case, we instead create the multivariate basis from the distribution. We define the orthogonal basis used to expand the function. We see that each input has the uniform distribution, which corresponds to the Legendre polynomials. .. GENERATED FROM PYTHON SOURCE LINES 92-95 .. code-block:: Python multivariateBasis = ot.OrthogonalProductPolynomialFactory([im.X1, im.X2, im.X3]) multivariateBasis .. raw:: html
  • dimension: 3
  • enumerate function: class=LinearEnumerateFunction dimension=3
Index Name Distribution Univariate polynomial
0 X1 Uniform LegendreFactory
1 X2 Uniform LegendreFactory
2 X3 Uniform LegendreFactory


.. GENERATED FROM PYTHON SOURCE LINES 96-98 Then we create the sparse polynomial chaos expansion using regression and the LARS selection model method. .. GENERATED FROM PYTHON SOURCE LINES 100-110 .. code-block:: Python selectionAlgorithm = ot.LeastSquaresMetaModelSelectionFactory() projectionStrategy = ot.LeastSquaresStrategy(selectionAlgorithm) totalDegree = 8 enumerateFunction = multivariateBasis.getEnumerateFunction() basisSize = enumerateFunction.getBasisSizeFromTotalDegree(totalDegree) adaptiveStrategy = ot.FixedStrategy(multivariateBasis, basisSize) chaosAlgo = ot.FunctionalChaosAlgorithm( inputTrain, outputTrain, im.inputDistribution, adaptiveStrategy, projectionStrategy ) .. GENERATED FROM PYTHON SOURCE LINES 111-113 The coefficients of the polynomial expansion can then be estimated using the :meth:`~openturns.FunctionalChaosAlgorithm.run` method. .. GENERATED FROM PYTHON SOURCE LINES 115-117 .. code-block:: Python chaosAlgo.run() .. GENERATED FROM PYTHON SOURCE LINES 118-120 The :meth:`~openturns.FunctionalChaosAlgorithm.getResult` method returns the result. .. GENERATED FROM PYTHON SOURCE LINES 120-122 .. code-block:: Python chaosResult = chaosAlgo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 123-124 The polynomial chaos result provides a pretty-print. .. GENERATED FROM PYTHON SOURCE LINES 126-128 .. code-block:: Python chaosResult .. raw:: html
FunctionalChaosResult
  • input dimension: 3
  • output dimension: 1
  • distribution dimension: 3
  • transformation: 3 -> 3
  • inverse transformation: 3 -> 3
  • orthogonal basis dimension: 3
  • indices size: 21
  • relative errors: [9.08703e-06]
  • residuals: [0.00577149]
Index Multi-index Coeff.
0 [0,0,0] 3.517248
1 [1,0,0] 1.622862
2 [0,0,1] 0.0009628309
3 [0,2,0] -0.5898958
4 [3,0,0] -1.288589
5 [1,0,2] 1.36247
6 [0,4,0] -1.939777
7 [0,3,1] 0.0006953569
8 [5,0,0] 0.1883195
9 [3,0,2] -1.081184
10 [1,0,4] 0.4106924
11 [0,6,0] 1.358584
12 [0,0,6] -0.002370007
13 [5,0,2] 0.1602851
14 [3,0,4] -0.319932
15 [1,4,2] -0.02412582
16 [0,1,6] -0.0001260937
17 [3,0,5] 0.02958038
18 [1,0,7] -0.01846124
19 [0,8,0] -0.3556474
20 [0,3,5] -0.01987799


.. GENERATED FROM PYTHON SOURCE LINES 129-130 Get the metamodel. .. GENERATED FROM PYTHON SOURCE LINES 132-134 .. code-block:: Python metamodel = chaosResult.getMetaModel() .. GENERATED FROM PYTHON SOURCE LINES 135-136 In order to validate the metamodel, we generate a test sample. .. GENERATED FROM PYTHON SOURCE LINES 138-146 .. code-block:: Python n_valid = 1000 inputTest = im.inputDistribution.getSample(n_valid) outputTest = im.model(inputTest) metamodelPredictions = metamodel(inputTest) val = ot.MetaModelValidation(outputTest, metamodelPredictions) r2Score = val.computeR2Score()[0] r2Score .. rst-class:: sphx-glr-script-out .. code-block:: none 0.9994752470145457 .. GENERATED FROM PYTHON SOURCE LINES 147-148 The :math:`R^2` is very close to 1: the metamodel is accurate. .. GENERATED FROM PYTHON SOURCE LINES 150-154 .. code-block:: Python graph = val.drawValidation() graph.setTitle("R2=%.2f%%" % (r2Score * 100)) view = otv.View(graph) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_003.png :alt: R2=99.95% :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_003.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 155-156 The metamodel has a good predictivity, since the points are almost on the first diagonal. .. GENERATED FROM PYTHON SOURCE LINES 158-160 Compute mean and variance ------------------------- .. GENERATED FROM PYTHON SOURCE LINES 162-165 The mean and variance of a polynomial chaos expansion are computed using the coefficients of the expansion. This can be done using :class:`~openturns.FunctionalChaosRandomVector`. .. GENERATED FROM PYTHON SOURCE LINES 167-178 .. code-block:: Python randomVector = ot.FunctionalChaosRandomVector(chaosResult) mean = randomVector.getMean() print("Mean=", mean) covarianceMatrix = randomVector.getCovariance() print("Covariance=", covarianceMatrix) outputDimension = outputTrain.getDimension() stdDev = ot.Point(outputDimension) for i in range(outputDimension): stdDev[i] = math.sqrt(covarianceMatrix[i, i]) print("Standard deviation=", stdDev) .. rst-class:: sphx-glr-script-out .. code-block:: none Mean= [3.51725] Covariance= [[ 13.7368 ]] Standard deviation= [3.70631] .. GENERATED FROM PYTHON SOURCE LINES 179-181 Compute and print Sobol' indices -------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 183-190 By default, printing the object will print the Sobol' indices and the multi-indices ordered by decreasing part of variance. If a multi-index has a part of variance which is lower than some threshold, it is not printed. This threshold can be customized using the `FunctionalChaosSobolIndices-VariancePartThreshold` key of the :class:`~openturns.ResourceMap`. .. GENERATED FROM PYTHON SOURCE LINES 192-195 .. code-block:: Python chaosSI = ot.FunctionalChaosSobolIndices(chaosResult) chaosSI .. raw:: html
FunctionalChaosSobolIndices
  • input dimension: 3
  • output dimension: 1
  • basis size: 21
  • mean: [3.51725]
  • std-dev: [3.70631]
Input Variable Sobol' index Total index
0 X1 0.315184 0.557148
1 X2 0.442823 0.442894
2 X3 0.000000 0.241993
Index Multi-index Part of variance
6 [0,4,0] 0.273917
1 [1,0,0] 0.191725
5 [1,0,2] 0.135136
11 [0,6,0] 0.134366
4 [3,0,0] 0.120877
9 [3,0,2] 0.085097
3 [0,2,0] 0.025332
10 [1,0,4] 0.012279


.. GENERATED FROM PYTHON SOURCE LINES 196-198 We notice that a coefficient with marginal degree equal to 6 has a significant impact on the output variance. Hence, we cannot get a satisfactory polynomial chaos with total degree less that 6. .. GENERATED FROM PYTHON SOURCE LINES 200-201 Draw Sobol' indices. .. GENERATED FROM PYTHON SOURCE LINES 203-211 .. code-block:: Python dim_input = im.inputDistribution.getDimension() first_order = [chaosSI.getSobolIndex(i) for i in range(dim_input)] total_order = [chaosSI.getSobolTotalIndex(i) for i in range(dim_input)] input_names = im.model.getInputDescription() graph = ot.SobolIndicesAlgorithm.DrawSobolIndices(input_names, first_order, total_order) view = otv.View(graph) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_004.png :alt: Sobol' indices :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_ishigami_004.png :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 212-218 The variable which has the largest impact on the output is, taking interactions into account, :math:`X_1`. We see that :math:`X_1` has interactions with other variables, since the first order indice is less than the total order indice. At first order, :math:`X_3` has no interaction with other variables since its first order indice is close to zero. .. GENERATED FROM PYTHON SOURCE LINES 220-222 Computing the accuracy ---------------------- .. GENERATED FROM PYTHON SOURCE LINES 224-226 The interesting point with the Ishigami function is that the exact Sobol' indices are known. We can use that property in order to compute the absolute error on the Sobol' indices for the polynomial chaos. .. GENERATED FROM PYTHON SOURCE LINES 228-229 To make the comparisons simpler, we gather the results into a list. .. GENERATED FROM PYTHON SOURCE LINES 231-234 .. code-block:: Python S_exact = [im.S1, im.S2, im.S3] ST_exact = [im.ST1, im.ST2, im.ST3] .. GENERATED FROM PYTHON SOURCE LINES 235-236 Then we perform a loop over the input dimension and compute the absolute error on the Sobol' indices. .. GENERATED FROM PYTHON SOURCE LINES 238-246 .. code-block:: Python for i in range(im.dim): absoluteErrorS = abs(first_order[i] - S_exact[i]) absoluteErrorST = abs(total_order[i] - ST_exact[i]) print( "X%d, Abs.Err. on S=%.1e, Abs.Err. on ST=%.1e" % (i + 1, absoluteErrorS, absoluteErrorST) ) .. rst-class:: sphx-glr-script-out .. code-block:: none X1, Abs.Err. on S=1.3e-03, Abs.Err. on ST=4.4e-04 X2, Abs.Err. on S=4.1e-04, Abs.Err. on ST=4.8e-04 X3, Abs.Err. on S=4.8e-07, Abs.Err. on ST=1.7e-03 .. GENERATED FROM PYTHON SOURCE LINES 247-249 We see that the indices are correctly estimated with a low accuracy even if we have used only 100 function evaluations. This shows the good performance of the polynomial chaos in this case. .. GENERATED FROM PYTHON SOURCE LINES 251-252 We can compute the part of the variance of the output explained by each multi-index. .. GENERATED FROM PYTHON SOURCE LINES 254-271 .. code-block:: Python partOfVariance = chaosSI.getPartOfVariance() chaosResult = chaosSI.getFunctionalChaosResult() coefficients = chaosResult.getCoefficients() orthogonalBasis = chaosResult.getOrthogonalBasis() enumerateFunction = orthogonalBasis.getEnumerateFunction() indices = chaosResult.getIndices() basisSize = indices.getSize() print("Index, global index, multi-index, coefficient, part of variance") for i in range(basisSize): globalIndex = indices[i] multiIndex = enumerateFunction(globalIndex) if partOfVariance[i] > 1.0e-3: print( "%d, %d, %s, %.4f, %.4f" % (i, globalIndex, multiIndex, coefficients[i, 0], partOfVariance[i]) ) .. rst-class:: sphx-glr-script-out .. code-block:: none Index, global index, multi-index, coefficient, part of variance 1, 1, [1,0,0], 1.6229, 0.1917 3, 7, [0,2,0], -0.5899, 0.0253 4, 10, [3,0,0], -1.2886, 0.1209 5, 15, [1,0,2], 1.3625, 0.1351 6, 30, [0,4,0], -1.9398, 0.2739 8, 35, [5,0,0], 0.1883, 0.0026 9, 40, [3,0,2], -1.0812, 0.0851 10, 49, [1,0,4], 0.4107, 0.0123 11, 77, [0,6,0], 1.3586, 0.1344 13, 89, [5,0,2], 0.1603, 0.0019 14, 98, [3,0,4], -0.3199, 0.0075 19, 156, [0,8,0], -0.3556, 0.0092 .. GENERATED FROM PYTHON SOURCE LINES 272-273 .. code-block:: Python view.show() .. _sphx_glr_download_auto_meta_modeling_polynomial_chaos_metamodel_plot_chaos_ishigami.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_chaos_ishigami.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_chaos_ishigami.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_chaos_ishigami.zip `