.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/polynomial_chaos_metamodel/plot_functional_chaos_database.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_polynomial_chaos_metamodel_plot_functional_chaos_database.py: Create a full or sparse polynomial chaos expansion ================================================== .. GENERATED FROM PYTHON SOURCE LINES 6-16 In this example we create a global approximation of a model using polynomial chaos expansion based on a design of experiment. The goal of this example is to show how we can create a full or sparse polynomial chaos expansion depending on our needs and depending on the number of observations we have. In general, we should have more observations than parameters to estimate. This is why a sparse polynomial chaos may be interesting: by carefully selecting the coefficients we estimate, we may reduce overfitting and increase the predictions of the metamodel. .. GENERATED FROM PYTHON SOURCE LINES 18-22 .. code-block:: Python import openturns as ot ot.Log.Show(ot.Log.NONE) .. GENERATED FROM PYTHON SOURCE LINES 23-25 Define the model ~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 27-28 Create the function. .. GENERATED FROM PYTHON SOURCE LINES 28-32 .. code-block:: Python myModel = ot.SymbolicFunction( ["x1", "x2", "x3", "x4"], ["1 + x1 * x2 + 2 * x3^2 + x4^4"] ) .. GENERATED FROM PYTHON SOURCE LINES 33-34 Create a multivariate distribution. .. GENERATED FROM PYTHON SOURCE LINES 34-38 .. code-block:: Python distribution = ot.JointDistribution( [ot.Normal(), ot.Uniform(), ot.Gamma(2.75, 1.0), ot.Beta(2.5, 1.0, -1.0, 2.0)] ) .. GENERATED FROM PYTHON SOURCE LINES 39-45 In order to create the PCE, we can specify the distribution of the input parameters. If not known, statistical inference can be used to select a possible candidate, and fitting tests can validate such an hypothesis. Please read :doc:`Fit a distribution from an input sample ` for an example of this method. .. GENERATED FROM PYTHON SOURCE LINES 47-49 Create a training sample ~~~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 51-52 Create a pair of input and output samples. .. GENERATED FROM PYTHON SOURCE LINES 52-56 .. code-block:: Python sampleSize = 250 inputSample = distribution.getSample(sampleSize) outputSample = myModel(inputSample) .. GENERATED FROM PYTHON SOURCE LINES 57-59 Build the orthogonal basis ~~~~~~~~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 61-63 In the next cell, we create the univariate orthogonal polynomial basis for each marginal. .. GENERATED FROM PYTHON SOURCE LINES 63-71 .. code-block:: Python inputDimension = inputSample.getDimension() coll = [ ot.StandardDistributionPolynomialFactory(distribution.getMarginal(i)) for i in range(inputDimension) ] enumerateFunction = ot.LinearEnumerateFunction(inputDimension) productBasis = ot.OrthogonalProductPolynomialFactory(coll, enumerateFunction) .. GENERATED FROM PYTHON SOURCE LINES 72-73 We can achieve the same result using :class:`~openturns.OrthogonalProductPolynomialFactory`. .. GENERATED FROM PYTHON SOURCE LINES 73-81 .. code-block:: Python marginalDistributionCollection = [ distribution.getMarginal(i) for i in range(inputDimension) ] multivariateBasis = ot.OrthogonalProductPolynomialFactory( marginalDistributionCollection ) multivariateBasis .. raw:: html
Index Name Distribution Univariate polynomial
0 X0 Normal HermiteFactory
1 X1 Uniform LegendreFactory
2 X2 Gamma LaguerreFactory
3 X3 Beta JacobiFactory


.. GENERATED FROM PYTHON SOURCE LINES 82-84 Create a full PCE ~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 86-98 Create the algorithm. We compute the basis size from the total degree. The next lines use the :class:`~openturns.LeastSquaresStrategy` class with default parameters (the default is the :class:`~openturns.PenalizedLeastSquaresAlgorithmFactory` class). This creates a full polynomial chaos expansion, i.e. we keep all the candidate coefficients produced by the enumeration rule. In order to create a sparse polynomial chaos expansion, we must use the :class:`~openturns.LeastSquaresMetaModelSelectionFactory` class instead. .. GENERATED FROM PYTHON SOURCE LINES 98-110 .. code-block:: Python totalDegree = 3 candidateBasisSize = enumerateFunction.getBasisSizeFromTotalDegree(totalDegree) print("Candidate basis size = ", candidateBasisSize) adaptiveStrategy = ot.FixedStrategy(productBasis, candidateBasisSize) projectionStrategy = ot.LeastSquaresStrategy() algo = ot.FunctionalChaosAlgorithm( inputSample, outputSample, distribution, adaptiveStrategy, projectionStrategy ) algo.run() result = algo.getResult() result .. rst-class:: sphx-glr-script-out .. code-block:: none Candidate basis size = 35 .. raw:: html
FunctionalChaosResult
Index Multi-index Coeff.
0 [0,0,0,0] 26.10131
1 [1,0,0,0] 0.01833501
2 [0,1,0,0] 0.02643858
3 [0,0,1,0] 24.89616
4 [0,0,0,1] 3.998141
5 [2,0,0,0] -0.007498848
6 [1,1,0,0] 0.5836605
7 [1,0,1,0] 0.0008685963
8 [1,0,0,1] -0.01786958
9 [0,2,0,0] 0.005226953
10 [0,1,1,0] 0.02195566
11 [0,1,0,1] -0.06375109
12 [0,0,2,0] 9.063002
13 [0,0,1,1] -0.02439593
14 [0,0,0,2] 2.366967
15 [3,0,0,0] 0.003033619
16 [2,1,0,0] -0.01237667
17 [2,0,1,0] -0.01342518
18 [2,0,0,1] 0.01335615
19 [1,2,0,0] -0.009671841
20 [1,1,1,0] 0.004816446
21 [1,1,0,1] -0.02295916
22 [1,0,2,0] -0.01663346
23 [1,0,1,1] 0.007606394
24 [1,0,0,2] 0.03900244
25 [0,3,0,0] 0.02087256
26 [0,2,1,0] -0.01402968
27 [0,2,0,1] -0.04129633
28 [0,1,2,0] -0.01182831
29 [0,1,1,1] -0.01354742
30 [0,1,0,2] 0.09136447
31 [0,0,3,0] 0.02163227
32 [0,0,2,1] 0.02992845
33 [0,0,1,2] 0.04678639
34 [0,0,0,3] 1.057842


.. GENERATED FROM PYTHON SOURCE LINES 111-112 Get the number of coefficients in the PCE. .. GENERATED FROM PYTHON SOURCE LINES 112-115 .. code-block:: Python selectedBasisSizeFull = result.getIndices().getSize() print("Selected basis size = ", selectedBasisSizeFull) .. rst-class:: sphx-glr-script-out .. code-block:: none Selected basis size = 35 .. GENERATED FROM PYTHON SOURCE LINES 116-119 We see that the number of coefficients in the selected basis is equal to the number of coefficients in the candidate basis. This is, indeed, a *full* PCE. .. GENERATED FROM PYTHON SOURCE LINES 121-123 Use the PCE ~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 125-126 Get the metamodel function. .. GENERATED FROM PYTHON SOURCE LINES 126-128 .. code-block:: Python metamodel = result.getMetaModel() .. GENERATED FROM PYTHON SOURCE LINES 129-131 In order to evaluate the metamodel on a single point, we just use it as any other :class:`openturns.Function`. .. GENERATED FROM PYTHON SOURCE LINES 131-135 .. code-block:: Python xPoint = distribution.getMean() yPoint = metamodel(xPoint) print("Value at ", xPoint, " is ", yPoint) .. rst-class:: sphx-glr-script-out .. code-block:: none Value at [0,0,2.75,1.14286] is [17.7455] .. GENERATED FROM PYTHON SOURCE LINES 136-137 Print residuals. .. GENERATED FROM PYTHON SOURCE LINES 137-139 .. code-block:: Python result.getResiduals() .. raw:: html
class=Point name=Unnamed dimension=1 values=[0.010775]


.. GENERATED FROM PYTHON SOURCE LINES 140-143 Based on these results, we may want to validate our metamodel. More details on this topic are presented in :doc:`Validate a polynomial chaos `. .. GENERATED FROM PYTHON SOURCE LINES 145-147 Create a sparse PCE ~~~~~~~~~~~~~~~~~~~ .. GENERATED FROM PYTHON SOURCE LINES 149-153 In order to create a sparse polynomial chaos expansion, we use the :class:`~openturns.LeastSquaresMetaModelSelectionFactory` class instead. .. GENERATED FROM PYTHON SOURCE LINES 153-166 .. code-block:: Python totalDegree = 6 candidateBasisSize = enumerateFunction.getBasisSizeFromTotalDegree(totalDegree) print("Candidate basis size = ", candidateBasisSize) adaptiveStrategy = ot.FixedStrategy(productBasis, candidateBasisSize) selectionAlgorithm = ot.LeastSquaresMetaModelSelectionFactory() projectionStrategy = ot.LeastSquaresStrategy(selectionAlgorithm) algo = ot.FunctionalChaosAlgorithm( inputSample, outputSample, distribution, adaptiveStrategy, projectionStrategy ) algo.run() result = algo.getResult() result .. rst-class:: sphx-glr-script-out .. code-block:: none Candidate basis size = 210 .. raw:: html
FunctionalChaosResult
Index Multi-index Coeff.
0 [0,0,0,0] 26.07564
1 [1,0,0,0] -0.003290232
2 [0,0,1,0] 24.86277
3 [0,0,0,1] 4.00878
4 [1,1,0,0] 0.5800043
5 [1,0,1,0] 0.005553137
6 [0,1,0,1] -0.03536237
7 [0,0,2,0] 9.006102
8 [0,0,0,2] 2.32328
9 [2,0,1,0] -0.01830508
10 [1,1,1,0] -0.001079013
11 [1,0,2,0] -0.01280951
12 [0,1,0,2] 0.07909247
13 [0,0,0,3] 1.0497
14 [0,2,0,2] 0.03390499
15 [0,0,2,2] -0.09301932
16 [2,3,0,0] 0.006792632
17 [2,2,1,0] -0.001824762
18 [1,0,1,3] -0.00731192
19 [0,1,0,4] 0.04903297
20 [0,4,2,0] 0.01100148
21 [0,2,2,2] 0.04061781
22 [0,1,1,4] -0.01867403
23 [0,0,2,4] -0.09888105


.. GENERATED FROM PYTHON SOURCE LINES 167-168 Get the number of coefficients in the PCE. .. GENERATED FROM PYTHON SOURCE LINES 168-171 .. code-block:: Python selectedBasisSizeSparse = result.getIndices().getSize() print("Selected basis size = ", selectedBasisSizeSparse) .. rst-class:: sphx-glr-script-out .. code-block:: none Selected basis size = 24 .. GENERATED FROM PYTHON SOURCE LINES 172-176 We see that the number of selected coefficients is lower than the number of candidate coefficients. This may reduce overfitting and can produce a PCE with more accurate predictions. .. _sphx_glr_download_auto_meta_modeling_polynomial_chaos_metamodel_plot_functional_chaos_database.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_functional_chaos_database.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_functional_chaos_database.py `