.. DO NOT EDIT. .. THIS FILE WAS AUTOMATICALLY GENERATED BY SPHINX-GALLERY. .. TO MAKE CHANGES, EDIT THE SOURCE PYTHON FILE: .. "auto_meta_modeling/polynomial_chaos_metamodel/plot_chaos_cantilever_beam_integration.py" .. LINE NUMBERS ARE GIVEN BELOW. .. only:: html .. note:: :class: sphx-glr-download-link-note :ref:`Go to the end ` to download the full example code. .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_meta_modeling_polynomial_chaos_metamodel_plot_chaos_cantilever_beam_integration.py: Create a polynomial chaos metamodel by integration on the cantilever beam ========================================================================= .. GENERATED FROM PYTHON SOURCE LINES 7-17 In this example, we create a polynomial chaos metamodel by integration on the :ref:`cantilever beam ` example. We choose to evaluate the coefficients of the chaos decomposition by integration using various kinds of design of experiments: - Gauss product, - Latin hypercube sampling, - Quasi Monte-Carlo with a Sobol' sequence. We will compare the results obtained on each design. .. GENERATED FROM PYTHON SOURCE LINES 19-24 .. code-block:: Python from openturns.usecases import cantilever_beam import openturns as ot import openturns.viewer as otv .. GENERATED FROM PYTHON SOURCE LINES 25-26 We first load the model from the usecases module : .. GENERATED FROM PYTHON SOURCE LINES 26-28 .. code-block:: Python cb = cantilever_beam.CantileverBeam() .. GENERATED FROM PYTHON SOURCE LINES 29-31 In this example we consider all marginals independent. They are defined in the :class:`~openturns.usecases.cantilever_beam.CantileverBeam` class: .. GENERATED FROM PYTHON SOURCE LINES 31-37 .. code-block:: Python dist_E = cb.E dist_F = cb.F dist_L = cb.L dist_I = cb.II distribution = cb.independentDistribution .. GENERATED FROM PYTHON SOURCE LINES 38-39 We load the model. .. GENERATED FROM PYTHON SOURCE LINES 39-43 .. code-block:: Python dim_input = cb.dim # dimension of the input dim_output = 1 # dimension of the output g = cb.model .. GENERATED FROM PYTHON SOURCE LINES 44-46 Create a polynomial chaos decomposition --------------------------------------- .. GENERATED FROM PYTHON SOURCE LINES 48-50 We create the multivariate polynomial basis by tensorization of the univariate polynomials and the default linear enumerate rule. .. GENERATED FROM PYTHON SOURCE LINES 50-54 .. code-block:: Python multivariateBasis = ot.OrthogonalProductPolynomialFactory( [dist_E, dist_F, dist_L, dist_I] ) .. GENERATED FROM PYTHON SOURCE LINES 55-59 In this case, we select :math:`P` using the :meth:`~openturns.EnumerateFunction.getBasisSizeFromTotalDegree` method, so that all polynomials with total degree lower or equal to 5 are used. This will lead to the computation of 126 coefficients. .. GENERATED FROM PYTHON SOURCE LINES 59-64 .. code-block:: Python totalDegree = 5 enum_func = multivariateBasis.getEnumerateFunction() basisSize = enum_func.getBasisSizeFromTotalDegree(totalDegree) print(f"Basis size = {basisSize}") .. rst-class:: sphx-glr-script-out .. code-block:: none Basis size = 126 .. GENERATED FROM PYTHON SOURCE LINES 65-67 We select the :class:`~openturns.FixedStrategy` truncation rule, which corresponds to using the first :math:`P` polynomials of the polynomial basis. .. GENERATED FROM PYTHON SOURCE LINES 67-69 .. code-block:: Python adaptiveStrategy = ot.FixedStrategy(multivariateBasis, basisSize) .. GENERATED FROM PYTHON SOURCE LINES 70-73 We begin by getting the standard measure associated with the multivariate polynomial basis. We see that the range of the `Beta` distribution has been standardized into the `[-1, 1]` interval. This is the same for the `Uniform` distribution and the second `Beta` distribution. .. GENERATED FROM PYTHON SOURCE LINES 73-76 .. code-block:: Python measure = multivariateBasis.getMeasure() print(f"Measure = {measure}") .. rst-class:: sphx-glr-script-out .. code-block:: none Measure = JointDistribution(Beta(alpha = 0.9, beta = 3.5, a = -1, b = 1), LogNormal(muLog = 5.69881, sigmaLog = 0.0997513, gamma = 0), Uniform(a = -1, b = 1), Beta(alpha = 2.5, beta = 4, a = -1, b = 1), IndependentCopula(dimension = 4)) .. GENERATED FROM PYTHON SOURCE LINES 77-79 The choice of the :class:`~openturns.GaussProductExperiment` rule with 4 nodes in each of the 4 dimensions leads to :math:`4^4=256` evaluations of the model. .. GENERATED FROM PYTHON SOURCE LINES 79-85 .. code-block:: Python marginalSizes = [4] * dim_input experiment = ot.GaussProductExperiment(distribution, marginalSizes) print(f"N={experiment.getSize()}") X, W = experiment.generateWithWeights() Y = g(X) .. rst-class:: sphx-glr-script-out .. code-block:: none N=256 .. GENERATED FROM PYTHON SOURCE LINES 86-87 We now set the method used to compute the coefficients; we select the integration method. .. GENERATED FROM PYTHON SOURCE LINES 87-89 .. code-block:: Python projectionStrategy = ot.IntegrationStrategy() .. GENERATED FROM PYTHON SOURCE LINES 90-91 We can now create the functional chaos. .. GENERATED FROM PYTHON SOURCE LINES 91-96 .. code-block:: Python algo = ot.FunctionalChaosAlgorithm( X, W, Y, distribution, adaptiveStrategy, projectionStrategy ) algo.run() .. GENERATED FROM PYTHON SOURCE LINES 97-98 Get the result .. GENERATED FROM PYTHON SOURCE LINES 98-100 .. code-block:: Python result = algo.getResult() .. GENERATED FROM PYTHON SOURCE LINES 101-102 The :meth:`~openturns.FunctionalChaosResult.getMetaModel` method returns the metamodel function. .. GENERATED FROM PYTHON SOURCE LINES 102-104 .. code-block:: Python metamodel = result.getMetaModel() .. GENERATED FROM PYTHON SOURCE LINES 105-107 Validate the metamodel ---------------------- .. GENERATED FROM PYTHON SOURCE LINES 109-110 Generate a new validation sample (which is independent of the training sample). .. GENERATED FROM PYTHON SOURCE LINES 110-114 .. code-block:: Python n_valid = 1000 X_test = distribution.getSample(n_valid) Y_test = g(X_test) .. GENERATED FROM PYTHON SOURCE LINES 115-117 The :class:`~openturns.MetaModelValidation` class validates the metamodel based on a validation sample. .. GENERATED FROM PYTHON SOURCE LINES 117-120 .. code-block:: Python metamodelPredictions = metamodel(X_test) val = ot.MetaModelValidation(Y_test, metamodelPredictions) .. GENERATED FROM PYTHON SOURCE LINES 121-122 Compute the :math:`R^2` coefficient of determination. .. GENERATED FROM PYTHON SOURCE LINES 122-125 .. code-block:: Python r2Score = val.computeR2Score()[0] r2Score .. rst-class:: sphx-glr-script-out .. code-block:: none 0.9999971399358337 .. GENERATED FROM PYTHON SOURCE LINES 126-127 Plot the observed versus the predicted outputs. .. GENERATED FROM PYTHON SOURCE LINES 127-131 .. code-block:: Python graph = val.drawValidation() graph.setTitle(f"Gauss product N={experiment.getSize()} - R2={r2Score * 100:.2f}") view = otv.View(graph) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_cantilever_beam_integration_001.svg :alt: Gauss product N=256 - R2=100.00 :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_cantilever_beam_integration_001.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 132-133 Now repeat the same process on various designs. .. GENERATED FROM PYTHON SOURCE LINES 133-155 .. code-block:: Python def draw_validation(experiment): projectionStrategy = ot.IntegrationStrategy(experiment) algo = ot.FunctionalChaosAlgorithm( X, Y, distribution, adaptiveStrategy, projectionStrategy ) algo.run() result = algo.getResult() metamodel = result.getMetaModel() X_test = distribution.getSample(n_valid) Y_test = g(X_test) metamodelPredictions = metamodel(X_test) val = ot.MetaModelValidation(Y_test, metamodelPredictions) r2Score = val.computeR2Score()[0] graph = val.drawValidation() graph.setTitle( f"{experiment.__class__.__name__} - N={experiment.getSize()} - R2={r2Score * 100:.2f}" ) return graph .. GENERATED FROM PYTHON SOURCE LINES 156-157 Use an LHS design. .. GENERATED FROM PYTHON SOURCE LINES 157-161 .. code-block:: Python experiment = ot.LHSExperiment(distribution, int(1e6)) graph = draw_validation(experiment) view = otv.View(graph) .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_cantilever_beam_integration_002.svg :alt: LHSExperiment - N=1000000 - R2=-1165584.94 :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_cantilever_beam_integration_002.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 162-163 Use a low-discrepancy experiment (Quasi-Monte Carlo). .. GENERATED FROM PYTHON SOURCE LINES 163-170 .. code-block:: Python sequence = ot.SobolSequence() experiment = ot.LowDiscrepancyExperiment(sequence, distribution, int(1e5)) graph = draw_validation(experiment) view = otv.View(graph) otv.View.ShowAll() .. image-sg:: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_cantilever_beam_integration_003.svg :alt: LowDiscrepancyExperiment - N=100000 - R2=-254100.88 :srcset: /auto_meta_modeling/polynomial_chaos_metamodel/images/sphx_glr_plot_chaos_cantilever_beam_integration_003.svg :class: sphx-glr-single-img .. GENERATED FROM PYTHON SOURCE LINES 171-178 Conclusion ---------- With the Gauss product rule, the coefficients are particularly well computed since the R2 score is excellent, even with the relatively limited amount of simulation (256 points). On the other hand the LHS and low-discrepancy experiments require many more points to achieve a :math:`R^2` > 99%. .. _sphx_glr_download_auto_meta_modeling_polynomial_chaos_metamodel_plot_chaos_cantilever_beam_integration.py: .. only:: html .. container:: sphx-glr-footer sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_chaos_cantilever_beam_integration.ipynb ` .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_chaos_cantilever_beam_integration.py ` .. container:: sphx-glr-download sphx-glr-download-zip :download:`Download zipped: plot_chaos_cantilever_beam_integration.zip `