.. only:: html .. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_auto_reliability_sensitivity_central_dispersion_plot_central_tendency.py: Central tendency analysis on the cantilever beam example ======================================================== In this example we perform a central tendency analysis of a random variable Y using the various methods available. We consider the :ref:`cantilever beam ` example and show how to use the `TaylorExpansionMoments` and `ExpectationSimulationAlgorithm` classes. .. code-block:: default from __future__ import print_function import openturns as ot import openturns.viewer as viewer from matplotlib import pylab as plt ot.Log.Show(ot.Log.NONE) We first load the data class from the usecases module : .. code-block:: default from openturns.usecases import cantilever_beam as cantilever_beam cb = cantilever_beam.CantileverBeam() We want to create the random variable of interest Y=g(X) where :math:`g(.)` is the physical model and :math:`X` is the input vectors. For this example we consider independent marginals. We set a `mean` vector and a unitary standard deviation : .. code-block:: default dim = cb.dim mean = [50.0, 1.0, 10.0, 5.0] sigma = ot.Point(dim, 1.0) R = ot.IdentityMatrix(dim) We create the input parameters distribution and make a random vector : .. code-block:: default distribution = ot.Normal(mean, sigma, R) X = ot.RandomVector(distribution) X.setDescription(['E', 'F', 'L', 'I']) `f` is the cantilever beam model : .. code-block:: default f = cb.model The random variable of interest Y is then .. code-block:: default Y = ot.CompositeRandomVector(f, X) Y.setDescription('Y') Taylor expansion ---------------- Perform Taylor approximation to get the expected value of Y and the importance factors. .. code-block:: default taylor = ot.TaylorExpansionMoments(Y) taylor_mean_fo = taylor.getMeanFirstOrder() taylor_mean_so = taylor.getMeanSecondOrder() taylor_cov = taylor.getCovariance() taylor_if = taylor.getImportanceFactors() print('model evaluation calls number=', f.getGradientCallsNumber()) print('model gradient calls number=', f.getGradientCallsNumber()) print('model hessian calls number=', f.getHessianCallsNumber()) print('taylor mean first order=', taylor_mean_fo) print('taylor variance=', taylor_cov) print('taylor importance factors=', taylor_if) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none model evaluation calls number= 1 model gradient calls number= 1 model hessian calls number= 1 taylor mean first order= [1.33333] taylor variance= [[ 2.0096 ]] taylor importance factors= [E : 0.000353857, F : 0.884642, L : 0.079618, I : 0.0353857] .. code-block:: default graph = taylor.drawImportanceFactors() view = viewer.View(graph) .. image:: /auto_reliability_sensitivity/central_dispersion/images/sphx_glr_plot_central_tendency_001.png :alt: Importance Factors from Taylor expansions - Y :class: sphx-glr-single-img We see that, at first order, the variable :math:`F` explains 88.5% of the variance of the output :math:`Y`. On the other hand, the variable :math:`E` is not significant in the variance of the output: at first order, the random variable :math:`E` could be replaced by a constant with no change to the output variance. Monte-Carlo simulation ---------------------- Perform a Monte Carlo simulation of Y to estimate its mean. .. code-block:: default algo = ot.ExpectationSimulationAlgorithm(Y) algo.setMaximumOuterSampling(1000) algo.setCoefficientOfVariationCriterionType('NONE') algo.run() print('model evaluation calls number=', f.getEvaluationCallsNumber()) expectation_result = algo.getResult() expectation_mean = expectation_result.getExpectationEstimate() print('monte carlo mean=', expectation_mean, 'var=', expectation_result.getVarianceEstimate()) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none model evaluation calls number= 1001 monte carlo mean= [1.39543] var= [0.00271142] Central dispersion analysis based on a sample --------------------------------------------- Directly compute statistical moments based on a sample of Y. Sometimes the probabilistic model is not available and the study needs to start from the data. .. code-block:: default Y_s = Y.getSample(1000) y_mean = Y_s.computeMean() y_stddev = Y_s.computeStandardDeviationPerComponent() y_quantile_95p = Y_s.computeQuantilePerComponent(0.95) print('mean=', y_mean, 'stddev=', y_stddev, 'quantile@95%', y_quantile_95p) .. rst-class:: sphx-glr-script-out Out: .. code-block:: none mean= [1.3887] stddev= [1.61762] quantile@95% [4.21421] .. code-block:: default graph = ot.KernelSmoothing().build(Y_s).drawPDF() graph.setTitle("Kernel smoothing approximation of the output distribution") view = viewer.View(graph) plt.show() .. image:: /auto_reliability_sensitivity/central_dispersion/images/sphx_glr_plot_central_tendency_002.png :alt: Kernel smoothing approximation of the output distribution :class: sphx-glr-single-img .. rst-class:: sphx-glr-timing **Total running time of the script:** ( 0 minutes 0.133 seconds) .. _sphx_glr_download_auto_reliability_sensitivity_central_dispersion_plot_central_tendency.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download sphx-glr-download-python :download:`Download Python source code: plot_central_tendency.py ` .. container:: sphx-glr-download sphx-glr-download-jupyter :download:`Download Jupyter notebook: plot_central_tendency.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_